Column: If you work for Uber or Amazon, you may be a victim of algorithmic wage discrimination
If you’ve ever worked for an on-demand app platform, or for Amazon, or even as an independent contractor at all in the last few years, there’s a good chance that you’ve been discriminated against — by an algorithm.
I’ll explain.
Let’s say I’m a delivery driver, and I pick up the lunch you ordered from your local sushi joint and drop it off on your doorstep. It takes me 15 minutes, and I get paid $5. You too are a delivery driver for the same company; you accept the same order, make the delivery in the same amount of time, at the same level of quality. How much should you get paid for your work? Five dollars, right?
Seems pretty straightforward. The notion that people should be paid the same wages for doing the same work is one of the most fundamental assumptions about a fair labor market. And yet, according to new research from Veena Dubal, a law professor at UC Hastings, on-demand app and tech companies have been undermining this crucial compact in ways that stand to influence the future of work in deeply concerning ways.
“From Amazon to Uber to the healthcare sector,” Dubal tells me, “workers are being paid different amounts for the same amount of work that is conducted for the same amount of time.”
Now let’s say I’m a delivery driver for Uber Eats or Postmates. These companies use black-box algorithms to determine how I get paid, so the amount I earn for picking up that sushi is going to be different every time I do the same delivery — and different from another worker making the same delivery for the same company. I may make $6.50 in one set of conditions but $4.25 in another; I am given little insight into why. And another driver might never make more than $3 for doing the exact same amount of work.
The seemingly random firing of drivers is one way ride-hail companies keep workers powerless. Can’t they bear the cost of humane engagement?
Dubal calls this “algorithmic wage discrimination,” and it’s a pernicious trend that has flown under the radar for too long. It’s a phenomenon that, she says, can reduce your pay, undermine efforts to organize your workplace, and exacerbate racial and gender discrimination. And it stands to be supercharged by the rise of AI.
In her paper, which is forthcoming from Columbia Law Review, Dubal details this new kind of wage discrimination and what it looks like in practice. It starts with data collection.
Companies such as Uber, Instacart and Amazon are constantly collecting reams of granular data about the contract workers who use their platforms — where they live and work, what times of day and for how long they tend to work, what their earnings targets are and which kinds of jobs they are willing to accept. Dubal said these companies “use that data to personalize and differentiate wages for workers in ways unknown to them.”
In most cases, workers are given only two choices for each job they’re offered on a platform — accept or decline — and they have no power to negotiate their rates. With the asymmetric information advantage all on their side, companies are able to use the data they’ve gathered to “calculate the exact wage rates necessary to incentivize desired behaviors.”
One of those desired behaviors is staying on the road as long as possible, so workers might be available to meet the always-fluctuating levels of demand. As such, Dubal writes, the companies are motivated “to elongate the time between sending fares to any one driver” — just as long as they don’t get so impatient waiting for a ride they end their shift. Remember, Uber drivers are not paid for any time they are not “engaged,” which is often as much as 40% of a shift, and they have no say in when they get offered rides, either. “The company’s machine-learning technologies may even predict the amount of time a specific driver is willing to wait for a fare,” Dubal writes.
If the algorithm can predict that one worker in the region with a higher acceptance rate will take that sushi delivery for $4 instead of $5 — they’ve been waiting for what seems like forever at this point — it may, according to the research, offer them a lower rate. If the algorithm can predict that a given worker will keep going until he or she hits a daily goal of $200, Dubal says, it might lower rates on offer, making that goal harder to hit, to keep them working longer.
This is algorithmic wage discrimination.
“It’s basically variable pay that’s personalized to individuals based on what is really, really a lot of data that’s accumulated on those workers while they’re working,” Dubal says.
Sergio Avedian, a veteran Uber driver and senior contributor at the gig workers’ resource the Rideshare Guy, says he has seen this phenomenon plenty and heard countless anecdotes from fellow drivers. (Avedian was not involved in Dubal’s research.)
Avedian shared an experiment he ran in which two Uber-driving brothers in Chicago sat side by side with their apps open. They recorded in real time which rates they were offered for the same ride — and one brother was consistently offered more for every trip. The brother who kept getting higher offers drove a Tesla and had a history of accepting fewer rides, while his brother had a rental hybrid sedan and a higher ride acceptance rate. This suggests that Uber’s algorithm is offering higher rates to the user with the nicer car and who has historically been more picky, in order to entice him onto the road — and lower ones to the driver who was statistically more likely to accept a ride for less pay.
At Curbivore, an on-demand industry trade show held in Los Angeles, Avedian did the experiment again, this time with four drivers — and none of them was offered the same rate for the same work.
This variation has exploded, Avedian says, since Uber rolled out its upfront pricing model. Previously, drivers’ earnings were based on a model a lot like a cab meter: a combination of distance, time and base fare, plus bonuses for driving in busy times and completing a certain number of trips per week. Now, drivers are sent an upfront offer, basically, for what they’ll get paid for a ride, total.
As Dara Kerr reported in the Markup, when the company quietly moved its new system into dozens of major U.S. markets last year, drivers immediately had concerns. It was unclear what went into calculating the rates, and the system seemed to make it easier for Uber to take a larger cut of the fare.
Google and Microsoft think chatbots that can converse like humans are the future of web search. But the human workers who make sure they don’t screw up are treated as disposable.
In theory, upfront pricing has some real benefits — workers are given more information about the ride before they agree to take it, for instance. But in reality, Avedian says, it has amounted to an almost across-the-board pay cut. For one thing, drivers don’t get paid when, due to traffic or other obstacles, trips go longer or farther than the algorithm predicts, as they very often do. For another, it’s a hotbed for algorithmic wage discrimination.
“In cabs you get a meter,” Avedian says; you can see how the fare is calculated as the trip goes on. Uber used to be more like that. “I knew what I was going to get paid. Now I have no idea. Sometimes that trip will show up at $9 and sometimes it will show up at $17. More often $9. Why re-create the wheel?”
He’ll tell me why: It gives Uber an opportunity to find a driver willing to take the lowest possible fare. When they send a driver the upfront rate, they essentially have an auction going on, Avedian says. “The algorithm will start shopping that to drivers with certain tendencies,” he says. “They’re running the best arbitrage on the planet. They’re trying to sell it to the driver for the lowest price possible.”
In the ride-hail community, drivers who accept every ride are known as “ants.” Those who wait for more lucrative rides are cherry-pickers, or pickers. Avedian is a picker himself because all the data he’s seen suggests that ants get offered lower rates — the algorithm knows it can pay them less, so it tries to do exactly that.
“It’s brilliant on their part, to be honest,” Avedian says. “They want to make sure they have the highest take rate on millions of trips per hour.”
All that nickel-and-diming adds up: In its last earnings report, Uber said it completed 2.1 billion trips in the fourth quarter of 2022, or 23 million trips per day. If it can find drivers willing to take trips for even $1 less per ride, they’re cutting millions of dollars in labor costs. That should give you an idea of how much money Uber stands to earn by leaning into algorithmic wage discrimination.
But it’s not just about the lowered pay. And it’s not just Uber — it’s every company that dictates the terms of employment through an app, online portal, temp office or independent contract.
“It gives them incredible flexibility,” Dubal tells me. “They can shift wages, shift algorithms according to whatever the firm needs or desires.” Furthermore, it’s “an extraordinary form of control that undermines the capability of organizing.”
One of the most successful labor campaigns of the last decade was Fight for $15. Fast-food workers saw the uniformly lackluster wages across their industry and united to call for change. Algorithmic wage discrimination makes building that kind of solidarity harder.
“A union-busting firm will always tell you they don’t want your workers coalescing around problems,” Dubal says. “They try keeping one group happy and another unhappy, making it impossible to meet and discuss an issue; and what [algorithmic wage discrimination] does is obscure any common problems a worker might have, making it hard to find common cause with co-workers.”
Contacted for comment, Uber spokesperson Zahid Arab said, “The central premise of professor Dubal’s paper about how Uber presents Upfront Fares to drivers is simply wrong. We do not tailor individual fares for individual drivers ‘as little as the system determines that they may be willing to accept.’ Moreover, factors like a driver’s race, ethnicity, Quest promotion status, acceptance rate, total earnings or prior trip history are not considered in calculating fares.”
Uber wouldn’t say what exactly does go into determining upfront pricing, which it insists is a boon to its drivers. But from where I’m sitting, it looks like another opportunity to hide its efforts to degrade wages behind proprietary technologies.
“There are drivers who will wear their vehicles out, their bodies out,” chasing diminishing returns, and the algorithm’s demands, Avedian says.
Indeed. Thanks in part to algorithmic wage discrimination, a lot of workers for Uber and other on-demand app platforms don’t even make minimum wage after gas, maintenance and time spent waiting between rides are factored in. And women and minorities, who already see imbalances in pay, are likely to feel the effects even more acutely. Uber’s own internal study, for instance, found that women drivers made 7% less than men did.
“According to Uber’s own analysis, there is gender-based discrimination that arises from this algorithmically based wage setting,” Dubal says. And since the on-demand app workers who log the most hours are most likely to be minorities, this kind of wage discrimination will have an outsize effect on their earnings. “That is a very scary and very novel way of re-creating and entrenching existing gender- and race-based hierarchies.” (Again, Uber says it does not consider race or gender in setting rates.)
Worse yet, since this kind of wage discrimination is based on huge sets of data, that data can be packaged, bought and sold to other app and contract companies — signaling a bleak future where our data and productivity records follow us around, making us vulnerable to algorithms that are constantly trying to exploit us for maximally productive results.
“If firms can purchase and transfer all my data: how I work, where I work, how much I make — if all of that is transferable, the possibility for economic mobility is severely curtailed, especially in low-wage markets,” Dubal says. Her paper cites the “payroll connectivity platform” company Argyle, which claims to have 80% of all gig workers employment data on file.
If we don’t address the creep of algorithmic wage discrimination now, the practice will be normalized in the largest sectors of our economy — retail, restaurants, computer science. It risks becoming the standard for how low-wage work is remunerated, she says. It’s the beginning of a bleak, casino-like future of work, where the worker always loses, bit by bit by bit.
And the time to address it is before yet another factor is introduced into the equation: AI systems, currently all the rage, that can draw further on vast reams of data to make even more inscrutable projections about how much a worker should earn.
The combination of AI and algorithmic wage discrimination has the potential “to create a unique set of dystopian harms,” Dubal says. “It’s one more tool that employers have to create impenetrable wage-setting systems that can neither be understood or contested.” In other words, if you haven’t experienced algorithmic wage discrimination yet, you may soon — and AI may well help deliver it to your doorstep.
Dubal’s prescription: an outright ban on using algorithms and AI to set wages. Count Avedian in too. “Without a doubt,” he says, starting with upfront pricing. “It should be banned.”
In the interest of averting a future where no one is quite sure why they’re making the wages they are, where the amount we earn slowly circles the drain, at the whims of an inscrutable algorithm over which we have no control — I have to say I concur.