Amazon delivery firms say racial bias skews customer reviews
When an Amazon.com delivery driver drops a package at someone’s home, the customer can give them a thumbs-up or a thumbs-down — feedback the company uses to help determine how much to pay the driver’s employer and whether they keep their job.
In doing so, Amazon is trusting the customer to provide an honest rating. But what if the person is biased against the driver?
That possibility is a matter of growing concern for Amazon delivery contractors who employ Black, Latino and Asian drivers. Time and again, they say, their employees of color get worse customer feedback than their white counterparts. Because the phenomenon affects some of their most productive employees, the delivery company owners suspect racial bias is to blame.
The delivery contractors say they’ve raised their concerns to multiple Amazon managers. The topic has also been discussed on Ignite, an online forum Amazon set up for delivery contractors to swap advice and discuss challenges. Amazon personnel monitor the forum and sometimes participate but have never engaged in the conversations about possible racial bias, according to the contractors. Nor has the company taken steps to address the issue, they said.
In interviews, eight current and former Amazon delivery contractors operating in Los Angeles, Seattle, Georgia, Northern California and the Northeast all described the same pattern: lower ratings for drivers of color, especially when deployed to neighborhoods where their race or ethnicity stand out. The contractors’ suspicions dovetail with decades of academic research documenting how racial, gender and age bias all influence customer impressions of service workers, from waiters to taxi drivers. Companies have been accused for years of doing too little to prevent bias from affecting customer feedback but are harvesting more of this kind of data all the time.
Fees the company collects from third-party sellers have risen for six years in a row, squeezing their margins.
“To give your customer that much power over the delivery process itself, you’re assuming that customer is coming from a good-natured position,” said an Amazon contractor in Northern California, who, like other delivery company owners, requested anonymity to avoid harming his relationship with Amazon. “That’s the flaw.”
Amazon spokesperson Maria Boschetti didn’t deny that racial bias is affecting customer reviews of drivers but said the company hasn’t ignored the issue. “We take any such concerns seriously and investigate every credible complaint — reviewing all available information, then taking appropriate action based on the facts available to us,” she said in an emailed statement. Boschetti said Amazon stops delivering packages to customers who abuse or may pose a threat to drivers. She also said the company doesn’t collect or store any demographic data about drivers.
Amazon isn’t the only tech company accused of letting racial bias affect its operations. Uber Technologies was sued for allegedly firing minority drivers based on how they were rated by customers. A federal judge in March 2022 said the plaintiffs failed to prove discrimination but gave them time to amend their complaint. Airbnb has struggled for years to stop hosts from discriminating against Black people looking to rent vacation properties. After being sued by a Black man alleging such prejudice, the company implemented a range of measures, including preventing hosts from seeing a photo of would-be renters until after they had accepted their business.
But critics say Amazon hasn’t done enough to root out blind spots in the data it collects. Online merchants have complained for years that wrong-headed or malicious customer feedback can get them booted off the company’s web store. A Bloomberg investigation in 2021 revealed that Amazon algorithms were failing to capture real-world conditions — weather, bad roads, traffic — and punishing delivery drivers for delays over which they had little control.
The racial bias alleged by delivery contractors is subtle and difficult to detect in any individual response. When a customer metes out a thumbs-down, a list of checkable options pops up, including such vague choices as “driver did not follow my delivery instructions” and “driver was unprofessional” that don’t require any substantiation. But when Amazon aggregates the feedback to calculate scores, the racial divide is undeniable, the business owners say.
An Airbnb host who canceled a woman’s reservation because of her race has agreed to pay $5,000 in damages and take a course in Asian American studies, a state regulatory agency announced Thursday.
This kind of feedback is prone to “implicit bias” from customers who may be more forgiving of minor mistakes from people who look like them and judge those perceived as outsiders more critically, said NiCole Buchanan, a psychology professor at Michigan State University. “It’s rarely someone overtly racist trying to do harm,” she said. “It’s all done very subtly.”
Dallan Flake, a law professor at Gonzaga University in Spokane, Wash., said it’s also difficult to hold employers accountable because you have to prove the feedback is having a disproportionate effect on drivers of color. Moreover, he said, the law doesn’t provide for punitive damages.
“Amazon isn’t immune to litigation, but they’re probably not worried about facing a big class-action lawsuit,” said Flake, who has studied bias in consumer feedback for years. “They have no motivation to do something that in the end could end up costing them if driver feedback scores improve.”
Amazon launched its “delivery service partner” program in 2018, inviting would-be entrepreneurs to start their own businesses. Globally, the program has grown to more than 3,000 contractors with about 275,000 drivers. Amazon doesn’t technically employ the drivers but monitors them with cameras, vehicle sensors and smartphone apps. The system keeps track of how many times a driver stops short or speeds, whether they wear a seat belt and whether they turn off the engine at each stop. Amazon scores each delivery business based on various metrics, including the number of deliveries completed, safety records and customer reviews.
The problem with relying on customer feedback to help determine overall scores, according to the delivery contractors, is that most people don’t provide a review. Feedback rates of 1% or less are common, contractors say, so a business that delivers tens of thousands of packages each week will receive reviews from just a few hundred customers. As a result, they say, a small group of people has outsize influence over scores.
The scorecards play a crucial role in how much delivery contractors earn. Good marks are required to receive bonuses, which often determine whether they make a profit. Poor marks can be devastating to the bottom line, according to the contractors, who say Amazon has little incentive to change the process because it saves the company money. Besides, the contractors say, they have no power to compel Amazon to make changes.
Amazon’s settlement with the FTC comes nearly two years after the Los Angeles Times first exposed its tip-withholding practices.
“You can’t tell an 800-pound gorilla they didn’t think this through when they put it together,” said one delivery business owner in the Atlanta metropolitan area. “They don’t want to hear about it because it works in their favor.”
The scorecards also help determine whether drivers are promoted, retained or fired. Fear of Amazon canceling their contracts could prompt some delivery companies to punish drivers with poor ratings, even if they suspect racial bias is at work, four contractors said. Delivery businesses worried about their overall scores could even fire drivers or avoid hiring people of color, they said.
Amazon contract drivers don’t know which customers leave reviews so they have no way of knowing how their performance is being assessed. But several drivers of color told Bloomberg they’ve encountered varying degrees of hostility during their rounds.
Veronica Saxon worked as an Amazon delivery driver for about a year in Michigan until she was injured during a collision in December. The job was the first time she experienced racism, said Saxon, who is Black. At one home, she said, a man and his son pulled guns on her as she walked down the driveway to put a package on their doorstep. Saxon’s employer assured her that such instances were rare, so she continued the job and the dispatcher never sent her back to that neighborhood. Several months later, she said, a white woman followed her for four hours while Saxon was making deliveries.
“She said she was just making sure everything was OK,” Saxon said. “After four hours, I took a photo of her license plate and told her I’d call the police if she kept following me.” The woman backed off.
The e-commerce giant is advising truck fleet operators to advertise they don’t screen for marijuana use, saying it will boost the number of candidates.
It’s hard for drivers to know why customers give critical feedback because there are so many variables, she said. Some might be racist and others might be mad you asked them to leash their loose dog, said Saxon, who said she was bitten twice while making Amazon deliveries.
“It might be racism and it might not be,” she said. “When I started the job, people warned me about delivering to homes with Trump flags, but they turned out to be some of my nicest customers who talked to me and gave me snacks and drinks.”
Another Black driver who has been delivering Amazon packages for three years in Atlanta said racism is a constant concern among drivers, especially in predominately white, gated communities.
“Amazon knows what’s going on,” said the driver, who spoke on condition of anonymity because he is still working for an Amazon delivery business. “It works out in Amazon’s favor if we get more negative feedback than positive because they can use that to deprive us of bonus payments and raises.”
Some white customers get upset when he takes a photo of a package on their stoop, as required by Amazon to verify a completed delivery. Some homeowners loose their dogs as he’s approaching, and he’s been bitten twice. Others ask to see his identification. “I just point to my Amazon hat and Amazon van and ask if they want to see ID from their mailman.” Such interactions don’t happen in Black and Latino neighborhoods, he said.
The driver said his boss has discussed racism with employees and encourages them to chronicle every incident so they can provide details to Amazon.
“We expect racism and we deal with it on a daily basis, so it becomes the norm,” he said. “When you deliver in a white community, you can feel it. People just stare. They don’t smile or wave or anything. The stare-down is to subliminally tell you that you don’t belong.”
A former Amazon employee familiar with the design of the contract-driver program said the company could do more to respond to bias in driver ratings, including designing the feedback system to focus on constructive input and tamping down on potentially racist or useless commentary.
“If Amazon’s going to score people, discipline people and fire people, they’ve got to make sure that the product’s right,” said this person, who requested anonymity to protect professional relationships. “They don’t think about things like bias. It’s not in the DNA. They’re just trying to make the machine work efficiently. They’re not thinking about the human cases.”
Rooting out biased customer feedback isn’t easy. But given Amazon’s expertise in consumer behavior, said Purdue University computer science professor Aniket Bera, it should be in a better position than many to address the issue. “The company has enough data to at least partially fix the problem,” he said. “If they don’t, it’s due to business reasons like it costing too much money.”
For now, delivery companies are trying to work around the more blatant racism that their employees experience. They give drivers routes where they’re less likely to stand out. They report incidents of pulled guns and other intimidation to Amazon. At a training session conducted at a company facility in Los Angeles, a delivery contractor said instructors — fellow owners — emphasized safety and told newbies to understand the ethnic and racial makeup of their areas and to assign personnel accordingly.
More to Read
Inside the business of entertainment
The Wide Shot brings you news, analysis and insights on everything from streaming wars to production — and what it all means for the future.
You may occasionally receive promotional content from the Los Angeles Times.