Racial and LGBT bias persists in ridesharing drivers despite mitigation efforts

Racial and LGBT bias persists in ridesharing drivers despite mitigation efforts Racial and LGBT bias persists in ridesharing drivers despite mitigation efforts

July 22, 20203 min read
Featuring:

Despite efforts by ridesharing companies to eliminate or reduce discrimination, research from the Indiana University Kelley School of Business finds that racial and LGBT bias persists among drivers.


Platforms such as Uber, Lyft and Via responded to drivers' biased behavior by removing information that could indicate a rider's gender and race from initial ride requests. However, researchers still found that biases against underrepresented groups and those who indicate support for the LGBT community continued to exist after drivers accepted a ride request -- when the rider's picture would then be displayed.


In other words, their efforts shifted some of the biased behavior until after the ride was confirmed, resulting in higher cancellation rates. Understanding whether bias has been removed also is important for ridesharing companies as they not only compete against each other but also with traditional transportation options.


"Our results confirm that bias at the ride request stage has been removed. However, after ride acceptance, racial and LGBT biases are persistent, while we found no evidence of gender biases," said Jorge Mejia, assistant professor of operations and decision technologies. "We show that signaling support for a social cause -- in our case, the lesbian, gay, bisexual and transgender community -- can also impact service provision. Riders who show support for the LGBT community, regardless of race or gender, also experience significantly higher cancelation rates."


Mejia and co-author Chris Parker, assistant professor in the information technology and analytics department at American University in Washington, believe they are the first to use support for social causes as a bias-enabling characteristic. Their article, "When Transparency Fails: Bias and Financial Incentives in Ridesharing Platforms," is published in Management Science.


They performed a field experiment on a ridesharing platform in fall 2018 in Washington, D.C. They randomly manipulated rider names, using those traditionally perceived to be white or Black, as well as profile pictures to observe drivers' behavior patterns in accepting and canceling rides. To illustrate support for LGBT rights, the authors overlaid a rainbow filter on the rider's picture profile.


"We found that underrepresented minorities are more than twice as likely to have a ride canceled than Caucasians; that's about 3 percent versus 8 percent," Mejia said. "There was no evidence of gender bias."


Mejia and Parker also varied times of ride requests to study whether peak price periods affected bias. They found that higher prices associated with peak times alleviated some of the bias against riders from the underrepresented group, but not against those who signal support for the LGBT community.


They believe that ridesharing companies should use other data-driven solutions to take note of rider characteristics when a driver cancels and penalize the driver for biased behavior. One possible way to punish drivers is to move them down the priority list when they exhibit biased cancellation behavior, so they have fewer ride requests. Alternatively, less-punitive measures may provide "badges" for drivers who exhibit especially low cancellation rates for minority riders.


But, ultimately, policymakers may need to intervene, Mejia said.


"Investments in reducing bias may not occur organically, as ridesharing platforms are trying to maximize the number of participants in the platform -- they want to attract both riders and drivers," he said. "As a result, it may be necessary for policymakers to mandate what information can be provided to a driver to ensure an unbiased experience, while maintaining the safety of everyone involved, or to create policies that require ridesharing platforms to monitor and remove drivers based on biased behavior.


"Careful attention should be paid to these policies both before and after implementation, as unintended consequences are almost sure to follow any simple fix."


Connect with:
  • Jorge Mejia
    Jorge Mejia Assistant Professor of Decision and Information Technologies

    Jorge Mejia is interested in understanding the antecedents and impacts of social media through the analysis of large amounts of data.

You might also like...