Workers ask gig economy companies to explain their algorithms


In two years, Alexandru, a 38-year-old Uber driver in London, has racked up nearly 7,000 trips and maintained an unrelenting five-star customer service rating.

When other drivers complained that Uber’s system was punishing them for no reason, he didn’t believe them. “My feeling was that this was not the whole truth. They must surely be guilty of something, ”he said.

But in July last year, Alexandru received his own warning from Uber’s computers, telling him it had been reported for fraudulent activity. Another warning came two weeks later. A third warning would result in his account being closed.

He stopped using the app for fear of being banned for good and started racking his brains for what he could have done wrong.

Like many companies in the gig economy, Uber manages its tens of thousands of UK drivers with artificial intelligence programs, which handle everything from using facial recognition to verify identities, to matching drivers with customers, to fraud detection when drivers deceive passengers or share accounts.

Using humans to run such a huge operation would be impossible, the companies say. Food delivery company Deliveroo, for example, says in its privacy policy that manual checks on its service “just wouldn’t be possible on time and given the volumes of deliveries we handle.”

But workers in the odd-job economy complain that there is little recourse when the computer makes a bad decision, and companies don’t tell them how algorithms will evaluate them, choosing instead to keep the process broadly. secret.

Alexandru claimed that although he was warned of fraud, such as deliberately extending the duration or distance of a trip, he was not given any explanation as to what had happened and was could not find out.

“It’s like you find a warning left on your desk by your boss, but they’re unreachable, and if you knock on their door they say, stop whatever you’re doing wrong, or you’ll be fired, but they will. ”I’m not telling you what it is. You feel targeted, discriminated against, ”he said.

When he called Uber for help, no one was able to help him. “They kept saying, ‘The system can’t be wrong, what have you done? “”

Finally, he took his case to a union, the Workers Info Exchange (WIE), which lobbies companies like Uber to explain how their systems work. In October, Uber apologized for reporting it in error.

“[Uber] use AI software as an HR service for the drivers, ”said Alexandru. “The system can make you stop working, it can end your contract with Uber, it can cost you your [private hire] license, ”he said.

Uber noted that although it uses automated processes for fraud detection, decisions to fire a driver are only made after human review by company personnel.

Over the past eight months, more than 500 concert workers from companies like Amazon Flex, Bolt, Deliveroo, and Uber have also asked WIE to help them understand automated decisions.

Under European data protection laws, drivers have the right to know if and how they have been subjected to automated decision-making. But so far, only 40 workers have received raw data on their work habits, such as job acceptance rates, decline rates and reviews. No company, however, has made it clear how data is used by their computers to make decisions, WIE said.

“The key is the Kafkaesque element of what it means to have an algorithm as a boss,” said Eva Blum-Dumontet, senior researcher at advocacy group Privacy International and co-author of a new report “Managed by robots”, written with WIE.

“People are suspended, unable to work, for reasons they don’t know. The employees of the company themselves do not know this. The insanity of the situation has an impact on their mental health – the feeling of being treated like a culprit, without knowing why. All the drivers I interviewed talked about it, much more than the financial aspect.

In Europe, regulators and courts are beginning to recognize the harms of algorithmic management practices. In January, an Italian court ruled that Deliveroo had discriminated against workers because its computers did not distinguish between unproductive workers and those who were sick or exercised their right to strike.

Italy has also fined Deliveroo and Glovo, a grocery delivery app, for failing to reveal how their computers divide tasks and evaluate performance.

In March, a Dutch court ruled that ridesharing company Ola should provide the data used to generate ‘fraud probability scores’ and ‘income profiles’ of drivers, which are used to decide on job allocation.

Despite these scattered legal victories, enforcement remains weak. “Laws – both employment laws and data privacy laws – fail at these engines,” said Cansu Safak, report co-author and researcher working on algorithmic surveillance at WIE. “When you try to exercise these rights, you find yourself talking to bots again. ”

She said she found the industry “deeply hostile and resistant to the exercise of workers’ rights”, thus requiring stricter enforcement of existing laws.

Uber said, “Our technology plays an important role in ensuring the safety of all who use our platform, as well as in maximizing revenue opportunities for drivers and couriers. ”

The company added that Uber is committed to being open and transparent about its systems, a claim disputed by campaigners, workers and union members, who said gig companies treat the way their algorithms work. as a trade secret.

Privacy International and WIE will launch a public petition on Monday demanding more details, such as how automated decisions are reviewed from half a dozen concert platforms.

Their aim is to draw attention to the growing automation of all workplaces, not just the odd-job economy.

“Everyone feels safe, they have a good job and it won’t affect them, just those poor Uber drivers. But it’s not just about us,” Alexandru said.

“The popular classes, the concert workers, the self-employed, we are the first to be affected because we are the most vulnerable. But what’s next? What if the AI ​​decided whether an ambulance is dispatched to your house or not, would you feel safe without speaking to a human operator? It affects all of our lives.

About Roberto Frank

Check Also

Parents, do you want to stop overdoing it? Research offers a clue.

Placeholder while loading article actions Leidy Klotz, professor of engineering and architecture at the University …