Meta’s Gruesome Content Broke Him. Now He Wants It to Pay

A Kenyan moderator sued the company for work-related PTSD. A new ruling on his case could signal a global reckoning for Big Tech outsourcing.
Silhouette of a judge's gavel on a blue background
Photograph: Aitor Diago/Getty Images

In 2019, Daniel Motaung moved from South Africa to Nairobi, Kenya, for a job at an outsourcing company called Sama. He had been hired for his Zulu language skills, but was unsure of exactly the kind of work he’d be performing. It was only after he started working, he claims, that Motaung discovered he would be spending eight hours or more a day looking at some of the most hideous content on the internet—beheadings, child abuse, suicides—as an outsourced content moderator for Meta.

Motaung alleges that he was paid as low as $2.20 an hour to view graphic content that left him with PTSD. He describes it as “emotionally and mentally devastating”: “I went in OK and went out not OK,” Motaung said in a statement shared by the Real Facebook Oversight Board, a group of independent civil rights advocates and experts. “It changed the person I was.” Motaung began to push to form a union that would allow the moderators to advocate for better pay and more support for their taxing work. Just six months into the job, he was fired. So he decided to sue his former employer and Meta.

Despite Meta’s months-long effort to have it dismissed, on February 6, the Kenyan Employment and Labour Relations Court ruled that Motaung’s case against the social media company can move forward, meaning that Meta can be held accountable for the psychological damage and labor violations faced by Motaung and other outsourced content moderators. 

Justice Jacob Gakeri ruled that Meta “shall not be struck” from the case,  according to Kenyan news site Business Daily, opening the company up to its first substantial labor challenge outside the US. 

As of 2020, it was estimated that Meta had some 15,000 moderators spread across the world through outsourcing companies. In Kenya, Meta’s outsourcing partner was Sama, though its contract with the company will end in March of this year. Should the case succeed, it could allow other large tech companies that outsource to Kenya to be held accountable for the way staff there are treated, and provide a framework for people in other countries seeking to challenge tech giants.

The case, filed by UK-based nonprofit Foxglove Legal and the Kenyan law firm Nzili and Sumbi Advocates on behalf of Motaung, alleges that the working conditions violate Kenyan law and constitute, among other things, forced labor and human trafficking because workers were “coerced by a threat of penalty to accept the unlawful circumstances they found themselves in.” 

Motaung and his lawyers want Meta to provide the same pay and mental health support to outsourced moderators that it does to its own staff. They also want Meta and Sama to undergo a human rights audit, as well as pay current and former moderators damages, and for psychological care. 

Meta had argued that it should not be subject to Kenyan law because it is a foreign corporation that does not operate in Kenya. Meta and Sama did not respond to a request for comment for this article. “These companies [Meta and other Big Tech firms] seek to enter and profit from a lot of jurisdictions while simultaneously saying that they don’t answer to the courts,” says Cori Crider, director of Foxglove Legal.

Motaung’s lawyer, Mercy Mutemi, argues that Meta’s content moderation operations in Nairobi, its small group of staff, and the fact that it makes money from Kenyan advertisers on its platform are proof that the company operates within the country. “They make money from Kenyans,” she says. Meta-owned Facebook had 9.95 million users and Instagram had 2.5 million users in Kenya in 2022.

The case is a first from a content moderator outside the company’s home country. In May 2020, Meta (then Facebook) reached a settlement of $52 million with US-based moderators who developed PTSD from working for the company. But previous reporting has found that many of the company’s international moderators doing nearly identical work face lower pay and receive less support while working in countries with fewer mental health care services and labor rights. While US-based moderators made around $15 per hour, moderators in places like India, the Philippines, and Kenya make much less, according to 2019 reporting from the Verge.

“The whole point of sending content moderation work overseas and far away is to hold it at arm’s length, and to reduce the cost of this business function,” says Paul Barrett, deputy director of the Center for Business and Human Rights at New York University, who authored a 2020 report on outsourced content moderation. But content moderation is critical for platforms to continue to operate, keeping the kind of content that would drive users—and advertisers—away from the platform. “Content moderation is a core vital business function, not something peripheral or an afterthought. But there’s a powerful irony from the fact that the whole arrangement is set up to offload responsibility,” he says. (A summarized version of Barrett’s report was included as evidence in the current case in Kenya on behalf of Motaung.)

Barrett says that other outsourcers, like those in the apparel industry, would find it unthinkable today to say that they bear no responsibility for the conditions in which their clothes are manufactured.

“I think technology companies, being younger and in some ways more arrogant, think that they can kind of pull this trick off,” he says.

A Sama moderator, speaking to WIRED on the condition of anonymity out of concern for retaliation, described needing to review thousands of pieces of content daily, often needing to make a decision about what could and could not stay on the platform in 55 seconds or less. Sometimes that content could be “something graphic, hate speech, bullying, incitement, something sexual,” they say. “You should expect anything.”

Crider, of Foxglove Legal, says that the systems and processes Sama moderators are exposed to—and that have been shown to be mentally and emotionally damaging—are all designed by Meta. (The case also alleges that Sama engaged in labor abuses through union-busting activities, but does not allege that Meta was part of this effort.)

“This is about the wider complaints about the system of work being inherently harmful, inherently toxic, and exposing people to an unacceptable level of risk,” Crider says. “That system is functionally identical, whether the person is in Mountain View, in Austin, in Warsaw, in Barcelona, in Dublin, or in Nairobi. And so from our perspective, the point is that it’s Facebook designing the system that is a driver of injury and a risk for PTSD for people.”

Crider says that in many countries, particularly those that rely on British common law, courts will often look to decisions in other, similar nations to help frame their own, and that Motaung’s case could be a blueprint for outsourced moderators in other countries. “While it doesn’t set any formal precedent, I hope that this case could set a landmark for other jurisdictions considering how to grapple with these large multinationals.”

Updated 2/6/2023 10:00 ET: This piece has been updated to reflect the decision of the court in Kenya to include Meta in Motaung's ongoing case.