Skip to main content
social Support = Good

Platform Labor Conditions

Supporting means...

Fair pay and working conditions for content moderators, data labelers, and AI trainers; mental health support for workers exposed to traumatic content; transparent about human labor in AI pipeline; direct employment rather than outsourcing to low-wage contractors; adequate breaks and counseling; recognizes platform workers' contribution

Opposing means...

Outsources traumatic content moderation to low-wage countries without adequate support; pays data labelers poverty wages; hides human labor behind AI branding; no mental health resources for workers viewing harmful content; excessive quotas and surveillance of platform workers; uses contractor structures to avoid responsibility

Recent Incidents

Content moderators from nine countries formed the Global Trade Union Alliance of Content Moderators in Nairobi, Kenya, to fight for living wages, safe working conditions and union representation. The alliance is calling on tech companies including TikTok, Meta, Alphabet and OpenAI to adopt mental health protections throughout their supply chains. Over 80% of workers surveyed said their employer needs to do more to support their mental health. Report titled 'The People Behind the Screens' documented traumatic, high-pressure conditions including PTSD, depression, burnout and suicidality among moderation workers. Workers describe pressure to review thousands of horrific videos daily including beheadings, child abuse, and torture.

In December 2024, over 140 content moderators working for Meta through contractor Samasource Kenya were diagnosed with severe PTSD by Dr. Ian Kanyanya at Kenyatta National Hospital. Moderators reported reviewing extreme content including child sexual abuse, torture, murder, and bestiality with only ~1 minute per piece, under close monitoring with threat of termination. Workers reported self-harm, vomiting, and severe psychological symptoms. Meta outsources content moderation to contractors in developing countries including Kenya and Ghana. A second wave of lawsuits emerged from Ghana moderators in April 2025.

negligent

Meta has faced multiple lawsuits from content moderators suffering severe psychological trauma. In 2020, Facebook paid $52 million to settle a US class-action (Scola v. Facebook) from moderators employed through Accenture and other contractors who developed PTSD. In September 2024, a Kenyan court ruled Meta can be sued in local courts, with 144 former moderators (81% diagnosed with severe PTSD) seeking $1.6 billion in compensation. Additional lawsuits from Ghana moderators allege depression, anxiety, insomnia, and substance abuse from reviewing extreme content. Accenture employed more than a third of Meta's ~15,000 content moderators.

Thousands of contract workers evaluating Google's Gemini AI for accuracy and safety earn as little as $14-15 per hour in the US, far below industry standards for cognitively demanding AI evaluation work. Overseas raters in India and the Philippines report effective rates under $10 per hour after deductions. Workers face grueling deadlines and burnout, fueling accusations of exploitation in the AI evaluation pipeline.

Washington Post investigation documented Scale AI's Remotasks platform as 'digital sweatshops' where workers train AI models for below minimum wage. Filipino taskers initially earned up to $200/week, but after expansion to India and Venezuela in 2021, pay plunged from $10/task to less than 1 cent. Of 36 workers interviewed, 34 reported delayed, reduced, or canceled payments. In March 2024, Remotasks abruptly shut down Kenya operations, stranding thousands without job security or owed wages. Three class-action lawsuits filed in late 2024/early 2025 alleged worker misclassification, unpaid training, and 'Orwellian' surveillance.

negligent

In November 2021, OpenAI contracted Sama to hire Kenyan data labelers to remove toxic content from ChatGPT training data. Despite OpenAI paying Sama $12.50/hour per worker, laborers received only $1.32-$2.00/hour. Workers were exposed to graphic content including child sexual abuse, bestiality, murder, and torture. Of 144 assessed workers, 81% were diagnosed with severe PTSD. Wellness counseling was limited due to productivity demands. Sama canceled the contract in March 2022, eight months early, then retrenched 200 employees. In July 2023, four workers petitioned Kenya's National Assembly for investigation.