Meta Platforms—Content moderators formed first-ever global trade union alliance demanding mental health protections from Meta, TikTok, Alphabet, OpenAI
Content moderators from nine countries formed the Global Trade Union Alliance of Content Moderators in Nairobi, Kenya, to fight for living wages, safe working conditions and union representation. The alliance is calling on tech companies including TikTok, Meta, Alphabet and OpenAI to adopt mental health protections throughout their supply chains. Over 80% of workers surveyed said their employer needs to do more to support their mental health. Report titled 'The People Behind the Screens' documented traumatic, high-pressure conditions including PTSD, depression, burnout and suicidality among moderation workers. Workers describe pressure to review thousands of horrific videos daily including beheadings, child abuse, and torture.
Scoring Impact
| Topic | Direction | Relevance | Contribution |
|---|---|---|---|
| Mental Health | +toward | secondary | +0.50 |
| Platform Labor Conditions | +toward | primary | +1.00 |
| Tech Worker Organizing | +toward | secondary | +0.50 |
| Overall incident score = | +0.644 | ||
Score = avg(topic contributions) × significance (high ×1.5) × confidence (0.64)
Evidence (2 signals)
Content moderators from nine countries formed Global Trade Union Alliance demanding safe working conditions
Content moderators formed the first-ever global trade union alliance in Nairobi, Kenya, with workers from nine countries. The Global Trade Union Alliance of Content Moderators is calling on tech companies including TikTok, Meta, Alphabet and OpenAI to adopt mental health protections throughout their supply chains. Over 80% of workers surveyed said their employer needs to do more to support their mental health.
Global content moderators alliance released report demanding mental health protocols after documenting trauma, PTSD, and suicidality
Report titled 'The People Behind the Screens' laid out traumatic, high-pressure conditions workers endure and concrete steps to reduce psychological harm, including measures to address PTSD, depression, burnout and even suicidality among moderation workers. Workers described pressure to review thousands of horrific videos each day including beheadings, child abuse, and torture.