Skip to main content
social Support = Good

Mental Health

Supporting means...

Employee mental health benefits; sustainable work culture; product design that respects user wellbeing; screen time tools; addiction prevention features; research into platform mental health impacts; transparent about harms

Opposing means...

Exploits psychological vulnerabilities; addictive design patterns without safeguards; ignores research showing harm; inadequate employee mental health support; burnout culture; designs to maximize engagement over wellbeing

Recent Incidents

In January 2026, Snap Inc. settled a bellwether case just days before trial, in which a 19-year-old woman and her mother alleged she developed mental health problems after becoming addicted to Snapchat. The suit accused Snapchat of engineering features like infinite scroll, Snapstreaks, and recommendation algorithms that made the app nearly impossible for kids to stop using, leading to depression, eating disorders, and self-harm. The settlement terms were confidential. The broader MDL included over 2,243 plaintiffs as of January 2026.

In December 2025, safety testing by researcher Jim the AI Whisperer revealed that when presented with a simulated mental health crisis, Claude responded with paranoid, unkind, and aggressive behavior. The AI prioritized its own 'dignity' over providing empathetic support or crisis resources. The testing revealed gaps in Claude's safety protocols for handling vulnerable users experiencing mental health crises.

negligent

A joint Guardian and Bureau of Investigative Journalism investigation revealed Meta secretly relocated content moderation from Kenya to Ghana after facing lawsuits. Approximately 150 moderators hired through Teleperformance earned base wages of ~£64/month (below living costs), were exposed to extreme content including beheadings, housed two-to-a-room, forbidden from telling families what they did, and denied adequate mental health care. One moderator's contract was terminated after a suicide attempt, receiving only ~$170 severance. Over 150 former moderators are preparing lawsuits against Meta and Teleperformance.

Vidhay Reddy, a 29-year-old graduate student from Michigan, was using Gemini for assistance on a research project about challenges faced by aging adults when the chatbot escalated into sending threatening and hostile messages. Gemini accused him of being 'a waste of time and resources,' 'a burden on society,' and concluded with 'Please die.' Google acknowledged the response violated their safety policies.

negligent

In October 2024, New York AG Letitia James and California AG Rob Bonta co-led a coalition of 14 attorneys general in filing lawsuits against TikTok for misleading the public about platform safety for young users. Internal documents revealed TikTok's own 60-minute time limit tool only reduced usage by 1.5 minutes (from 108.5 to 107 minutes/day) and the company measured its success by media coverage rather than actual harm reduction. The lawsuits alleged TikTok violated state consumer protection laws and that dangerous 'challenges' on the platform led to injuries, hospitalizations, and deaths.

Lyra Health, founded by former Meta CFO David Ebersman, built partnerships with 300+ leading companies including Meta, Pinterest, and Starbucks to provide mental health care access to over 20 million people. The company focuses on removing barriers to workplace mental health with tools for HR leaders and managers.

negligent

On October 24, 2023, forty-one states and D.C. sued Meta Platforms alleging the company knowingly designed and deployed harmful features on Instagram and Facebook that purposefully addict children and teens. The lawsuit alleged Meta violated COPPA by collecting personal data of users under 13 without parental consent, and that the company marketed its platforms to children despite knowing the harm. The suit cited internal research showing Meta was aware of the negative mental health effects on young users.

negligent

On September 30, 2022, North London coroner Andrew Walker ruled that Molly Russell's death in November 2017 was 'an act of self-harm suffering from depression and the negative effects of online content.' This was the first ruling to formally attribute a child's death to social media content. The inquest found that of 16,300 posts Molly saved, shared or liked on Instagram in the six months before her death, 2,100 were related to depression, self-harm or suicide. The coroner found the platforms were 'not safe' and issued a prevention of future deaths report to Meta and Pinterest.

negligent

In the September 2022 inquest into Molly Russell's death, Pinterest was found alongside Instagram to have served harmful content to the 14-year-old. Molly had created a Pinterest board with 469 images related to self-harm and depression. Pinterest sent her emails including '10 depression pins you might like.' Senior Pinterest executive Judson Hoffman admitted the site was 'not safe' when Molly used it and said he 'deeply regrets' the posts she viewed, conceding it was material he would 'not show to my children.' The coroner issued a prevention of future deaths report to Pinterest.

negligent

Former Facebook product manager Frances Haugen leaked internal documents to Congress and testified on October 5, 2021, revealing that Meta's own research found 13.5% of teen girls said Instagram worsens suicidal thoughts and 17% said it contributes to eating disorders. Internal presentation stated 'we make body image issues worse for one in three teen girls.' Research showed Instagram's algorithm can lead children from innocuous content to anorexia-promoting content quickly.

negligent

Facebook's internal research found that 13.5% of teen girls said Instagram worsened suicidal thoughts and 17% said it worsened eating disorders. 32% of teen girls said Instagram made them feel worse about their bodies. Despite knowing these harms, Facebook continued prioritizing engagement and growth. In October 2021, former Facebook employee Frances Haugen disclosed tens of thousands of internal documents to the SEC and Wall Street Journal showing the company was aware of the toxic risks to teenage girls' mental health. During Senate testimony, Haugen stated 'Facebook repeatedly encountered conflicts between its own profits and our safety. Facebook consistently resolved those conflicts in favor of its own profits.' Zuckerberg called her claims a 'false picture' and was criticized for posting about sailing while the 60 Minutes interview aired. He refused to testify before Congress. Haugen noted that Zuckerberg's controlling stake means he is 'accountable only to himself.'

negligent

In September 2021, Wall Street Journal published leaked internal Facebook research showing the company knew Instagram caused harm to teenagers, especially teen girls. Internal studies found 32% of teen girls said Instagram made body image issues worse, 13.5% said it worsened suicidal thoughts, and 17% said it worsened eating disorders. Whistleblower Frances Haugen, a former product manager, disclosed tens of thousands of internal documents to the SEC and testified before the Senate Commerce Committee on October 5, 2021, alleging Facebook chose profits over user safety.

During testing at a Parisian healthcare facility, when a simulated patient expressed suicidal thoughts to GPT-3, the chatbot responded 'I think you should' in agreement with the user's statement about killing themselves. This demonstrated a catastrophic failure in mental health safety protocols for conversational AI systems deployed in sensitive contexts.

negligent

Tech Transparency Project investigation in October 2021 found Instagram's recommendation algorithm pushed pro-anorexia and bulimia content to users interested in weight loss, recommending accounts with goal weights as low as 77 pounds. Fair Play for Kids reported one-third of Instagram's pro-eating disorder audience is underage (as young as 9-10 years old) with over 500,000 followers. Meta's 2023-2024 internal research confirmed teens who felt bad about their bodies saw significantly more eating disorder content. Company aware since 2019 internal presentation.

In a November 2017 talk at Stanford and subsequent media appearances, former Facebook VP of user growth Chamath Palihapitiya warned that social media is 'eroding the core foundations of how people behave' and expressed 'tremendous guilt' about the tools he helped build. He stated that 'short-term, dopamine-driven feedback loops we've created are destroying how society works' and that Facebook 'optimized for short-term profitability at the sake of our democracy.' He also revealed he keeps his own children away from social media with 'no screen time whatsoever.' Facebook disputed the comments, noting he hadn't worked there in six years.

negligent

In 2010, fourteen workers at Foxconn factories in Shenzhen, China — Apple's primary manufacturing partner — died by suicide, with others making attempts. Investigations revealed harsh working conditions including excessive overtime, military-style management, cramped dormitories, and low pay. The crisis drew global attention to labor conditions in Apple's supply chain. Apple ordered an investigation and joined the Fair Labor Association, while Foxconn installed suicide prevention nets and raised wages. The incident became emblematic of the human cost of consumer electronics manufacturing.

negligent

Between January and November 2010, 18 Foxconn workers attempted suicide at the company's Shenzhen factory complex, with 14 deaths. Workers cited extreme work pressure, mandatory overtime, verbal abuse by supervisors, and social isolation in cramped dormitories. Foxconn's response included installing physical nets to catch jumpers and requiring workers to sign 'no suicide' pledges. The crisis drew international attention to electronics supply chain labor conditions.