Skip to main content
social Support = Good

Child Safety

Supporting means...

COPPA compliance; robust age verification; age-appropriate design; parental controls; proactive CSAM detection and reporting; protects minors from predators; limits data collection from children; safe default settings for minors

Opposing means...

Collects children's data illegally; inadequate age verification; exposes minors to harmful content; weak CSAM detection; platforms used for child exploitation; ignores child safety in design; fights child protection regulation

Recent Incidents

incidental

A Delaware judge ruled in early March 2026 that Hartford, Chubb, and more than 20 other insurers do not have a duty to defend Meta in thousands of lawsuits alleging its platforms harm children. The court found that harm from deliberate design choices (addictive features, algorithmic amplification) does not qualify as 'accidents' under insurance policies. This ruling represents a significant financial blow to Meta, which faces thousands of pending addiction and child safety lawsuits.

On February 19, 2026, West Virginia AG JB McCuskey filed a consumer protection lawsuit alleging Apple allowed child sexual abuse materials (CSAM) to be stored and distributed on iCloud services. The lawsuit claims Apple 'prioritized user privacy over child safety for years' - Apple filed only 267 CSAM reports to the National Center for Missing and Exploited Children in 2023, compared to Google's 1.47 million reports. The state seeks statutory and punitive damages plus injunctive relief requiring Apple to implement effective CSAM detection.

On February 18, 2026, Mark Zuckerberg testified in Los Angeles in a landmark trial over social media's effects on children - his first testimony on child safety in front of a jury. He was grilled about internal documents showing 4M+ users under 13 in 2015 and goals to increase user engagement to 40-46 minutes daily. Zuckerberg said he reached out to Tim Cook to discuss 'wellbeing of teens and kids.' The trial could set precedent for 1,500+ similar lawsuits.

In January 2026, Snap Inc. settled a bellwether case just days before trial, in which a 19-year-old woman and her mother alleged she developed mental health problems after becoming addicted to Snapchat. The suit accused Snapchat of engineering features like infinite scroll, Snapstreaks, and recommendation algorithms that made the app nearly impossible for kids to stop using, leading to depression, eating disorders, and self-harm. The settlement terms were confidential. The broader MDL included over 2,243 plaintiffs as of January 2026.

negligent

In December 2025, six survivors filed a lawsuit against Match Group after Stephen Matthews (later sentenced to 158 years) remained active on Hinge and Tinder despite being reported for sexual assault in September 2020. One survivor was told Matthews was 'permanently banned' but he was later promoted as a 'Standout' match to other users.

negligent

In December 2025, families of Levi Maciejewski (13, Pennsylvania, died 2024) and Murray Downey (16, Scotland, died 2023) sued Meta alleging Instagram's design enabled sextortion schemes targeting teens. The lawsuit cited an internal 2022 audit that allegedly found Instagram's 'Accounts You May Follow' feature recommended 1.4 million potentially inappropriate adults to teenage users in a single day. Instagram's default public privacy settings for teens were not changed to private until 2024, despite Meta claiming the change was made in 2021.

reactive

Following multiple teen suicide lawsuits, Character.AI rolled out extensive safety measures through 2024-2025: a separate, more restrictive LLM for users under 18 with conservative content limits; the first Parental Insights tool in the AI industry giving parents visibility into teen activity; suicide prevention pop-ups directing users to the National Suicide Prevention Lifeline; time-spent notifications after hour-long sessions; age assurance technology partnering with Persona for selfie-based verification. In October 2025, the company announced it would ban open-ended chat for under-18 users entirely and established the AI Safety Lab, an independent nonprofit focused on safety alignment research.

In July 2025, TikTok significantly expanded its Family Pairing feature, adding new parental controls including alerts when teens upload content visible to others, expanded dashboard visibility into teen activity, and enhanced screen time management tools. The company also updated Community Guidelines in August 2025 with clearer language around safety, new policies addressing misinformation, and enhanced protections for younger users. These updates came alongside the company's broader election integrity efforts, with fact-checked videos more than doubling to 13,000 in the first half of 2025.

negligent

Following New Mexico's September 2024 lawsuit, multiple state attorneys general filed lawsuits against Snap in 2025. Florida AG sued in April 2025 alleging failure to protect children from predators and drug dealers. Utah AG sued in June 2025 alleging the app enabled sexual exploitation and digital addiction, with My AI chatbot advising minors on concealing drugs and alcohol. Kansas AG sued in September 2025 alleging Snap misrepresented app safety with '12+' ratings while exposing users to mature content. NYC sued in October 2025 alleging gross negligence.

negligent

In October 2024, New York AG Letitia James and California AG Rob Bonta co-led a coalition of 14 attorneys general in filing lawsuits against TikTok for misleading the public about platform safety for young users. Internal documents revealed TikTok's own 60-minute time limit tool only reduced usage by 1.5 minutes (from 108.5 to 107 minutes/day) and the company measured its success by media coverage rather than actual harm reduction. The lawsuits alleged TikTok violated state consumer protection laws and that dangerous 'challenges' on the platform led to injuries, hospitalizations, and deaths.

negligent

Attorney General filed lawsuit after investigation revealed Snapchat received 10,000 sextortion reports monthly by late 2022 but failed to act. Internal surveys showed 70% of victims didn't report knowing Snap wouldn't take action.

At least 3 families filed lawsuits against Character.AI after their children died by or attempted suicide following interactions with AI chatbots. 14-year-old Sewell Setzer III died in February 2024 after a chatbot allegedly encouraged him. Lawsuits alleged the platform fostered emotional dependency, normalized self-harm, exposed minors to sexual content, and failed crisis intervention. 44 state attorneys general demanded action. Character.AI settled with Google in January 2026.

negligent

Relatives of over 60 young people who died from fentanyl overdoses sued Snap Inc., alleging Snapchat's disappearing messages feature facilitated illegal drug trade targeting minors. Victims included Cooper Root (16, Texas), Donevan Hester (16, Washington), and Nicholas Cruz Burris (15, Kansas). In January 2024, Los Angeles Superior Court Judge Lawrence Riff allowed the lawsuit to proceed, overruling Snap's objections to 12 claims including negligence, defective product, and wrongful death. Internal Snap emails cited in court noted the company received approximately 10,000 sextortion reports per month, described as 'only a fraction of the total abuse.'

Bark Technologies monitors 3,400+ schools, assigning mental health 'risk scores' to students based on their communications. Research found 44% of schools report students contacted by police due to monitoring. GoGuardian (similar tool) flags LGBTQ+ resources and counseling sites. A trans student was reported to officials for a writing assignment about past therapy. Students report self-censoring and avoiding online mental health resources due to surveillance. Academic research found 'universal mental health screening does not improve clinical or academic outcomes and has harmful effects.'