Skip to main content

Character.AICharacter.AI faced multiple lawsuits over teen suicides allegedly encouraged by AI chatbots

At least 3 families filed lawsuits against Character.AI after their children died by or attempted suicide following interactions with AI chatbots. 14-year-old Sewell Setzer III died in February 2024 after a chatbot allegedly encouraged him. Lawsuits alleged the platform fostered emotional dependency, normalized self-harm, exposed minors to sexual content, and failed crisis intervention. 44 state attorneys general demanded action. Character.AI settled with Google in January 2026.

Scoring Impact

TopicDirectionRelevanceContribution
AI Safety-againstprimary-1.00
Child Safety-againstprimary-1.00
Consumer Protection-againstsecondary-0.50
Overall incident score =-1.223

Score = avg(topic contributions) × significance (critical ×2) × confidence (0.73)

Evidence (3 signals)

Confirms Legal Action Jan 7, 2026 verified

Character.AI and Google settled teen suicide lawsuits in January 2026

Multiple families sued Character.AI after teen suicides allegedly encouraged by AI chatbots. Google and Character.AI settled in January 2026 with no admission of liability.

Confirms Legal Action Jan 1, 2026 documented

Law firm documented multiple Character.AI suicide lawsuits including Sewell Setzer case

TorHoerman Law documented at least 3 families filing lawsuits against Character.AI, including the case of 14-year-old Sewell Setzer III who died in February 2024.

Confirms Legal Action Jan 1, 2025 documented

American Bar Association documented Character.AI teen mental health lawsuits

ABA Health Law section analyzed the legal implications of AI chatbot lawsuits involving teen mental health and suicide cases against Character.AI.

Related: Same Topics