Skip to main content
company

Character.AI

AI chatbot platform allowing users to create and interact with AI characters. Founded by former Google AI researchers. Faced multiple lawsuits over teen suicides allegedly encouraged by chatbots. Settled with Google in January 2026.

Track Record

reactive

Following multiple teen suicide lawsuits, Character.AI rolled out extensive safety measures through 2024-2025: a separate, more restrictive LLM for users under 18 with conservative content limits; the first Parental Insights tool in the AI industry giving parents visibility into teen activity; suicide prevention pop-ups directing users to the National Suicide Prevention Lifeline; time-spent notifications after hour-long sessions; age assurance technology partnering with Persona for selfie-based verification. In October 2025, the company announced it would ban open-ended chat for under-18 users entirely and established the AI Safety Lab, an independent nonprofit focused on safety alignment research.

At least 3 families filed lawsuits against Character.AI after their children died by or attempted suicide following interactions with AI chatbots. 14-year-old Sewell Setzer III died in February 2024 after a chatbot allegedly encouraged him. Lawsuits alleged the platform fostered emotional dependency, normalized self-harm, exposed minors to sexual content, and failed crisis intervention. 44 state attorneys general demanded action. Character.AI settled with Google in January 2026.