OpenAI—Two mass shooters used ChatGPT to plan attacks; OpenAI leadership declined to notify law enforcement about flagged conversations
Two mass shooters used ChatGPT to plan their attacks: a Florida State University shooting (spring 2025, 2 dead, 5 wounded) and a British Columbia shooting (February 2026). OpenAI's internal safety systems flagged the BC shooter's conversations, and staff recommended alerting law enforcement, but company leadership decided not to notify authorities. Florida AG launched criminal investigation in April 2026. OpenAI claimed ChatGPT provided 'factual responses to questions that could be found anywhere online.'
Scoring Impact
| Topic | Direction | Relevance | Contribution |
|---|---|---|---|
| AI Safety | -against | primary | -1.00 |
| Content Moderation | -against | secondary | -0.50 |
| Digital Safety for Vulnerable Users | -against | primary | -1.00 |
| Overall incident score = | -0.477 | ||
Score = avg(topic contributions) × significance (critical ×2) × confidence (0.57)× agency (negligent ×0.5)
Evidence (1 signal)
NPR reported two mass shooters used ChatGPT to plan attacks, OpenAI leadership declined to notify law enforcement
NPR reported on April 23, 2026 that two mass shooters used ChatGPT to plan their attacks. OpenAI's internal systems flagged conversations from the BC shooter and staff recommended alerting law enforcement, but company leadership decided not to notify authorities. Florida AG launched criminal investigation.