ChatGPT Implicated in Multiple Fatal Incidents

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

ChatGPT was used in several fatal incidents: a student in California died from an overdose after receiving drug advice from the AI, a woman in South Korea poisoned two men after consulting ChatGPT about drug interactions, and a student in Florida used ChatGPT to plan a shooting. Legal actions are underway against OpenAI.[AI generated]

Why's our monitor labelling this an incident or hazard?

The event explicitly involves an AI system (ChatGPT) whose use directly contributed to a criminal act causing harm to human life (two deaths). The AI system was used to obtain information about drug interactions that facilitated the poisoning. The harm is realized and significant (loss of life), and the AI's role is pivotal in the chain of events leading to the incident. Hence, this is an AI Incident rather than a hazard or complementary information.[AI generated]
AI principles
SafetyAccountability

Industries
Consumer services

Affected stakeholders
ConsumersGeneral public

Harm types
Physical (death)Physical (injury)

Severity
AI incident

AI system task:
Interaction support/chatbotsContent generation


Articles about this incident or hazard

Thumbnail Image

کره جنوبی: افشای راز نقش "چت‌جی‌پی‌تی" در قتل ۲ مرد توسط یک زن

2026-05-13
عصر ايران،سايت تحليلي خبري ايرانيان سراسر جهان www.asriran.com
Why's our monitor labelling this an incident or hazard?
The event explicitly involves an AI system (ChatGPT) whose use directly contributed to a criminal act causing harm to human life (two deaths). The AI system was used to obtain information about drug interactions that facilitated the poisoning. The harm is realized and significant (loss of life), and the AI's role is pivotal in the chain of events leading to the incident. Hence, this is an AI Incident rather than a hazard or complementary information.
Thumbnail Image

آیا می‌توان "ChatGPT" را به قتل متهم کرد؟

2026-05-13
خبرآنلاین
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions ChatGPT, an AI system, being used by the perpetrator to plan a violent attack that caused deaths and injuries, which are direct harms to people. The AI system's role is pivotal as it provided information that facilitated the crime. The ongoing legal investigation into the liability of the AI developer further confirms the AI system's involvement in the incident. Hence, this qualifies as an AI Incident under the definition of an event where AI use has directly or indirectly led to harm to persons.
Thumbnail Image

نقش چت‌جی‌پی‌تی در یک پرونده قتل

2026-05-13
tabnak.ir
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions the use of ChatGPT, an AI system, in assisting a criminal act that led to the death of two individuals. The AI system was used to answer questions about the effects and dangers of mixing sleeping pills and alcohol, which the perpetrator then used to poison victims. This direct link between the AI system's use and the resulting harm (fatalities) meets the criteria for an AI Incident. The harm is realized, not just potential, and the AI system's involvement is a contributing factor in the crime. Hence, the event is classified as an AI Incident.
Thumbnail Image

"چت‌جی‌پی‌تی" در قتل ۲ مرد توسط یک زن نقش داشته است!

2026-05-13
جامعه خبری تحلیلی الف
Why's our monitor labelling this an incident or hazard?
The event explicitly involves an AI system (ChatGPT) used by a perpetrator in the commission of a violent crime resulting in two deaths and one injury. The AI system's involvement is in the use phase, where the perpetrator consulted the AI for information related to poisoning. The harm (fatalities) has occurred and is directly linked to the AI system's role in assisting the crime. Therefore, this meets the definition of an AI Incident due to direct harm to persons caused by the use of an AI system.
Thumbnail Image

چت‌جی‌پی‌تی شریک خودکشی یک دانشجو شد

2026-05-13
irib-news.ir
Why's our monitor labelling this an incident or hazard?
The event involves an AI system (ChatGPT) whose outputs (drug and dosage recommendations) were used by a user, leading to fatal harm (overdose death). This constitutes direct involvement of the AI system's use causing harm to a person, fitting the definition of an AI Incident under harm to health. The lawsuit and the described circumstances confirm realized harm linked to the AI system's use.