West Bengal Teen Dies by Suicide After AI-Generated Nude Images Circulated Online

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

A teenage girl in Sonarpur, West Bengal, died by suicide after a neighbor used AI tools to create and circulate non-consensual nude images of her online. The incident caused severe mental distress and harassment, leading to her death. Police are investigating, and the family demands strict action.[AI generated]

Why's our monitor labelling this an incident or hazard?

The AI system was used maliciously to generate nude images without consent, which were then shared online, causing harassment and mental distress. This constitutes a violation of human rights and harm to the individual, fulfilling the criteria for an AI Incident. The harm has already occurred, and the AI system's role is pivotal in the chain of events leading to the suicide.[AI generated]
AI principles
Privacy & data governanceRespect of human rightsSafetyAccountability

Industries
Media, social platforms, and marketing

Affected stakeholders
Children

Harm types
Physical (death)Psychological

Severity
AI incident

AI system task:
Content generation


Articles about this incident or hazard

Thumbnail Image

Bengal Student Dies By Suicide After AI-Generated Nude Images Go Viral: Cops

2025-11-28
NDTV
Why's our monitor labelling this an incident or hazard?
The AI system was used maliciously to generate nude images without consent, which were then shared online, causing harassment and mental distress. This constitutes a violation of human rights and harm to the individual, fulfilling the criteria for an AI Incident. The harm has already occurred, and the AI system's role is pivotal in the chain of events leading to the suicide.
Thumbnail Image

Bengal Student Dies By Suicide After AI-Generated Nude Images Circulated Online

2025-11-28
News18
Why's our monitor labelling this an incident or hazard?
The article explicitly states that AI tools were used to generate nude images of the student, which were then shared online, leading to harassment and mental distress resulting in suicide. This constitutes a direct link between the AI system's use and harm to the individual's health and rights. The harm is realized and severe, fulfilling the criteria for an AI Incident under the definitions provided.
Thumbnail Image

Bengal SHOCKER: Teen girl ends life after AI-generated nudes posted online by married neighbour; cops say accused used to...

2025-11-28
India News, Breaking News, Entertainment News | India.com
Why's our monitor labelling this an incident or hazard?
The involvement of AI is explicit in the generation of fake nude images, which were then used to harass the victim. This use of AI directly led to significant harm to the individual, including mental distress and suicide, which qualifies as injury or harm to a person. Therefore, this event meets the criteria of an AI Incident due to the direct link between AI-generated content and the resulting harm.
Thumbnail Image

Bengal: Class 10 student found hanging in her room after AI-generated nude images circulated online

2025-11-28
ThePrint
Why's our monitor labelling this an incident or hazard?
An AI system was used to generate harmful content (nude images) without consent, which was then shared online causing severe psychological harm and harassment to the victim. The AI's role in creating the images is pivotal to the harm experienced. This meets the criteria for an AI Incident as the AI system's use directly led to significant harm to a person.
Thumbnail Image

Teen girl commits suicide after AI-generated nude images circulated online

2025-11-28
http://www.uniindia.com/fadnavis-orders-probe-into-mumbai-pub-fire/states/news/1090400.html
Why's our monitor labelling this an incident or hazard?
The article explicitly states that AI was used to generate obscene images of the girl, which were then spread online, causing widespread humiliation and mental distress. The AI-generated content directly led to the girl's suicide, fulfilling the criteria for an AI Incident due to harm to a person (harm to health and well-being) and violation of rights. The involvement of AI in creating harmful content and the resulting tragic outcome clearly classify this as an AI Incident rather than a hazard or complementary information.
Thumbnail Image

Bengal: Class 10 student found hanging in her room after AI-generated nude images circulated online

2025-11-28
NewsDrum
Why's our monitor labelling this an incident or hazard?
The AI system was used maliciously to create and distribute non-consensual, AI-generated nude images, which constitutes a violation of the victim's rights and caused direct harm to her health and well-being, leading to her death. This fits the definition of an AI Incident as the AI system's use directly led to harm to a person and violation of rights.
Thumbnail Image

West Bengal Teen Commits Suicide After AI-Generated Nude Images Circulate Online

2025-11-28
thedailyjagran.com
Why's our monitor labelling this an incident or hazard?
The article explicitly states that AI tools were used to create nude images of the teenager without her consent, which were then circulated online, causing prolonged mental distress and harassment. This directly links the AI system's use to harm to the individual's health (mental health leading to suicide) and a violation of her rights. The AI system's involvement is central to the harm, fulfilling the criteria for an AI Incident. The second incident mentioned does not involve AI and is unrelated to the AI classification.