
The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.
A teenage girl in Sonarpur, West Bengal, died by suicide after a neighbor used AI tools to create and circulate non-consensual nude images of her online. The incident caused severe mental distress and harassment, leading to her death. Police are investigating, and the family demands strict action.[AI generated]
Why's our monitor labelling this an incident or hazard?
The AI system was used maliciously to generate nude images without consent, which were then shared online, causing harassment and mental distress. This constitutes a violation of human rights and harm to the individual, fulfilling the criteria for an AI Incident. The harm has already occurred, and the AI system's role is pivotal in the chain of events leading to the suicide.[AI generated]