AI-Generated Nude Photos of Bengaluru Students Circulated on Instagram

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

In Bengaluru, AI-generated nude photos of Class 9 students were circulated on Instagram, leading to a police investigation. The images, created using photos from the victims' private Instagram accounts, were shared in a school-related group. Parents suspect an acquaintance is responsible and have filed a complaint with the Cyber Crime Cell.[AI generated]

Why's our monitor labelling this an incident or hazard?

The event explicitly mentions AI-generated nude images created by extracting photos from social media and manipulated to produce harmful content. This use of AI directly led to harm to the victims, including violation of privacy and emotional distress, which falls under violations of human rights and harm to communities. Therefore, this qualifies as an AI Incident due to the realized harm caused by the malicious use of AI systems.[AI generated]
AI principles
Privacy & data governanceRespect of human rightsSafetyAccountability

Industries
Media, social platforms, and marketing

Affected stakeholders
Children

Harm types
PsychologicalReputationalHuman or fundamental rights

Severity
AI incident

AI system task:
Content generation


Articles about this incident or hazard

Thumbnail Image

Bengaluru: AI-generated Nude Pictures of Class 9 CBSE School Student Emerge Online, Parents File FIR - News18

2024-05-27
News18
Why's our monitor labelling this an incident or hazard?
The event explicitly mentions AI-generated nude images created by extracting photos from social media and manipulated to produce harmful content. This use of AI directly led to harm to the victims, including violation of privacy and emotional distress, which falls under violations of human rights and harm to communities. Therefore, this qualifies as an AI Incident due to the realized harm caused by the malicious use of AI systems.
Thumbnail Image

PU student, 2 minor boys held over AI-morphed nude pics of Bengaluru girl | Bengaluru News - Times of India

2024-05-29
The Times of India
Why's our monitor labelling this an incident or hazard?
The article explicitly states that an AI system was used to morph the girl's photo to create nude images, which were then posted on social media without her consent. This constitutes a violation of her rights and causes harm to her dignity and privacy. The involvement of AI in creating manipulated content that leads to harm fits the definition of an AI Incident, as the AI system's use directly led to harm to a person and violation of rights. Therefore, this event qualifies as an AI Incident.
Thumbnail Image

AI-morphed naked pics of class 9 Bengaluru girl surface on Instagram: Report

2024-05-26
Hindustan Times
Why's our monitor labelling this an incident or hazard?
The event describes AI-generated morphed naked pictures of a minor girl being posted on Instagram without consent. This constitutes a violation of the girl's rights, including privacy and potentially other legal protections, and causes harm to the individual and community. The AI system's use in creating these images directly led to this harm. Therefore, this qualifies as an AI Incident under the definitions provided.
Thumbnail Image

Bengaluru: Top Private School in Shock After AI-generated Nude Images of Class IX Students Emerge Online

2024-05-27
News18
Why's our monitor labelling this an incident or hazard?
The event explicitly mentions AI-generated nude images of minors being circulated online, which constitutes a violation of rights and harm to the individuals involved. The AI system's use directly led to this harm, fulfilling the criteria for an AI Incident. The involvement of AI in generating harmful content and the resulting violation of rights and harm to the community clearly classifies this as an AI Incident rather than a hazard or complementary information.
Thumbnail Image

AI-Generated Nude Photos of Girl Student Surface on Instagram, FIR Lodged in Bengaluru

2024-05-29
Republic World
Why's our monitor labelling this an incident or hazard?
The incident directly involves the use of an AI system to create manipulated nude photos of a minor, which were then shared on Instagram, leading to harm to the victim's dignity, privacy, and mental health. This constitutes a violation of human rights and harm to the individual, fulfilling the criteria for an AI Incident. The AI system's use in generating the fake images is central to the harm caused.
Thumbnail Image

Class 9 students' AI-generated nude photos circulated in Bengaluru, parents file complaint with Cyber Cell

2024-05-27
Firstpost
Why's our monitor labelling this an incident or hazard?
The event explicitly mentions AI-generated nude photos of minors being circulated, which constitutes a violation of rights and harm to the individuals involved. The AI system's use in generating these images is central to the incident, fulfilling the criteria for an AI Incident due to the direct harm caused. The involvement of law enforcement and the ongoing investigation further confirm the seriousness and realization of harm.
Thumbnail Image

Morphed AI-Generated Photos Of A Class IX Student Of A Top Bengaluru School Surfaces Online

2024-05-28
KalingaTV
Why's our monitor labelling this an incident or hazard?
The event describes morphed images of students that were shared online, which indicates the use of AI or digital manipulation tools to create fake nude photos. This has led to direct harm to the students involved, including privacy violations and potential psychological harm. The AI system's involvement is in the creation of the morphed images, which directly led to the harm. Hence, this is an AI Incident as per the definition of harm to persons and violation of rights caused by AI-generated content.
Thumbnail Image

Bengaluru: AI-generated naked pics of minor girl shared on Instagram by anonymous persons, case registered

2024-05-26
News9live
Why's our monitor labelling this an incident or hazard?
The event explicitly mentions AI-generated naked pictures of a minor girl being shared on Instagram, which directly harms the individual's privacy and dignity, constituting a violation of rights. The AI system's use in generating these images and their dissemination on social media directly led to harm, fulfilling the criteria for an AI Incident. The case registration by police further confirms the recognition of harm caused. The second incident mentioned about the death of a student is unrelated to AI and does not affect the classification.
Thumbnail Image

Bengaluru: AI-generated nude pictures of students appears on Instagram page, parents lodge police complaint

2024-05-27
News9live
Why's our monitor labelling this an incident or hazard?
The event explicitly mentions AI-generated nude images, indicating the involvement of an AI system in creating manipulated harmful content. The posting of these images has directly led to harm to the students, including violations of their privacy and potential psychological harm, which falls under violations of human rights and harm to individuals. Therefore, this qualifies as an AI Incident due to the realized harm caused by the malicious use of AI-generated content.