Student Faces Trial for AI-Generated Sexual Images of Schoolmates in Córdoba

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

A student in Córdoba, Argentina, used AI to create and publish manipulated sexual images of female classmates, including minors, on adult websites, identifying them by name and linking to their social media. The victims suffered psychological harm and privacy violations, prompting a criminal trial for gender-based violence.[AI generated]

Why's our monitor labelling this an incident or hazard?

The event explicitly mentions the use of AI tools to create false sexual images by placing victims' faces on nude bodies, which were then published online with identifying information. This use of AI directly caused harm to the victims, including psychological injury and violation of their rights, fitting the definition of an AI Incident under violations of human rights and harm to persons. Therefore, this event qualifies as an AI Incident.[AI generated]
AI principles
Privacy & data governanceRespect of human rights

Industries
Media, social platforms, and marketing

Affected stakeholders
WomenChildren

Harm types
PsychologicalReputationalHuman or fundamental rights

Severity
AI incident

AI system task:
Content generation


Articles about this incident or hazard

Thumbnail Image

Un joven va a juicio por crear imágenes con IA de sus compañeras de escuela y subirlas a sitios para adultos

2026-03-19
Los Andes
Why's our monitor labelling this an incident or hazard?
The event explicitly mentions the use of AI tools to create false sexual images by placing victims' faces on nude bodies, which were then published online with identifying information. This use of AI directly caused harm to the victims, including psychological injury and violation of their rights, fitting the definition of an AI Incident under violations of human rights and harm to persons. Therefore, this event qualifies as an AI Incident.
Thumbnail Image

Va a juicio por difundir imágenes de sus compañeras desnudas creadas con IA

2026-03-18
La Capital
Why's our monitor labelling this an incident or hazard?
The event explicitly involves the use of AI to generate manipulated images that caused psychological harm to individuals, constituting injury to health and violations of rights under applicable law. The AI system's use in creating and disseminating these images directly led to harm, fulfilling the criteria for an AI Incident. The harm is realized and significant, including violence of a gender-based nature and digital violence, which aligns with the definitions of AI Incident involving harm to persons and violations of rights.
Thumbnail Image

Juicio a joven por crear imágenes sexuales de compañeras con IA y publicarlas

2026-03-18
Cadena 3 Argentina
Why's our monitor labelling this an incident or hazard?
The event explicitly involves the use of AI to generate sexualized images of victims without consent, leading to psychological harm and violations of rights. The AI system's use directly led to harm (psychological injury, violence, privacy violations). The harm is realized and significant, meeting the criteria for an AI Incident. The involvement of AI in creating manipulated images that caused harm to individuals and communities is clear and direct.
Thumbnail Image

Un estudiante de Córdoba irá a juicio por crear fotos sexuales falsas de compañeras

2026-03-18
eldiariodecarlospaz.com.ar
Why's our monitor labelling this an incident or hazard?
The event explicitly mentions the use of AI to create false sexual images, which directly led to harm to the victims' mental health and privacy, fulfilling the criteria for an AI Incident. The harm includes psychological injury (anxiety, PTSD, social isolation) and violations of rights (privacy, dignity, protection from gender-based violence). The AI system's use is central to the incident, as it enabled the creation and dissemination of harmful content. Therefore, this is classified as an AI Incident.
Thumbnail Image

Hizo imágenes de sus compañeras menores con IA, las subió a un sitio web para adultos y ahora va a juicio

2026-03-18
Diario Primera Linea
Why's our monitor labelling this an incident or hazard?
The event explicitly involves the use of AI to create manipulated images, which is an AI system's use. The resulting harm is direct and significant, including psychological injury and violation of rights, meeting the criteria for an AI Incident. The AI system's role is pivotal in generating the harmful content that caused the injuries and violations. Therefore, this event qualifies as an AI Incident under the OECD framework.
Thumbnail Image

Hizo imágenes de sus compañeras menores con IA, las subió a un sitio web para adultos y ahora va a juicio

2026-03-18
Agencia Noticias Argentinas
Why's our monitor labelling this an incident or hazard?
The event explicitly mentions the use of AI to generate manipulated images, which constitutes an AI system involvement. The use of AI here is central to the harm caused, as the AI-generated images were published and led to psychological injury and digital violence against minors. This meets the criteria for an AI Incident because the AI system's use directly led to harm to persons (psychological injury) and violations of rights (privacy, dignity, and protection from gender-based violence).
Thumbnail Image

Irá a juicio el estudiante que generó imágenes sexuales con IA de sus compañeras en Córdoba

2026-03-18
Telefe Córdoba
Why's our monitor labelling this an incident or hazard?
The event explicitly involves the use of AI to generate manipulated sexual images (deepfake-like content) of the victims, which were then distributed, causing psychological and social harm. This fits the definition of an AI Incident because the AI system's use directly led to violations of human rights (privacy, dignity) and harm to individuals (psychological injury). The harm is realized and significant, and the AI system's role is pivotal in creating the harmful content. Therefore, this is classified as an AI Incident.