Minors in Valladolid Tried for Using AI to Create and Share Non-Consensual Nude Images of Classmates

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Ten male minors in Valladolid, Spain, are on trial for using AI to generate and distribute pornographic images by placing classmates' faces onto nude bodies. The AI-generated images were shared without consent, leading to charges of child pornography and moral harm, and resulting in legal and psychological consequences for the victims.[AI generated]

Why's our monitor labelling this an incident or hazard?

The event explicitly involves the use of an AI system to generate harmful content (pornographic images) that directly caused harm to individuals (moral and psychological harm to minors). The use of AI was central to the creation and dissemination of this content, leading to legal action and sanctions. This meets the criteria for an AI Incident because the AI system's use directly led to violations of rights and harm to individuals.[AI generated]
AI principles
Privacy & data governanceRespect of human rights

Industries
Media, social platforms, and marketing

Affected stakeholders
Children

Harm types
PsychologicalReputationalHuman or fundamental rights

Severity
AI incident

AI system task:
Content generation


Articles about this incident or hazard

Thumbnail Image

Diez menores a juicio en Valladolid por difundir fotos de IA de compañeras de instituto desnudas

2026-04-10
El Día de Valladolid
Why's our monitor labelling this an incident or hazard?
The event explicitly involves the use of an AI system to generate harmful content (pornographic images) that directly caused harm to individuals (moral and psychological harm to minors). The use of AI was central to the creation and dissemination of this content, leading to legal action and sanctions. This meets the criteria for an AI Incident because the AI system's use directly led to violations of rights and harm to individuals.
Thumbnail Image

Comienza el juicio a diez menores en Valladolid por usar la IA para difundir fotos de compañeras del instituto desnudas

2026-04-13
20 minutos
Why's our monitor labelling this an incident or hazard?
The event explicitly mentions the use of AI to generate pornographic images of minors, which constitutes a violation of human rights and causes harm to the victims' integrity and moral well-being. The AI system's use directly led to the dissemination of harmful content, causing psychological harm and social disruption. Therefore, this qualifies as an AI Incident under the definitions provided, as the AI system's use directly led to harm (violation of rights and harm to individuals and communities).
Thumbnail Image

Diez menores se sientan en el banquillo por usar la inteligencia artificial para crear fotos de sus compañeras de instituto desnudas

2026-04-13
La Voz de Galicia
Why's our monitor labelling this an incident or hazard?
The event explicitly describes the use of an AI system to generate pornographic images of minors without consent, which is a clear violation of rights and causes moral and psychological harm. The AI system's use directly led to the creation and dissemination of harmful content, fulfilling the criteria for an AI Incident under violations of human rights and harm to communities. The legal proceedings and sanctions further confirm the materialized harm.
Thumbnail Image

Los menores acusados de difundir fotos de compañeras de su instituto...

2026-04-13
europa press
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions the use of AI techniques to generate manipulated nude photos of minors, which is a clear violation of human rights and moral integrity. The AI system's use directly led to harm to the victims, including psychological and moral harm, and legal actions are underway. This fits the definition of an AI Incident as the AI system's use has directly led to harm (violation of rights and moral harm).
Thumbnail Image

Se declaran inocentes de difundir fotos desnudas de compañeras

2026-04-13
El Día de Valladolid
Why's our monitor labelling this an incident or hazard?
The event explicitly mentions the use of AI techniques to generate manipulated nude images of minors, which is a clear involvement of an AI system. The harm includes violations of moral integrity and the creation and dissemination of child pornography, which are serious legal and human rights violations. Despite the lack of material proof and the accused denying involvement, the event describes realized harm and legal proceedings related to the AI system's use. Hence, it qualifies as an AI Incident rather than a hazard or complementary information.
Thumbnail Image

A juicio en Valladolid diez menores por usar la IA para difundir fotos de compañeras desnudas del instituto

2026-04-13
Gente Digital
Why's our monitor labelling this an incident or hazard?
The use of AI to generate and distribute non-consensual pornographic images of minors constitutes a direct violation of human rights and moral integrity, fulfilling the criteria for harm under the AI Incident definition. The AI system's use directly led to the creation and dissemination of harmful content, causing injury to the victims' dignity and mental health. The involvement of AI in the creation of these images and the resulting legal and social consequences confirm this as an AI Incident rather than a hazard or complementary information.