Minors Arrested in Madrid for Distributing AI-Generated Fake Nude Images

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

In Madrid, three minors were arrested for distributing AI-generated fake nude images of 13 adolescent girls. The images, created using mobile apps and websites, were shared via messaging apps. The Guardia Civil initiated the investigation in April, leading to the minors' arrest and involvement of the Juvenile Prosecutor's Office.[AI generated]

Why's our monitor labelling this an incident or hazard?

This event involves the use of AI systems to create deepfake nude images of minors and distribute them, resulting in direct harm and violation of the victims’ rights. The involvement of AI in generating illegal content and its actual distribution qualifies it as an AI Incident.[AI generated]
AI principles
Respect of human rightsPrivacy & data governanceSafetyAccountability

Industries
Consumer services

Affected stakeholders
Children

Harm types
PsychologicalReputationalHuman or fundamental rights

Severity
AI incident

AI system task:
Content generation


Articles about this incident or hazard

Thumbnail Image

Detenidos tres menores por distribuir desnudos generados por IA de 13 chicas

2024-07-12
20 minutos
Why's our monitor labelling this an incident or hazard?
This event involves the use of AI systems to create deepfake nude images of minors and distribute them, resulting in direct harm and violation of the victims’ rights. The involvement of AI in generating illegal content and its actual distribution qualifies it as an AI Incident.
Thumbnail Image

Detenidos tres menores por distribuir imágenes manipuladas con IA de adolescentes desnudas

2024-07-12
infobae
Why's our monitor labelling this an incident or hazard?
The incident involves the use of AI-based tools to generate and distribute sexualized images of minors, directly causing harm and violating criminal and human rights protections. This meets the criteria for an AI Incident, as the AI system’s use led directly to illegal distribution of child sexual content.
Thumbnail Image

Tres menores detenidos por distribuir imágenes pornográficas manipuladas con IA en San Agustín de Guadalix

2024-07-12
20 minutos
Why's our monitor labelling this an incident or hazard?
This event describes the malicious use of AI systems to produce and distribute non-consensual, pornographic images of minors, directly causing harm (sexual exploitation, privacy and data protection violations, moral degradation). It is not merely a potential risk or a contextual update, but a realized incident involving AI misuse with clear legal and human-rights implications.
Thumbnail Image

Detenidos tres menores por distribuir imágenes manipuladas con IA de adolescentes desnudas

2024-07-12
El Confidencial
Why's our monitor labelling this an incident or hazard?
This is a direct incident in which AI systems were used to create and disseminate sexualized images of minors without consent, constituting a violation of rights and child pornography laws. The misuse of AI led directly to harm and criminal charges.
Thumbnail Image

Detenidos tres menores por distribuir fotografías de adolescentes desnudas generadas con Inteligencia Artificial

2024-07-12
La Voz de Galicia
Why's our monitor labelling this an incident or hazard?
The event describes the direct misuse of AI systems to produce non-consensual, child sexual exploitation material. The generation and dissemination of these images have already occurred, causing harm (violation of minors’ rights, distribution of illicit content). Therefore this qualifies as an AI Incident.
Thumbnail Image

Detenidos tres menores por distribuir desnudos generados por IA de...

2024-07-12
europa press
Why's our monitor labelling this an incident or hazard?
An AI system was explicitly used to generate sexualized images of children by “undressing” photographs via mobile apps/websites. These AI-generated images of minors were then distributed, causing actual harm (violation of minors’ rights, child pornography). This is a realized incident involving AI misuse leading directly to criminal harm.
Thumbnail Image

Detenidos tres menores en Madrid por distribuir imágenes manipuladas con IA de adolescentes desnudas

2024-07-12
La Razón
Why's our monitor labelling this an incident or hazard?
The event involves the use of AI applications to create fake nude photographs of underage girls and distribute them, constituting direct harm (sexual exploitation, violation of minors’ rights) through misuse of an AI system. Harms have materialized, so it is classified as an AI Incident.
Thumbnail Image

Detenidos tres menores en Madrid por distribuir desnudos generados por IA de 13 chicas también menores

2024-07-12
LaSexta
Why's our monitor labelling this an incident or hazard?
An AI system was explicitly used to manipulate photos to create fake pornographic images of minors, and these images were distributed among minors. The misuse of AI in generating and sharing child sexual abuse content constitutes direct harm under human rights and criminal law, making this an AI incident.
Thumbnail Image

Detenidos en Madrid tres menores por difundir fotos falsas de adolescente desnudas manipuladas con Inteligencia Artificial

2024-07-12
Antena3
Why's our monitor labelling this an incident or hazard?
The event involves the use of an AI system to generate and distribute illicit sexual content of minors, causing real harm and violating the victims’ rights and moral integrity. This is a direct instance of AI misuse leading to illegal and harmful outcomes.
Thumbnail Image

La Guardia Civil detiene a tres menores por desnudar a sus compañeras instituto con Inteligencia Artificial

2024-07-12
EL MUNDO
Why's our monitor labelling this an incident or hazard?
The episode describes the use of AI image-manipulation tools to create and share false nude photos of minors, directly causing harm (distribution of child pornography, harassment, moral integrity violations). As an AI system’s use led to realized legal and personal harms, it is classified as an AI Incident.
Thumbnail Image

Detenidos tres menores en Madrid por difundir imágenes creadas con IA de jóvenes desnudas

2024-07-12
LaVanguardia
Why's our monitor labelling this an incident or hazard?
An AI system (online AI image–generation apps) was used to produce and share illegal sexual content involving minors, constituting a direct violation of human rights (protection of minors, integrity moral) and criminal pornography laws. This harm has already occurred, so it is an AI Incident.
Thumbnail Image

Detenidos tres menores por distribuir fotos falsas de chicas desnudas: las hacían con Inteligencia Artificial

2024-07-12
El Español
Why's our monitor labelling this an incident or hazard?
The incident involves deliberate misuse of AI systems (mobile/web apps using AI to undress images) to create sexualized content of minors and distribute it on social media, causing direct harm (child pornography, moral harm, privacy violations, and harassment). This meets the definition of an AI Incident, as the AI system’s use directly led to realized harm under criminal statutes.
Thumbnail Image

Detenidos tres menores en Madrid por distribuir imágenes de adolescentes desnudas manipuladas con IA

2024-07-12
Público.es
Why's our monitor labelling this an incident or hazard?
The detained minors used AI-based applications and websites to generate sexualized, non-consensual images of underage girls, then distributed them. The AI system’s misuse directly caused harm through privacy violations, moral degradation, and creation of child pornography, triggering criminal investigation and arrests. This meets the definition of an AI Incident as the AI-enabled content generation led to actual, realized harm.
Thumbnail Image

Detienen a 3 menores en Madrid por distribuir fotos de 13 chicas desnudas generadas por IA

2024-07-12
El Observador
Why's our monitor labelling this an incident or hazard?
The case involves the malicious use of an AI system to generate and distribute nonconsensual nude images of minors, directly causing legal and moral harm (child exploitation, harassment, violation of data protection). The harm has materialized, making this an AI Incident.
Thumbnail Image

Detenidos tres menores por distribuir imágenes manipuladas con IA de adolescentes desnudas

2024-07-12
El Periódico
Why's our monitor labelling this an incident or hazard?
The event describes direct misuse of AI applications to create and distribute sexual images of minors, causing actual harm through violations of personal rights and laws protecting children. The AI system’s pivotal role in producing the illicit images classifies this as an AI Incident.
Thumbnail Image

Detenidos tres menores en Madrid por distribuir desnudos generados por IA de varias chicas

2024-07-12
OndaCero
Why's our monitor labelling this an incident or hazard?
The article explicitly states that AI techniques were used to generate fake nude images of minors, which were then distributed, constituting child pornography and other legal violations. The AI system's use directly led to harm to the victims' rights and moral integrity, fulfilling the criteria for an AI Incident. The involvement of AI in generating harmful content and the resulting legal and moral harms confirm this classification.
Thumbnail Image

Detenidos tres menores por distribuir imágenes manipuladas con IA de adolescentes desnudas

2024-07-12
Diario de Cádiz
Why's our monitor labelling this an incident or hazard?
The article describes the use of AI applications to generate fake nude images of minors, which were then distributed, causing harm to the victims. The AI system was used to create manipulated content that led to violations of personal rights and moral integrity, fitting the definition of an AI Incident. The harm is realized and direct, involving injury to the dignity and privacy of minors, and legal consequences are involved. Therefore, this event qualifies as an AI Incident rather than a hazard or complementary information.
Thumbnail Image

Detenidos tres menores por distribuir imágenes falsas de carácter sexual de otras menores de edad

2024-07-12
Telemadrid
Why's our monitor labelling this an incident or hazard?
The article explicitly states that AI techniques were used to manipulate images to create fake sexual content of minors, which were then distributed, causing harm. This constitutes a violation of rights and harm to individuals, fulfilling the criteria for an AI Incident. The AI system's use directly led to the harm, and the event involves the development and use of AI systems for harmful purposes.
Thumbnail Image

Tres adolescentes aprehendidos por distribuir contenido íntimo generado por IA de 13 chicas menores

2024-07-12
Qué!
Why's our monitor labelling this an incident or hazard?
The article explicitly states that AI techniques were used to generate fake sexual images of minors, which were then distributed by the suspects. This use of AI directly led to harm to the victims, including violations of their rights and moral integrity, and is punishable under criminal law. The involvement of AI in generating the harmful content and its distribution causing actual harm meets the criteria for an AI Incident. The event is not merely a potential risk or a complementary update but a realized harm caused by AI misuse.