Florida Man Arrested for Creating AI-Generated Deepfake Nudes from Social Media Images

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Sammarth Gautam was arrested in Florida for using AI software to generate non-consensual nude images of at least three victims by removing clothing from their social media photos. The AI-generated deepfakes were discovered online, resulting in privacy violations and sexual exploitation, prompting criminal charges.[AI generated]

Why's our monitor labelling this an incident or hazard?

The event explicitly involves an AI system (deepfake software) used to generate fake nude images without consent, which directly led to harm to the victims' privacy and dignity. This constitutes a violation of human rights and sexual exploitation, fitting the definition of an AI Incident. The AI system's use directly caused the harm, and the event involves realized harm, not just potential harm.[AI generated]
AI principles
Privacy & data governanceRespect of human rightsSafetyRobustness & digital securityTransparency & explainabilityAccountabilityHuman wellbeing

Industries
Media, social platforms, and marketing

Affected stakeholders
General public

Harm types
Human or fundamental rightsPsychologicalReputational

Severity
AI incident

AI system task:
Content generation


Articles about this incident or hazard

Thumbnail Image

Computer-generated fake nudes discovered by victims on the internet, Florida cops say

2024-04-23
Yahoo
Why's our monitor labelling this an incident or hazard?
The event explicitly involves an AI system (deepfake software) used to generate fake nude images without consent, which directly led to harm to the victims' privacy and dignity. This constitutes a violation of human rights and sexual exploitation, fitting the definition of an AI Incident. The AI system's use directly caused the harm, and the event involves realized harm, not just potential harm.
Thumbnail Image

Computer-generated fake nudes discovered by victims on the Internet, Florida cops say

2024-04-24
The Star
Why's our monitor labelling this an incident or hazard?
The event explicitly involves an AI system (deepfake software) used to create fake nude images without consent, which directly led to harm in the form of privacy violations and sexual exploitation. The involvement of AI in generating the images is clear, and the harm is realized, not just potential. Therefore, this qualifies as an AI Incident under the framework, specifically a violation of human rights and privacy.
Thumbnail Image

Man arrested after Hillsborough investigation of AI porn depicting several victims

2024-04-23
Tampa Bay Times
Why's our monitor labelling this an incident or hazard?
The event explicitly describes the use of AI to generate pornographic images without the victims' consent, which is a clear violation of human rights and privacy. The harm has already occurred as victims found these images online, and the perpetrator was arrested for promotion of altered sexual depictions without consent. This fits the definition of an AI Incident because the AI system's use directly led to harm to individuals' rights and dignity.
Thumbnail Image

Computer-generated fake nudes discovered by victims on the internet, Florida cops say

2024-04-23
The Charlotte Observer
Why's our monitor labelling this an incident or hazard?
The article describes the creation and dissemination of AI-generated deepfake nude images without consent, which is a clear violation of personal privacy and sexual exploitation. The AI system was used to manipulate original clothed photos to produce realistic nude images, directly causing harm to the victims. This fits the definition of an AI Incident as it involves the use of an AI system leading directly to harm (violation of rights and privacy).
Thumbnail Image

Man accused of creating computer-generated porn images of several victims

2024-04-23
WFLA
Why's our monitor labelling this an incident or hazard?
The event involves the use of an AI system (software that removes clothing from images) to generate pornographic images without consent, which constitutes a violation of the victims' rights and privacy. This harm has already occurred, as the images were found online and victims reported the incident. Therefore, this qualifies as an AI Incident due to the direct harm caused by the AI system's use in creating non-consensual explicit content.
Thumbnail Image

Man accused of creating computer-generated pornography from victims' social media images arrested in Florida

2024-04-23
FOX 13 Tampa Bay
Why's our monitor labelling this an incident or hazard?
The suspect used software to create altered sexual depictions without consent by removing clothing from original images, which is a clear misuse of AI-generated content technology leading to violations of personal rights and privacy. This fits the definition of an AI Incident because the AI system's use directly led to harm (violation of rights and personal harm to victims).