Voice Actors Condemn Non-Consensual AI Deepfake Use in Skyrim Mods

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Voice actors have publicly denounced the use of AI voice cloning tools, such as ElevenLabs, to create non-consensual pornographic mods for Skyrim. These AI-generated deepfakes exploit actors' voices without permission, violating their rights and causing reputational and personal harm, while modding platforms have not banned such content.[AI generated]

Why's our monitor labelling this an incident or hazard?

The article explicitly describes AI-generated voice acting using voice cloning technology (an AI system) that impersonates real voice actors without their consent, especially for pornographic content. This unauthorized use directly violates the actors' rights and causes harm to them. The AI system's use has directly led to this harm, fulfilling the criteria for an AI Incident under violations of human rights and intellectual property rights. The presence of the AI system is clear, the harm is realized, and the event is not merely a potential risk or complementary information but a current incident.[AI generated]
AI principles
AccountabilityPrivacy & data governanceRespect of human rightsTransparency & explainabilityHuman wellbeingSafety

Industries
Arts, entertainment, and recreationMedia, social platforms, and marketing

Affected stakeholders
Workers

Harm types
ReputationalPsychologicalHuman or fundamental rights

Severity
AI incident

AI system task:
Content generation


Articles about this incident or hazard

Thumbnail Image

'Voice actors are being abused' by Skyrim modding communities using AI

2023-07-06
TechRadar
Why's our monitor labelling this an incident or hazard?
The article explicitly describes AI-generated voice acting using voice cloning technology (an AI system) that impersonates real voice actors without their consent, especially for pornographic content. This unauthorized use directly violates the actors' rights and causes harm to them. The AI system's use has directly led to this harm, fulfilling the criteria for an AI Incident under violations of human rights and intellectual property rights. The presence of the AI system is clear, the harm is realized, and the event is not merely a potential risk or complementary information but a current incident.
Thumbnail Image

Anger from voice actors as NSFW mods use AI deepfakes to replicate their voices: 'This is NOT okay.'

2023-07-06
pcgamer
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions AI deepfake technology being used to clone voice actors' voices without consent for non-consensual pornographic content, which constitutes a violation of intellectual property and personal rights. The harm is realized and ongoing, as voice actors express anger and distress, and efforts are underway to track and remove such content. The AI system's misuse directly leads to harm (violation of rights and reputational damage), fulfilling the criteria for an AI Incident rather than a hazard or complementary information. The presence of AI systems (voice cloning/deepfake AI) is clear, and the harm is direct and significant.
Thumbnail Image

Video game voice actors denounce NSFW mods using AI deepfakes of their voices

2023-07-07
NME Music News, Reviews, Videos, Galleries, Tickets and Blogs | NME.COM
Why's our monitor labelling this an incident or hazard?
The use of AI voice cloning to create non-consensual pornographic content directly harms the voice actors by violating their rights and causing reputational and personal harm. The AI system's outputs are central to the harm, fulfilling the criteria for an AI Incident. Although no official legal or company response is noted, the harm to the actors is realized and ongoing, not merely potential. Therefore, this event qualifies as an AI Incident due to the direct involvement of AI in causing violations of rights and harm to individuals.
Thumbnail Image

Skyream: voice actors displeased by smutty game mods using AI to clone their voices | Boing Boing

2023-07-08
Boing Boing
Why's our monitor labelling this an incident or hazard?
The use of AI to clone voices without consent directly infringes on the voice actors' rights, constituting a breach of intellectual property and personality rights under applicable law. This harm is realized as the actors' voices, their professional assets, are exploited without permission, causing tangible damage. Therefore, this qualifies as an AI Incident due to violations of rights caused by AI use.
Thumbnail Image

Voice Actors Speak Out Against Skyrim NSFW Mods Using Deepfake AI

2023-07-07
Twinfinite
Why's our monitor labelling this an incident or hazard?
The article describes the use of AI voice cloning systems to generate non-consensual deepfake content, which directly harms voice actors by violating their rights and consent. The harm is realized and ongoing, as the content is being created and distributed. The AI system's use is central to the harm, fulfilling the criteria for an AI Incident under violations of human rights and intellectual property rights. The presence of AI voice cloning is explicit, and the harm is direct and materialized.