AI Voice Cloning Threatens Voice Actors' Rights and Livelihoods

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

AI-driven voice cloning is increasingly replacing human voice actors in dubbing and entertainment, raising concerns about job loss, unauthorized use of voices, and violations of intellectual property and labor rights. Industry figures and unions are negotiating to protect actors from economic and rights-based harms caused by synthetic voice technologies.[AI generated]

Why's our monitor labelling this an incident or hazard?

The event involves AI systems used for voice cloning, which directly impacts the rights of voice actors by potentially violating their intellectual property and labor rights through unauthorized use and lack of fair compensation. The article describes ongoing use and misuse of AI voice cloning technology, leading to harm to individuals and communities (voice actors and the entertainment industry). Therefore, this qualifies as an AI Incident due to realized violations of rights and harm caused by AI system use.[AI generated]
AI principles
AccountabilityFairnessHuman wellbeingPrivacy & data governanceRespect of human rightsTransparency & explainability

Industries
Arts, entertainment, and recreationMedia, social platforms, and marketing

Affected stakeholders
Workers

Harm types
Economic/PropertyHuman or fundamental rights

Severity
AI incident

Business function:
Research and developmentOther

AI system task:
Content generation


Articles about this incident or hazard

Thumbnail Image

Synchronsprecher und KI: "Vielleicht sind wir die letzte Generation"

2024-04-04
Salzburger Nachrichten
Why's our monitor labelling this an incident or hazard?
The event involves AI systems used for voice cloning, which directly impacts the rights of voice actors by potentially violating their intellectual property and labor rights through unauthorized use and lack of fair compensation. The article describes ongoing use and misuse of AI voice cloning technology, leading to harm to individuals and communities (voice actors and the entertainment industry). Therefore, this qualifies as an AI Incident due to realized violations of rights and harm caused by AI system use.
Thumbnail Image

Synchronsprecher und KI: "Vielleicht sind wir die letzte Generation"

2024-04-07
heise online
Why's our monitor labelling this an incident or hazard?
The article explicitly describes AI systems used for voice cloning and synthetic voice generation that have already caused harm by replacing human voice actors, unauthorized use of voices, and labor disputes. These harms fall under violations of human rights and labor rights, as well as economic harm to individuals and communities dependent on voice acting. The presence and use of AI systems are clear and central to the described harms. Therefore, this event qualifies as an AI Incident rather than a hazard or complementary information.
Thumbnail Image

Synchronsprecher sorgen sich um berufliche Zukunft in Zeiten von KI

2024-04-04
der Standard
Why's our monitor labelling this an incident or hazard?
The event involves AI systems that clone voices and generate synthetic speech, which are explicitly mentioned and currently used in media production. The use of these AI systems has directly led to harm to the professional community of voice actors by replacing human labor, causing economic harm and potential violation of intellectual property and labor rights. The article also mentions unauthorized use of voices, which constitutes a violation of rights. Therefore, this qualifies as an AI Incident due to realized harm to people and their rights caused by AI use.
Thumbnail Image

Synchronsprecher: "Vielleicht sind wir die letzte Generation"

2024-04-04
Die Presse
Why's our monitor labelling this an incident or hazard?
The article explicitly describes AI systems that clone human voices and are used commercially, leading to direct harm to voice actors through unauthorized use of their voices and potential job displacement. The harm includes violation of labor rights and economic harm to individuals relying on their voices professionally. The AI system's use is central to these harms, fulfilling the criteria for an AI Incident. The concerns about voice cloning and the inability to patent voices further underline the direct impact on rights and livelihoods. Hence, this is not merely a potential hazard or complementary information but a realized incident involving AI systems causing harm.
Thumbnail Image

Stimme der Zukunft: KI und die vielleicht letzte Generation der Synchronsprecher

2024-04-04
www.kleinezeitung.at
Why's our monitor labelling this an incident or hazard?
The article clearly involves AI systems used for voice cloning and synthetic voice generation, which fits the definition of an AI system. The concerns raised relate to the potential replacement of human voice actors and unauthorized use of voices, which could plausibly lead to harms such as violation of intellectual property rights or labor rights. However, the article does not report any specific incident where such harm has already occurred or been legally recognized. The main focus is on the industry's concern and ongoing negotiations to address these issues, making this a discussion of potential future harms and industry responses rather than a report of an AI Incident or Hazard. Therefore, the event is best classified as Complementary Information, as it provides context and insight into societal and governance responses to AI's impact on the voice acting profession.
Thumbnail Image

Künstliche Intelligenz: Synchronsprecher und KI: "Vielleicht sind wir die letzte Generation"

2024-04-04
Stuttgarter-Zeitung.de
Why's our monitor labelling this an incident or hazard?
The article clearly involves AI systems used for voice cloning and synthetic voice generation, which fits the definition of an AI system. The concerns raised relate to the potential displacement of human voice actors and unauthorized use of voices, which could lead to violations of intellectual property or labor rights. However, the article does not describe any concrete incident where harm has already occurred due to AI use; rather, it discusses the current state, industry reactions, and potential future consequences. Therefore, it does not meet the criteria for an AI Incident or AI Hazard. Instead, it provides contextual and industry response information about AI's impact, fitting the definition of Complementary Information.
Thumbnail Image

Synchronsprecher und KI: "Vielleicht sind wir die letzte Generation

2024-04-04
Baden online
Why's our monitor labelling this an incident or hazard?
The article clearly involves AI systems (voice cloning and synthetic voice generation) and discusses their use in media dubbing. The concerns raised relate to potential violations of intellectual property and labor rights (rights to one's voice), as well as economic harm to voice actors. However, the article does not report a specific AI incident where harm has already occurred; it mainly expresses fears and ongoing negotiations to prevent such harm. The mention of voices being 'stolen' on obscure platforms suggests misuse but lacks detail on direct harm or legal outcomes. Thus, the situation fits the definition of an AI Hazard, where AI use could plausibly lead to harm (loss of jobs, rights violations) in the future if unregulated or misused.