UK Music Industry Threatens Legal Action Against AI Voice Cloning Platform Jammable

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

The British Phonographic Industry (BPI) has threatened legal action against Jammable (formerly Voicify), an AI platform offering thousands of unlicensed voice models to clone famous artists' voices for deepfake music. The service's unauthorized use of copyrighted material has led to industry backlash and removal of some AI-generated models.[AI generated]

Why's our monitor labelling this an incident or hazard?

The AI system in question is a deepfake voice cloning technology used to imitate artists' voices. The British Phonographic Industry (BPI) alleges that the system was trained using copyrighted works without authorization, constituting a breach of intellectual property rights. This is a direct violation of legal protections for artists' creative works, fulfilling the criteria for an AI Incident under the category of violations of intellectual property rights. The legal proceedings and the complaint indicate that harm has occurred or is ongoing due to the unauthorized use of copyrighted material in the AI system's development and deployment.[AI generated]
AI principles
AccountabilityPrivacy & data governanceRespect of human rightsTransparency & explainabilityDemocracy & human autonomy

Industries
Arts, entertainment, and recreation

Affected stakeholders
Business

Harm types
Economic/PropertyReputational

Severity
AI incident

AI system task:
Content generation


Articles about this incident or hazard

Thumbnail Image

Fake it till you make it? Not on our watch, says UK music industry

2024-03-18
Euronews English
Why's our monitor labelling this an incident or hazard?
The AI system in question is a deepfake voice cloning technology used to imitate artists' voices. The British Phonographic Industry (BPI) alleges that the system was trained using copyrighted works without authorization, constituting a breach of intellectual property rights. This is a direct violation of legal protections for artists' creative works, fulfilling the criteria for an AI Incident under the category of violations of intellectual property rights. The legal proceedings and the complaint indicate that harm has occurred or is ongoing due to the unauthorized use of copyrighted material in the AI system's development and deployment.
Thumbnail Image

'Deepfake' music start-up Voicify in copyright row

2024-03-18
The Sunday Times
Why's our monitor labelling this an incident or hazard?
The event describes a start-up using AI systems to create deepfake music imitating artists' voices, which allegedly infringes copyright by training on copyrighted works. This constitutes a violation of intellectual property rights, a form of harm under the AI Incident definition. The legal action and accusations indicate that the AI system's development and use have directly led to this harm, qualifying the event as an AI Incident.
Thumbnail Image

College student pulls Drake AI deepfake model after threats from U.K. music industry

2024-03-18
Fortune
Why's our monitor labelling this an incident or hazard?
The AI system developed by Jammable used copyrighted voices of artists like Amy Winehouse and Drake to generate deepfake audio without authorization, directly infringing on intellectual property rights. This has led to legal action and the removal of the AI-generated models, demonstrating realized harm. Therefore, this qualifies as an AI Incident due to the violation of rights caused by the AI system's use and deployment.
Thumbnail Image

Music Industry Threatens 'Deepfake AI Music' Service With Legal Action * TorrentFreak

2024-03-20
TorrentFreak
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions an AI system (vocal cloning AI) used to generate music that allegedly infringes copyright, which is a violation of intellectual property rights (a form of harm under AI Incident definitions). However, the harm is not yet realized through legal proceedings or confirmed infringement rulings; instead, the article focuses on the threat of legal action and the industry's response. This fits the definition of Complementary Information, as it details governance and societal responses to AI-related copyright issues, rather than reporting a direct AI Incident or a plausible future AI Hazard. The AI system's involvement is clear, but the event centers on legal threats and industry pressure rather than an actual incident of harm or a credible imminent risk of harm.
Thumbnail Image

Jammable, formerly known as Voicify, offers 3,000 AI models to clone artists' voices. Now it faces legal action from the UK's music industry

2024-03-18
Music Business Worldwide
Why's our monitor labelling this an incident or hazard?
The AI system involved is a vocal cloning service that uses AI models to replicate famous artists' voices without authorization. This unauthorized use directly infringes on copyright and artists' rights, causing harm to the artists and the music industry. The article details ongoing legal actions and industry responses to these harms, confirming that the AI system's use has led to actual violations and damages. Hence, the event meets the criteria for an AI Incident as the AI system's use has directly caused harm (violation of intellectual property and publicity rights).
Thumbnail Image

Jammable Faces BPI Legal Threat Over Soundalike Artist Voices

2024-03-18
Digital Music News
Why's our monitor labelling this an incident or hazard?
The AI system in question is explicitly described as an AI-powered soundalike voice platform that generates audio mimicking real artists' voices. The BPI's legal threat is based on infringement concerns, indicating that the AI system's use has directly led to a violation of intellectual property rights. Since the harm (rights infringement) is occurring or has occurred due to the AI system's use, this qualifies as an AI Incident rather than a hazard or complementary information. The event is not merely a potential risk but involves actual unauthorized use and legal dispute, meeting the criteria for an AI Incident.
Thumbnail Image

Music body threatens first-of-its-kind AI deepfake suit

2024-03-19
World IP Review
Why's our monitor labelling this an incident or hazard?
The AI system (Jammable) uses voice cloning technology to replicate artists' voices without permission, infringing on copyright and intellectual property rights. This unauthorized use harms artists by exploiting their creative work and potentially damaging their income and reputation. The involvement of the AI system in creating deepfake music content is central to the harm. The BPI's legal warning and threat of lawsuit confirm that harm has occurred or is ongoing, meeting the criteria for an AI Incident involving violations of intellectual property rights.