Artists Sue OpenAI Over Dall-E 3's Use of Copyrighted Works in AI Training

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

OpenAI's Dall-E 3 image generator, praised for its advanced capabilities, has sparked lawsuits and backlash from artists who allege their copyrighted works were used without consent to train the AI. Artists claim this infringes on their intellectual property rights and threatens their livelihoods.[AI generated]

Why's our monitor labelling this an incident or hazard?

The article explicitly discusses AI systems (Dall-E 3 and similar generative AI models) that have been trained on artists' copyrighted works without permission, leading to lawsuits alleging copyright infringement. This is a direct harm related to intellectual property rights violations caused by the AI systems' development and use. The presence of ongoing legal action and artists' complaints confirms that harm has materialized, not just a potential risk. The AI system's role is pivotal as it is the use of these AI models trained on unauthorized data that caused the harm. Hence, this is an AI Incident rather than a hazard or complementary information.[AI generated]
AI principles
AccountabilityTransparency & explainabilityPrivacy & data governanceFairnessRespect of human rightsHuman wellbeing

Industries
Arts, entertainment, and recreationMedia, social platforms, and marketing

Affected stakeholders
Workers

Harm types
Economic/Property

Severity
AI incident

Business function:
Research and development

AI system task:
Content generation


Articles about this incident or hazard

Thumbnail Image

OpenAI won't say how many artists have opted out of training AI.

2023-11-03
The Verge
Why's our monitor labelling this an incident or hazard?
The event involves AI systems in the context of training data and artist content exclusion, but no direct or indirect harm has been reported. The focus is on the process and challenges of opting out, as well as tools designed to disrupt AI image generation, which are responses to AI impacts rather than incidents or hazards themselves. Therefore, this is Complementary Information providing context and updates on societal and governance responses to AI-related concerns.
Thumbnail Image

Dall-E 3 is so good it's stoking an artist revolt against AI scraping

2023-11-06
The Star
Why's our monitor labelling this an incident or hazard?
The article explicitly discusses AI systems (Dall-E 3 and similar generative AI models) that have been trained on artists' copyrighted works without permission, leading to lawsuits alleging copyright infringement. This is a direct harm related to intellectual property rights violations caused by the AI systems' development and use. The presence of ongoing legal action and artists' complaints confirms that harm has materialized, not just a potential risk. The AI system's role is pivotal as it is the use of these AI models trained on unauthorized data that caused the harm. Hence, this is an AI Incident rather than a hazard or complementary information.
Thumbnail Image

Dall-E 3 Is So Good It's Stoking an Artist Revolt Against AI Scraping - BNN Bloomberg

2023-11-03
BNN
Why's our monitor labelling this an incident or hazard?
The article details ongoing lawsuits against AI companies for unauthorized use of artists' copyrighted works in training AI image generators, which is a violation of intellectual property rights (a breach of applicable law protecting such rights). The harm to artists' economic interests and creative control is a direct consequence of the AI systems' development and use. The article also highlights the insufficient opt-out mechanisms and the creation of tools to mitigate these harms, reinforcing that the harm is occurring and recognized. Therefore, this qualifies as an AI Incident under the framework, as the AI systems' use has directly or indirectly led to violations of intellectual property rights and harm to artist communities.
Thumbnail Image

Dall-E 3 Is So Good It's Stoking an Artist Revolt Against AI Scraping

2023-11-04
HT Tech
Why's our monitor labelling this an incident or hazard?
The article involves AI systems (Dall-E 3 and other image generators) and discusses their use in training on artists' works without consent, which implicates intellectual property rights. However, the article does not describe a concrete AI Incident where harm has been directly or indirectly caused by the AI system's outputs or malfunction. Instead, it focuses on ongoing legal challenges, opt-out mechanisms, and protective tools, which are responses and developments in the AI ecosystem. These aspects align with the definition of Complementary Information, as they provide supporting context and updates rather than reporting a new AI Incident or AI Hazard.