Activision Blizzard’s AI Integration Sparks Layoffs and Artist Outcry

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Activision Blizzard integrated generative AI tools like Midjourney and Stable Diffusion into concept art and in-game cosmetics for Call of Duty: Modern Warfare 3. Under former CTO Michael Vance’s direction, artists were compelled to adopt AI while the studio cut about 1,900 jobs, stoking fears of AI replacing human creatives.[AI generated]

Why's our monitor labelling this an incident or hazard?

The article explicitly mentions the use of generative AI systems (Midjourney, Stable Diffusion) in game development and links this use to the direct harm of job losses and layoffs affecting thousands of workers. This constitutes harm to people (economic and employment harm), which falls under the definition of an AI Incident. The AI system's use is directly leading to realized harm (job loss), not just a potential risk, so this is an AI Incident rather than a hazard or complementary information.[AI generated]
AI principles
AccountabilityFairnessHuman wellbeingTransparency & explainabilityDemocracy & human autonomy

Industries
Arts, entertainment, and recreation

Affected stakeholders
Workers

Harm types
Economic/PropertyPsychologicalReputational

Severity
AI incident

Business function:
Research and developmentMarketing and advertisementSales

AI system task:
Content generation


Articles about this incident or hazard

Thumbnail Image

AI Is Already Taking Jobs in the Video Game Industry

2024-07-23
Wired
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions the use of generative AI systems (Midjourney, Stable Diffusion) in game development and links this use to the direct harm of job losses and layoffs affecting thousands of workers. This constitutes harm to people (economic and employment harm), which falls under the definition of an AI Incident. The AI system's use is directly leading to realized harm (job loss), not just a potential risk, so this is an AI Incident rather than a hazard or complementary information.
Thumbnail Image

Activision fired artists after using AI for Call of Duty game

2024-07-25
The Hindu
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions the use of AI systems (generative AI models) in creating game content and concept art. The layoffs of artists following AI adoption indicate harm to labor rights and employment, which fits the definition of harm under AI Incident (c) - violations of labor rights or significant harm to groups of people. The harm is realized, not just potential, as layoffs have occurred. Hence, this is an AI Incident rather than a hazard or complementary information.
Thumbnail Image

Bleak report alleges that Activision already has an AI-generated cosmetic in Call of Duty -- with artists reportedly 'forced to use AI to aid in their work' and pushed into outsourcing

2024-07-25
pcgamer
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions generative AI systems being used in the creation of game art assets, with artists forced to use AI and layoffs linked to AI adoption. The sale of AI-generated cosmetics without clear disclosure may mislead consumers. These factors indicate direct involvement of AI systems in causing labor rights issues and potential consumer deception, qualifying this as an AI Incident under violations of labor rights and intellectual property rights. The harms are realized, not just potential, so it is not a hazard or complementary information.
Thumbnail Image

Call Of Duty: MW3 Somehow Just Got Even Worse

2024-07-24
ScreenRant
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions the use of generative AI tools in creating concept art and marketing materials for a major game release, which is an AI system involvement in the development and use of AI. While there is controversy and concern about the impact on human talent and lack of disclosure to consumers, no direct harm such as legal violations or physical harm has been reported as having occurred. The concerns raised about job displacement and ethical issues constitute a credible risk of future harm. Therefore, this event fits the definition of an AI Hazard rather than an AI Incident or Complementary Information, as the main focus is on the potential negative implications of AI use rather than a specific realized harm or a response to a past incident.
Thumbnail Image

Activision Blizzard is reportedly already making games with AI, and quietly sold an AI-generated microtransaction in Call of Duty: Modern Warfare 3

2024-07-24
gamesradar
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions the use of generative AI systems in creating game assets and selling AI-generated skins, which is a direct use of AI systems. The layoffs of 2D artists and forced adoption of AI tools by remaining staff indicate harm to labor rights and employment conditions, fulfilling the criteria for harm under (c) violations of human rights or labor rights. The harm is realized, not just potential, as layoffs have occurred and workers are compelled to use AI. Hence, this is an AI Incident rather than a hazard or complementary information.
Thumbnail Image

Activision Blizzard has begun using AI to pick up the slack left by laid-off artists

2024-07-26
TechSpot
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions generative AI systems being used to create game content that replaced human artists, resulting in layoffs of both junior and senior staff. This is a direct use of AI leading to harm in the form of job loss, which is a violation of labor rights and causes significant harm to affected workers. The AI system's role is pivotal in this harm, meeting the criteria for an AI Incident under violations of labor rights and harm to people.
Thumbnail Image

A new report suggests AI might be replacing video game workers - Fast Company

2024-07-24
Fast Company
Why's our monitor labelling this an incident or hazard?
The event involves AI systems (image generation AI) whose use in the video game industry could plausibly lead to significant harm in the form of job losses and economic disruption for workers. Although layoffs have occurred, the article does not establish a direct causal link between AI use and these layoffs. The harm is thus potential rather than realized. Therefore, this situation fits the definition of an AI Hazard, as the development and use of AI systems could plausibly lead to harm (job displacement) but no direct or indirect harm caused by AI has been confirmed yet.
Thumbnail Image

Call of Duty Accused of Selling AI-Generated Cosmetic

2024-07-24
Game Rant
Why's our monitor labelling this an incident or hazard?
An AI system (generative AI) was used in the development and deployment of game content that was sold to consumers. This raises issues of intellectual property rights and ethical concerns about originality, which can be considered a violation of intellectual property rights under the framework. Additionally, the forced use of AI and layoffs linked to AI adoption imply potential harm to labor rights. Since these harms have already occurred (sale of AI-generated content without disclosure and workforce impacts), this qualifies as an AI Incident. The article does not merely speculate about future risks but reports realized impacts and ethical breaches related to AI use.
Thumbnail Image

Former Activision staff concerned about gen AI tools used in development, including Midjourney

2024-07-25
GamesIndustry.biz
Why's our monitor labelling this an incident or hazard?
The article discusses the use of generative AI tools like Midjourney and Stable Diffusion in game development and highlights employee concerns about AI replacing artists and the use of copyrighted material without consent. While these raise important ethical and legal questions, the article does not report any direct or indirect harm caused by the AI systems, nor does it describe a plausible imminent risk of harm. The focus is on industry practices, employee reactions, and broader trends, which fits the definition of Complementary Information rather than an AI Incident or AI Hazard.
Thumbnail Image

Activision Blizzard's AI Integration Raises Concerns of Job Cuts in Gaming Industry - WinBuzzer

2024-07-24
WinBuzzer
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions the use of generative AI systems (Midjourney, Stable Diffusion, OpenAI tools) in game development leading to layoffs of thousands of workers, including at Activision Blizzard. The layoffs and job insecurity are direct harms linked to the AI system's deployment and use. The ethical concerns and increased unionization efforts further indicate violations of labor rights and harm to communities. Since the harm is realized and directly linked to AI use, this event meets the criteria for an AI Incident rather than a hazard or complementary information.
Thumbnail Image

Activision accused of using AI skins in Modern Warfare 3

2024-07-24
ggrecon.com
Why's our monitor labelling this an incident or hazard?
The article discusses AI-generated content in game cosmetics and related workforce impacts but does not describe any direct or indirect harm caused by the AI system's development, use, or malfunction. The layoffs and use of AI tools are business and labor issues rather than AI incidents causing harm as defined. There is no indication that the AI use has led to injury, rights violations, or other harms, nor that it plausibly could lead to such harms. Therefore, this is best classified as Complementary Information providing context on AI adoption and its impact on the creative workforce in gaming.
Thumbnail Image

WIRED: Activision Artists Were Forced to Use AI

2024-07-24
80.lv
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions the use of AI systems for generating art assets and the subsequent layoffs of 1,900 employees at Activision, which indicates harm to labor rights and employment. The forced use of AI and the replacement of human artists with AI-generated content constitutes a violation of labor rights and harm to the community of workers. Therefore, this qualifies as an AI Incident due to realized harm linked to the use of AI systems in the workplace.
Thumbnail Image

Activision In The Crosshairs: The Video Game Giant Is Accused Of Using Artificial Intelligence To Create Skins - Bullfrag

2024-07-25
Bullfrag
Why's our monitor labelling this an incident or hazard?
The article involves AI systems (AI tools like Midjourney used for content generation) and discusses their use in production, which affects workers' job security and industry dynamics. However, it does not report any realized harm such as layoffs directly caused by AI malfunction or misuse, nor any violation of rights or physical harm. The fears and anxieties expressed are about potential future impacts, making this an AI Hazard rather than an AI Incident. The article also includes broader industry context and labor concerns, but the main focus is on the plausible future harm from AI use in game content creation and workforce impact. Hence, the classification is AI Hazard.