AI-Altered Runway Photo Erases Model's Racial Identity

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Model Shereen Wu accused designer Michael Costello of sharing an AI-altered runway photo that replaced her Asian-American face with that of a white model. The incident, widely publicized on social media, sparked outrage over racial erasure and dehumanization caused by misuse of AI in digital image editing.[AI generated]

Why's our monitor labelling this an incident or hazard?

The event describes the use of AI or AI-assisted digital editing to alter a person's face in a photograph without consent, which directly leads to harm in terms of dehumanization and violation of rights. The alteration affects the model's identity and public representation, which falls under violations of human rights and dignity. The involvement of AI in the alteration is reasonably inferred from the description of digital face alteration. Therefore, this qualifies as an AI Incident due to the direct harm caused by the AI system's use in altering the image and the resulting violation of rights and harm to the individual and community.[AI generated]
AI principles
FairnessRespect of human rightsTransparency & explainabilityAccountabilityHuman wellbeingSafety

Industries
Arts, entertainment, and recreationMedia, social platforms, and marketing

Affected stakeholders
Women

Harm types
ReputationalPsychologicalHuman or fundamental rights

Severity
AI incident

Business function:
Marketing and advertisement

AI system task:
Content generation


Articles about this incident or hazard

Thumbnail Image

Model alleges her face was altered with AI to appear white, says it is 'highly dehumanizing'

2023-11-04
Economic Times
Why's our monitor labelling this an incident or hazard?
The event describes the use of AI or AI-assisted digital editing to alter a person's face in a photograph without consent, which directly leads to harm in terms of dehumanization and violation of rights. The alteration affects the model's identity and public representation, which falls under violations of human rights and dignity. The involvement of AI in the alteration is reasonably inferred from the description of digital face alteration. Therefore, this qualifies as an AI Incident due to the direct harm caused by the AI system's use in altering the image and the resulting violation of rights and harm to the individual and community.
Thumbnail Image

Model says her face was edited with AI to look white: 'It's very dehumanizing'

2023-11-04
The Guardian
Why's our monitor labelling this an incident or hazard?
An AI system was used to alter the model's face in a way that changed her racial appearance, which is a direct misuse of AI-generated content causing harm to the model's identity, exposure, and dignity. This constitutes a violation of rights and harm to the community by erasing racial identity, fitting the definition of an AI Incident. The harm is realized, not just potential, as the model has publicly expressed the negative impact and the altered image was shared widely. Therefore, this event qualifies as an AI Incident.
Thumbnail Image

Model Calls Out 'Project Runway' Alum For Replacing Her Face With White Model On Instagram

2023-11-01
Comic Sands
Why's our monitor labelling this an incident or hazard?
The designer used AI to alter a photo by replacing the Asian-American model's face with that of a White model, which is a direct misuse of AI technology leading to racial discrimination and harm to the model's career opportunities and dignity. The harm is realized and directly linked to the AI system's use in photo editing. This fits the definition of an AI Incident as it involves violation of rights and harm to a person due to AI use.