ByteDance Sued for Unlawful Biometric Data Collection by CapCut AI App

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

ByteDance faces multiple class-action lawsuits alleging its AI-powered CapCut video editing app illegally collected users' biometric data, including face and voice prints, without proper consent or disclosure. The lawsuits claim these privacy violations breach federal and state laws, including Illinois' Biometric Information Privacy Act, and were used for targeted advertising.[AI generated]

Why's our monitor labelling this an incident or hazard?

The CapCut app is an AI system used for video creation and editing, which involves processing personal data. The lawsuit alleges that ByteDance's use of this AI system has directly led to violations of privacy rights and laws, including biometric data privacy. This constitutes a violation of human rights and legal obligations, fitting the definition of an AI Incident because the AI system's use has directly led to harm through unlawful data collection and privacy breaches.[AI generated]
AI principles
Privacy & data governanceTransparency & explainabilityRespect of human rightsAccountability

Industries
Media, social platforms, and marketingConsumer services

Affected stakeholders
Consumers

Harm types
Human or fundamental rightsReputational

Severity
AI incident

Business function:
Marketing and advertisement

AI system task:
Recognition/object detectionContent generationOrganisation/recommenders


Articles about this incident or hazard

Thumbnail Image

TikTok's owner broke fed, state privacy laws with consumer video tool, lawsuit alleges | Biometric Update

2023-08-02
Biometric Update
Why's our monitor labelling this an incident or hazard?
The CapCut app is an AI system used for video creation and editing, which involves processing personal data. The lawsuit alleges that ByteDance's use of this AI system has directly led to violations of privacy rights and laws, including biometric data privacy. This constitutes a violation of human rights and legal obligations, fitting the definition of an AI Incident because the AI system's use has directly led to harm through unlawful data collection and privacy breaches.
Thumbnail Image

ByteDance sued for allegedly collecting biometric data without consent | Engadget

2023-08-02
engadget
Why's our monitor labelling this an incident or hazard?
The CapCut app uses AI systems for biometric data collection (face scans, voiceprints) and data processing for targeted ads. The lawsuit alleges that this AI-driven data collection occurred without informed consent, violating privacy laws and potentially exposing users to unauthorized surveillance. This constitutes a violation of human rights and legal obligations protecting biometric privacy, thus meeting the criteria for an AI Incident. The harm is realized through unlawful data collection and privacy breaches, not merely potential or future harm.
Thumbnail Image

ByteDance's CapCut Allegedly Collected Biometrics, Now Faces Class-Action Lawsuit from Illinois

2023-08-03
Tech Times
Why's our monitor labelling this an incident or hazard?
CapCut is an AI-enabled video-editing app that collects biometric data (face scans, voiceprints) and other personal information for ad targeting. The alleged unauthorized collection and use of biometric data without meaningful consent constitutes a violation of privacy and legal rights, which falls under harm category (c) - violations of human rights or breach of legal obligations. The AI system's use in processing biometric data and targeting ads directly leads to this harm. Therefore, this event qualifies as an AI Incident.
Thumbnail Image

ByteDance Faces BIPA Lawsuit Over Video Editing App

2023-08-02
FindBiometrics
Why's our monitor labelling this an incident or hazard?
CapCut uses AI to process biometric data (face and voice biometrics), which is an AI system. The lawsuit alleges that this data collection occurred without proper consent and disclosures, violating BIPA, a legal framework protecting biometric privacy rights. This constitutes a violation of human rights and legal obligations (harm category c). The involvement of AI in biometric data processing and the resulting privacy violations directly link the AI system's use to harm. Hence, this is classified as an AI Incident.