LVMH Sued for Unlawful Collection of Biometric Data via AI Try-On Tool

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

LVMH faces a lawsuit alleging its AI-powered virtual try-on tool for eyewear collects and stores users' facial scans without consent, violating state biometric privacy laws. The tool processes sensitive biometric data through facial recognition, raising legal and privacy concerns over unauthorized data collection and storage.[AI generated]

Why's our monitor labelling this an incident or hazard?

The virtual try-on tool uses AI-based facial scanning to overlay glasses on users' faces, which qualifies as an AI system. The lawsuit claims that LVMH collects and stores biometric data without informed consent, violating Illinois' biometric privacy law (BIPA). This is a direct harm related to legal rights and privacy protections. Since the AI system's use has directly led to this violation, the event meets the criteria for an AI Incident under the framework, specifically under violations of human rights or legal obligations protecting biometric data privacy.[AI generated]
AI principles
Privacy & data governanceRespect of human rightsTransparency & explainabilityAccountabilityRobustness & digital security

Industries
Consumer productsDigital security

Affected stakeholders
Consumers

Harm types
Human or fundamental rightsReputationalEconomic/Property

Severity
AI incident

Business function:
Marketing and advertisementSales

AI system task:
Recognition/object detectionContent generation


Articles about this incident or hazard

Thumbnail Image

LVMH collects people's facial scans without asking when they use its virtual try-on tool for glasses, lawsuit claims

2022-04-09
Yahoo News
Why's our monitor labelling this an incident or hazard?
The virtual try-on tool uses AI-based facial scanning to overlay glasses on users' faces, which qualifies as an AI system. The lawsuit claims that LVMH collects and stores biometric data without informed consent, violating Illinois' biometric privacy law (BIPA). This is a direct harm related to legal rights and privacy protections. Since the AI system's use has directly led to this violation, the event meets the criteria for an AI Incident under the framework, specifically under violations of human rights or legal obligations protecting biometric data privacy.
Thumbnail Image

LVMH collects people's facial scans without asking when they use its virtual try-on tool for glasses, lawsuit claims

2022-04-09
Business Insider
Why's our monitor labelling this an incident or hazard?
The virtual try-on tool uses AI to perform facial scans and overlay glasses images, which qualifies as an AI system. The lawsuit claims that LVMH collects and stores biometric data without informed consent, violating Illinois' biometric privacy law. This is a breach of legal obligations protecting fundamental rights, fulfilling the criteria for an AI Incident under violations of human rights or breach of applicable law. The harm is realized as the unauthorized collection and storage of sensitive biometric data without consent, not merely a potential risk.
Thumbnail Image

LVMH Eyewear Virtual 'Try-on' Tool Draws Biometric Privacy Suit

2022-04-09
Bloomberg Business
Why's our monitor labelling this an incident or hazard?
The virtual try-on tool uses AI to process facial scans and generate live video overlays, indicating AI system involvement. The alleged unlawful collection and storage of biometric data without consent is a breach of the Illinois Biometric Privacy Protection Act, a legal framework protecting biometric privacy rights. This breach constitutes harm under category (c) (violations of human rights or breach of applicable law). Therefore, this event qualifies as an AI Incident due to the realized legal harm caused by the AI system's use.
Thumbnail Image

LVMH Sued For Illegally Storing Customers' Biometric Data

2022-04-11
ZeroHedge
Why's our monitor labelling this an incident or hazard?
The virtual try-on tool uses AI-based biometric facial recognition to collect detailed facial scans, qualifying as an AI system. The lawsuit claims that LVMH collects and stores this biometric data without informed consent, violating privacy laws designed to protect individuals' biometric information. This is a direct violation of legal rights and obligations, constituting harm under the framework's category (c) - violations of human rights or breach of legal obligations protecting fundamental rights. Hence, this event qualifies as an AI Incident due to the realized harm caused by the AI system's use.
Thumbnail Image

LVMH eyewear virtual ‘try-on’ tool draws biometric privacy suit

2022-04-11
The Star
Why's our monitor labelling this an incident or hazard?
The virtual try-on tool uses AI to process live video and create facial scans, which qualifies as an AI system. The lawsuit claims unlawful collection and storage of biometric data without consent, violating the Illinois Biometric Privacy Protection Act. This is a breach of legal obligations protecting fundamental rights, fulfilling the criteria for an AI Incident under violations of human rights or legal obligations. The harm is realized as the unauthorized data collection has already occurred, not just a potential risk.
Thumbnail Image

LVMH collects people's facial scans without asking when they use its virtual try-on tool for glasses, lawsuit claims

2022-04-09
Business Insider Nederland
Why's our monitor labelling this an incident or hazard?
The virtual try-on tool uses AI-based facial scanning technology to overlay glasses on users' faces, which qualifies as an AI system. The lawsuit alleges that LVMH collects and stores biometric data without user consent, violating state biometric privacy laws, which is a breach of legal obligations protecting fundamental rights. This is a direct harm related to the AI system's use, meeting the criteria for an AI Incident. The event is not merely a potential risk or a complementary update but a claim of realized harm through unlawful data collection.