
The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.
LVMH faces a lawsuit alleging its AI-powered virtual try-on tool for eyewear collects and stores users' facial scans without consent, violating state biometric privacy laws. The tool processes sensitive biometric data through facial recognition, raising legal and privacy concerns over unauthorized data collection and storage.[AI generated]
Why's our monitor labelling this an incident or hazard?
The virtual try-on tool uses AI-based facial scanning to overlay glasses on users' faces, which qualifies as an AI system. The lawsuit claims that LVMH collects and stores biometric data without informed consent, violating Illinois' biometric privacy law (BIPA). This is a direct harm related to legal rights and privacy protections. Since the AI system's use has directly led to this violation, the event meets the criteria for an AI Incident under the framework, specifically under violations of human rights or legal obligations protecting biometric data privacy.[AI generated]