Selfie Apps Illegally Share and Sell Users' Biometric Data

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Several popular AI-powered selfie and beauty apps have been found to collect, analyze, and sell users' sensitive biometric data—including facial features and skin tone—to third parties without proper consent. This widespread practice constitutes a violation of privacy and data protection rights, affecting millions of users globally.[AI generated]

Why's our monitor labelling this an incident or hazard?

The selfie apps use AI systems to process biometric data and images. The study shows that these AI systems' use has led to the unauthorized or non-transparent sharing of sensitive personal data with third parties, including large tech companies, which constitutes a violation of users' rights and privacy. This harm is realized and ongoing, as users' data is being sold or shared without proper informed consent, fulfilling the criteria for an AI Incident under violations of human rights or breach of legal obligations protecting fundamental rights.[AI generated]
AI principles
Privacy & data governanceRespect of human rightsTransparency & explainabilityAccountability

Industries
Consumer servicesMedia, social platforms, and marketing

Affected stakeholders
Consumers

Harm types
Human or fundamental rights

Severity
AI incident

Business function:
Marketing and advertisement

AI system task:
Recognition/object detectionContent generation


Articles about this incident or hazard

Thumbnail Image

Test zeigt: Viele Selfie-Apps geben Nutzerdaten an Dritte weiter

2022-02-16
Focus
Why's our monitor labelling this an incident or hazard?
The selfie apps use AI systems to process biometric data and images. The study shows that these AI systems' use has led to the unauthorized or non-transparent sharing of sensitive personal data with third parties, including large tech companies, which constitutes a violation of users' rights and privacy. This harm is realized and ongoing, as users' data is being sold or shared without proper informed consent, fulfilling the criteria for an AI Incident under violations of human rights or breach of legal obligations protecting fundamental rights.
Thumbnail Image

Selfie-Apps verkaufen biometrische Daten an Dritte - oe3.ORF.at

2022-02-18
oe3.ORF.at
Why's our monitor labelling this an incident or hazard?
The selfie apps use AI systems for image processing and biometric data extraction. The apps' development and use involve processing sensitive biometric data, and the lack of informed consent and unauthorized data sharing constitutes a violation of users' rights under applicable data protection laws. This is a breach of obligations intended to protect fundamental rights, specifically privacy and data protection rights, which qualifies as harm under the AI Incident definition. Therefore, this event is classified as an AI Incident due to the realized violation of rights caused by the AI systems' use and data handling practices.
Thumbnail Image

Vorsicht vor Beauty-Apps: Einige vermarkten Gesichtsdaten

2022-02-18
Westdeutscher Rundfunk
Why's our monitor labelling this an incident or hazard?
The event involves AI systems (beauty apps using facial analysis AI) that process biometric data, which is sensitive personal data protected under privacy and data protection laws. The unauthorized sharing and selling of this data can be considered a violation of fundamental rights (privacy rights). Although the article does not describe a specific realized harm or incident, it reveals a clear ongoing misuse and risk of harm to users' rights. This fits the definition of an AI Incident because the AI system's use has directly or indirectly led to violations of rights through data misuse. The harm is realized in the form of privacy violations and unauthorized data commercialization, not merely a potential hazard or complementary information.