Thailand's Biometric Data Collection of Myanmar Nationals Raises AI-Driven Privacy Concerns

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Thailand's government is collecting iris and facial biometrics from Myanmar nationals using AI-enabled systems to streamline healthcare services. Over 10,000 individuals have been affected, sparking concerns among rights groups about potential privacy violations, human rights risks, and the misuse or security vulnerabilities of sensitive biometric data.[AI generated]

Why's our monitor labelling this an incident or hazard?

The biometric data collection system involves AI technologies for facial and iris recognition, qualifying as an AI system. The event focuses on the deployment and use of this system and the associated privacy and security concerns raised by activists and experts. Although no actual harm or violation has been documented, the potential for misuse or unauthorized data sharing could plausibly lead to violations of human rights or privacy breaches, constituting an AI Hazard. Therefore, this event is best classified as an AI Hazard rather than an AI Incident or Complementary Information.[AI generated]
AI principles
Privacy & data governanceRespect of human rightsRobustness & digital securityTransparency & explainabilityAccountability

Industries
Healthcare, drugs, and biotechnologyGovernment, security, and defenceDigital security

Affected stakeholders
General public

Harm types
Human or fundamental rights

Severity
AI hazard

Business function:
Citizen/customer service

AI system task:
Recognition/object detection


Articles about this incident or hazard

Thumbnail Image

Thailand's biometric data collection stirs debate for Myanmar nationals

2024-05-08
Radio Free Asia
Why's our monitor labelling this an incident or hazard?
The biometric data collection system involves AI technologies for facial and iris recognition, qualifying as an AI system. The event focuses on the deployment and use of this system and the associated privacy and security concerns raised by activists and experts. Although no actual harm or violation has been documented, the potential for misuse or unauthorized data sharing could plausibly lead to violations of human rights or privacy breaches, constituting an AI Hazard. Therefore, this event is best classified as an AI Hazard rather than an AI Incident or Complementary Information.
Thumbnail Image

Thailand's biometric scheme for Myanmar nationals

2024-05-09
The Thaiger
Why's our monitor labelling this an incident or hazard?
The biometric data collection program uses AI-enabled biometric recognition systems (iris and facial recognition) to process personal data. The program's deployment has already affected over 10,000 individuals, raising privacy and human rights issues, which constitute harm under the framework. The mention of malware targeting biometric data and a biometric system glitch causing chaos at a major airport further indicates malfunction and cybersecurity vulnerabilities of AI systems, leading to operational disruption. These factors combined demonstrate direct and indirect harms caused by the development, use, and malfunction of AI systems, fitting the definition of an AI Incident.
Thumbnail Image

Thai government to collect iris and face biometrics from Myanmar nationals | Biometric Update

2024-05-08
Biometric Update
Why's our monitor labelling this an incident or hazard?
An AI system is involved as biometric scanning and data linking typically use AI technologies. The event involves the use of AI systems in data collection and management. Although no direct harm has been reported, the article highlights plausible risks of human rights violations, privacy breaches, and misuse of sensitive biometric data, which could lead to harm. Therefore, this situation fits the definition of an AI Hazard, as the development and use of AI systems in this context could plausibly lead to violations of human rights or privacy breaches in the future.