Myanmar Junta Expands AI-Powered Surveillance Using Chinese Facial Recognition Technology

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Myanmar's military junta is expanding the use of Chinese-built surveillance cameras with AI-driven facial recognition across multiple cities. These systems, sourced from companies like Dahua, Huawei, and Hikvision, enable the tracking and repression of activists and resistance groups, raising serious concerns about ongoing human rights violations and state-led population surveillance.[AI generated]

Why's our monitor labelling this an incident or hazard?

The event involves the deployment and use of AI-powered facial recognition systems by Myanmar's junta for surveillance purposes. The AI system's use is directly linked to potential and ongoing human rights violations and harm to communities, as it enables tracking and repression of activists and dissenters. The article provides credible sources and expert opinions highlighting these harms. The AI system's role is pivotal in these harms, meeting the criteria for an AI Incident. Although some details are not independently verified, the described use and context strongly indicate realized harm rather than just potential risk, thus excluding classification as an AI Hazard or Complementary Information.[AI generated]
AI principles
Privacy & data governanceRespect of human rightsTransparency & explainabilityAccountabilityDemocracy & human autonomySafetyFairnessRobustness & digital security

Industries
Government, security, and defenceDigital security

Affected stakeholders
Civil societyGeneral public

Harm types
Human or fundamental rightsPsychologicalPublic interest

Severity
AI incident

Business function:
Monitoring and quality control

AI system task:
Recognition/object detection


Articles about this incident or hazard

Thumbnail Image

Exclusive-Myanmar's junta rolls out Chinese camera surveillance systems in more cities -sources

2022-07-11
Yahoo News
Why's our monitor labelling this an incident or hazard?
The event involves the deployment and use of AI-powered facial recognition systems by Myanmar's junta for surveillance purposes. The AI system's use is directly linked to potential and ongoing human rights violations and harm to communities, as it enables tracking and repression of activists and dissenters. The article provides credible sources and expert opinions highlighting these harms. The AI system's role is pivotal in these harms, meeting the criteria for an AI Incident. Although some details are not independently verified, the described use and context strongly indicate realized harm rather than just potential risk, thus excluding classification as an AI Hazard or Complementary Information.
Thumbnail Image

Exclusive-Myanmar's junta rolls out Chinese camera surveillance systems in more cities -sources

2022-07-10
Yahoo Sports Canada
Why's our monitor labelling this an incident or hazard?
The event involves AI systems explicitly: facial recognition technology integrated into surveillance cameras, which are AI systems capable of identifying individuals. The use of these AI systems by Myanmar's junta for surveillance and repression directly leads to violations of human rights and breaches of fundamental rights, fulfilling the criteria for an AI Incident. The harms are ongoing and realized, not merely potential, as the systems are actively installed and used to monitor and suppress dissent. Therefore, this event qualifies as an AI Incident.
Thumbnail Image

Myanmar's junta rolls out Chinese camera surveillance systems in more cities -- sources | Inquirer News

2022-07-11
Inquirer.net
Why's our monitor labelling this an incident or hazard?
The event involves AI systems explicitly through the deployment of facial recognition technology, which is an AI system capable of identifying individuals in real time. The junta's use of these systems to surveil and potentially repress activists and resistance groups directly implicates violations of human rights and harm to communities. The article provides evidence of the AI system's use and the associated harms, meeting the criteria for an AI Incident. Although some details are not independently verified, the credible sources and human rights concerns justify classification as an AI Incident rather than a hazard or complementary information.
Thumbnail Image

Myanmar extends Chinese camera surveillance systems in more cities

2022-07-11
Prothomalo
Why's our monitor labelling this an incident or hazard?
The event involves the use of AI systems (facial recognition technology) in surveillance cameras. The deployment by an authoritarian government with a history of repression suggests that the AI system's use is directly linked to potential or ongoing violations of human rights and harm to communities. Given the context of the junta's control and the expansion of these systems, the AI system's use plausibly leads to harm as defined under human rights violations and harm to communities. Therefore, this qualifies as an AI Incident due to the direct or indirect harm caused by the AI system's use in surveillance and repression.
Thumbnail Image

Chinese-built surveillance systems are spreading across junta-ruled Myanmar

2022-07-11
South China Morning Post
Why's our monitor labelling this an incident or hazard?
The event involves AI systems explicitly: facial recognition software and AI-enabled CCTV cameras. The use of these AI systems by Myanmar's junta to surveil and track democracy activists and resistance groups directly leads to violations of human rights and harm to communities. The article provides evidence that these systems are installed and operational, with military and police using them to identify and intercept activists, which meets the criteria for an AI Incident. The harm is realized, not just potential, and the AI system's role is pivotal in enabling this surveillance and repression.
Thumbnail Image

Myanmar's junta rolls out more Chinese surveillance cameras

2022-07-11
The Japan Times
Why's our monitor labelling this an incident or hazard?
The event involves the use of an AI system (facial recognition technology) by a government known for authoritarian control. The deployment of AI surveillance systems with facial recognition capabilities in public spaces can plausibly lead to violations of human rights, such as unlawful surveillance, suppression of political opposition, and breaches of privacy. Since the article describes the rollout of these systems but does not report actual incidents of harm yet, this constitutes an AI Hazard rather than an AI Incident.
Thumbnail Image

Myanmar Expanding Use of Chinese-Made Facial Recognition Systems: Report

2022-07-12
The Diplomat Magazine
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions the use of facial recognition AI systems (an AI system) by an authoritarian regime to surveil and suppress opposition, which constitutes a violation of human rights and harm to communities. The military junta's use of these systems to strengthen its hold on power and target resistance networks directly links the AI system's use to harm. Therefore, this qualifies as an AI Incident under the framework, as the AI system's use has directly led to harm through repression and surveillance.
Thumbnail Image

Myanmar coup: Junta installs China-made surveillance cameras in more cities - EconoTimes

2022-07-12
EconoTimes
Why's our monitor labelling this an incident or hazard?
The surveillance cameras with facial recognition capabilities qualify as AI systems because they use AI to identify and track individuals. The military's use of these systems to monitor and suppress activists directly leads to violations of human rights, fulfilling the criteria for an AI Incident. The article explicitly states that these systems are used to track movements, identify safe houses, and intercept vehicles of activists, which constitutes direct harm to communities and individuals' rights. Therefore, this event is best classified as an AI Incident.
Thumbnail Image

Burmese military expanding surveillance and facial recognition | Thaiger

2022-07-11
The Thaiger
Why's our monitor labelling this an incident or hazard?
The event involves the use of AI systems (facial recognition technology) in surveillance by the military junta. The deployment and expansion of this technology is directly linked to violations of human rights, including the suppression of dissent and monitoring of private movements, which constitute harm to communities and breaches of fundamental rights. Therefore, this qualifies as an AI Incident due to the realized harm caused by the AI system's use in an oppressive context.
Thumbnail Image

Myanmar's junta rolls out Chinese camera surveillance systems in more cities

2022-07-11
The Financial Express
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions the installation of facial recognition technology, which is an AI system, by Myanmar's junta government. The deployment is linked to surveillance and control in cities under military rule, which is known to involve human rights abuses. The AI system's use in this context directly leads to violations of human rights and harm to communities, fulfilling the criteria for an AI Incident. The involvement is in the use of AI systems for surveillance that likely causes harm, not just a potential hazard or complementary information.
Thumbnail Image

Myanmar expands facial recognition-ready surveillance 'safe cities' | Biometric Update

2022-07-12
Biometric Update
Why's our monitor labelling this an incident or hazard?
The event involves AI systems (facial recognition technology) being deployed in surveillance projects. While no direct harm is reported, the deployment in a politically sensitive context with potential for misuse suggests a credible risk of future harm, such as violations of human rights. Therefore, this qualifies as an AI Hazard rather than an AI Incident. It is not Complementary Information because it is not an update or response to a prior incident, nor is it unrelated as it clearly involves AI systems with potential for harm.
Thumbnail Image

Myanmar's Junta Is Pursuing Biometric Surveillance - and So Was Its Democratic Predecessor - FindBiometrics

2022-07-12
FindBiometrics
Why's our monitor labelling this an incident or hazard?
The article mentions the use of biometric surveillance systems sourced from companies known for AI-enabled facial recognition technology. The involvement of AI in biometric identification is reasonably inferred. The deployment by Myanmar's junta, known for oppressive actions, suggests a credible risk of human rights violations and harm to communities. Since the article does not report actual harm yet but indicates a credible potential for harm, this fits the definition of an AI Hazard rather than an AI Incident. There is no indication that harm has already occurred or that this is merely complementary information or unrelated news.
Thumbnail Image

Chinese-built surveillance systems spread across junta-ruled Burma, including hundreds of Huawei cameras: sources

2022-07-11
IntellAsia
Why's our monitor labelling this an incident or hazard?
The event involves the use of AI systems (facial recognition technology) developed and deployed by the Myanmar junta for surveillance purposes. The AI's role in identifying and tracking activists and resistance groups, who are targeted by the junta, directly links the AI system's use to violations of human rights and potential repression. The article provides credible sources and expert statements highlighting the risks and harms caused by these AI-enabled surveillance systems. Therefore, this qualifies as an AI Incident due to the realized harm to human rights and communities through the AI system's deployment and use.
Thumbnail Image

Myanmar: Population surveillance using China cameras - Update Kenya

2022-07-12
Update Realty
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions the use of facial recognition technology, an AI system, for population surveillance by the Myanmar military junta. The AI system's use is directly linked to harm, as it enables tracking and repression of activists and resistance groups, violating their rights and freedoms. The harm is ongoing and planned to expand, fulfilling the criteria for an AI Incident involving violations of human rights and harm to communities. The involvement of AI in surveillance and facial recognition is clear, and the harm is realized, not just potential.
Thumbnail Image

Myanmar's junta rolls out Chinese camera surveillance systems in more cities

2022-07-11
kathmandupost.com
Why's our monitor labelling this an incident or hazard?
The event involves the deployment and use of AI systems (facial recognition technology) by Myanmar's junta for surveillance purposes. The AI system's use is directly linked to human rights violations, as it enables tracking and repression of activists and resistance groups. The article provides credible information about the AI system's role in facilitating these harms, meeting the criteria for an AI Incident. The harm is ongoing and systemic, not merely potential, and the AI system's role is pivotal in enabling the junta's surveillance and control measures.
Thumbnail Image

Junta Militer Awasi Warga Sipil Myanmar Lewat CCTV

2022-07-14
Liputan 6
Why's our monitor labelling this an incident or hazard?
The event involves the use of AI systems (facial recognition technology) developed and deployed by the Myanmar military junta for mass surveillance. The AI system's use directly leads to violations of human rights by enabling the military to monitor, track, and suppress activists and opposition groups. The harm is realized and ongoing, as the surveillance is actively used to repress dissent. This fits the definition of an AI Incident because the AI system's use has directly led to violations of fundamental rights under applicable law, fulfilling criterion (c) in the AI Incident definition.
Thumbnail Image

Junta Pasang Kamera Pengenalan Wajah di Setiap Sudut Kota Myanmar |Republika Online

2022-07-11
Republika Online
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions the use of AI-based facial recognition technology integrated into CCTV cameras installed by the Myanmar junta. The system is used to identify and track individuals, particularly activists and resistance members, which constitutes a violation of human rights. The deployment and use of this AI system have directly led to harms such as repression and threats to activists, fulfilling the criteria for an AI Incident under violations of human rights. The involvement is in the use of the AI system for surveillance and control, causing direct harm.
Thumbnail Image

Junta Myanmar Pasang CCTV Pengenalan Wajah Buatan China

2022-07-11
TEMPO.CO
Why's our monitor labelling this an incident or hazard?
The event involves the use of AI facial recognition systems by the Myanmar junta for surveillance, which is known to be associated with human rights abuses. While no specific harm is reported yet, the deployment of such technology in this context plausibly leads to violations of rights and harm to communities. Hence, it fits the definition of an AI Hazard rather than an AI Incident or Complementary Information.
Thumbnail Image

Junta Militer Myanmar Perketat Pengawasan Sipil dengan Teknologi CCTV

2022-07-11
Tempo Media
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions the use of AI-based facial recognition technology integrated into CCTV systems for surveillance by the Myanmar military junta. The deployment is actively used to track and suppress activists and opposition groups, which constitutes a violation of human rights and fundamental freedoms. The AI system's use is directly linked to harm (human rights violations), fulfilling the criteria for an AI Incident. The involvement is in the use of the AI system for oppressive surveillance, causing realized harm rather than just potential harm or background context.
Thumbnail Image

Perketat Pengawasan, Junta Myanmar Pasang CCTV Canggih China di Lebih Banyak Kota

2022-07-11
VOA Indonesia
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions the use of AI-based facial recognition technology as part of the CCTV surveillance system deployed by Myanmar's junta. The system's use to monitor, identify, and potentially suppress activists and opposition groups directly implicates violations of human rights and fundamental freedoms. The AI system's role is pivotal in enabling this surveillance and repression. Hence, this qualifies as an AI Incident due to realized harm to human rights caused by the AI system's use.
Thumbnail Image

Junta Siap Sebar Kamera Intai Made in China | Republika ID

2022-07-12
republika.id
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions the use of AI-powered facial recognition cameras by the Myanmar junta to monitor and track activists and opposition groups. This use of AI surveillance technology is linked to ongoing repression, including killings and human rights abuses by the junta. The AI system's deployment is a direct factor enabling these harms, fulfilling the criteria for an AI Incident due to violations of human rights and harm to communities. The involvement of AI in the surveillance system and its use for oppressive purposes is clear and directly connected to realized harm.