Indian Police Use AI Facial Recognition to Reunite Missing Children

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Indian police in Telangana used a facial recognition AI app as part of Operation Smile to identify and reunite thousands of missing and trafficked children with their families, significantly mitigating harm from child trafficking and disappearance. The initiative is praised as a major advance in addressing these issues.[AI generated]

Why's our monitor labelling this an incident or hazard?

The event involves an AI system (facial recognition app) explicitly mentioned and used by police to identify missing children and reunite them with families, directly addressing harm related to child trafficking and disappearance (human rights and welfare). The AI system's use has led to realized positive outcomes, which is a form of harm mitigation. The concerns about data privacy are noted but do not outweigh the fact that the AI system's use has directly led to addressing a significant harm. Therefore, this is an AI Incident because the AI system's use has directly led to addressing a serious harm involving human rights and welfare of children.[AI generated]
Industries
Government, security, and defence

Severity
AI incident

AI system task:
Recognition/object detection


Articles about this incident or hazard

Thumbnail Image

Indian police use facial recognition app to reunite families with...

2020-02-14
U.S.
Why's our monitor labelling this an incident or hazard?
The event involves an AI system (facial recognition app) explicitly mentioned and used by police to identify missing children and reunite them with families, directly addressing harm related to child trafficking and disappearance (human rights and welfare). The AI system's use has led to realized positive outcomes, which is a form of harm mitigation. The concerns about data privacy are noted but do not outweigh the fact that the AI system's use has directly led to addressing a significant harm. Therefore, this is an AI Incident because the AI system's use has directly led to addressing a serious harm involving human rights and welfare of children.
Thumbnail Image

Indian police use facial recognition app to reunite families with...

2020-02-14
U.K.
Why's our monitor labelling this an incident or hazard?
The facial recognition app is an AI system explicitly mentioned as being used by police to scan and match children's photographs to reunite them with their families. This use directly addresses harm related to child trafficking and missing children, which involves violations of fundamental rights and harm to vulnerable groups. The AI system's deployment has led to the reunification of thousands of children, indicating realized harm mitigation rather than potential harm. Although there are concerns about data privacy and rights, the article focuses on the positive impact of the AI system in reducing harm. Therefore, this event qualifies as an AI Incident due to the AI system's direct role in addressing and mitigating harm to people (children) and their rights.
Thumbnail Image

Indian Police Use Facial Recognition App To Reunite Families

2020-02-14
Independent Newspapers Nigeria
Why's our monitor labelling this an incident or hazard?
The facial recognition app is an AI system explicitly mentioned and used by police to identify missing children and reunite them with families, directly addressing harm related to child trafficking and disappearance. The AI system's use has led to a positive outcome by reducing harm to children and communities. While privacy concerns are noted, no direct harm from the AI system's misuse or malfunction is reported. The event involves the use of AI leading to a significant social impact related to human rights and welfare, fitting the definition of an AI Incident due to the AI system's pivotal role in harm mitigation.
Thumbnail Image

Indian police use facial recognition app to reunite families with lost children

2020-02-14
DhakaTribune
Why's our monitor labelling this an incident or hazard?
The facial recognition app is an AI system explicitly mentioned and used by police to identify missing children and reunite them with families, directly impacting harm related to child trafficking and missing children (harm to persons). The AI system's use has led to realized positive outcomes, which is a form of harm mitigation rather than harm creation, but since the system's use is directly linked to addressing a serious harm, it qualifies as an AI Incident. The article also mentions concerns about data privacy, but these are cautions rather than realized harms. Thus, the event is best classified as an AI Incident due to the AI system's direct involvement in a significant social harm context.
Thumbnail Image

REUTERS - Indian police use facial recognition app to reunite families with lost children

2020-02-14
nampa.org
Why's our monitor labelling this an incident or hazard?
Facial recognition apps are AI systems that analyze images to identify individuals. The article states that the police have used this app to successfully reunite thousands of missing and trafficked children with their families, which directly addresses harm to persons by mitigating the effects of trafficking and disappearance. Therefore, this event involves the use of an AI system that has directly led to harm mitigation, qualifying it as an AI Incident under the definition of harm to persons.