ZeroEyes AI Gun Detection System Deployed to Prevent Gun Violence

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

ZeroEyes, an AI-based gun detection platform, is being deployed in Hobbs, New Mexico and across the US to analyze security camera feeds for visible firearms. When a gun is detected, alerts are sent within seconds to trained experts and first responders, aiming to prevent shootings and reduce response times.[AI generated]

Why's our monitor labelling this an incident or hazard?

The article describes an AI system (ZeroEyes) that uses advanced machine learning and computer vision to detect guns in public and private spaces. The system's deployment and use have directly led to harm reduction by enabling faster alerts and responses to potential active shooter events, which are significant harms to human health and safety. The AI system's role is pivotal in detecting guns before shots are fired, thus preventing or mitigating injury or death. Therefore, this event qualifies as an AI Incident under the framework, as the AI system's use has directly led to harm prevention and protection of human life.[AI generated]
AI principles
AccountabilityFairnessPrivacy & data governanceRespect of human rightsRobustness & digital securitySafetyTransparency & explainabilityDemocracy & human autonomy

Industries
Government, security, and defenceDigital securityIT infrastructure and hosting

Affected stakeholders
General public

Harm types
Physical (injury)Physical (death)Human or fundamental rightsPsychologicalReputationalPublic interest

Severity
AI incident

Business function:
Monitoring and quality controlICT management and information security

AI system task:
Recognition/object detectionEvent/anomaly detection


Articles about this incident or hazard

Thumbnail Image

ZeroEyes uses AI and security cameras to detect guns in public and private spaces

2023-07-31
VentureBeat
Why's our monitor labelling this an incident or hazard?
The article describes an AI system (ZeroEyes) that uses advanced machine learning and computer vision to detect guns in public and private spaces. The system's deployment and use have directly led to harm reduction by enabling faster alerts and responses to potential active shooter events, which are significant harms to human health and safety. The AI system's role is pivotal in detecting guns before shots are fired, thus preventing or mitigating injury or death. Therefore, this event qualifies as an AI Incident under the framework, as the AI system's use has directly led to harm prevention and protection of human life.
Thumbnail Image

AI Threat Detection Services

2023-08-01
Trend Hunter
Why's our monitor labelling this an incident or hazard?
The article describes an AI system in active use for threat detection, which could plausibly prevent harm but does not report any incident or malfunction causing harm. There is no indication of realized harm or a near miss event. The content is primarily informational about the AI system's function and its potential benefits, without describing any AI Incident or AI Hazard. Therefore, it is best classified as Complementary Information, as it provides context and understanding of AI applications in security without reporting harm or credible risk of harm.
Thumbnail Image

To These US Veterans, AI Is a Tool That Could Help Prevent Mass Shootings

2023-07-29
Post and Courier
Why's our monitor labelling this an incident or hazard?
The AI system is explicitly described as analyzing live video feeds to detect guns and alert authorities, which is a direct use of AI technology. The system's deployment and active identification of weapons indicate it is in operational use, contributing to preventing harm (injury or death) from mass shootings. Therefore, this event involves the use of an AI system that directly leads to harm prevention, fitting the definition of an AI Incident as it relates to injury or harm to people. The article does not describe a malfunction or potential future harm but an active, beneficial use of AI to reduce harm.
Thumbnail Image

To These US Veterans, AI Is a Tool That Could Help Prevent Mass Shootings

2023-07-29
The Daily Courier
Why's our monitor labelling this an incident or hazard?
The AI system is explicitly described as analyzing live video feeds to detect guns and trigger alerts to law enforcement and clients, which directly relates to preventing injury or harm to people (harm category a). The system is in active use with hundreds of clients and thousands of buildings, including hospitals and military applications, and has already identified many weapons daily. This means the AI's use is directly linked to harm prevention, qualifying as an AI Incident rather than a hazard or complementary information. The article does not describe any malfunction or failure but focuses on the system's active role in harm prevention, which fits the definition of an AI Incident.
Thumbnail Image

ZeroEyes uses AI and security cameras to detect guns in public and private spaces - RocketNews

2023-07-31
RocketNews | Top News Stories From Around the Globe
Why's our monitor labelling this an incident or hazard?
The article describes an AI system actively used to detect guns and prevent shootings, which is a positive safety application. There is no indication of any harm, malfunction, or misuse of the AI system leading to injury, rights violations, or other harms. The AI system's role is beneficial and preventive, and no potential hazards or incidents are reported. Therefore, the event is best classified as Complementary Information, providing context and updates on AI deployment and its societal impact without describing an AI Incident or AI Hazard.
Thumbnail Image

Hobbs, New Mexico deploys ZeroEyes' AI-Based gun detection and intelligent situational awareness platform to deter and mitigate gun-related violence

2023-08-01
Police1
Why's our monitor labelling this an incident or hazard?
The AI system is explicitly involved as it analyzes video feeds to detect guns and triggers alerts that can directly influence law enforcement response to gun threats. This use of AI is intended to prevent or reduce harm to people by enabling faster intervention in potentially violent situations. Since the system is actively deployed and used to mitigate gun-related violence, it is directly linked to preventing injury or harm to persons, fitting the definition of an AI Incident. The event does not describe a hazard or potential future harm but an active deployment with a direct role in harm prevention.