AI Weapons Scanner Fails to Detect Knife, Leading to School Stabbing Incident

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Evolv Technology's AI-powered weapons scanner, deployed in a New York school for nearly $4 million, failed to detect a large knife, resulting in a student being stabbed. Investigations revealed the system frequently misses knives, raising concerns about its effectiveness and the safety risks of relying on such AI security solutions.[AI generated]

Why's our monitor labelling this an incident or hazard?

The Evolv weapons scanners use AI combined with sensor technology to detect concealed weapons. The article reports that these scanners failed to detect 42% of large knives in tests and specifically failed to detect knives used in two separate stabbing incidents in schools, resulting in physical injuries to students. This demonstrates direct harm caused by the AI system's failure to perform as intended. Therefore, this qualifies as an AI Incident due to injury to persons caused by the malfunction or inadequacy of an AI system in a critical safety application.[AI generated]
AI principles
SafetyRobustness & digital securityAccountabilityTransparency & explainabilityHuman wellbeingRespect of human rights

Industries
Education and trainingDigital securityRobots, sensors, and IT hardware

Affected stakeholders
Children

Harm types
Physical (injury)PsychologicalReputationalEconomic/PropertyPublic interest

Severity
AI incident

Business function:
Monitoring and quality control

AI system task:
Recognition/object detectionEvent/anomaly detection


Articles about this incident or hazard

Thumbnail Image

Weapons scanners used in schools fail to detect almost 50% of knives

2023-05-23
Daily Mail Online
Why's our monitor labelling this an incident or hazard?
The Evolv weapons scanners use AI combined with sensor technology to detect concealed weapons. The article reports that these scanners failed to detect 42% of large knives in tests and specifically failed to detect knives used in two separate stabbing incidents in schools, resulting in physical injuries to students. This demonstrates direct harm caused by the AI system's failure to perform as intended. Therefore, this qualifies as an AI Incident due to injury to persons caused by the malfunction or inadequacy of an AI system in a critical safety application.
Thumbnail Image

A nearly $4 million AI-powered weapons scanner sold to a New York school system failed to detect knives

2023-05-23
Business Insider Nederland
Why's our monitor labelling this an incident or hazard?
The AI system (weapons scanner) was used in a real-world setting (a school) and failed to detect a dangerous weapon (knife), which directly resulted in a stabbing incident causing injury to a student. The AI system's malfunction (failure to detect the knife) is a direct contributing factor to the harm. Therefore, this qualifies as an AI Incident due to injury to a person caused by the AI system's failure.
Thumbnail Image

A nearly $4 million AI-powered weapons scanner sold to a New York school system failed to detect knives

2023-05-23
Business Insider
Why's our monitor labelling this an incident or hazard?
The AI system (weapons scanner) was explicitly mentioned and is described as using AI algorithms to detect weapons. Its failure to detect a knife allowed a student to carry a weapon into the school and injure another student, causing harm to health. This is a direct link between the AI system's malfunction and realized harm, meeting the definition of an AI Incident.
Thumbnail Image

AI scanner used in hundreds of US schools misses knives

2023-05-23
Yahoo Sports Canada
Why's our monitor labelling this an incident or hazard?
The Evolv Technology AI weapons scanner is explicitly described as using AI to detect weapons, including knives. The system failed to detect a large knife that was used in a stabbing attack on a student, causing serious injury. This is a direct harm to a person caused by the malfunction of the AI system. The article also notes that the system missed 42% of large knives in tests and failed to detect other knives brought into schools, indicating a systemic failure. The harm is realized and significant, involving injury to a person and failure to protect a critical infrastructure environment (schools). Therefore, this event meets the criteria for an AI Incident.
Thumbnail Image

Evolv Technology's scanners come under fire after security lapses

2023-05-22
CBS News
Why's our monitor labelling this an incident or hazard?
The Evolv security scanners use AI technology to detect weapons, which is explicitly mentioned. The failure to detect a 7-inch knife that was then used in a stabbing attack constitutes a direct harm to a person caused by the AI system's malfunction or limitation. The article provides evidence of realized harm (injury from the stabbing) linked to the AI system's failure. Although the company acknowledges limitations, the harm has already occurred, making this an AI Incident rather than a hazard or complementary information.
Thumbnail Image

Student Stabbed After AI Weapons Scanner Missed Knife

2023-05-23
Gizmodo
Why's our monitor labelling this an incident or hazard?
The AI weapons scanner, an AI system designed to detect weapons, failed to detect a 9-inch knife that was then used to stab a student, causing physical injury. Additionally, other knives were found undetected by the same system, indicating a malfunction or failure in its use. This directly led to harm to a person, which fits the definition of an AI Incident as the AI system's malfunction directly caused injury. The event involves the use and malfunction of an AI system leading to realized harm, not just potential harm or complementary information.
Thumbnail Image

AI scanner used to detect weapons in schools 'fails to find knives'

2023-05-23
Mirror
Why's our monitor labelling this an incident or hazard?
The AI system (Evolv Technology's AI weapons scanner) was explicitly mentioned and is designed to detect weapons including knives. Its malfunction (failure to detect knives) directly led to a stabbing incident causing physical harm to a student, which is a clear harm to health. Additional failures to detect knives in other schools further confirm the system's unreliability and harm caused. This meets the definition of an AI Incident because the AI system's malfunction directly led to injury and harm to a person. The article does not merely warn of potential harm or discuss responses but reports realized harm due to the AI system's failure.
Thumbnail Image

AI weapons scanner used in US schools fails to find knives, officials claim

2023-05-23
Daily Star
Why's our monitor labelling this an incident or hazard?
The Evolv AI weapons scanner is explicitly described as an AI system used for weapons detection. Its failure to detect knives, which it was purportedly designed to detect, directly led to a stabbing incident causing physical harm to a student. Additional knives were found undetected in other schools using the system, indicating a systemic failure. This constitutes an AI Incident because the AI system's malfunction has directly led to injury and harm to persons, fulfilling the criteria for harm to health (a).
Thumbnail Image

AI scanner used in hundreds of US schools misses knives - MyJoyOnline.com

2023-05-23
MyJoyOnline.com
Why's our monitor labelling this an incident or hazard?
The Evolv Technology AI weapons scanner is explicitly described as using AI to detect weapons, including knives. The system's failure to detect a knife that was used in a stabbing incident constitutes a malfunction of the AI system. This malfunction directly led to physical harm to a student (multiple stab wounds) and ongoing risks as other knives were found undetected. Therefore, this event meets the criteria for an AI Incident due to direct harm to a person caused by the AI system's failure.
Thumbnail Image

AI weapon scanner fails in school knife attack, raising security concerns

2023-05-23
The Thaiger
Why's our monitor labelling this an incident or hazard?
The Evolv Technology AI weapon scanner is an AI system designed to detect concealed weapons. Its failure to detect a large knife allowed a stabbing attack to occur, causing injury to a student. The system's inability to reliably detect knives, despite being marketed for weapons detection, directly contributed to harm. Additionally, the continued use of the system in other schools where knives were found only through staff reports highlights ongoing risks. Therefore, this event meets the criteria for an AI Incident due to the AI system's malfunction leading to injury and safety concerns.
Thumbnail Image

Schools' AI Weapon Detection Systems Accused Of Flaws | Silicon

2023-05-23
Silicon UK
Why's our monitor labelling this an incident or hazard?
The AI system (Evolv's weapons detection system) is explicitly described as using AI to detect weapons. The system's failure to detect weapons, including large knives, has directly led to harm, including a fatal stabbing in a school. This constitutes injury or harm to a person caused by the malfunction or ineffectiveness of an AI system. The article also highlights systemic issues with overreliance on such AI systems and misleading marketing claims, reinforcing the classification as an AI Incident rather than a mere hazard or complementary information.
Thumbnail Image

AI Scanner Supposed to Protect US Schools Can't Detect Knives In Almost Half the Cases - TechTheLead

2023-05-25
TechTheLead - Technology for tomorrow
Why's our monitor labelling this an incident or hazard?
The AI system (Evolv's AI scanner) is explicitly mentioned and is used for weapon detection in schools. Its failure to detect knives in nearly half the cases has directly contributed to a stabbing incident, causing injury to a student. This constitutes harm to a person resulting from the AI system's malfunction or inadequate performance. Therefore, this event qualifies as an AI Incident under the definition of harm to health caused directly or indirectly by the use or malfunction of an AI system.