
The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.
Allen Control Systems, led by Steven Simoni, has developed the Bullfrog, an AI-powered autonomous machine gun designed to shoot down drones. The system, tested and partially deployed with the U.S. military, demonstrates the use of AI in lethal autonomous weapons, raising concerns about potential malfunctions and risks of harm.[AI generated]
Why's our monitor labelling this an incident or hazard?
The Bullfrog is an AI-powered autonomous weapon system designed to shoot drones, which are battlefield threats. Its deployment and testing with the US military indicate active use of AI in lethal autonomous systems. The article reports actual use cases where drones were shot down and also instances of malfunction (gun jamming), which could lead to harm. The AI system's role in enabling autonomous lethal force directly implicates it in potential injury or death, fulfilling the criteria for an AI Incident. The article does not merely discuss potential future risks but describes ongoing development, testing, and partial deployment, with real-world implications for harm in military operations.[AI generated]