
The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.
Serbia is set to begin production of its domestically developed AI-enabled military drone, Pegaz, with technological assistance from China. The drones, featuring autopilot and advanced software, will be introduced into the Serbian armed forces, raising concerns about potential future risks associated with AI-powered weaponry.[AI generated]
Why's our monitor labelling this an incident or hazard?
The event involves the development and imminent production of an AI-enabled military drone system. While no harm has yet occurred, the deployment of armed drones with AI capabilities inherently carries plausible risks of harm, including injury, disruption, or violations of rights, due to their potential use in military operations. Therefore, this event qualifies as an AI Hazard because the AI system's development and intended use could plausibly lead to an AI Incident in the future. There is no indication that harm has already occurred, so it is not an AI Incident. The article is not merely complementary information or unrelated, as it highlights the production of an AI system with significant potential for harm.[AI generated]