
The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.
The Ukrainian military has approved the use of SAKER SCOUT drones equipped with artificial intelligence. These drones autonomously identify and relay enemy equipment coordinates, including camouflaged targets, to command centers, supporting reconnaissance and attack missions. Their deployment in active conflict zones raises credible risks of harm due to autonomous targeting capabilities.[AI generated]
Why's our monitor labelling this an incident or hazard?
The event involves the deployment and use of an AI system (the SAKER SCOUT drone) that autonomously performs reconnaissance and targeting functions. The AI system's use in military operations, including coordination with kamikaze drones, implies a direct role in potential harm scenarios such as armed conflict. Although no specific harm is reported as having occurred yet, the AI system's use in military targeting plausibly could lead to injury or harm to persons or groups, qualifying this as an AI Hazard. There is no indication that harm has already occurred or that an incident has taken place, so it is not classified as an AI Incident. The event is more than general AI news or product launch, as it concerns the operational deployment of AI in a military context with plausible future harm.[AI generated]