Germany Procures AI-Enabled Kamikaze Drones for Bundeswehr

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Rheinmetall will supply the German Bundeswehr with AI-powered loitering munitions capable of autonomously identifying and attacking targets. The €300 million contract covers a large, undisclosed number of drones, with deliveries starting in 2027. The autonomous nature of these weapons poses significant risks of harm in future military operations.[AI generated]

Why's our monitor labelling this an incident or hazard?

The drones described are autonomous loitering munitions, which by definition involve AI systems capable of autonomous target engagement. The event concerns the development and planned use of these AI-enabled weapons, which could plausibly lead to injury or harm to persons and other serious consequences. Since the delivery and use are planned for the future and no harm has yet occurred, this qualifies as an AI Hazard rather than an AI Incident.[AI generated]
AI principles
SafetyDemocracy & human autonomy

Industries
Government, security, and defence

Affected stakeholders
General public

Harm types
Physical (death)Physical (injury)

Severity
AI hazard

Business function:
Other

AI system task:
Recognition/object detectionGoal-driven organisation


Articles about this incident or hazard

Thumbnail Image

Rheinmetall liefert Bundeswehr Kamikaze-Drohnen

2026-04-22
T-online.de
Why's our monitor labelling this an incident or hazard?
The drones described are autonomous loitering munitions, which by definition involve AI systems capable of autonomous target engagement. The event concerns the development and planned use of these AI-enabled weapons, which could plausibly lead to injury or harm to persons and other serious consequences. Since the delivery and use are planned for the future and no harm has yet occurred, this qualifies as an AI Hazard rather than an AI Incident.
Thumbnail Image

100 Kilometer Reichweite: Rheinmetall liefert Bundeswehr Kamikaze-Drohnen

2026-04-22
ZEIT ONLINE
Why's our monitor labelling this an incident or hazard?
The event involves AI systems (autonomous loitering munitions) whose development and intended use could plausibly lead to harm (lethal force, destruction of property, harm to communities). The drones autonomously select and engage targets, indicating AI involvement. No actual harm or incident is reported yet, as delivery and use are planned for the future. Therefore, this is an AI Hazard, reflecting the credible risk posed by deploying autonomous lethal AI systems. The article does not report any realized harm or incident, nor does it primarily focus on responses or governance measures, so it is not Complementary Information or an AI Incident.
Thumbnail Image

Nächster Millionen-Deal für Rheinmetall: Deutsche Waffenschmiede beliefert die Bundeswehr

2026-04-22
TAG24
Why's our monitor labelling this an incident or hazard?
The drones described are autonomous weapon systems, which by definition involve AI systems capable of independent decision-making to attack targets. The article reports a large procurement contract for these systems, indicating imminent deployment. Although no incident of harm is reported yet, the nature of autonomous lethal drones inherently carries a credible risk of injury, death, and violations of human rights. Hence, this is a clear AI Hazard rather than an AI Incident. It is not Complementary Information because the article focuses on the contract and system description, not on responses or updates to prior incidents. It is not Unrelated because the AI system and its potential harms are central to the event.
Thumbnail Image

Bundeswehr bestellt Kamikaze-Drohnen für 300 Millionen Euro - Rheinmetall liefert aus Neuss

2026-04-22
Kölner Stadt-Anzeiger
Why's our monitor labelling this an incident or hazard?
The drones described are AI systems because they autonomously navigate, loiter, identify targets, and execute attacks without human intervention after launch. The article focuses on the procurement and future deployment of these systems, with no current harm reported but clear potential for lethal harm and violations of rights. The event is not a realized incident but a credible future risk, fitting the definition of an AI Hazard. It is not complementary information since the main focus is the contract and system capabilities, not a response or update to a prior incident. It is not unrelated because the AI system and its potential harms are central to the report.
Thumbnail Image

Milliardenvertrag: Rheinmetall liefert Kamikaze-Drohnen an die Bundeswehr

2026-04-22
Berliner Zeitung
Why's our monitor labelling this an incident or hazard?
The event involves the development and planned use of AI-enabled autonomous loitering munitions, which are weapon systems that can independently identify and attack targets. This clearly fits the definition of an AI system. The contract and production plans indicate imminent deployment, which plausibly could lead to harm such as injury or death (harm to persons) and damage to property. The article does not describe an incident where harm has already occurred, but the nature of the system and its intended use constitute a credible risk of future harm. Therefore, this event is best classified as an AI Hazard, as it plausibly could lead to an AI Incident involving injury or harm to persons and other harms associated with autonomous weapons.
Thumbnail Image

Rheinmetall liefert künftig Kamikaze-Drohnen an die Bundeswehr

2026-04-22
Business Insider
Why's our monitor labelling this an incident or hazard?
The article explicitly describes the development and planned delivery of kamikaze drones and unmanned surface vessels equipped with autonomous capabilities to the military. These systems qualify as AI systems due to their autonomous surveillance and attack functions. While no incident of harm is reported yet, the deployment of such weaponized AI systems inherently carries a plausible risk of causing injury, death, or other harms in military contexts. The event is about the contract and production plans, not about an actual harm event, so it is not an AI Incident. It is not complementary information since it is not an update or response to a prior incident but a new development with potential risk. Hence, it is best classified as an AI Hazard.
Thumbnail Image

Rheinmetall liefert Bundeswehr Kamikaze-Drohnen

2026-04-22
wallstreet:online
Why's our monitor labelling this an incident or hazard?
The drones described are loitering munitions with autonomous capabilities to identify and attack targets, which reasonably implies the involvement of AI systems. The article does not report any actual harm or incidents caused by these drones yet, but their intended use as autonomous weapons carrying explosives clearly presents a plausible risk of injury, death, or property damage. The event concerns the development and planned deployment of AI-enabled lethal autonomous systems, which fits the definition of an AI Hazard. There is no indication of realized harm or incident at this stage, so it is not an AI Incident. It is not merely complementary information or unrelated news, as the focus is on the procurement and future use of AI-enabled autonomous weapons with significant risk potential.
Thumbnail Image

Rheinmetall liefert Bundeswehr Kamikaze-Drohnen

2026-04-22
come-on.de
Why's our monitor labelling this an incident or hazard?
The drones described are AI systems because they autonomously loiter, identify, and attack targets, relying on AI software that requires continuous updates. The article focuses on the procurement and planned deployment of these AI-enabled autonomous weapons, which have a high potential for causing harm (injury or death) in military conflict. Since the drones are not yet deployed or have not caused harm as per the article, this is a plausible future risk rather than a realized incident. Hence, the event is best classified as an AI Hazard, reflecting the credible potential for harm from these AI systems.
Thumbnail Image

100 Kilometer Reichweite: Rheinmetall liefert Bundeswehr Kamikaze-Drohnen

2026-04-22
Schwarzwälder Bote
Why's our monitor labelling this an incident or hazard?
The drones described are AI systems because they autonomously fly, observe targets, and decide when to detonate or abort the mission. Their deployment as weapons with explosive payloads directly implicates them in potential harm to human life and property. The article explicitly states their use for combat and reconnaissance, with software continuously updated to maintain effectiveness against countermeasures, indicating active use and development of AI in a lethal context. This meets the criteria for an AI Incident due to direct harm potential and actual deployment in military operations.
Thumbnail Image

Rheinmetall liefert Bundeswehr Kamikaze-Drohnen

2026-04-22
Volksstimme.de
Why's our monitor labelling this an incident or hazard?
The drones described are autonomous loitering munitions that use software to identify and engage targets without human intervention, fitting the definition of AI systems. The article focuses on the procurement and future deployment of these AI-enabled weapons, which could plausibly lead to harm (injury or death) in military operations. Since the article does not report any actual harm or incident but highlights the imminent delivery and use of these systems, it constitutes an AI Hazard rather than an AI Incident. The continuous software updates and innovation clauses underscore the evolving AI capabilities and associated risks. Therefore, this event is best classified as an AI Hazard due to the credible potential for harm inherent in autonomous lethal weapons systems.
Thumbnail Image

Rheinmetall liefert Bundeswehr Kamikaze-Drohnen

2026-04-22
Börse Online
Why's our monitor labelling this an incident or hazard?
The article explicitly discusses AI-enabled kamikaze drones (loitering munitions) being developed and delivered to the military. These systems rely on AI software for autonomous target detection and engagement, which is continuously updated to counter adversary measures. While no actual harm or incident is reported, the deployment of such lethal autonomous weapons systems inherently carries a credible risk of causing injury, death, or violations of human rights. The event concerns the development and use of AI systems that could plausibly lead to significant harm, fitting the definition of an AI Hazard rather than an Incident or Complementary Information. It is not unrelated because AI systems are central to the drones' operation and the potential harms.
Thumbnail Image

WDH: Rheinmetall liefert Bundeswehr Kamikaze-Drohnen

2026-04-22
Börse Online
Why's our monitor labelling this an incident or hazard?
The drones described are AI systems because they involve autonomous or semi-autonomous flight and targeting decisions, relying on continuously updated software. The article discusses their development and imminent deployment in large numbers, which could plausibly lead to harm (injury or death) in military operations. No actual harm or misuse is reported yet, so it is not an AI Incident. The article is not merely complementary information since it focuses on the delivery and capabilities of these AI-enabled weapons, highlighting the potential risks. Hence, the event is best classified as an AI Hazard.
Thumbnail Image

Wdh: Rheinmetall liefert Bundeswehr Kamikaze-Drohnen

2026-04-22
finanzen.ch
Why's our monitor labelling this an incident or hazard?
The drones described are autonomous loitering munitions that use software to identify and engage targets without human intervention, fitting the definition of AI systems. The article discusses the procurement and planned deployment of these systems by the military, with no current incident of harm reported. However, the nature of autonomous lethal weapons inherently carries a credible risk of injury, death, and violations of human rights, making this a plausible future harm scenario. Hence, it qualifies as an AI Hazard rather than an AI Incident or Complementary Information.
Thumbnail Image

100 Kilometer Reichweite: Rheinmetall liefert Bundeswehr Kamikaze-Drohnen - Deutschland & Welt - Rhein-Zeitung

2026-04-22
Rhein-Zeitung
Why's our monitor labelling this an incident or hazard?
The event involves AI systems embedded in autonomous loitering munitions capable of independently selecting and attacking targets, which fits the definition of an AI system. The article discusses the Bundeswehr's procurement and planned deployment of these drones, emphasizing continuous software updates to maintain effectiveness. While no actual harm or incident is reported, the nature of these weapons and their autonomous capabilities plausibly could lead to injury, death, or other harms. Hence, this is not merely general AI news or complementary information but a credible AI Hazard due to the potential for significant harm inherent in autonomous weapon systems.
Thumbnail Image

Rheinmetall liefert Bundeswehr Kamikaze-Drohnen

2026-04-22
Wetterauer-Zeitung.de
Why's our monitor labelling this an incident or hazard?
The drones described are AI systems because they autonomously loiter, identify, and attack targets, relying on continuously updated software. The event concerns the Bundeswehr acquiring these AI-enabled loitering munitions, which are lethal autonomous weapons. While the article does not report a specific incident of harm caused by these drones, the nature of the system and its intended use in warfare plausibly could lead to injury or death, qualifying as a credible future harm. Hence, this is an AI Hazard rather than an AI Incident. The article focuses on the procurement and development of these AI systems, not on a realized harm or incident. Therefore, the classification is AI Hazard.
Thumbnail Image

WDH: Rheinmetall liefert Bundeswehr Kamikaze-Drohnen

2026-04-22
Boersen-Zeitung der WM Gruppe Herausgebergemeinschaft Wertpapier-Mitteilungen, Keppler, Lehmann GmbH & Co. KG (WM Gruppe)
Why's our monitor labelling this an incident or hazard?
The drones described are autonomous loitering munitions that use AI-based software to identify and attack targets without human intervention once launched. The article explicitly mentions the importance of continuously updating the software to maintain effectiveness, indicating AI system involvement in operational decision-making. The delivery and deployment of such autonomous weapons systems pose a credible risk of harm to human life, property, and communities, fulfilling the criteria for an AI Hazard. Since the article does not report any actual harm or incident resulting from their use but focuses on the procurement and future deployment, it does not qualify as an AI Incident. It is not merely complementary information because the main focus is on the acquisition and potential use of AI-enabled lethal systems with inherent risks. Therefore, the correct classification is AI Hazard.