AeroVironment Expands AI-Enabled Switchblade Loitering Munitions Line

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

AeroVironment has unveiled new AI-enabled Switchblade loitering munitions, including the Switchblade 400, 600 Block 2, and 300 Block 20, featuring autonomous target recognition and engagement. These lethal autonomous weapon systems, designed for military use, raise significant risks of future harm due to their AI-driven capabilities.[AI generated]

Why's our monitor labelling this an incident or hazard?

The Switchblade systems incorporate AI for target detection and tracking and are used in military combat, which involves direct risks of injury or death. Although no specific incident of harm is reported, the expansion and increased production of these AI-enabled autonomous weapons plausibly lead to significant harm. The article focuses on the development and deployment of these systems rather than a realized harm event, fitting the definition of an AI Hazard. The presence of AI systems is explicit, and the potential for harm is credible and significant, but no direct or indirect harm is described as having occurred in this article.[AI generated]
AI principles
AccountabilitySafetyHuman wellbeingRespect of human rightsDemocracy & human autonomy

Industries
Government, security, and defenceRobots, sensors, and IT hardware

Affected stakeholders
General public

Harm types
Physical (death)Physical (injury)Human or fundamental rights

Severity
AI hazard

AI system task:
Recognition/object detectionGoal-driven organisation


Articles about this incident or hazard

Thumbnail Image

AeroVironment expands Switchblade loitering munition family with 3 new variants By Investing.com

2025-10-13
Investing.com India
Why's our monitor labelling this an incident or hazard?
The Switchblade systems incorporate AI for target detection and tracking and are used in military combat, which involves direct risks of injury or death. Although no specific incident of harm is reported, the expansion and increased production of these AI-enabled autonomous weapons plausibly lead to significant harm. The article focuses on the development and deployment of these systems rather than a realized harm event, fitting the definition of an AI Hazard. The presence of AI systems is explicit, and the potential for harm is credible and significant, but no direct or indirect harm is described as having occurred in this article.
Thumbnail Image

AeroVironment eyes new factory, drone launches for Switchblade

2025-10-13
Yahoo
Why's our monitor labelling this an incident or hazard?
The Switchblade loitering munition is an AI system because it autonomously navigates and targets for lethal strikes. The article focuses on the expansion of production and new launch/control methods, including integration with other drones, which could plausibly lead to increased harm in conflict zones. Although no specific incident of harm is described, the development and proliferation of such autonomous weapons systems inherently carry credible risks of injury, death, and violations of human rights. Hence, this is an AI Hazard rather than an AI Incident or Complementary Information. It is not unrelated because the AI system and its potential for harm are central to the article.
Thumbnail Image

AV Unveils Next Generation of Switchblade Loitering Munitions

2025-10-13
wallstreet:online
Why's our monitor labelling this an incident or hazard?
The Switchblade loitering munitions incorporate AI systems for autonomous target recognition and decision-making in lethal military applications. The development and deployment of such AI-enabled autonomous weapons systems pose a credible risk of harm, including injury or death to persons and potential violations of human rights. Although the article does not describe a specific incident of harm occurring, the nature of these AI systems and their intended use as lethal autonomous weapons means they could plausibly lead to significant harm. Therefore, this event qualifies as an AI Hazard under the framework, as it highlights the development and introduction of AI-enabled autonomous weapons with high potential for misuse and harm.
Thumbnail Image

AeroVironment Launches New Drone, Loitering Munitions | Aviation Week Network

2025-10-13
Aviation Week
Why's our monitor labelling this an incident or hazard?
The loitering munitions and drones described are autonomous or semi-autonomous systems that rely on AI for navigation, targeting, and operational functions. The development and offering for sale of such AI-enabled weapons systems with lethal capabilities constitute an AI Hazard because they could plausibly lead to injury, death, or other harms in military conflicts. Although no specific harm is reported as having occurred yet, the nature of these systems and their intended use imply credible risks of future harm, fitting the definition of an AI Hazard rather than an Incident or Complementary Information.
Thumbnail Image

AUSA 2025: AV launches new Switchblade for LASSO

2025-10-13
Janes.com
Why's our monitor labelling this an incident or hazard?
The Switchblade 400 is an AI system as it includes an NVIDIA-based processor supporting automatic and aided target recognition, indicating AI-driven autonomous functions. The event concerns the development and introduction of a new AI-enabled weapon system with lethal capabilities, which could plausibly lead to harm if used in conflict. However, no actual harm or incident is reported in the article, only the announcement and specifications of the system. This fits the definition of an AI Hazard, as the system's development and intended use could plausibly lead to harm, but no harm has yet occurred or been reported.
Thumbnail Image

AV Unveils Next Generation of Switchblade® Loitering Munitions

2025-10-13
Eagle-Tribune
Why's our monitor labelling this an incident or hazard?
The Switchblade loitering munitions are AI systems designed for autonomous or semi-autonomous lethal operations. The announcement of new models expands the availability and capabilities of such weapons, which inherently carry risks of causing harm in military contexts. Although no actual harm or incident is described, the development and proliferation of these AI-enabled weapons constitute an AI Hazard due to their potential to lead to injury, violations of human rights, and other significant harms in the future.
Thumbnail Image

AV Adds To Switchblade, UAS Product Lines In Pursuit Of Army LASSO, MRR Programs - Defense Daily

2025-10-13
Defense Daily
Why's our monitor labelling this an incident or hazard?
The event involves the development and deployment of AI-enabled autonomous weapon systems (loitering munitions and drones) which are known to carry significant risks of harm. Although no specific harm or incident is reported, the nature of these AI systems and their potential use in military contexts plausibly could lead to AI Incidents involving injury, disruption, or rights violations. Therefore, this event qualifies as an AI Hazard due to the credible potential for future harm stemming from these AI systems.
Thumbnail Image

AeroVironment Unveils Next Generation of Switchblade® Loitering Munitions

2025-10-13
Green Stock News
Why's our monitor labelling this an incident or hazard?
The event involves AI systems explicitly described as autonomous loitering munitions with AI/ML-enabled automatic target recognition and autonomous threat engagement capabilities. These systems are designed for lethal military use, which inherently involves risks of injury or death to persons and harm to communities. Although no specific incident of harm or malfunction is reported in the article, the development and expanded production of such AI-enabled autonomous weapons plausibly lead to significant harms, including lethal outcomes and potential violations of human rights and international humanitarian law. The article's focus on new capabilities and increased production capacity indicates a credible risk of future harm rather than a realized incident. Hence, the event fits the definition of an AI Hazard rather than an AI Incident or Complementary Information.
Thumbnail Image

Armored soldiers get first live-fire work on Switchblade 600

2025-10-14
Military Times
Why's our monitor labelling this an incident or hazard?
The Switchblade 600 is an AI system as it involves autonomous drone flight and target engagement using onboard cameras and operator control, indicating AI involvement in decision-making. The article describes its use in live-fire exercises but does not mention any injury, malfunction, or violation of rights occurring during these tests. Therefore, no AI Incident has occurred. However, the system's nature as an autonomous weapon capable of lethal force means its deployment plausibly could lead to harm (injury or death) in future use. Hence, this event is best classified as an AI Hazard, reflecting the credible risk posed by the AI system's use in military operations.
Thumbnail Image

Switchblade and MQ-9 Integration Underway

2025-10-14
RayHaber | RaillyNews
Why's our monitor labelling this an incident or hazard?
The Switchblade loitering munitions involve AI systems for target recognition and autonomous or semi-autonomous operation. The article focuses on the development, production expansion, and integration of these AI-enabled weapons, which could plausibly lead to significant harm in future conflicts. Although no specific incident of harm is described, the nature of these weapons and their increased deployment constitute an AI Hazard due to the credible risk of injury, violation of rights, and harm to communities. There is no indication of a realized harm event in the article, so it does not qualify as an AI Incident. The article is not merely complementary information since it highlights the expansion and integration of AI-enabled lethal systems with potential for harm.
Thumbnail Image

New Lethal Technology Takes Center Stage: 1st Cavalry Division Tests the Switchblade 600 at Fort Hood

2025-10-15
LifeZette
Why's our monitor labelling this an incident or hazard?
The Switchblade 600 is an AI system as it involves autonomous or semi-autonomous drone operation with onboard cameras and target engagement capabilities. Its use in live-fire exercises and deployment in combat scenarios directly relates to potential harm to persons and communities through lethal force. While the article does not describe an accident or malfunction causing unintended harm, the deployment and use of such AI-enabled lethal autonomous weapons systems inherently involve direct harm potential. This qualifies as an AI Hazard because the system's use could plausibly lead to injury or harm to persons. However, since the article only describes testing and training without any actual harm or incident occurring, it does not meet the threshold for an AI Incident. Therefore, the event is best classified as an AI Hazard due to the plausible future harm from the use of this AI-enabled lethal weapon system.
Thumbnail Image

AeroVironment to produce 14,400 Switchblades loitering munitions per year in the US

2025-10-15
Army Recognition
Why's our monitor labelling this an incident or hazard?
The Switchblade loitering munitions incorporate AI features like automated target recognition and autonomous control, qualifying them as AI systems. The article focuses on the expansion of production capacity to thousands of units per year, indicating a significant increase in availability and potential use. Although no direct harm is reported in this article, the deployment of AI-enabled autonomous weapons inherently carries risks of injury, death, and broader harm to communities. The event thus fits the definition of an AI Hazard, as the development and mass production of these systems could plausibly lead to AI Incidents involving harm. There is no indication of an actual incident or harm having occurred in this report, so it is not an AI Incident. It is not merely complementary information because the main focus is on the production scale and capabilities that imply future risk, not on responses or ecosystem context. Therefore, the classification is AI Hazard.
Thumbnail Image

AUSA 2025: AeroVironment showcases new variant of Switchblade loitering munition family

2025-10-15
Shephard Media
Why's our monitor labelling this an incident or hazard?
The Switchblade 400 is an AI-enabled autonomous weapon system with capabilities for target recognition and autonomous operation. Its development and proliferation represent a credible AI hazard because such systems could plausibly lead to injury, loss of life, or other harms if used in conflict. Since the article does not report any actual harm or incident but focuses on the unveiling and features of the system, it fits the definition of an AI Hazard rather than an AI Incident. The AI system's involvement is in its development and intended use, which plausibly could lead to harm.