AI-Enabled Combat Drone Crashes During Test in California

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

A General Atomics YFQ-42A Dark Merlin, an AI-enabled semi-autonomous combat drone developed for the U.S. Air Force's Collaborative Combat Aircraft program, crashed shortly after takeoff during a test in California. No injuries occurred, but the incident halted flight testing and triggered an investigation into the AI system's malfunction.[AI generated]

Why's our monitor labelling this an incident or hazard?

The YFQ-42A is a semi-autonomous drone relying on AI for mission autonomy and flight operations. The crash during a test flight is a malfunction of the AI system or its integration, directly causing harm to property (the drone) and disruption of the USAF's critical military testing infrastructure. The event involves the use and malfunction of an AI system, with realized harm (crash damage and operational disruption). No injuries occurred, but harm to property and disruption of critical infrastructure are sufficient to classify this as an AI Incident under the OECD framework. The event is not merely a hazard or complementary information, as harm has materialized, nor is it unrelated.[AI generated]
AI principles
SafetyRobustness & digital security

Industries
Government, security, and defenceRobots, sensors, and IT hardware

Affected stakeholders
BusinessGovernment

Harm types
Economic/Property

Severity
AI incident

Business function:
Research and development

AI system task:
Goal-driven organisation


Articles about this incident or hazard

Thumbnail Image

U.S. F-47's "Buddy" YFQ-42A "Dark Merlin" Crashes During Test Flight; CCA Program Testing Paused

2026-04-08
Latest Asian, Middle-East, EurAsian, Indian News
Why's our monitor labelling this an incident or hazard?
The YFQ-42A is a semi-autonomous drone relying on AI for mission autonomy and flight operations. The crash during a test flight is a malfunction of the AI system or its integration, directly causing harm to property (the drone) and disruption of the USAF's critical military testing infrastructure. The event involves the use and malfunction of an AI system, with realized harm (crash damage and operational disruption). No injuries occurred, but harm to property and disruption of critical infrastructure are sufficient to classify this as an AI Incident under the OECD framework. The event is not merely a hazard or complementary information, as harm has materialized, nor is it unrelated.
Thumbnail Image

General Atomics CCA Drone Crashes After Takeoff, Company Pauses Flight Tests

2026-04-07
autoevolution
Why's our monitor labelling this an incident or hazard?
The drone is an AI system due to its autonomous capabilities and role as a collaborative combat aircraft. The crash is a malfunction during use, directly causing harm to property (the drone) and disruption of operations (flight tests paused). Even though no injuries occurred, the damage to the drone and operational disruption meet the criteria for an AI Incident. The event is not merely a potential hazard or complementary information, as harm has already occurred.
Thumbnail Image

Autonomous prototype fighter jet crashes in California desert

2026-04-07
Washington Times
Why's our monitor labelling this an incident or hazard?
The YFQ-42A is an autonomous fighter jet prototype employing AI and machine learning for autonomous operation. The crash during takeoff is a direct malfunction of this AI system. While no injuries occurred, the destruction of the prototype is harm to property. The event involves the use and malfunction of an AI system, meeting the criteria for an AI Incident. The investigation and pause in testing further confirm the significance of the incident. Hence, it is not merely a hazard or complementary information but an AI Incident.
Thumbnail Image

General Atomics ASI Pauses YFQ-42A Flights After Mishap

2026-04-08
Aviation Week
Why's our monitor labelling this an incident or hazard?
The YFQ-42A is an autonomous uncrewed aircraft, which qualifies as an AI system due to its autonomous operation. The mishap following takeoff indicates a malfunction or failure during use. However, the article does not report any actual harm such as injury, property damage, or violation of rights. The investigation is ongoing, and operations are paused to prevent further risk. Given the plausible risk of harm from such a malfunction, but no realized harm yet, the event fits the definition of an AI Hazard rather than an AI Incident. It is not Complementary Information because the main focus is the mishap event itself, not a response or update to a prior incident. It is not Unrelated because the event involves an AI system and a safety-related incident.
Thumbnail Image

BREAKING : US AI war drone crashes seconds after takeoff attemp | News.az

2026-04-07
News.az
Why's our monitor labelling this an incident or hazard?
The YFQ-42A Dark Merlin is a semi-autonomous combat aircraft with AI-enabled autonomous control capabilities. The crash during takeoff is a malfunction of the AI system controlling the aircraft, directly causing harm to property (the drone) and disruption of flight testing operations, which are critical infrastructure for military development. No injuries occurred, but the AI system's malfunction is central to the incident. Therefore, this event meets the criteria for an AI Incident.
Thumbnail Image

GA-ASI Lays Out No Date for Resumption Of YFQ-42A Flights After Mishap - Defense Daily

2026-04-07
Defense Daily
Why's our monitor labelling this an incident or hazard?
The YFQ-42A is a prototype drone likely equipped with AI for autonomous or semi-autonomous operation, thus qualifying as an AI system. The mishap during flight testing indicates a malfunction or failure in the AI system's use. There is no mention of actual injury, property damage, or other harms occurring, only that the company is investigating. Therefore, the event represents a plausible risk of harm due to AI system malfunction but no realized harm yet, fitting the definition of an AI Hazard.
Thumbnail Image

General Atomics Drone Prototype Crashes in California

2026-04-07
Manufacturing.net
Why's our monitor labelling this an incident or hazard?
The drone is described as an autonomous uncrewed aircraft, which qualifies as an AI system. The crash is a malfunction during testing, with no injuries or other harm reported. Since no harm has occurred but the malfunction could plausibly lead to harm in other circumstances, this fits the definition of an AI Hazard. There is no indication of realized harm or violation of rights, so it is not an AI Incident. The article is not merely complementary information or unrelated, as it reports a specific event involving an AI system with potential safety implications.
Thumbnail Image

Air Force Wants Nearly $1 Billion to Start Buying CCAs in 2027

2026-04-06
Air & Space Forces Magazine
Why's our monitor labelling this an incident or hazard?
The article explicitly describes the development and planned procurement of AI-enabled semi-autonomous combat drones (CCAs). While no harm or incident has occurred yet, the nature of these systems and their intended use in combat scenarios imply a plausible risk of future harm, such as injury or disruption. The article does not describe any actual harm or misuse, nor does it focus on responses or updates to prior incidents. Hence, it fits the definition of an AI Hazard, as the event plausibly could lead to an AI Incident in the future due to the deployment of autonomous weapon systems.