Australia Invests in AI-Enabled Ghost Bat Combat Drones

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

The Australian government has allocated an additional $399 million AUD to develop the MQ-28A Ghost Bat, an AI-enabled autonomous combat drone. The funding will support the production of three advanced Block 2 drones with enhanced autonomous, sensor, and combat capabilities, raising concerns about future risks associated with armed AI military systems.[AI generated]

Why's our monitor labelling this an incident or hazard?

The Ghost Bat drone is an AI system as it is designed for autonomous or semi-autonomous operation in military missions, acting as a "loyal wingman" alongside crewed aircraft. The article focuses on funding and development progress without mentioning any actual harm or incident caused by the drone. However, the nature of the system—a military autonomous drone—implies a credible risk of future harm, such as injury or disruption, if deployed. Since no harm has yet occurred but plausible future harm exists, the event is best classified as an AI Hazard rather than an AI Incident or Complementary Information.[AI generated]
AI principles
SafetyRobustness & digital securityAccountabilityTransparency & explainabilityRespect of human rightsDemocracy & human autonomyPrivacy & data governance

Industries
Government, security, and defenceRobots, sensors, and IT hardwareMobility and autonomous vehiclesDigital security

Affected stakeholders
General public

Harm types
Physical (death)Physical (injury)Human or fundamental rightsPublic interestPsychological

Severity
AI hazard

Business function:
Research and developmentManufacturingOther

AI system task:
Recognition/object detectionGoal-driven organisationReasoning with knowledge structures/planning


Articles about this incident or hazard

Thumbnail Image

Loyal defence 'wingman' drones getting an extra $399m

2024-02-08
Daily Mail Online
Why's our monitor labelling this an incident or hazard?
The Ghost Bat drone is an AI system as it is designed for autonomous or semi-autonomous operation in military missions, acting as a "loyal wingman" alongside crewed aircraft. The article focuses on funding and development progress without mentioning any actual harm or incident caused by the drone. However, the nature of the system—a military autonomous drone—implies a credible risk of future harm, such as injury or disruption, if deployed. Since no harm has yet occurred but plausible future harm exists, the event is best classified as an AI Hazard rather than an AI Incident or Complementary Information.
Thumbnail Image

Australia Invests $400 Million Into Next-Generation Military Combat Aircraft

2024-02-11
www.theepochtimes.com
Why's our monitor labelling this an incident or hazard?
The event involves the development and use of AI systems in autonomous military drones capable of combat roles, which inherently carry risks of harm including injury to persons, disruption, and violations of rights if misused or malfunctioning. Although no harm has yet occurred, the nature of the AI system's intended use in military combat roles plausibly leads to potential harms such as injury, loss of life, or escalation of conflict. Therefore, this event qualifies as an AI Hazard due to the credible risk of future harm stemming from the deployment of AI-enabled autonomous combat drones.
Thumbnail Image

Australia to Invest a Further $260m on Next-Generation Military Drones

2024-02-09
U.S. News & World Report
Why's our monitor labelling this an incident or hazard?
The MQ-28A Ghost Bat is a military drone likely equipped with AI systems for autonomous operations, which fits the definition of an AI system. The article focuses on the investment and manufacturing of these drones without mentioning any actual harm or incident. Given the nature of AI-enabled autonomous military drones, their development and deployment could plausibly lead to harms such as injury or violations of rights in future conflicts. Therefore, this event qualifies as an AI Hazard due to the credible potential for future harm, but not an AI Incident as no harm has yet occurred.
Thumbnail Image

Aus working to create killer drone fleet

2024-02-08
News.com.au
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions AI-controlled killer drones being developed and funded by the Australian government. These drones qualify as AI systems due to their autonomous combat capabilities. The event concerns the development and use of such AI systems with lethal potential, which could plausibly lead to harms such as injury or death and violations of rights. Since the harm is potential and not yet realized, this fits the definition of an AI Hazard rather than an AI Incident.
Thumbnail Image

Australia to allocate $260 million for next-generation military drones- Republic World

2024-02-09
Republic World
Why's our monitor labelling this an incident or hazard?
The article describes the development and production of advanced military drones with combat capabilities, which almost certainly involve AI systems for autonomous functions. While no harm or incident is reported, the potential for these AI-enabled drones to cause injury, disruption, or other harms in the future is credible. Therefore, this event constitutes an AI Hazard due to the plausible future harm from the deployment and use of these AI-powered military drones. There is no indication of an actual incident or complementary information about responses or mitigation, so AI Hazard is the appropriate classification.
Thumbnail Image

Australia to invest a further $260m on next-generation military drones

2024-02-09
ThePrint
Why's our monitor labelling this an incident or hazard?
The MQ-28A Ghost Bat is a military drone designed for combat roles, which almost certainly involves AI systems for autonomous operation and decision-making. The article describes the development and manufacturing investment but does not report any actual harm or incident. However, the nature of AI-enabled autonomous military drones inherently carries plausible risks of harm, including injury or disruption, making this an AI Hazard rather than an Incident or Complementary Information. It is not unrelated because the AI system involvement is reasonably inferred from the drone's autonomous combat functions.
Thumbnail Image

Australia Boosts Combat Drone Program Amid Defense Tensions - BNN Bloomberg

2024-02-09
BNN
Why's our monitor labelling this an incident or hazard?
The event involves the development and use of AI systems in the form of combat drones, which are inherently capable of autonomous decision-making in military operations. Although no direct harm or incident is reported, the investment and deployment of these drones in a tense geopolitical environment plausibly could lead to harms such as injury, escalation of conflict, or other military-related damages. Therefore, this event fits the definition of an AI Hazard, as it could plausibly lead to an AI Incident in the future. There is no indication of realized harm or incident yet, so it is not an AI Incident. It is not merely complementary information because the focus is on the development and investment in potentially harmful AI systems rather than responses or updates to past incidents.
Thumbnail Image

Australia working to create killer drone fleet

2024-02-08
The Courier Mail
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions AI-controlled killer drones being developed and funded by the Australian government. These drones are AI systems with lethal capabilities, which inherently carry risks of causing injury or death and other harms. While no incident of harm is reported yet, the development and planned deployment of such systems constitute a credible risk of future harm. This fits the definition of an AI Hazard, as the event involves the development and use of AI systems that could plausibly lead to an AI Incident involving injury, violation of rights, or other significant harms. There is no indication that harm has already occurred, so it is not an AI Incident. The article is not merely complementary information or unrelated news, as it focuses on the potential risks of these AI systems.
Thumbnail Image

'We want to build our defence industry': Hastie welcomes govt's defence announcement

2024-02-08
Sky News Australia
Why's our monitor labelling this an incident or hazard?
The MQ-28A Ghost Bat is an AI-enabled unmanned combat aircraft, so its development involves AI systems. However, the article only reports on the government's investment and strategic intentions without mentioning any realized harm, malfunction, or misuse. There is no indication of direct or indirect harm caused by the AI system, nor a credible risk of harm described in the article. Therefore, this event does not qualify as an AI Incident or AI Hazard. It is best classified as Complementary Information because it provides context on AI development in defense and government policy but does not report any harm or plausible harm.
Thumbnail Image

Australia to invest 260M dollars in next-gen military drones

2024-02-09
Agencia Informativa Latinamericana Prensa Latina
Why's our monitor labelling this an incident or hazard?
The MQ-28A Ghost Bat is an AI-enabled military drone designed for combat roles, implying autonomous or AI-assisted functions. The article focuses on investment and manufacturing without reporting any actual harm or incidents. However, the development and deployment of such AI military systems inherently carry plausible risks of harm, including injury, disruption, or violations of human rights, especially in combat scenarios. Since no harm has yet occurred but the potential is credible and foreseeable, this event fits the definition of an AI Hazard rather than an AI Incident or Complementary Information.
Thumbnail Image

Plans to build 'Ghost Bat' killer drone fleet armed to protect fighter jets

2024-02-10
The Scottish Sun
Why's our monitor labelling this an incident or hazard?
The MQ-28A Ghost Bat is an AI system as it is an unmanned aerial vehicle designed for autonomous or semi-autonomous combat operations, including decision-making in dynamic environments. The article does not report any realized harm or incidents caused by these drones yet, but the nature of armed autonomous drones inherently carries plausible risks of injury, death, and broader harm in military conflict. The development and planned deployment of such lethal AI-enabled systems represent a credible future risk (AI Hazard) rather than an incident. The article also references similar drone warfare developments in Ukraine, underscoring the potential for harm. Hence, the classification as AI Hazard is appropriate.
Thumbnail Image

More funding to fuel Ghost Bat

2024-02-12
GlobalSecurity.org
Why's our monitor labelling this an incident or hazard?
The MQ-28A Ghost Bat is an AI system as it involves autonomous systems and integrated combat capabilities that rely on AI for operation. The article focuses on funding and development progress without reporting any realized harm or incidents. However, autonomous combat drones inherently carry plausible risks of causing injury, disruption, or other harms if deployed or malfunctioning. Since no harm has yet occurred but the system's development could plausibly lead to AI-related harm, this event fits the definition of an AI Hazard rather than an AI Incident or Complementary Information. It is not unrelated because the AI system and its potential impacts are central to the report.
Thumbnail Image

'Very significant': Ghost Bat will allow military to 'undertake more dangerous missions'

2024-02-08
Gold Coast Bulletin
Why's our monitor labelling this an incident or hazard?
The MQ-28A Ghost Bat is an AI system due to its autonomous operation and cooperative combat functions involving AI-enabled decision-making. The article focuses on the expected announcement and strategic implications of deploying this AI system, without describing any realized harm or incidents. While the system's use could plausibly lead to harm in the future given its military application, the article does not describe any current or past harm or malfunction. Therefore, this event is best classified as an AI Hazard, reflecting the plausible future risks associated with deploying autonomous combat AI systems.
Thumbnail Image

Australia pours $400m into Loyal Wingman drone programme - Airforce Technology

2024-02-09
Air Force Technology
Why's our monitor labelling this an incident or hazard?
The MQ-28A Ghost Bat is described as an autonomous combat aircraft with AI-enabled capabilities. Although no harm or incident is reported, the nature of the system—an autonomous military drone with combat roles—carries credible risks of causing injury, disruption, or other harms if deployed or malfunctioning. The article focuses on funding and development milestones, not on any realized harm or incident. Hence, it fits the definition of an AI Hazard, as the development and deployment of such AI systems could plausibly lead to AI Incidents in the future.
Thumbnail Image

Albanese Govt Pumps $400M into Next-Gen Loyal Wingman Drone

2024-02-08
Mirage News
Why's our monitor labelling this an incident or hazard?
The MQ-28A Ghost Bat is described as an autonomous collaborative combat aircraft with enhanced capabilities and integrated autonomous systems, indicating AI system involvement. There is no mention of any realized harm or incident caused by the drone so far. However, given its military combat role and autonomous nature, the development and deployment of such a system plausibly could lead to harms including injury, disruption, or violations of rights. The article focuses on funding and development progress without reporting any incident or harm, so it does not qualify as an AI Incident. It is not merely complementary information since the focus is on the development of a potentially hazardous AI system rather than responses or ecosystem context. Hence, the classification as AI Hazard is appropriate.
Thumbnail Image

Ghost Bat Receives Increased Defence Department Funding

2024-02-12
Mirage News
Why's our monitor labelling this an incident or hazard?
The MQ-28A Ghost Bat is an AI-enabled autonomous uncrewed combat aircraft system under development, involving AI for autonomous operations and integrated combat systems. Although the article does not report any actual harm or incidents, the nature of the system and its military application imply a credible risk of future harm, including injury or disruption, if deployed. Therefore, this event qualifies as an AI Hazard due to the plausible future harm from the development and deployment of autonomous combat drones.
Thumbnail Image

Australia Advances MQ-28A Ghost Bat Program, Injects Additional $260M

2024-02-09
The Defense Post
Why's our monitor labelling this an incident or hazard?
The MQ-28A Ghost Bat is an autonomous drone designed for reconnaissance, electronic warfare, and armed combat roles, implying the use of AI systems for autonomous navigation, decision-making, and mission execution. The funding and development of such armed autonomous drones pose plausible risks of harm, including injury or harm to persons in conflict scenarios, disruption of critical infrastructure, or other significant harms due to their military use. Although no specific harm has been reported yet, the nature of the system and its intended use as an armed autonomous drone make it a credible AI Hazard, as it could plausibly lead to AI Incidents involving harm in the future.
Thumbnail Image

Next steps for the Ghost Bat | The Strategist

2024-02-12
The Strategist
Why's our monitor labelling this an incident or hazard?
The MQ-28A Ghost Bat is an AI system as it is an autonomous UAV designed for combat roles involving complex decision-making and teaming with crewed aircraft. The article centers on the development, investment, and strategic implications of this AI system but does not report any actual harm or incident resulting from its use or malfunction. It discusses potential future operational concepts and the importance of scaling production to reduce risk, which implies plausible future risks but does not document any current harm or incident. Therefore, this event is best classified as Complementary Information, providing context and updates on AI system development and strategic planning without reporting an AI Incident or AI Hazard.
Thumbnail Image

Release the Ghost Bats: Australia tries to sell AUKUS autonomous air superiority

2024-02-10
The Mandarin
Why's our monitor labelling this an incident or hazard?
The MQ-28 Ghost Bat is an AI system as it is an autonomous combat drone capable of making decisions and coordinating with other aircraft. The article discusses its development and potential deployment, which could plausibly lead to harm such as injury or death in combat, disruption of military operations, or escalation of conflict. No actual harm or incident is reported, so it does not qualify as an AI Incident. However, the development and planned sale of such autonomous lethal AI systems constitute an AI Hazard due to the credible risk of future harm inherent in their use. The article focuses on the technology's capabilities and strategic implications rather than reporting any realized harm or incident, so it is not Complementary Information or Unrelated.
Thumbnail Image

Defence 'wingman' drones get extra $399 million | Canberra CityNews

2024-02-08
Canberra CityNews
Why's our monitor labelling this an incident or hazard?
The article explicitly discusses the development and funding of an AI-enabled autonomous military drone system (the Ghost Bat) designed to operate as a "loyal wingman" alongside crewed aircraft. This clearly involves an AI system. There is no indication that any harm has yet occurred, so it is not an AI Incident. However, given the nature of autonomous military drones and their potential to cause injury, property damage, or other harms if misused or malfunctioning, the event plausibly could lead to an AI Incident in the future. Thus, it qualifies as an AI Hazard. The article does not focus on responses, updates, or broader ecosystem context, so it is not Complementary Information. It is not unrelated as it clearly involves AI systems in a defense context.
Thumbnail Image

Australia allocates A$399 million for Ghost Bat development - APDR

2024-02-11
APDR
Why's our monitor labelling this an incident or hazard?
The MQ-28A Ghost Bat is an AI-enabled autonomous combat aircraft, which qualifies as an AI system under the definition. The event concerns the development and funding of this system, which could plausibly lead to future harms given its military and autonomous nature. However, no actual harm, malfunction, or incident is reported in the article. Therefore, this event is best classified as an AI Hazard, reflecting the plausible future risk associated with the development of autonomous combat AI systems, but without any realized harm at this stage.
Thumbnail Image

Australia funds three more MQ-28A Ghost Bat drones

2024-02-09
AeroTime
Why's our monitor labelling this an incident or hazard?
The MQ-28A Ghost Bat is an AI system due to its autonomous functionalities and AI-enabled operations. While no harm has been reported yet, the development and deployment of autonomous combat drones pose plausible risks of harm, including injury, disruption, or violations of rights, if misused or malfunctioning. Therefore, this event represents an AI Hazard as it plausibly could lead to AI Incidents in the future due to the nature of the system and its intended military use.
Thumbnail Image

Aussies add $400M AUD for Boeing's Ghost Bat loyal wingman, to unveil an armed UAV this year

2024-02-09
Breaking Defense
Why's our monitor labelling this an incident or hazard?
The event involves AI systems (autonomous drones with loyal wingman software) in development and testing phases, with potential future military applications. There is no indication of any direct or indirect harm caused by these AI systems at this time. The article discusses future capabilities and investments without reporting any incidents or hazards. Therefore, this event is best classified as an AI Hazard because the development and potential deployment of armed autonomous drones plausibly could lead to AI incidents in the future, given their combat roles and autonomous capabilities. However, since no harm has yet occurred, it is not an AI Incident. It is not Complementary Information because it is not an update or response to a prior incident, nor is it unrelated as it clearly involves AI systems with potential for harm.