Serial Production of AI-Enabled KIZILELMA Combat Drone Begins in Turkey

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Turkey has started serial production of KIZILELMA, an AI-powered unmanned combat aircraft. The system, highlighted by Selçuk Bayraktar, marks a shift from manned to autonomous military aviation, raising concerns about future risks and potential harm associated with the deployment of autonomous weapon platforms.[AI generated]

Why's our monitor labelling this an incident or hazard?

The article explicitly discusses the development and serial production of KIZILELMA, an unmanned combat aircraft that relies on AI and autonomous technologies for operation. Although no harm or incident is reported, the nature of the system—a weaponized autonomous drone—implies credible potential for harm in the future, such as injury, disruption, or rights violations. The event is not a realized incident but a credible hazard due to the plausible risks associated with AI-enabled autonomous weapons. It is not complementary information because the main focus is on the production start of a potentially harmful AI system, not on responses or updates to past incidents. It is not unrelated because the AI system and its implications are central to the report.[AI generated]
AI principles
AccountabilitySafetyRobustness & digital securityRespect of human rightsTransparency & explainabilityDemocracy & human autonomy

Industries
Government, security, and defenceRobots, sensors, and IT hardwareDigital security

Affected stakeholders
General public

Harm types
Physical (death)Physical (injury)Human or fundamental rightsPublic interestPsychologicalEconomic/Property

Severity
AI hazard

Business function:
ManufacturingResearch and development

AI system task:
Recognition/object detectionGoal-driven organisationReasoning with knowledge structures/planning


Articles about this incident or hazard

Thumbnail Image

TEKNOLOJİ MASASI 3 - KIZILELMA'nın seri üretimi başladı

2024-10-05
Haberler
Why's our monitor labelling this an incident or hazard?
The article explicitly discusses the development and serial production of KIZILELMA, an unmanned combat aircraft that relies on AI and autonomous technologies for operation. Although no harm or incident is reported, the nature of the system—a weaponized autonomous drone—implies credible potential for harm in the future, such as injury, disruption, or rights violations. The event is not a realized incident but a credible hazard due to the plausible risks associated with AI-enabled autonomous weapons. It is not complementary information because the main focus is on the production start of a potentially harmful AI system, not on responses or updates to past incidents. It is not unrelated because the AI system and its implications are central to the report.
Thumbnail Image

KIZILELMA'nın seri üretimi başladı

2024-10-05
TRT haber
Why's our monitor labelling this an incident or hazard?
The KIZILELMA is an AI system as it is an unmanned combat aircraft with autonomous capabilities. The article discusses the start of serial production but does not report any harm or incidents caused by the system. Given the nature of autonomous weapons, their deployment could plausibly lead to harms such as injury, disruption, or rights violations. Since no actual harm has occurred yet, the event is best classified as an AI Hazard. It is not Complementary Information because the focus is on the production start and the potential implications, not on updates or responses to past incidents. It is not Unrelated because the system clearly involves AI and potential harm.
Thumbnail Image

Merakla beklenen KIZILELMA'nın seri üretimi başladı

2024-10-05
İnternethaber
Why's our monitor labelling this an incident or hazard?
The article explicitly describes KIZILELMA as an unmanned combat aircraft with advanced AI-driven systems for flight control and mission execution. Although no incident or harm is reported, the production and potential deployment of such AI-enabled military drones inherently carry risks of harm to people, infrastructure, and rights. The event is about the start of serial production, indicating the technology is moving towards wider use, which plausibly could lead to AI Incidents in the future. Hence, it fits the definition of an AI Hazard rather than an Incident or Complementary Information. It is not unrelated because the AI system's development and use are central to the event.
Thumbnail Image

KIZILELMA için tarihi gün... Seri üretimi başladı!

2024-10-05
Ak�am
Why's our monitor labelling this an incident or hazard?
KIZILELMA is an AI-enabled unmanned combat aircraft system, involving advanced autonomous technologies. While the article focuses on production and technological progress without reporting any actual harm or incident, the nature of the system as an autonomous weapon platform implies a credible risk of future harm. The development and serial production of such AI-powered military drones can plausibly lead to AI Incidents involving injury, violation of rights, or harm to communities if used in conflict or misused. Since no harm has yet occurred or been reported, but plausible future harm exists, the event is best classified as an AI Hazard.