Iran Unveils AI-Enabled Mohajer-10 Drone with Advanced Military Capabilities

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Iran has unveiled the Mohajer-10 drone, an advanced unmanned aerial vehicle reportedly capable of autonomous operations, electronic warfare, and carrying various munitions, including miniaturized nuclear warheads. The drone is marketed to foreign buyers and is claimed to be able to strike targets in Israel, raising concerns about potential future harm from AI-enabled military technology.[AI generated]

Why's our monitor labelling this an incident or hazard?

The 'Mohajer 10' drone is an AI-enabled system given its autonomous flight capabilities, electronic warfare, and intelligence systems. Its development and deployment for military purposes with armaments imply a credible risk of causing harm, including injury or violations of rights, if used in conflict. The article does not report any actual harm or incident caused by the drone yet, but the potential for harm is clear and significant. Hence, it fits the definition of an AI Hazard rather than an AI Incident or Complementary Information. It is not unrelated because it involves an AI system with plausible future harm.[AI generated]
AI principles
SafetyAccountabilityRespect of human rightsRobustness & digital securityTransparency & explainabilityDemocracy & human autonomyHuman wellbeingSustainability

Industries
Government, security, and defenceRobots, sensors, and IT hardwareDigital securityMobility and autonomous vehicles

Affected stakeholders
General publicGovernmentWorkers

Harm types
Physical (death)Physical (injury)EnvironmentalPublic interestHuman or fundamental rightsPsychologicalEconomic/Property

Severity
AI hazard

AI system task:
Recognition/object detectionGoal-driven organisationReasoning with knowledge structures/planningEvent/anomaly detection


Articles about this incident or hazard

Thumbnail Image

پهپاد

2023-08-27
خبرگزاری باشگاه خبرنگاران | آخرین اخبار ایران و جهان | YJC
Why's our monitor labelling this an incident or hazard?
The 'Mohajer 10' drone is an AI-enabled system given its autonomous flight capabilities, electronic warfare, and intelligence systems. Its development and deployment for military purposes with armaments imply a credible risk of causing harm, including injury or violations of rights, if used in conflict. The article does not report any actual harm or incident caused by the drone yet, but the potential for harm is clear and significant. Hence, it fits the definition of an AI Hazard rather than an AI Incident or Complementary Information. It is not unrelated because it involves an AI system with plausible future harm.
Thumbnail Image

خبرگزاری فارس - تحلیلگران نظامی: ساخت پهپاد‌های پیشرفته توسط ایران در حال رشد است

2023-08-26
خبرگزاری فارس
Why's our monitor labelling this an incident or hazard?
The event involves the development and deployment of an AI-enabled military drone system with advanced capabilities. Although no direct harm or incident has occurred yet, the article clearly indicates the potential for these drones to be used in hostile actions, posing a credible risk of harm to people and communities (e.g., threats to Israel and US forces). This fits the definition of an AI Hazard, as the AI system's use could plausibly lead to an AI Incident involving harm to persons or communities. There is no indication of an actual incident or realized harm, nor is the article primarily about responses or updates, so it is not an AI Incident or Complementary Information.
Thumbnail Image

خبرگزاری فارس - پهپاد شاهد 136 ایرانی و کپی‌های خارجی‌اش+عکس

2023-08-30
خبرگزاری فارس
Why's our monitor labelling this an incident or hazard?
The Shahed 136 drone is an AI system because it is an autonomous weapon system capable of conducting attack missions. The article focuses on the production and copying of this drone by other countries, highlighting the spread of this technology. While no direct or indirect harm is reported, the proliferation of such autonomous lethal drones poses a credible risk of future harm, including injury, property damage, or harm to communities. This fits the definition of an AI Hazard, as the event plausibly leads to an AI Incident in the future. There is no indication of a realized incident or complementary information about responses or mitigation, so AI Hazard is the appropriate classification.
Thumbnail Image

این پهپاد ایرانی، اسرائیل را هدف گرفته است +ویژگی ها

2023-08-27
خبرآنلاین
Why's our monitor labelling this an incident or hazard?
The drone described is an armed unmanned aerial vehicle with advanced capabilities such as electronic warfare and reconnaissance systems, which typically involve AI for autonomous navigation, target recognition, and mission execution. The article highlights its potential to strike targets in Israel, implying a military threat. Although no actual harm or attack is reported, the existence and deployment of such a drone with AI capabilities pose a credible risk of future harm, including injury, disruption, or violation of rights. Therefore, this event qualifies as an AI Hazard due to the plausible future harm from the use of AI-enabled autonomous weapon systems.
Thumbnail Image

واکنش تحلیلگران نظامی به رونمایی ایران از پهپاد

2023-08-26
فردانیوز
Why's our monitor labelling this an incident or hazard?
The article explicitly describes a new armed drone with advanced capabilities that almost certainly involve AI systems for autonomous navigation, targeting, and electronic warfare. While no actual harm or incident is reported, the drone's capabilities and stated potential use to threaten other countries imply a credible risk of future harm to persons and communities. This fits the definition of an AI Hazard, as the development and deployment of such AI-enabled military technology could plausibly lead to injury, conflict escalation, or other harms. There is no indication that harm has already occurred, so it is not an AI Incident. The article is not merely complementary information or unrelated news, as it focuses on the unveiling of a potentially harmful AI system with military applications.
Thumbnail Image

پهپاد شاهد ۱۳۶ ایرانی و کپی‌های خارجی‌اش+عکس

2023-08-30
جوان‌آنلاين
Why's our monitor labelling this an incident or hazard?
The Shahed 136 is an AI-enabled autonomous weapon system (a kamikaze drone) designed for ground attacks, which inherently involves risks of harm to people and property. The article discusses its deployment and copies by other countries, indicating a credible risk of harm from its use. However, no specific incident of harm or malfunction is described in the article. Therefore, the event fits the definition of an AI Hazard, as the development, use, and proliferation of this AI system could plausibly lead to harm, but no concrete incident is reported here.
Thumbnail Image

این پهپاد ایرانی، اسرائیل را هدف گرفته است +ویژگی ها

2023-08-27
اصفهان امروز
Why's our monitor labelling this an incident or hazard?
The article explicitly discusses a military drone with advanced capabilities likely involving AI systems for autonomous operation and targeting. While no actual harm or attack is reported, the drone's intended use to strike targets in Israel implies a credible risk of future harm. The event is about the development and unveiling of this AI-enabled weapon system, which fits the definition of an AI Hazard due to its plausible potential to cause harm. There is no indication that harm has already occurred, so it is not an AI Incident. The article is not merely complementary information or unrelated, as it focuses on a specific AI-enabled system with potential for harm.
Thumbnail Image

اروپایی‌ها متقاضی پهپادهای ایران هستند؟! منابع حکومتی: "مهاجر۱۰" قابلیت حمل کلاهک مینیاتوری دارد

2023-08-27
kayhan.london
Why's our monitor labelling this an incident or hazard?
The event involves AI systems insofar as the drones likely incorporate AI for autonomous or semi-autonomous operation, navigation, targeting, and payload delivery. The development and export of such drones with nuclear payload capability represent a significant AI Hazard because they could plausibly lead to serious harms such as injury, disruption of critical infrastructure, and violations of international law. Since no direct harm has been reported yet, but the potential for harm is credible and significant, this event is best classified as an AI Hazard rather than an AI Incident. The article focuses on the potential risks and geopolitical implications rather than an actual incident of harm caused by the AI-enabled drones.
Thumbnail Image

تحلیلگران نظامی: ساخت پهپاد‌های پیشرفته توسط ایران در حال رشد است

2023-08-26
جوان‌آنلاين
Why's our monitor labelling this an incident or hazard?
The Mohajer-10 drone is an AI system given its autonomous or semi-autonomous capabilities and electronic warfare functions. The article does not report any actual harm caused yet but emphasizes the drone's potential to threaten other nations, implying plausible future harm. The development and unveiling of such advanced AI-enabled military drones constitute an AI Hazard because they could plausibly lead to injury, harm to communities, or disruption of critical infrastructure if used in conflict. There is no indication of an actual incident or harm having occurred, so it is not an AI Incident. The article is not merely complementary information since it focuses on the potential threat and capabilities of the AI system rather than a response or update. Hence, the classification is AI Hazard.
Thumbnail Image

Iran ra mắt chiến cơ không người lái mới

2023-08-23
laodong.vn
Why's our monitor labelling this an incident or hazard?
The Mohajer-10 is an AI system as it is an unmanned aerial vehicle with autonomous or semi-autonomous capabilities for navigation and weapon deployment. Its development and unveiling represent the creation and potential use of an AI system with military applications that could lead to harm. Since no actual harm or incident is reported yet, but the potential for harm is credible and significant, this event fits the definition of an AI Hazard rather than an AI Incident. The article focuses on the introduction of the system and its capabilities, implying plausible future risks rather than describing realized harm.
Thumbnail Image

Iran trình làng UAV tấn công giống hệt MQ-9 Reaper của Mỹ

2023-08-23
Tin tức 24h
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions the use of artificial intelligence in the development of a strategic attack UAV by Iran. Although no specific harm has yet occurred from this new UAV, the nature of the system (an AI-enabled military drone with offensive capabilities) plausibly leads to potential harms such as injury, disruption of critical infrastructure, or violations of human rights in future conflicts. Therefore, this event constitutes an AI Hazard because it involves the development and potential use of an AI system that could plausibly lead to significant harm, even though no incident has yet materialized.
Thumbnail Image

Iran ra mắt máy bay tấn công không người lái có tầm hoạt động 2.000 km

2023-08-23
Báo điện tử Tiền Phong
Why's our monitor labelling this an incident or hazard?
The event involves the development and unveiling of an AI-enabled armed drone system with advanced autonomous and AI-assisted capabilities. While no harm has yet occurred, the nature of the system and its intended military use plausibly could lead to AI Incidents such as injury, disruption, or violations of rights. Therefore, this event fits the definition of an AI Hazard, as it plausibly could lead to harm but no incident has yet been reported.
Thumbnail Image

Iran trình làng máy bay không người lái tối tân Mojaher-10

2023-08-23
VOV.vn
Why's our monitor labelling this an incident or hazard?
The Mojaher-10 UAV is an AI-enabled military system with advanced capabilities and armaments, introduced amid heightened tensions between Iran and the US. Although no actual harm or incident is reported, the AI system's development and potential use in military operations could plausibly lead to significant harms, including injury, disruption, or rights violations. The event does not describe a realized harm but highlights a credible risk associated with the AI system's deployment, fitting the definition of an AI Hazard rather than an Incident or Complementary Information.
Thumbnail Image

Iran phát triển thành công máy bay quân sự không người lái | Trung Đông | Vietnam+ (VietnamPlus)

2023-08-22
VietnamPlus
Why's our monitor labelling this an incident or hazard?
The event involves the development and deployment of a military UAV likely equipped with AI systems for autonomous or semi-autonomous operation, which fits the definition of an AI system. The article does not report any actual harm or incident caused by this UAV but highlights its capabilities and potential use in military operations. Given the weaponized nature and operational scope of the UAV, there is a credible risk that its use could lead to harms such as injury, disruption, or violations of human rights. Since no harm has yet occurred or been reported, but plausible future harm exists, the event is best classified as an AI Hazard rather than an AI Incident or Complementary Information.