Tesla FSD Beta Fails to Detect Flooded Road, Leading to Crash and Owner's Lawsuit Threat

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

A Tesla Model 3 using Full Self-Driving (FSD) Beta drove into a flooded road in California after both the AI system and the driver ignored warning signs. The car hydroplaned and became stranded in water, causing property damage. The owner blames Tesla and plans to sue, highlighting FSD's limitations.[AI generated]

Why's our monitor labelling this an incident or hazard?

The Tesla FSD Beta is an AI system involved in the event. The AI system failed to recognize the flood warning sign and did not react appropriately to the hazardous road conditions, contributing indirectly to the vehicle being submerged and damaged. The driver also ignored warnings and did not intervene in time, which is a form of user error but the AI system's malfunction or limitation played a pivotal role. This constitutes harm to property, meeting the criteria for an AI Incident. There is no indication of injury or other types of harm, but the property damage is sufficient for classification as an AI Incident rather than a hazard or complementary information.[AI generated]
AI principles
SafetyRobustness & digital securityTransparency & explainabilityAccountability

Industries
Mobility and autonomous vehiclesConsumer products

Affected stakeholders
Consumers

Harm types
Economic/PropertyReputational

Severity
AI incident

Business function:
Research and developmentMonitoring and quality control

AI system task:
Recognition/object detectionGoal-driven organisationReasoning with knowledge structures/planning


Articles about this incident or hazard

Thumbnail Image

Tesla Model 3 Owner Ignores Warnings, Sends Car Into Flood Water While Using FSD

2023-08-21
Yahoo News
Why's our monitor labelling this an incident or hazard?
The Tesla FSD Beta is an AI system involved in the event. The AI system failed to recognize the flood warning sign and did not react appropriately to the hazardous road conditions, contributing indirectly to the vehicle being submerged and damaged. The driver also ignored warnings and did not intervene in time, which is a form of user error but the AI system's malfunction or limitation played a pivotal role. This constitutes harm to property, meeting the criteria for an AI Incident. There is no indication of injury or other types of harm, but the property damage is sufficient for classification as an AI Incident rather than a hazard or complementary information.
Thumbnail Image

Submarine Mode?: 'Full Self-Driving' Software Guides Tesla into California Floodwater Pond

2023-08-23
Breitbart
Why's our monitor labelling this an incident or hazard?
The Tesla Full Self-Driving software is an AI system that autonomously controls the vehicle. In this case, the AI system failed to detect and respond to flood warning signs and hazardous road conditions, resulting in the vehicle entering floodwater and becoming partially submerged. This caused harm to property (the vehicle) and posed a risk to the driver's safety. Although the driver ignored warnings and was expected to take control, the AI system's malfunction was a direct contributing factor. Hence, the event meets the criteria for an AI Incident due to the AI system's malfunction and use leading to harm.
Thumbnail Image

Watch Tesla Model 3 dive into flood water while running on FSD Beta

2023-08-22
Motor1.com
Why's our monitor labelling this an incident or hazard?
The Tesla FSD Beta is an AI system designed to autonomously assist driving. The incident describes a malfunction where the AI did not heed a flood warning and failed to reduce speed, leading to the vehicle aquaplaning and submerging in water. This caused harm to property and posed a risk to the driver's safety. The driver's overreliance on the AI system and failure to intervene also contributed. The event meets the criteria for an AI Incident as the AI system's malfunction directly led to harm (property damage and potential injury).
Thumbnail Image

Tesla driver watches while FSD sends his Model 3 into a giant puddle - Autoblog

2023-08-21
Autoblog
Why's our monitor labelling this an incident or hazard?
The Tesla FSD system is an AI system involved in autonomous driving decisions. The incident occurred because the AI system did not appropriately respond to the flooded road conditions, and the driver failed to intervene as recommended. This led to the vehicle losing control and ending up in a puddle, causing harm to property and potential risk to safety. The AI system's malfunction and the user's misuse (overreliance) directly contributed to the harm, fitting the definition of an AI Incident.
Thumbnail Image

Video shows Tesla FSD turning Model 3 into impromptu dinghy after hydroplaning on flooded road while owner blames Tesla and wants to sue

2023-08-22
Notebookcheck
Why's our monitor labelling this an incident or hazard?
Tesla's Full Self-Driving software is an AI system involved in the vehicle's operation. The incident resulted from the vehicle's failure to slow down and avoid a flooded road section, leading to hydroplaning and the vehicle becoming stranded. This is a direct harm related to the AI system's use and malfunction or limitations. The owner's intention to sue Tesla highlights the perceived responsibility of the AI system in the incident. Therefore, this qualifies as an AI Incident due to direct harm caused by the AI system's use and malfunction.
Thumbnail Image

Tesla Model 3 Using FSD Beta Drives Right Into Flood Waters

2023-08-22
Jalopnik
Why's our monitor labelling this an incident or hazard?
The Tesla FSD Beta is an AI system involved in autonomous vehicle operation. The incident involves the AI system's malfunction in failing to detect and respond appropriately to flood warnings, leading to the vehicle entering flood waters and sustaining damage. This constitutes harm to property and potential harm to the driver, fulfilling the criteria for an AI Incident. The AI system's failure to act appropriately is a direct cause of the harm, even though the driver was expected to intervene. Hence, the event is classified as an AI Incident.
Thumbnail Image

The driver of the Tesla Model 3 seems to have forgotten that drivers are still responsible when FSD is enabled

2023-08-23
Carscoops
Why's our monitor labelling this an incident or hazard?
The Tesla Full Self-Driving Beta is an AI system designed to assist driving. In this case, the system failed to recognize a flooded road hazard and did not brake or alert the driver, leading to a crash into floodwaters. The harm includes damage to the vehicle and potential risk to the driver's safety. The incident stems from the AI system's malfunction and the user's failure to remain attentive as required. Therefore, this event meets the criteria for an AI Incident due to direct harm caused by the AI system's failure and its role in the accident.
Thumbnail Image

Tesla Driver Ignores Road Sign, His Car Drives Into Pond Running on FSD Beta

2023-08-21
autoevolution
Why's our monitor labelling this an incident or hazard?
The Tesla FSD Beta is an AI system designed to autonomously control the vehicle. The system's failure to detect and react to the flood warning sign and flooded road directly led to the car entering the pond and being damaged. This is a clear case where the AI system's malfunction during use caused harm to property and posed a risk to the driver's safety. The driver’s reliance on the AI system and the system’s failure to act appropriately fulfill the criteria for an AI Incident under the framework.
Thumbnail Image

Tesla using Full Self Driving ends up sunk in huge puddle

2023-08-23
Driving
Why's our monitor labelling this an incident or hazard?
The Tesla Full Self-Driving system is an AI system that makes real-time driving decisions. The incident describes the vehicle being sunk due to the AI system's guidance despite flood warnings, indicating a malfunction or failure in the AI's operation. This caused harm to property (the vehicle) and potential risk to the driver, fitting the definition of an AI Incident where the AI system's use directly led to harm.
Thumbnail Image

2023 Tesla Model Y Owner Gets FSD Beta, Finds Out It Doesn't Work on His EV

2023-08-21
autoevolution
Why's our monitor labelling this an incident or hazard?
The event explicitly involves an AI system (Tesla's FSD Beta, an AI-based ADAS). The harm arises from the use of the AI system—specifically, the sale and attempted use of a software feature that does not work on the customer's vehicle due to hardware incompatibility. This led to financial harm and consumer rights violations because the customer was not properly informed and was allowed to pay for a non-functional product. Although no physical harm or safety incident occurred, the harm to consumer rights and potential financial loss meets the criteria for an AI Incident. The event is not merely a product announcement or general news, nor is it a potential future harm; the harm has already occurred. Therefore, the classification is AI Incident.
Thumbnail Image

Teslin autopilot završio u vodi - slede tužbe VIDEO

2023-08-24
B92
Why's our monitor labelling this an incident or hazard?
The Tesla Full Self Driving system is an AI system involved in autonomous vehicle operation. The incident describes a failure of the AI system to recognize a flood warning sign, leading the vehicle into a hazardous flooded area. The driver did not intervene in time, resulting in aquaplaning and stopping in deep water, which poses a direct risk of injury or harm. The AI system's malfunction and the driver's overreliance on it directly contributed to the hazardous event. This fits the definition of an AI Incident as the AI system's malfunction directly led to a harmful or dangerous situation involving potential injury or harm to a person.
Thumbnail Image

Vozač Tesle ignorisao saobraćajne znakove i završio u vodi - sada želi da tuži sve redom

2023-08-24
Telegraf.rs
Why's our monitor labelling this an incident or hazard?
The Tesla Full Self Driving system is an AI system designed to assist with autonomous driving. The incident describes a malfunction or failure of this AI system to recognize a critical traffic sign, which directly contributed to the vehicle entering a flooded area and the subsequent accident. The harm includes potential injury and property damage. The driver's overreliance on the AI system and the system's failure to alert or react appropriately are central to the incident. Therefore, this qualifies as an AI Incident due to the direct involvement of an AI system leading to harm.
Thumbnail Image

Teslin autopilot završio u vodi. Slijede tužbe | Raport.ba

2023-08-25
Raport.ba
Why's our monitor labelling this an incident or hazard?
The Tesla autopilot system is an AI system designed to assist driving. The failure to recognize the flood warning sign and the subsequent driving into deep water caused harm (property damage and risk to health). The autopilot's malfunction or inability to respond appropriately is a direct factor in the incident. Although the driver also bears responsibility, the AI system's role is pivotal. Therefore, this event meets the criteria for an AI Incident.
Thumbnail Image

Este Tesla Model 3 acaba sumergido en agua: ¿qué sucedió?

2023-08-22
Motor1.com
Why's our monitor labelling this an incident or hazard?
The Tesla FSD Beta is an AI system involved in autonomous driving. Its failure to recognize or respond to the flooded road hazard directly led to the vehicle's loss of control and submersion, causing harm to property and posing a risk to the driver. The driver’s reliance on the AI system without intervention contributed to the incident. Although no injury occurred, the significant property damage and potential safety risk meet the criteria for an AI Incident. The event is not merely a hazard or complementary information, as harm has materialized due to the AI system's malfunction and use.