10,000 Australian Tesla Owners Sue Over AI 'Phantom Braking' Malfunction

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Around 10,000 Tesla drivers in Australia have filed a class-action lawsuit against Tesla, alleging that the company's AI-powered autopilot and automatic braking systems suffer from 'phantom braking'—unexpected, unwarranted stops that pose safety risks. The lawsuit claims Tesla failed to address these defects, endangering drivers.[AI generated]

Why's our monitor labelling this an incident or hazard?

The Tesla Autopilot system is an AI system providing autonomous driving assistance. The reported 'phantom braking' is a malfunction of this AI system that has directly caused accidents and fatalities, constituting injury and harm to people. The lawsuit and regulatory investigations confirm the harm has occurred. Therefore, this event meets the criteria for an AI Incident due to the AI system's malfunction leading to direct harm.[AI generated]
AI principles
SafetyRobustness & digital securityAccountabilityTransparency & explainabilityDemocracy & human autonomy

Industries
Mobility and autonomous vehicles

Affected stakeholders
Consumers

Harm types
Physical (injury)Economic/PropertyReputational

Severity
AI incident

Business function:
Monitoring and quality controlResearch and development

AI system task:
Recognition/object detectionForecasting/predictionGoal-driven organisationReasoning with knowledge structures/planning


Articles about this incident or hazard

Thumbnail Image

Demanda colectiva de 10.000 clientes contra Tesla: "Imagínese conduciendo por la autopista con el piloto automático activado y que este frene sin motivo aparente"

2025-06-12
LaVanguardia
Why's our monitor labelling this an incident or hazard?
The Tesla Autopilot system is an AI system providing autonomous driving assistance. The reported 'phantom braking' is a malfunction of this AI system that has directly caused accidents and fatalities, constituting injury and harm to people. The lawsuit and regulatory investigations confirm the harm has occurred. Therefore, this event meets the criteria for an AI Incident due to the AI system's malfunction leading to direct harm.
Thumbnail Image

Unos 10.000 conductores demandan a Tesla por problemas del frenado automático

2025-06-12
Diario1
Why's our monitor labelling this an incident or hazard?
The event clearly involves an AI system, Tesla's autopilot and automatic braking, which is designed to assist driving by making real-time decisions. The reported 'phantom braking' is a malfunction of this AI system causing unexpected braking without real obstacles, which is a direct safety hazard. The harm is realized or at least strongly implied, as the malfunction can cause accidents and endanger drivers. Therefore, this qualifies as an AI Incident because the AI system's malfunction has directly led to or caused harm or risk of harm to people. The presence of a large class action lawsuit further supports the materialization of harm or significant risk thereof.
Thumbnail Image

Unos 10.000 conductores en Australia demandan a Tesla por problemas del frenado automático | Periódico Zócalo | Noticias de Saltillo, Torreón, Piedras Negras, Monclova, Acuña

2025-06-12
Zócalo Saltillo
Why's our monitor labelling this an incident or hazard?
The automatic braking and autopilot features in Tesla vehicles are AI systems designed to assist driving by interpreting sensor data and making decisions. The reported 'phantom braking' is a malfunction of this AI system, causing unexpected braking that poses safety risks and could lead to accidents, which is harm to persons. The lawsuit alleges that Tesla was aware of these defects and did not adequately address them, indicating the AI system's malfunction has directly led to harm or risk of harm. This fits the definition of an AI Incident because the AI system's malfunction has caused or could cause injury or harm to people.
Thumbnail Image

Unos 10.000 conductores en Australia demandan a Tesla por problemas del frenado automático - ON ECONOMIA

2025-06-12
ElNacional.cat
Why's our monitor labelling this an incident or hazard?
The Tesla autopilot and automatic braking system qualify as AI systems because they perform autonomous driving tasks involving real-time decision-making. The reported 'phantom braking' is a malfunction of this AI system that can cause dangerous driving conditions and potential accidents, thus directly leading to harm or risk of harm to people. The lawsuit claims that Tesla knew about these defects and did not adequately address them, indicating the AI system's malfunction is central to the harm. Therefore, this event meets the criteria for an AI Incident due to the direct link between the AI system's malfunction and potential injury or harm to drivers.
Thumbnail Image

Nueva demanda colectiva a Tesla en Australia por su "frenado fantasma" automático

2025-06-12
infotaller
Why's our monitor labelling this an incident or hazard?
The event involves an AI system explicitly: Tesla's autopilot and automatic braking system, which are AI-based driver assistance technologies. The 'phantom braking' is a malfunction causing unexpected braking without obstacles, posing a direct safety risk to drivers and potentially leading to accidents and injury. The lawsuit claims harm has occurred and that Tesla failed to address these issues, indicating realized harm linked to the AI system's malfunction. Hence, this is an AI Incident as the AI system's malfunction has directly led to harm or risk of harm to people.
Thumbnail Image

'Phantom braking': Why drivers like Dominic want to give up their Teslas

2025-06-11
Australian Broadcasting Corporation
Why's our monitor labelling this an incident or hazard?
Tesla's Autopilot is an AI system that controls vehicle acceleration, steering, and braking. The phantom braking issue is a malfunction or erroneous behavior of this AI system, directly causing sudden braking that has led to collisions and driver distress, which are harms to health and safety. The class action lawsuit and multiple driver reports confirm that harm has occurred. The AI system's malfunction is the pivotal factor leading to these harms, meeting the criteria for an AI Incident rather than a hazard or complementary information.
Thumbnail Image

More Than 10,000 Owners Are Suing Tesla In Australia | Carscoops

2025-06-12
Carscoops
Why's our monitor labelling this an incident or hazard?
The Tesla Autopilot system is an AI system involved in autonomous driving assistance. The reported phantom braking incidents have caused actual harm or risk of harm to drivers, including collisions, which fits the definition of injury or harm to persons. The lawsuit also challenges Tesla's claims about the AI system's capabilities, indicating a breach of obligations related to truthful marketing and consumer rights. The involvement of the AI system in causing or contributing to these harms is direct and material. Hence, this event meets the criteria for an AI Incident rather than a hazard or complementary information.
Thumbnail Image

Tesla owners in Australia join class action for phantom braking, Autopilot issues affecting Model 3, Model Y - paultan.org

2025-06-12
Paul Tan's Automotive News
Why's our monitor labelling this an incident or hazard?
The Tesla Autopilot system is an AI system that autonomously controls vehicle functions such as emergency braking. The reported phantom braking incidents have directly caused dangerous situations, including a near collision with a truck, indicating direct harm or risk of harm to persons. The class action lawsuit highlights these harms as a result of the AI system's malfunction or erroneous behavior. Additionally, misleading claims about battery range, while relevant, do not negate the primary AI-related safety harms. Hence, this event meets the criteria for an AI Incident due to the AI system's malfunction leading to direct harm or risk thereof.
Thumbnail Image

Tesla Drivers Sue Over 'Phantom Braking' -- Musk Faces Legal Heat in France and Australia

2025-06-11
International Business Times UK
Why's our monitor labelling this an incident or hazard?
The Australian lawsuit directly concerns the malfunction of Tesla's Autopilot, an AI system for driver assistance, which has caused safety issues and near-accidents, thus constituting direct harm or risk to health and safety. This fits the definition of an AI Incident as the AI system's malfunction has led to harm or risk thereof. The French lawsuit is about reputational harm linked to the CEO's politics and does not involve AI system harm or plausible future harm, so it is unrelated to AI incident or hazard classification.
Thumbnail Image

Why Is Tesla Facing Legal Action In Australia? - TechRound

2025-06-11
TechRound
Why's our monitor labelling this an incident or hazard?
The Tesla Autopilot and Tesla Vision system are AI systems that perform autonomous driver assistance functions. The reported sudden braking events and reduced battery range represent malfunctions or failures of these AI systems, which have directly caused safety risks (harm to persons) and economic harm (extra costs, reduced resale value). The class action lawsuit and complaints from thousands of drivers confirm that harm has occurred. The involvement of AI in the development, use, and malfunction of these systems is explicit. Hence, this event meets the criteria for an AI Incident rather than a hazard or complementary information.
Thumbnail Image

Thousands In Australia Sue Tesla Over Phantom Braking Problems - Jalopnik

2025-06-14
Jalopnik
Why's our monitor labelling this an incident or hazard?
The Tesla vehicles' phantom braking problem is caused by the AI system controlling the car's autonomous or semi-autonomous driving features. The sudden braking incidents have caused fear and some collisions, which constitute harm to people. The lawsuit and reports confirm that the AI system's malfunction has directly led to these harms. Hence, this event meets the criteria for an AI Incident as the AI system's malfunction has directly caused harm.