
The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.
Three EV models—Tesla with single-pedal mode, Lynk & Co 02 and Xiaomi SU7—experienced AI-driven automatic braking malfunctions. Tesla’s single-pedal brake failed on a highway; Lynk & Co’s system misapplied brakes due to sensor, software or lighting errors; Xiaomi’s SU7 secondary brake misjudged primary failure, all posing direct safety risks.[AI generated]
Why's our monitor labelling this an incident or hazard?
The automatic braking system is an AI system that makes real-time decisions based on sensor input and software processing. The article describes incidents where the system triggers braking without a valid reason, which can cause safety hazards and potential injury. This fits the definition of an AI Incident because the AI system's malfunction or erroneous output has directly led to harm or risk of harm to persons. The article focuses on the causes and implications of this malfunction, indicating realized or imminent harm rather than just potential future harm or general information.[AI generated]