
The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.
A Tesla robotaxi in Dallas malfunctioned when its autonomous driving AI missed a turn and attempted to pull over on a highway, causing passenger Chris Ramos to feel unsafe and requiring human intervention. The incident highlights ongoing safety concerns and technical limitations of Tesla's driverless service.[AI generated]
Why's our monitor labelling this an incident or hazard?
The Tesla robotaxi is an AI system performing autonomous driving. The described event involves the AI system malfunctioning during use, causing the vehicle to behave unsafely on a highway, which directly led to passenger discomfort and potential risk of harm. Human intervention was necessary to mitigate the situation. This fits the definition of an AI Incident because the AI system's malfunction directly led to harm (or risk thereof) to a person.[AI generated]