Google Street View AI Fails to Censor Nudity, Man Awarded Compensation

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Google was ordered to pay an Argentine man after its Street View AI failed to censor images of him naked outside his home, exposing his identity and causing public humiliation. The incident highlights the privacy risks and harms from AI system malfunctions in automated image processing.[AI generated]

Why's our monitor labelling this an incident or hazard?

Google Street View uses AI systems to capture and process images, including automated face blurring to protect privacy. In this case, the AI system failed to censor the man's nudity, exposing him publicly and causing harm to his privacy and dignity. The harm is a violation of personal rights and privacy, which falls under violations of human rights or breach of obligations intended to protect fundamental rights. Since the AI system's malfunction (failure to censor nudity) directly led to this harm, this qualifies as an AI Incident.[AI generated]
AI principles
Privacy & data governanceRespect of human rightsRobustness & digital securitySafetyAccountability

Industries
Media, social platforms, and marketingIT infrastructure and hostingConsumer servicesDigital security

Affected stakeholders
General public

Harm types
Human or fundamental rightsPsychologicalReputational

Severity
AI incident

Business function:
Monitoring and quality control

AI system task:
Recognition/object detection


Articles about this incident or hazard

Thumbnail Image

Google To Pay Rs 10.9 Lakh To Man After Street View Clicked Him Naked; Internet Erupts In Hilarious Memes

2025-07-30
Mashable India
Why's our monitor labelling this an incident or hazard?
Google Street View uses AI systems to capture and process images, including automated face blurring to protect privacy. In this case, the AI system failed to censor the man's nudity, exposing him publicly and causing harm to his privacy and dignity. The harm is a violation of personal rights and privacy, which falls under violations of human rights or breach of obligations intended to protect fundamental rights. Since the AI system's malfunction (failure to censor nudity) directly led to this harm, this qualifies as an AI Incident.
Thumbnail Image

Man Wins $12,500 After Google Street View Camera Photographs Him Naked

2025-08-01
VICE
Why's our monitor labelling this an incident or hazard?
Google Street View employs AI systems for image processing and privacy protection measures such as blurring identifiable features. The failure to blur the naked man's image led to a privacy violation, causing harm to the individual's dignity and personal rights. This constitutes a violation of human rights and privacy protections, directly linked to the AI system's use and malfunction (failure to adequately blur sensitive content). Therefore, this qualifies as an AI Incident under the framework, as the AI system's malfunction directly led to harm.
Thumbnail Image

صورة عابرة تُكلف غوغل 12 ألف يورو

2025-08-03
صحيفة السوسنة الأردنية
Why's our monitor labelling this an incident or hazard?
Google's Street View uses AI systems for panoramic image capture and processing. The incident involves the use of such an AI system that directly caused harm to the man's privacy and dignity by publishing an image without consent. This harm is a violation of fundamental rights protected by law. Therefore, this qualifies as an AI Incident due to the direct link between the AI system's use and the realized harm (privacy violation and reputational damage).