Flock Safety License Plate Reader Data Sharing Sparks Privacy and Rights Concerns in California

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Flock Safety's AI-powered license plate readers, used by law enforcement in California, have come under scrutiny after data was shared with federal agencies, including ICE and Border Patrol, without proper oversight. This has led to privacy violations, public backlash, and contract terminations by cities and Amazon's Ring, highlighting risks of AI surveillance misuse.[AI generated]

Why's our monitor labelling this an incident or hazard?

The article describes an AI system (Flock's license plate reader) actively used by police and communities, which has led to widespread public concern about privacy and civil liberties. The system's use has indirectly caused harm by eroding trust and raising fears of surveillance misuse, which aligns with violations of human rights and harm to communities. The termination of contracts by cities is a direct consequence of these harms. Although no physical injury or legal ruling is mentioned, the societal and rights-based harms are clear and materialized, meeting the criteria for an AI Incident rather than a hazard or complementary information.[AI generated]
AI principles
Privacy & data governanceAccountability

Industries
Government, security, and defence

Affected stakeholders
General public

Harm types
Human or fundamental rightsPublic interest

Severity
AI incident

Business function:
Compliance and justice

AI system task:
Recognition/object detection


Articles about this incident or hazard

Thumbnail Image

Cities join Amazon in ending their partnership with license-plate reader Flock following Super Bowl Ad. 'Your privacy is totally fine,' says Ring CEO | Fortune

2026-03-03
Fortune
Why's our monitor labelling this an incident or hazard?
The article describes an AI system (Flock's license plate reader) actively used by police and communities, which has led to widespread public concern about privacy and civil liberties. The system's use has indirectly caused harm by eroding trust and raising fears of surveillance misuse, which aligns with violations of human rights and harm to communities. The termination of contracts by cities is a direct consequence of these harms. Although no physical injury or legal ruling is mentioned, the societal and rights-based harms are clear and materialized, meeting the criteria for an AI Incident rather than a hazard or complementary information.
Thumbnail Image

LAPD's relationship with Flock Safety under scrutiny from oversight body

2026-03-03
Los Angeles Times
Why's our monitor labelling this an incident or hazard?
The license plate reader system is an AI system as it involves automated recognition and processing of license plate data. The article highlights a 'configuration error' that allowed unauthorized access to sensitive data, raising concerns about misuse and potential harm to individuals, especially vulnerable groups. While the harm is not confirmed as realized, the plausible risk of privacy violations and misuse of surveillance data by federal authorities and others is credible. Hence, the event fits the definition of an AI Hazard, as the AI system's use and malfunction could plausibly lead to violations of human rights and harm to communities.
Thumbnail Image

Privacy concerns surround Richmond's Flock camera system ahead of vote

2026-03-03
KRON4
Why's our monitor labelling this an incident or hazard?
The Flock camera system qualifies as an AI system because it involves automated license plate reading, which requires AI-based image recognition and data processing. The concerns raised are about potential misuse of data and privacy violations, which could lead to harm such as violations of rights or harm to communities if data is shared improperly. However, the article does not report any realized harm or incident; it focuses on the debate and concerns prior to a council vote. Hence, this is best classified as an AI Hazard, reflecting the plausible future risk of harm from the AI system's use or misuse.
Thumbnail Image

Residents rally to demand Richmond keep Flock camera on and extend contract

2026-03-04
ABC7 News
Why's our monitor labelling this an incident or hazard?
The event centers on the use and management of an AI system (automated license plate readers) that has been involved in law enforcement activities with tangible outcomes (arrests, suspect identification). The unauthorized data access due to a software error constitutes a breach of privacy and a violation of legal obligations, which falls under violations of human rights or breach of applicable law. However, the article does not report direct harm such as injury or explicit rights violations occurring as a result of this breach, but rather the discovery of the breach and the ensuing policy debate. Therefore, this event is best classified as Complementary Information, as it provides an update on the governance, policy considerations, and responses related to an AI system previously deployed, rather than reporting a new AI Incident or AI Hazard.
Thumbnail Image

LA Historic Park cameras

2026-03-03
LAist
Why's our monitor labelling this an incident or hazard?
The Flock Safety license plate readers are AI systems used for surveillance. Their deployment in a public park and the sharing of collected data with law enforcement, including instances of illegal sharing with federal agencies, directly implicate violations of privacy and potentially human rights. The harms are realized, not just potential, as the article documents actual data sharing and community pushback due to surveillance fears. Therefore, this qualifies as an AI Incident due to violations of rights and harm to communities caused by the AI system's use and data handling practices.
Thumbnail Image

License plate readers spark privacy concerns at LA State Historic Park in Chinatown

2026-03-03
LAist
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions AI-powered license plate readers used for surveillance, which is an AI system. The use of these systems has led to privacy concerns and documented instances of law enforcement agencies sharing data in violation of state laws, indicating a breach of rights. The AI system's deployment and data sharing practices have directly contributed to these harms. The concerns about mass surveillance and data misuse align with violations of human rights as defined in the framework. Hence, this event qualifies as an AI Incident rather than a hazard or complementary information.
Thumbnail Image

LAPD's relationship with Flock Safety under scrutiny from oversight body

2026-03-04
Eagle-Tribune
Why's our monitor labelling this an incident or hazard?
The license plate reader system is an AI system as it automates recognition and data processing. The event centers on oversight and concerns about data use and sharing, which could implicate human rights or privacy violations. However, no direct harm or incident is reported yet, only a request for more information and scrutiny. Therefore, this is a plausible risk scenario but without confirmed harm at this stage, fitting the definition of Complementary Information as it provides context and governance response to potential AI-related issues.
Thumbnail Image

Everett shuts down Flock camera network after judge rules footage is public record

2026-03-02
wltx.com
Why's our monitor labelling this an incident or hazard?
The Flock camera system is an AI system as it involves automated license plate recognition technology. The event stems from the use of this AI system and the legal ruling that its data is public record. The concerns raised relate to potential misuse of the data that could harm individuals' privacy and safety, which fits the definition of plausible future harm. However, the article does not report any actual harm or incident caused by the AI system's use so far. The shutdown is a precautionary response to the ruling and ongoing legislative debate. Thus, the event is best classified as an AI Hazard, reflecting the credible risk of harm from the AI system's data being publicly accessible, but without evidence of realized harm at this time.
Thumbnail Image

Cities join Amazon in ending their partnership with license-plate reader Flock following public outcry. 'Your privacy is totally fine,' says Ring CEO

2026-03-03
DNYUZ
Why's our monitor labelling this an incident or hazard?
The article describes an AI system (Flock Safety's license plate reader) used by law enforcement and communities for surveillance purposes. The system's development and intended use raised significant public concern about privacy and potential misuse, leading to contract cancellations. However, the article does not report any actual harm or violation of rights that has occurred due to the AI system's malfunction or misuse. The concerns and contract terminations are preventive and reflect plausible future harm rather than realized harm. Thus, the event fits the definition of an AI Hazard rather than an AI Incident or Complementary Information.
Thumbnail Image

Lapd's Relationship With Flock Safety Under Scrutiny From Oversight Body

2026-03-03
Breaking News, Latest News, US and Canada News, World News, Videos
Why's our monitor labelling this an incident or hazard?
Flock Safety's license plate reader system is an AI system that processes surveillance data. The sharing of this data with national authorities without proper consent or oversight constitutes a violation of privacy and potentially human rights, fulfilling the criteria for harm under (c) violations of human rights or breach of obligations under applicable law. The LAPD's investigation and Flock's acknowledgment indicate that harm has occurred or is ongoing. Therefore, this event qualifies as an AI Incident due to the direct or indirect harm caused by the use and misuse of the AI system.