AI-Powered Flock Cameras Used for Protest Surveillance and Raise Privacy Concerns in the U.S.

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

Florida law enforcement used AI-powered Flock license plate readers to track individuals linked to political protests, raising concerns over privacy and rights violations. In Georgia, residents report privacy harms and misuse, including stalking and targeting immigrants, highlighting the risks of mass surveillance enabled by these AI systems.[AI generated]

Why's our monitor labelling this an incident or hazard?

The automated license plate reader system is an AI system used for surveillance. The misuse by the police officer accessing the system to stalk individuals constitutes a violation of rights and misuse of AI technology, which is a direct harm caused by the AI system's use. This fits the definition of an AI Incident because the AI system's use directly led to harm (privacy violations and misuse of surveillance data). The article also discusses responses and concerns but the primary event is the misuse and resulting harm, making it an AI Incident rather than a hazard or complementary information.[AI generated]
AI principles
Privacy & data governanceRespect of human rights

Industries
Government, security, and defence

Affected stakeholders
General publicOther

Harm types
Human or fundamental rightsPsychological

Severity
AI incident

Business function:
Compliance and justice

AI system task:
Recognition/object detection


Articles about this incident or hazard

Thumbnail Image

Costa Mesa to audit license plate reader program after misuse by police officer

2026-04-25
Los Angeles Times
Why's our monitor labelling this an incident or hazard?
The automated license plate reader system is an AI system used for surveillance. The misuse by the police officer accessing the system to stalk individuals constitutes a violation of rights and misuse of AI technology, which is a direct harm caused by the AI system's use. This fits the definition of an AI Incident because the AI system's use directly led to harm (privacy violations and misuse of surveillance data). The article also discusses responses and concerns but the primary event is the misuse and resulting harm, making it an AI Incident rather than a hazard or complementary information.
Thumbnail Image

Lowe's faces pressure to cut ties with Flock Safety as AI surveillance data raises serious privacy concerns

2026-04-23
Fast Company
Why's our monitor labelling this an incident or hazard?
Flock Safety's surveillance technology includes AI systems like automated license plate readers and drones that process data for law enforcement use. The reported use of this data by ICE and in sensitive investigations suggests that AI systems have indirectly led to violations of privacy and human rights. The involvement of multiple advocacy groups demanding action further underscores the recognition of harm. Therefore, this event qualifies as an AI Incident due to the realized harm related to privacy and rights violations stemming from AI surveillance data use.
Thumbnail Image

College Station approves grant application tied to Flock security systems after public pushback

2026-04-24
Community Impact Newspaper
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions an AI-enabled system (automated license plate reader cameras) and discusses its use and public concerns about privacy and surveillance. However, it does not report any actual harm or incident caused by the AI system, nor does it describe a credible imminent risk of harm. The focus is on the grant approval process, public pushback, and governance measures, which aligns with Complementary Information. The event does not meet criteria for AI Incident (no realized harm) or AI Hazard (no plausible imminent harm described).
Thumbnail Image

Evansville shared Flock camera data nationwide. Did officials know?

2026-04-24
Evansville Courier & Press
Why's our monitor labelling this an incident or hazard?
The Flock Safety system is an AI system that performs license plate recognition and enables complex, nationwide data searches by law enforcement. The article documents actual use of this AI system in ways that have led to privacy violations and potential breaches of legal and human rights protections, such as searches related to abortion investigations and immigration enforcement that may contravene local laws or policies. These uses have caused harm to individuals' privacy and rights, fulfilling the criteria for an AI Incident. The event is not merely a potential risk or a governance response but describes concrete harms and misuse of the AI system's capabilities.
Thumbnail Image

Under Scrutiny, Flock Safety Debuts Automatic Auditing Tool

2026-04-24
Government Technology
Why's our monitor labelling this an incident or hazard?
The article focuses on a new compliance and auditing tool designed to monitor AI system usage and address privacy and legal concerns. It does not report any realized harm or direct incidents caused by the AI system, nor does it suggest a plausible future harm from the AI system itself. Instead, it highlights a governance and oversight response to prior concerns, making it Complementary Information rather than an Incident or Hazard.
Thumbnail Image

Flock Camera Debate

2026-04-25
WOWO 1190 AM | 107.5 FM
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions the AI system (Flock cameras) and its use by law enforcement, indicating AI system involvement. However, the event centers on public debate and concerns about privacy and civil liberties, without any reported harm or incident caused by the AI system. There is no evidence of direct or indirect harm, nor a plausible imminent risk of harm described. The focus is on transparency, policy, and public trust issues, which aligns with Complementary Information as it relates to societal responses and governance discussions around AI use.
Thumbnail Image

Florida police used Flock cameras to track No Kings protesters

2026-04-24
Pensacola News Journal
Why's our monitor labelling this an incident or hazard?
The event involves AI systems explicitly (AI-powered license plate readers) used by law enforcement to surveil and track individuals tied to protests and sensitive activities. The surveillance has led to realized harms, including violations of privacy and political rights, which are human rights violations. The article documents actual use and impact, not just potential risks. Hence, it meets the criteria for an AI Incident because the AI system's use has directly led to harm in the form of rights violations and community harm through mass surveillance and political tracking.
Thumbnail Image

Flock surveillance policy gone from university website, College cites IT issue - Flat Hat News

2026-04-23
The Flat Hat
Why's our monitor labelling this an incident or hazard?
The Flock surveillance cameras are AI systems (automatic license plate readers using AI). However, the event focuses on the temporary removal of the policy page from the website due to an IT compliance issue, not on any harm caused by the AI system. There is no evidence of injury, rights violations, disruption, or other harms linked to the AI system's use or malfunction. The university's response includes transparency efforts, indicating ongoing governance and communication rather than harm. Therefore, this event is best classified as Complementary Information, as it provides context and updates related to the AI system's governance and transparency but does not report an incident or hazard.
Thumbnail Image

Increasing Number of Flock Cameras Has Georgia Local Worried Over Privacy Concerns: 'How we're tracked and who benefits is inherently political'

2026-04-25
The Nerd Stash
Why's our monitor labelling this an incident or hazard?
Flock cameras are AI systems designed for surveillance and tracking, involving AI-based data processing and recognition. The article documents actual misuse and privacy violations linked to these systems, including stalking and use against immigrant populations, constituting direct harm to individuals' rights and privacy. The widespread deployment and the resulting public concern about mass surveillance and abuse of power confirm that harm is occurring. Therefore, this event qualifies as an AI Incident due to realized violations of human rights and privacy harms caused by the AI system's use.
Thumbnail Image

Pensacola Police Chief Eric Winstrom defends PPD using Flock cameras

2026-04-25
Pensacola News Journal
Why's our monitor labelling this an incident or hazard?
The article involves an AI system (Flock's license plate readers with AI software) used by law enforcement. However, it does not describe any realized harm such as violations of rights, injury, or other harms directly caused by the AI system. The concerns raised are about potential privacy violations and misuse, but no incident of harm is reported. The police chief's statements about policy and oversight indicate ongoing governance efforts rather than an incident or imminent hazard. Therefore, this event is best classified as Complementary Information, providing context and updates on AI system use and governance without reporting a new incident or hazard.
Thumbnail Image

Flock Safety Camera Error Turns Colorado Driver Into Frequent Police Target in Dystopian Nightmare

2026-04-26
Yahoo
Why's our monitor labelling this an incident or hazard?
The Flock Safety cameras are AI systems performing automated license plate recognition and alerting law enforcement. The incident arises from a malfunction in the system's data linkage, causing false alerts that lead to repeated police stops of an innocent individual. This constitutes indirect harm to the person's safety and freedom, fulfilling the criteria for an AI Incident. The harm is realized and ongoing, not merely potential, and stems from the AI system's malfunction and its use by law enforcement.
Thumbnail Image

As AI license plate readers spread across Colorado, grassroots movements push back

2026-04-27
The Colorado Sun
Why's our monitor labelling this an incident or hazard?
The event involves the use of AI systems (ALPRs) that process images and generate searchable data about individuals' movements, which has directly led to harms such as wrongful arrests and privacy violations. The article provides concrete examples of misuse and legal challenges, indicating realized harm rather than just potential risk. The systemic nature of mass surveillance and data sharing raises significant human rights concerns. Therefore, this qualifies as an AI Incident because the AI system's use has directly or indirectly caused violations of rights and harm to communities.
Thumbnail Image

Monday Morning Accounting News Brief: 990s to Get a Facelift; DOJ Gets Busy Busting Fraud | 4.27.26

2026-04-27
Going Concern
Why's our monitor labelling this an incident or hazard?
The event involves AI systems (automated license plate readers with AI capabilities) and addresses privacy and security concerns, which relate to potential violations of rights. However, the article does not report any realized harm or incident resulting from these systems, only concerns and responses. Therefore, it does not meet the criteria for an AI Incident or AI Hazard. Instead, it provides complementary information about societal and governance responses to AI-related privacy issues.
Thumbnail Image

OPINION: Before expanding flock cameras, expand public oversight

2026-04-28
Dayton Daily News
Why's our monitor labelling this an incident or hazard?
The article centers on the potential risks and governance challenges associated with the use of AI-enabled ALPR camera networks but does not describe any actual harm or incident resulting from their use. It discusses plausible future harms related to privacy, data retention, and misuse of surveillance data, which could lead to violations of civil liberties if not properly managed. Therefore, the event qualifies as an AI Hazard because it plausibly could lead to an AI Incident if oversight and governance are inadequate. It is not Complementary Information because it is not an update or response to a past incident, nor is it unrelated as it clearly involves AI systems (automated license plate readers with data sharing and retention capabilities).
Thumbnail Image

These AI-Powered Surveillance Cameras Are Everywhere -- and People Have Had Enough

2026-04-28
The Nation
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions AI-powered cameras and facial recognition systems used by law enforcement and private entities, which constitute AI systems. The harms described include violations of privacy, lack of legal oversight, and targeting of vulnerable groups, which are breaches of human rights and legal protections. The AI systems' development and use have directly led to these harms by enabling mass surveillance and data sharing without consent or warrants. The article also notes real-world consequences such as tracking protestors and abortion seekers, confirming that harm has materialized. Thus, this is an AI Incident rather than a hazard or complementary information.
Thumbnail Image

Did your car get caught on camera? Madison County defends surveillance

2026-04-28
The Asheville Citizen Times
Why's our monitor labelling this an incident or hazard?
The article explicitly mentions an AI system (Flock Safety's automated license plate reader cameras) used by law enforcement. The concerns raised relate to privacy and potential misuse, which are plausible risks associated with AI surveillance systems. However, the article does not report any actual harm or violation caused by the AI system in this context. Instead, it documents community concerns, public meetings, and official responses, which align with the definition of Complementary Information. There is no indication of an AI Incident (harm realized) or AI Hazard (plausible future harm without current harm) as the focus is on ongoing societal and governance discussions about the technology's use and implications.
Thumbnail Image

Flock Safety cameras present at Lowe's and UW-Oshkosh campus

2026-04-27
WGBA
Why's our monitor labelling this an incident or hazard?
Flock Safety cameras are AI systems used for surveillance and vehicle identification, so AI system involvement is present. However, the article does not report any harm or plausible harm caused by these systems. It mainly discusses the presence of the cameras and public opinion, without any indication of incidents or risks. Thus, the event fits the definition of Complementary Information, as it provides context and societal response to AI deployment without describing an incident or hazard.
Thumbnail Image

Aspen 'holding off' on Flock cameras pending legislation

2026-04-28
Aspen Times
Why's our monitor labelling this an incident or hazard?
The event involves an AI system (Flock Safety's automated license plate readers) whose use is currently paused due to pending legislation addressing data privacy and access. While there are concerns about potential violations of constitutional rights and data misuse, the article does not report any realized harm or incident in Aspen. The focus is on the plausible future risks and regulatory responses to mitigate those risks. Therefore, this qualifies as an AI Hazard because the AI system's deployment could plausibly lead to harms such as privacy violations or unlawful surveillance if not properly regulated. It is not an AI Incident since no direct or indirect harm has yet occurred, nor is it merely complementary information or unrelated news.
Thumbnail Image

What the Flock?: An Explainer on the Controversial License Plate Reading Cameras Deployed Across the Triangle

2026-04-27
INDY Week
Why's our monitor labelling this an incident or hazard?
The Flock ALPR system is an AI system that automatically captures and processes license plate and vehicle data, storing it in a centralized database accessible by multiple law enforcement agencies and private entities. The article documents direct harms including mass surveillance without individualized suspicion, privacy violations, and misuse of data for immigration enforcement and protest monitoring. These harms constitute violations of human rights and fundamental rights. The AI system's use and data sharing practices have directly led to these harms, fulfilling the criteria for an AI Incident. The article does not merely discuss potential risks or responses but details ongoing harms and misuse, distinguishing it from an AI Hazard or Complementary Information.
Thumbnail Image

Man Trapped in Dystopian Nightmare Thanks to AI Surveillance Cameras Flagging His Every Move

2026-04-29
Futurism
Why's our monitor labelling this an incident or hazard?
The AI system (Flock Safety's ALPRs) is explicitly involved, as it automatically flagged the man's vehicle based on incorrect data, triggering police alerts and repeated stops. The harm is realized and ongoing, including harassment and potential safety risks, fulfilling the criteria for injury or harm to a person. The incident stems from the AI system's malfunction (erroneous data linkage) and its use in law enforcement surveillance. Hence, this is an AI Incident rather than a hazard or complementary information.
Thumbnail Image

The surveillance bombardment: Lafayette teen arrested in Bay Area Flock camera vandalism spree

2026-04-29
East Bay Times
Why's our monitor labelling this an incident or hazard?
The event explicitly involves AI surveillance cameras (Flock cameras) that use AI to record images and license plates. The vandalism caused physical damage to these AI systems, including a small explosion and disruption of their operation, which constitutes harm to property and disruption of critical infrastructure (surveillance network). The AI system's presence and role are clear, and the harm is realized through vandalism. Although the harm is caused by human vandalism, it is directly linked to the AI system's use and presence. Thus, this qualifies as an AI Incident due to harm to property and disruption of operation caused by or related to the AI system.
Thumbnail Image

AAUP | Remove Flock cameras and cancel the contract: It's about much more than license plates

2026-04-29
The Stanford Daily
Why's our monitor labelling this an incident or hazard?
The event involves the use of an AI-enabled system (Flock ALPR cameras) that collects and processes license plate data to track vehicle movements. This system's deployment has directly led to violations of privacy rights and has a chilling effect on freedoms of speech and assembly, which are fundamental rights. The article documents realized harms from the system's use, including unauthorized data sharing and surveillance that impacts individuals' behavior and community trust. Therefore, this qualifies as an AI Incident due to the direct harm caused by the AI system's use in mass surveillance and rights violations.
Thumbnail Image

AAUP: Academic Freedom On The Line | Removing Flock cameras is about much more than license plates

2026-04-29
The Stanford Daily
Why's our monitor labelling this an incident or hazard?
The event involves AI systems in the form of automated license plate readers (ALPRs) that use AI to capture and process vehicle location data. The article details the use and misuse of this AI system, including unauthorized data sharing and surveillance concerns. While the article highlights significant potential harms related to privacy and civil liberties, it does not describe a specific incident where harm has already occurred. Instead, it emphasizes the plausible future risk and chilling effects on academic freedom and community trust. Thus, it fits the definition of an AI Hazard rather than an AI Incident or Complementary Information.
Thumbnail Image

Oshkosh ends Flock Safety contract but will keep license plate readers

2026-04-29
WGBA
Why's our monitor labelling this an incident or hazard?
The article involves an AI system (automated license plate readers) and discusses its use and public concerns, but it does not describe any realized harm or incident resulting from the AI system's development, use, or malfunction. The event is about contract termination and vendor change, with public scrutiny over privacy issues, which fits the definition of Complementary Information as it provides context and societal response rather than reporting a new AI Incident or AI Hazard.
Thumbnail Image

Connecticut Community Voices Concern Over Flock Safety Data

2026-04-29
Government Technology
Why's our monitor labelling this an incident or hazard?
The automated license plate cameras are AI systems used for crime detection and prevention. The concerns raised by residents and officials focus on data privacy and sharing practices, which could plausibly lead to violations of rights or harm to communities if misused. However, the article does not describe any actual harm or incident resulting from the AI system's use. The police department has taken steps to limit data sharing and audit usage, indicating ongoing governance efforts. Since no harm has materialized but plausible risks exist, this event fits the definition of an AI Hazard rather than an AI Incident or Complementary Information.
Thumbnail Image

Camera crackdown? Not here: Yakima County expands Flock use

2026-04-29
AppleValleyNewsNow.com
Why's our monitor labelling this an incident or hazard?
The article explicitly involves AI systems in the form of Flock's ALPR and video surveillance cameras, which use AI to read license plates and analyze video feeds. The event concerns the use and expansion of these AI systems by law enforcement. While there is no direct evidence of harm occurring, the concerns about privacy, data sharing with federal agencies without local knowledge, and the potential for misuse of surveillance data indicate plausible future harms related to violations of privacy rights and possibly other human rights. Since no actual harm is reported, but plausible harm is credible, the event fits the definition of an AI Hazard rather than an AI Incident. It is not merely complementary information because the main focus is on the ongoing use and expansion of AI surveillance with associated risks, not on responses or updates to past incidents.
Thumbnail Image

Framingham Community Members Call for Review of City's Contract with Flock Safety | The Frame

2026-04-29
accessfram.tv
Why's our monitor labelling this an incident or hazard?
The article describes an AI system (Flock Safety's ALPR cameras) actively used by law enforcement, which involves AI in its operation. The concerns raised by the community about data sharing, privacy, and surveillance represent potential risks of harm, but no actual incident of harm is reported. The event centers on community calls for review and oversight before contract renewal, highlighting plausible future harms rather than realized ones. Thus, it fits the definition of an AI Hazard rather than an AI Incident or Complementary Information.
Thumbnail Image

Not a one-off: More Coloradans stopped by police after data errors triggered Flock alerts

2026-05-01
9NEWS
Why's our monitor labelling this an incident or hazard?
The event explicitly involves AI systems (Flock cameras using AI for license plate recognition and hotlist alerts) whose malfunction or misuse (due to human data entry errors in police databases) directly led to wrongful police stops and harassment of innocent people. This caused harm to individuals (stress, harassment, potential risk during police stops) and implicates violations of rights and harms to communities. The harm is realized, not just potential, and the AI system's role is pivotal in triggering police actions based on faulty data. Therefore, this qualifies as an AI Incident.
Thumbnail Image

Oakland County Commissioner pushes 12-month moratorium on Flock surveillance tech after drone vote

2026-05-01
WDIV
Why's our monitor labelling this an incident or hazard?
The Flock Safety drones involve AI systems for surveillance and real-time video analysis, which qualifies as AI system involvement. However, the article does not describe any realized harm such as privacy violations, misuse, or other direct or indirect harms resulting from the AI system's deployment. The concerns expressed by residents and the commissioner's proposal for a moratorium indicate a plausible risk of harm in the future, particularly regarding privacy and data security. Therefore, this event fits the definition of an AI Hazard, as it highlights a credible potential for harm from the use of AI surveillance technology, but no actual incident has occurred yet.
Thumbnail Image

Flock Safety Employees Watched Kids' Gymnastics Room to Pitch Surveillance Tech

2026-04-30
Gadget Review
Why's our monitor labelling this an incident or hazard?
The event explicitly involves an AI surveillance system used by Flock Safety. The employees' unauthorized or ethically questionable use of live camera feeds for sales purposes directly led to harm in terms of privacy violations and ethical breaches involving children, which falls under violations of human rights and harm to communities. The misuse of AI surveillance technology in this way meets the criteria for an AI Incident because the AI system's use directly led to harm. The event is not merely a potential hazard or complementary information but a realized incident of harm due to AI misuse.
Thumbnail Image

Colorado's Flock Cameras: Catching Kidnappers, Harassing Innocents, Sparking Privacy Uprising

2026-04-30
WebProNews
Why's our monitor labelling this an incident or hazard?
The event involves an AI system (Flock Safety's ALPR) that uses automated scanning and data analysis to generate law enforcement alerts. The system's malfunction (data entry errors causing wrongful identification) and use (surveillance and data sharing practices) have directly led to harms including repeated wrongful stops, privacy infringements, and public distrust. These constitute violations of rights and harms to communities, fitting the definition of an AI Incident. The article also discusses legislative and societal responses, but the primary focus is on realized harms caused by the AI system's deployment and errors, not just potential or complementary information.