AI Hallucination in Police Report Leads to Fan Ban and Public Apology

Thumbnail Image

The information displayed in the AIM should not be reported as representing the official views of the OECD or of its member countries.

West Midlands Police used Microsoft's Copilot AI tool to draft a report containing false information, which led to Maccabi Tel Aviv fans being banned from a football match in Birmingham. The AI-generated inaccuracies prompted a public apology, suspension of the AI tool, and an official review into the incident.[AI generated]

Why's our monitor labelling this an incident or hazard?

The event clearly involves an AI system (Microsoft Copilot) whose malfunction (hallucination) led to inaccuracies in an official police report. This report influenced a decision that harmed a community (Maccabi supporters) by banning them from attending a match based on false information, which constitutes harm to communities and a breach of trust. The police chief's apology and suspension of the AI tool confirm the AI's role in the incident. Therefore, this qualifies as an AI Incident because the AI system's malfunction directly led to harm.[AI generated]
AI principles
AccountabilityRobustness & digital security

Industries
Government, security, and defence

Affected stakeholders
General public

Harm types
ReputationalPsychologicalHuman or fundamental rights

Severity
AI incident

Business function:
Compliance and justice

AI system task:
Content generation


Articles about this incident or hazard

Thumbnail Image

Police chief 'absolutely determined´ to learn lessons from Maccabi...

2026-03-04
Daily Mail Online
Why's our monitor labelling this an incident or hazard?
The event clearly involves an AI system (Microsoft Copilot) whose malfunction (hallucination) led to inaccuracies in an official police report. This report influenced a decision that harmed a community (Maccabi supporters) by banning them from attending a match based on false information, which constitutes harm to communities and a breach of trust. The police chief's apology and suspension of the AI tool confirm the AI's role in the incident. Therefore, this qualifies as an AI Incident because the AI system's malfunction directly led to harm.
Thumbnail Image

Police chief 'absolutely determined' to learn lessons from Maccabi away fans ban

2026-03-04
Express & Star
Why's our monitor labelling this an incident or hazard?
The event clearly involves an AI system (Microsoft Copilot) whose malfunction (hallucination) led to inaccuracies in an official police report. These inaccuracies contributed to a decision that harmed the Maccabi fans and damaged public trust, which constitutes harm to communities and a violation of rights. The police chief's apology and suspension of the AI tool confirm the AI's role in the incident. Therefore, this qualifies as an AI Incident because the AI system's malfunction directly led to realized harm.
Thumbnail Image

West Midlands Police chief 'absolutely determined' to learn lessons from Maccabi away fans ban

2026-03-04
Express & Star
Why's our monitor labelling this an incident or hazard?
The event clearly involves an AI system (Microsoft Copilot) whose malfunction (hallucination) led to inaccuracies in an official police report. These inaccuracies contributed to a decision that harmed community trust and possibly violated rights, fulfilling the criteria for an AI Incident. The police force's response and suspension of the AI tool are complementary information but do not negate the fact that harm occurred due to the AI system's malfunction. Therefore, this event qualifies as an AI Incident.
Thumbnail Image

Police chief 'absolutely determined' to learn lessons from Maccabi away fans ban

2026-03-04
The Irish News
Why's our monitor labelling this an incident or hazard?
The event clearly involves an AI system (Microsoft Copilot) whose malfunction (hallucination producing false information) directly led to harm, including misinformation affecting public safety decisions and community relations. This meets the criteria for an AI Incident because the AI system's malfunction directly caused harm to communities and public trust, and there are ongoing investigations and remedial measures. The event is not merely a potential hazard or complementary information, but a realized incident involving AI-related harm.
Thumbnail Image

Police chief 'absolutely determined' to learn lessons from Maccabi away fans ban

2026-03-04
Shropshire Star
Why's our monitor labelling this an incident or hazard?
The event clearly involves an AI system (Microsoft Copilot) whose malfunction (hallucination) led to inaccuracies in an official police report. This report influenced a decision that caused harm to the community (Maccabi fans) and damaged public trust, which fits the definition of an AI Incident due to violation of rights and harm to communities. The police's suspension of the AI tool and ongoing investigations are responses to this incident, but the primary event is the AI-related harm itself, not just the response or complementary information.