TL;DR
West Midlands Police say Microsoft’s Copilot produced a false match result that was included in an intelligence report. The erroneous entry referenced a non-existent West Ham v Maccabi Tel Aviv game that contributed to a ban on Maccabi fans at a Europa League fixture.
What happened
Craig Guildford, chief constable of West Midlands Police, acknowledged in a letter to the Home Affairs Committee that an intelligence report contained an incorrect match result generated by Microsoft’s Copilot AI assistant. The entry described a West Ham v Maccabi Tel Aviv fixture that never took place; West Midlands Police incorporated that fabricated item into the report. The intelligence material was cited when Maccabi Tel Aviv supporters were banned from a Europa League match against Aston Villa in November after the Birmingham Safety Advisory Group assessed the fixture as “high risk,” citing prior violent clashes and hate crime offences at a separate Maccabi fixture in Amsterdam. Guildford had earlier attributed the error to social media scraping, but his recent letter links the mistake to Copilot. The Verge reports that Microsoft’s interface itself warns users that “Copilot may make mistakes.” Microsoft did not respond to requests for comment in time for publication.
Why it matters
- AI-generated errors can influence real-world policing and public safety decisions when incorporated into official reports.
- The incident raises questions about verification processes for intelligence that relies on automated assistants.
- Public trust in law enforcement and event safety decisions can be undermined by demonstrable AI hallucinations.
- It underscores the need for clear accountability when third-party AI tools are used in sensitive operational contexts.
Key facts
- Craig Guildford, chief constable of West Midlands Police, said the erroneous match result arose from use of Microsoft Copilot.
- Copilot produced a fabricated West Ham v Maccabi Tel Aviv match that did not occur but was included in an intelligence report.
- The intelligence report contributed to a ban on Maccabi Tel Aviv fans attending a Europa League match against Aston Villa in November.
- The Birmingham Safety Advisory Group had classified that fixture as high risk after violent clashes and hate crime offences at a prior Maccabi match in Amsterdam.
- Guildford had previously told MPs in December that the mistake resulted from social media scraping, not AI.
- Microsoft’s Copilot interface carries a warning that “Copilot may make mistakes.”
- The Verge noted prior testing that found Copilot sometimes produced incorrect information.
- Microsoft did not respond to requests for comment before the story was published.
What to watch next
- Any follow-up actions or policy changes from West Midlands Police regarding AI use in intelligence — not confirmed in the source.
- Whether the Home Affairs Committee seeks further evidence or hearings after receiving the chief constable’s letter — not confirmed in the source.
- A public response or explanation from Microsoft about why Copilot generated the fabricated match (Microsoft had not replied by publication).
Quick glossary
- AI hallucination: When an artificial intelligence system generates information that is false or not grounded in its training data or sources.
- Copilot: Microsoft’s branded AI assistant that can generate text and summaries for users; its interface warns it can make mistakes.
- Intelligence report: A document used by authorities to assess risks and inform operational or public-safety decisions, often drawing on multiple sources.
- Safety Advisory Group: A local body that assesses risks for public events and advises on measures such as policing and spectator restrictions.
Reader FAQ
Did Copilot invent the West Ham v Maccabi Tel Aviv match?
According to the chief constable’s letter reported by The Verge, Copilot generated a result for a match that did not occur and that item was included in the intelligence report.
Were Maccabi Tel Aviv fans banned because of that error?
The intelligence report that included the fabricated match was cited in decisions that led to a ban on Maccabi fans at a Europa League match; the Birmingham Safety Advisory Group had already deemed the fixture high risk after earlier incidents.
Did West Midlands Police initially admit using AI?
Guildford previously denied in December that AI was used, attributing the error to social media scraping; his later letter attributed the erroneous result to Copilot.
Has Microsoft explained why Copilot made the error?
Microsoft did not respond to requests for comment in time for publication.
Will there be disciplinary or policy changes following this incident?
not confirmed in the source

NEWS AI TECH UK police blame Microsoft Copilot for intelligence mistake Copilot invented a non-existent football match that was included in an intelligence report. by Tom Warren Jan 14, 2026, 1:00 PM…
Sources
- UK police blame Microsoft Copilot for intelligence mistake
- Maccabi police chief on brink after using AI to ban fans
- West Midlands Police apologise for using AI in 'dodgy …
- Who is West Midlands Police Chief Constable Craig …
Related posts
- Large language models: the latest chapter in a 400‑year confidence trick
- Windows 2000 stalls a coastal Portuguese ticket kiosk after memory error
- vLLM large-scale serving hits 2.2k tok/s per H200 with Wide-EP