8 Sources
8 Sources
[1]
Deny, deny, admit: UK police used Copilot AI "hallucination" when banning football fans
After repeatedly denying for weeks that his force used AI tools, the chief constable of the West Midlands police has finally admitted that a hugely controversial decision to ban Maccabi Tel Aviv football fans from the UK did involve hallucinated information from Microsoft Copilot. In October 2025, Birmingham's Safety Advisory Group (SAG) met to decide whether an upcoming football match between Aston Villa (based in Birmingham) and Maccabi Tel Aviv could be held safely. Tensions were heightened in part due to an October 2 terror attack against a synagogue in Manchester where several people were killed by an Islamic attacker. West Midlands Police, who were a key member of the SAG, argued that the upcoming football match could lead to violence in Birmingham, and they recommended banning fans from the game. The police pointed specifically to claims that Maccabi Tel Aviv fans had been violent in a recent football match in Amsterdam. This decision was hugely controversial, and it quickly became political. To some Jews and conservatives, it looked like Jewish fans were being banned from the match even though Islamic terror attacks were the more serious source of violence. The football match went ahead on November 6 without fans, but the controversy around the ban has persisted for months. Making it worse was the fact that the West Midlands Police narrative rapidly fell apart. According to the BBC, police claimed that the Amsterdam football match featured "500-600 Maccabi fans [who] had targeted Muslim communities the night before the Amsterdam fixture, saying there had been 'serious assaults including throwing random members of the public' into a river. They also claimed that 5,000 officers were needed to deal with the unrest in Amsterdam, after previously saying that the figure was 1,200." Amsterdam police made clear that the West Midlands account of bad Maccabi fan behavior was highly exaggerated, and the BBC recently obtained a letter from the Dutch inspector general confirming that the claims were inaccurate. But it was one flat-out error -- a small one, really -- that has made the West Midlands Police recommendation look particularly shoddy. In a list of recent games with Maccabi Tel Aviv fans present, the police included a match between West Ham (UK) and Maccabi Tel Aviv. The only problem? No such match occurred. So where had this completely fantasized detail come from? As an inquiry into the whole situation was mounted, Craig Guildford, the chief constable of the West Midlands Police, was hauled before Parliament in December 2025 and again in early January 2026 to answer questions. Both times, he claimed the police did not use AI -- the obvious suspect in a case like this. In December, Guildford blamed "social media scraping" gone wrong; in January, he chalked it up to some bad Googling. "We do not use AI," he told Parliament on January 6. "On the West Ham side of things and how we gained that information, in producing the report, one of the officers would usually go to... a system, which football officers use all over the country, that has intelligence reports of previous games. They did not find any relevant information within the searches that they made for that. They basically Googled when the last time was. That is how the information came to be." But Guildford admitted this week that this explanation was, in fact, bollocks. As he acknowledged in a letter on January 12, "I [recently] became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot." He had not intended to deceive anyone, he added, saying that "up until Friday afternoon, [I] understood that the West Ham match had only been identified through the use of Google." This has made a bad situation even worse. Today, in the House of Commons, Home Secretary Shabana Mahmood gave a long statement on the case in which she threw Guildford under the bus and backed over him five or six times. Mahmood blamed the ban on "confirmation bias" by the police. She said the Amsterdam stories they used were "exaggerated or simply untrue." And she highlighted the fact that Guildford claimed "AI tools were not used to prepare intelligence reports," but now "AI hallucination" was said to be responsible. The whole thing was a "failure of leadership," and Guildford "no longer has my confidence," she said. This last bit was something that everyone in the UK appears to agree on. Conservatives want Guildford to go, too, with party leaders calling for his resignation. MP Nick Timothy has been ranting for days on X about the issue, especially the fact that hallucination-prone AI tools are being used to produce security decisions. "More detail on the misuse of AI by the police," he wrote today. "They didn't just deny it to the home affairs committee. They denied it in FOI requests. They said they have no AI policy. So officers are using a new, unreliable technology for sensitive purposes without training or rules."
[2]
UK police blame Microsoft Copilot for intelligence mistake
The chief constable of one of Britain's largest police forces has admitted that Microsoft's Copilot AI assistant made a mistake in a football (soccer) intelligence report. The report, which led to Israeli football fans being banned from a match last year, included a non-existent match between West Ham and Maccabi Tel Aviv. Copilot hallucinated the game and West Midlands Police included the error in its intelligence report. "On Friday afternoon I became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot [sic]," says Craig Guildford, chief constable of West Midlands Police, in a letter to the Home Affairs Committee earlier this week. Guildford previously denied in December that the West Midlands Police had used AI to prepare the report, blaming "social media scraping" for the error. Maccabi Tel Aviv fans were banned from a Europa League match against Aston Villa in November last year, because the Birmingham Safety Advisory Group deemed the match "high risk" after "violent clashes and hate crime offences" at a previous Maccabi match in Amsterdam. As Microsoft warns at the bottom of its Copilot interface, "Copilot may make mistakes." This is a pretty high profile mistake, though. We tested Copilot recently and my colleague Antonio G. Di Benedetto found that Microsoft's AI assistant often "got things wrong" and "made stuff up." We've reached out to Microsoft to comment on why Copilot made up a football match that never existed, but the company didn't respond in time for publication.
[3]
Police chief admits misleading MPs over AI role in incorrect report that led to Israeli fans' ban
Here's Chief Constable Craig Guildford's letter to Karen Bradley, chair of the Home Affairs Select Committee: Dear Dame Karen Bradley, I write further to my appearance at the HAC on the 1st December 2025 and 6th January 2026 and in relation to the questions raised by the Committee concerning the provenance of the inclusion of the West Ham v Maccabi Tel Aviv match in the WMP report to the Birmingham City Council Safety Advisory Group. In preparation for the force response to the HMICFRS inquiry into this matter, on Friday afternoon I became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot. Both ACC O'Hara and I had, up until Friday afternoon, understood that the West Ham match had only been identified through the use of Google. This will be further explained in the additional material being provided to the Committee. I would like to offer my profound apology to the Committee for this error, both on behalf of myself and that of ACC O'Hara. I had understood and been advised that the match had been identified by way of a Google search in preparation for attending HAC.
[4]
British Police Used Microsoft Copilot for Faulty Report That Led to Ban on Soccer Team Fans
A police force in the United Kingdom has admitted to relying on Microsoft's AI assistant Copilot in the fabrication of a faulty intelligence report. Late last year, the Israeli soccer team Maccabi Tel Aviv played against the British Aston Villa in a game in Birmingham, U.K. Prior to the game, a Birmingham safety committee decided to not allow Maccabi Tel Aviv fans to attend the game, basing the decision on an intelligence report from West Midlands police deeming the match high-risk for hooliganism. The details of that report were later heavily contested by government officials. Now, West Midlands police chief constable Craig Guildford has admitted that his subordinates relied on Microsoft Copilot to fabricate the report and failed to fact-check its findings. Specifically, the report referred to a match between Maccabi Tel Aviv and the British soccer team West Ham that was completely hallucinated by Copilot. The two teams have never played against each other, and on the day of the imaginary game, West Ham had another game against the Greek team Olympiacos, according to the U.K. Parliament's Home Affairs Committee. "On Friday afternoon I became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot," Guildford wrote in a letter to the Home Affairs Committee, after weeks of denying the use of AI. Artificial intelligence is far from a perfect technology. It is still particularly prone to hallucinations, and that tendency can fuel the spread of misinformation with real-life consequences, and the West Midlands intelligence report incident is far from the first example. Back in October, major consulting firm Deloitte had to pay the Australian government a partial refund after delivering them a $290,000 AI-generated report riddled with fake academic research papers and court judgments. But despite the shortcomings of AI, American big tech companies have been engaged in a full-blown effort to deploy the technology far and wide, across all parts of the workforce. “America needs to be the most aggressive in adopting AI technology of any country in the world, bar none, and that is an imperative,†Nvidia CEO Jensen Huang said at a press and industry briefing in October. “We have to encourage every single company, every single student, to use AI.†Huang has also recently gone on to say that talking about the risks of AI deployment was "hurtful" and "not helpful to society." Microsoft is also aggressive about the future of AI in the workforce. The company made AI use mandatory for its employees and markets Copilot as an AI assistant to boost productivity in the workplace. The Copilot technology is widely deployed by a range of companies across the American corporate world. As of late last year, Copilot is also used by the U.S. House of Representatives.
[5]
West Midlands police chief apologises after AI error used to justify Maccabi Tel Aviv ban
Craig Guildford says he gave incorrect evidence to MPs and mistake arose from 'use of Microsoft Copilot' The chief of West Midlands police has apologised to MPs for giving them incorrect evidence about the decision to ban Maccabi Tel Aviv football fans, saying it had been produced by artificial intelligence (AI). Craig Guildford told the home affairs select committee on Monday that the inclusion of a fictitious match between Maccabi Tel Aviv and West Ham in police intelligence "arose as a result of a use of Microsoft Copilot". The chief constable had previously told MPs that the force did not use AI and the mistake regarding the West Ham match, which had never taken place, was made by "one individual doing one Google search". It comes as the home secretary, Shabana Mahmood, prepares to make a statement to MPs about the findings of a report by His Majesty's Inspectorate of Constabulary into the decision to ban Maccabi Tel Aviv fans from attending a Europe League match against Aston Villa in November. The fake West Ham match was a part of intelligence that was presented to the council-led security advisory group, who made the decision to ban away fans. In an email to the home affairs select committee published on Wednesday, Guildford said he would like to offer his "profound apology" for the error. "I had understood and been advised that the match had been identified by way of a Google search in preparation for attending HAC. My belief that this was the case was honestly held and there was no intention to mislead the committee."
[6]
UK police chief blames AI for error in evidence over Maccabi fan ban
London (AFP) - An under-fire UK police chief on Wednesday blamed the use of AI for erroneous evidence given to MPs over the decision to ban Maccabi Tel Aviv football fans from a match against Aston Villa. The police classified the match in Birmingham in November as "high risk", citing previous Maccabi games including a Europa League encounter in Amsterdam which sparked clashes between locals and Israeli fans. The decision to ban Maccabi fans from the Villa Park UEFA Europa League game was slammed by Prime Minister Keir Starmer, with the government ordering an independent report which is due to be published later on Wednesday. Scrutiny has increased on West Midlands police after multiple pieces of evidence used to justify the decision proved flawed, with the force rejecting allegations that the move was motivated by politics rather than fan safety. In an intelligence report for the game, police cited a match between West Ham and Maccabi Tel Aviv -- which never took place. When questioned about this by lawmakers earlier this month, chief constable Craig Guildford insisted that the error was the result of a Google search, and that the force had not used artificial intelligence in its research. However, in a letter to MPs on Wednesday, Guildford admitted that the erroneous information was due to the use of Microsoft Co Pilot, an AI chatbot. "I would like to offer my profound apology to the committee for this error," Guildford said, adding that there was "no intention to mislead the committee". That came after UK media reported in December that Dutch police also disputed evidence cited by the West Midlands force to justify the ban. UK police claimed they were told that Maccabi fans were behind several violent incidents during the 2024 Amsterdam clashes -- but that intelligence was partly contradicted by Dutch politicians and police. The decision to ban the fans was also sharply criticised by Israeli politicians, who claimed that it was antisemitic. British interior minister Shabana Mahmood will present the findings of the independent inquiry on Wednesday, which could heap further pressure on Guildford after opposition leader Kemi Badenoch called for his resignation over the debacle. The Guardian newspaper reported that the watchdog report is set to say that the police made a series of errors in how it gathered and handled intelligence while making the decision. The match went ahead amid heavy security, but without Maccabi fans after the Israeli team turned down their away ticket allocation.
[7]
UK police apologise to parliament after admitting a major and controversial policy decision was based on AI making stuff up
The Chief Constable of West Midlands Police in the United Kingdom has apologised to members of parliament after admitting that a major policing decision had been made "as result of a use of Microsoft Co Pilot [sic]". The decision was taken in November 2025 to ban fans from the Israeli team Maccabi Tel Aviv from attending a Europa League match against the Birmingham-based team Aston Villa. It was hugely controversial at the time, with even Prime Minister Keir Starmer questioning the move and calling it "wrong", and MPs have continued digging into why it was made. The Chief Constable, Craig Guildford, had previously insisted the police "do not use AI" but that a Google search had provided erroneous information used in a report that led to the ban: Notably, the inclusion of a match between Maccabi Tel-Aviv and West Ham that never happened. Now Guildford has written to Karen Bradley, chair of the Home Affairs Select Committee, admitting that AI was used in assembling the report: "In preparation for the force response to the HMICFRS inquiry into this matter, on Friday afternoon I became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot," writes Guildford. "Both ACC O'Hara and I had, up until Friday afternoon, understood that the West Ham match had only been identified through the use of Google [...] I would like to offer my profound apology to the Committee for this error, both on behalf of myself and that of ACC O'Hara." Guildford had previously been grilled by MPs about the inclusion of the non-existent match on December 1, 2025, at which time he said: "Within my narrative, which I have compiled over the weekend, the one assertion in relation to West Ham is completely wrong. "I am told that is a result of some social media scraping that was done, and that is wrong. That was one element in a document that was eight or nine pages long, but we stand by the key tenets in the document." Paul Kohler MP asked Guildford at that December hearing if the force just "did an AI search" and "whacked it into the [report]", to which Guildford responded: "No, not at all. We do a very comprehensive assessment." On 6 January Guildford repeated the police denials. "The summation in the House -- it was a question that was asked in the House -- was that West Midlands Police may have used AI on this particular occasion," said Guildford. "We do not use AI." Not done with digging a hole on that occasion, Guildford went on to waffle about "a system, which football officers use all over the country, that has intelligence reports of previous games", adding that "I am told that they just did a Google search on that [previous fixtures] because they could not find it in the normal system." To briefly pause and re-focus on the topic at hand, the controversy surrounds a decision taken by Birmingham's Safety Advisory Group, which is led by West Midlands Police and Birmingham City Council, to deem the November 6 match between Maccabi Tel Aviv and Aston Villa "high risk." As a result of this Tel Aviv fans were banned, a decision that was decried as antisemitic by some senior politicians. Police at the time pointed to previous incidents at Maccabi Tel Aviv games, including fan violence during a 2024 Europa League match in Amsterdam with Ajax, to justify the decision. The Dutch authorities have questioned the UK police's interpretation of the causes of violence at the match. But conflicting opinions over the Ajax game is one thing. It's a match that never actually happened being used to inform this decision that is truly shocking, and raises massive questions over the competence and command structure of the UK police. Copilot can't get basic facts right: how on Earth could it be used to inform a decision of this magnitude, and one that was always sure to attract scrutiny? "Another day, another confession from West Midlands Police," said Conservative MP Nick Timothy MP on X. "Despite denials at two separate hearings, it turns out they did use AI to produce their dodgy 'intelligence' dossier. Their account of their conduct in getting Israeli fans banned from Villa Park continues to unravel." His Majesty's Inspector of Constabulary (HMIC), Sir Andy Cooke, is due to write to the home secretary about the matter later today. That same home secretary is one of those who questioned the initial decision. And rumbling beneath it all is the toxic suggestion of antisemitism being involved in the decision. The government's independent adviser on antisemitism, Lord Mann, says: "What on earth were they doing using AI to create an untruth to back their case." He called on Chief Constable Guildford to resign and for West Midlands Police to be put under special measures by the police inspectorate. The intelligence report referring to the non-existent match between Maccabi Tel Aviv and West Ham has not been published, though it has been quoted in parliament. "Early on in the intelligence report, it says: 'The most recent match Maccabi played in the UK was against West Ham in the Europa Conference League on 9 November 2023. This was part of the '23-24 European campaign. It marked Maccabi Tel Aviv's last competitive appearance on UK soil to date,'" said Lord Mann. "That is in the intelligence report, but that did not happen. West Ham have never played Maccabi Tel Aviv. On that day, West Ham played Olympiakos of Greece and beat them 1-0. I think Tel Aviv were playing a Ukrainian team somewhere." The UK government has invested billions in all sorts of AI measures, and is one of those that believes it's going to somehow revolutionise the country and change the way it does everything. It's easy to look at the last year of AI humiliation, or software companies complaining that nobody likes it, and hand-wave the AI boosters away. But an incident like this, where a major political decision was made and parliament was misled as a result, surely has to be some sort of canary in the coal mine.
[8]
Police chief apologises for AI error that helped form Maccabi Tel Aviv fan ban decision
West Midlands Police's chief constable has apologised to MPs for giving them an error in evidence over the decision to ban Maccabi Tel Aviv fans, as the Home Secretary is set to address Parliament. Force leaders have been under fire over the decision to ban supporters of the Israeli football team from attending a Europa League match against Aston Villa on Nov. 6. Chief constable Craig Guildford wrote to the Home Affairs Committee to apologise for the mistake, after he appeared twice to give evidence over the controversy. In a letter to committee chairwoman Dame Karen Bradley, the senior police figure said that evidence given to the committee by himself and Assistant Chief Constable Mike O'Hara that wrong intelligence over a West Ham match with Maccabi Tel Aviv was because of a Google search was incorrect. Instead, the "erroneous result" arose from the use of the artificial intelligence tool Microsoft Co Pilot. Mr. Guildford wrote: "Both ACC O'Hara and I had, up until Friday afternoon, understood that the West Ham match had only been identified through the use of Google. "I would like to offer my profound apology to the Committee for this error, both on behalf of myself and that of ACC O'Hara. "I had understood and been advised that the match had been identified by way of a Google search in preparation for attending HAC. "My belief that this was the case was honestly held and there was no intention to mislead the Committee." During the select committee hearing on January 6, MPs asked Mr Guildford if any artificial intelligence had been used in the force's process. He said: "There was a definite note that we've got to the bottom of in terms of the West Ham game. "The summation, I think in the House, it was a question that was asked in the House was that, you know, you've used the AI, or West Midlands may have used AI on this particular occasion. "We don't do that. We don't use the AI." The police chief has faced mounting pressure and calls to resign over the ban. Maccabi Tel Aviv fans were barred from travelling to the game at Villa Park by the local Safety Advisory Group (SAG), which cited safety concerns based on advice from the police force. This included a reference by the force to a match between the Israeli club and West Ham United that never happened. The decision by the SAG -- which is made up of representatives from the council, police and other authorities -- sparked political outrage, including from Prime Minister Sir Keir Starmer. Since then, doubts have been growing over the intelligence used by police, including disputes over the accuracy of information. Mr. Guildford has insisted the decision was not politically influenced. It comes as Home Secretary Shabana Mahmood will make a statement to MPs on Wednesday after she ordered an investigation into the move to be carried out by His Majesty's Inspectorate of Constabulary and Fire and Rescue Services. According to The Guardian, the report from chief inspector of constabulary Sir Andy Cooke will say West Midlands Police made a series of errors in how it gathered and handled intelligence. A Home Office spokesperson said: "The Home Secretary has this morning received the Chief Inspectorate's findings into the recommendation by West Midlands Police to ban Maccabi Tel Aviv fans from attending a match against Aston Villa. "She will carefully consider the letter and will make a statement in the House of Commons in response later today." The power to sack Mr. Guildford lies with West Midlands police and crime commissioner Simon Foster, who has said he will formally review evidence on decision-making around the ban.
Share
Share
Copy Link
After weeks of denials, West Midlands Police Chief Constable Craig Guildford admitted his force used Microsoft Copilot to produce a faulty intelligence report that banned Maccabi Tel Aviv fans from a match. The AI hallucinated a non-existent game, and Guildford repeatedly misled Parliament about AI use. Home Secretary Shabana Mahmood called it a leadership failure and withdrew her confidence in the chief constable.
After repeatedly denying for weeks that his force used AI tools, Chief Constable Craig Guildford of the West Midlands Police has finally admitted that a controversial decision to impose a ban on football fans involved an an AI hallucination from Microsoft Copilot. The admission came in a letter to the Home Affairs Committee on January 12, 2026, after Guildford had twice testified before Parliament that no AI technology was involved in producing the intelligence reports.
3

Source: BBC
In October 2025, Birmingham's Safety Advisory Group met to assess whether an upcoming Europa League football match between Aston Villa and Maccabi Tel Aviv could proceed safely. West Midlands Police, a key member of the group, argued the match posed a violence risk and recommended banning fans from attending. The force pointed to claims that Maccabi Tel Aviv fans had been violent at a recent match in Amsterdam, citing exaggerated accounts that Amsterdam police later disputed.
1

Source: France 24
The faulty intelligence report included a critical error that would ultimately unravel the entire case. Among a list of recent games involving Maccabi Tel Aviv fans, West Midlands Police cited a match between West Ham and Maccabi Tel Aviv. The problem was stark: no such match ever occurred. On the date of this imaginary game, West Ham was actually playing against Greek team Olympiacos.
2
4
This AI hallucination became the smoking gun in what would become a major scandal involving sensitive security decisions. Microsoft warns at the bottom of its Copilot interface that "Copilot may make mistakes," but this particular error had real-world consequences affecting public safety and community relations.
2

Source: The Verge
When questioned by Parliament in December 2025 and again on January 6, 2026, Guildford repeatedly denied that West Midlands Police used AI technology. In December, he blamed "social media scraping" for the error. In January, he told MPs: "We do not use AI. On the West Ham side of things and how we gained that information, in producing the report, one of the officers would usually go to... a system, which football officers use all over the country. They basically Googled when the last time was."
1
But on Friday afternoon before sending his letter, Guildford learned the truth. "I became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot," he wrote to the committee, offering a "profound apology" for providing incorrect police evidence. He insisted he had "no intention to mislead the committee" and that his belief was "honestly held."
3
Related Stories
The revelation triggered a political firestorm. Home Secretary Shabana Mahmood delivered a scathing statement in the House of Commons, calling the Maccabi Tel Aviv ban a "failure of leadership" driven by "confirmation bias." She highlighted that Guildford had claimed "AI tools were not used to prepare intelligence reports," but now blamed an AI hallucination for the error. Mahmood declared that Guildford "no longer has my confidence," effectively calling for his removal.
1
Conservatives joined the chorus demanding Guildford's resignation. MP Nick Timothy has been vocal about the dangers of using hallucination-prone AI tools for sensitive security decisions, noting that officers denied AI use not just to the Home Affairs Committee but also in Freedom of Information requests. "They said they have no AI policy. So officers are using a new, unreliable technology for sensitive purposes," he wrote.
1
This incident mirrors other high-profile AI failures. In October, consulting firm Deloitte had to partially refund the Australian government $290,000 after delivering an AI-generated report riddled with fake academic research and court judgments.
4
The controversy raises urgent questions about how AI tools are being deployed across law enforcement and government without adequate oversight or fact-checking protocols. Microsoft Copilot is widely used across the American corporate world and was even adopted by the U.S. House of Representatives as of late last year.
4
The case highlights the risks of confirmation bias when authorities use unreliable AI technology to justify predetermined conclusions. West Midlands Police presented exaggerated or false information about Amsterdam incidents—claims that Dutch authorities disputed—while failing to verify basic facts produced by Microsoft Copilot. The combination of misinformation and unchecked AI output created a perfect storm that led to what many viewed as discriminatory treatment of Israeli fans in the wake of Islamic terror attacks.
1
As His Majesty's Inspectorate of Constabulary investigates the decision-making process, the incident serves as a stark warning about deploying AI assistants for intelligence reports without clear policies, training, or verification procedures. The match went ahead on November 6 without fans, but the fallout continues to reverberate through British policing and government.
Summarized by
Navi
[1]
[3]
[4]
13 Dec 2024•Policy and Regulation

26 Aug 2024

26 Nov 2025•Policy and Regulation

1
Policy and Regulation

2
Technology

3
Policy and Regulation
