10 Sources
10 Sources
[1]
Deny, deny, admit: UK police used Copilot AI "hallucination" when banning football fans
After repeatedly denying for weeks that his force used AI tools, the chief constable of the West Midlands police has finally admitted that a hugely controversial decision to ban Maccabi Tel Aviv football fans from the UK did involve hallucinated information from Microsoft Copilot. In October 2025, Birmingham's Safety Advisory Group (SAG) met to decide whether an upcoming football match between Aston Villa (based in Birmingham) and Maccabi Tel Aviv could be held safely. Tensions were heightened in part due to an October 2 terror attack against a synagogue in Manchester where several people were killed by an Islamic attacker. West Midlands Police, who were a key member of the SAG, argued that the upcoming football match could lead to violence in Birmingham, and they recommended banning fans from the game. The police pointed specifically to claims that Maccabi Tel Aviv fans had been violent in a recent football match in Amsterdam. This decision was hugely controversial, and it quickly became political. To some Jews and conservatives, it looked like Jewish fans were being banned from the match even though Islamic terror attacks were the more serious source of violence. The football match went ahead on November 6 without fans, but the controversy around the ban has persisted for months. Making it worse was the fact that the West Midlands Police narrative rapidly fell apart. According to the BBC, police claimed that the Amsterdam football match featured "500-600 Maccabi fans [who] had targeted Muslim communities the night before the Amsterdam fixture, saying there had been 'serious assaults including throwing random members of the public' into a river. They also claimed that 5,000 officers were needed to deal with the unrest in Amsterdam, after previously saying that the figure was 1,200." Amsterdam police made clear that the West Midlands account of bad Maccabi fan behavior was highly exaggerated, and the BBC recently obtained a letter from the Dutch inspector general confirming that the claims were inaccurate. But it was one flat-out error -- a small one, really -- that has made the West Midlands Police recommendation look particularly shoddy. In a list of recent games with Maccabi Tel Aviv fans present, the police included a match between West Ham (UK) and Maccabi Tel Aviv. The only problem? No such match occurred. So where had this completely fantasized detail come from? As an inquiry into the whole situation was mounted, Craig Guildford, the chief constable of the West Midlands Police, was hauled before Parliament in December 2025 and again in early January 2026 to answer questions. Both times, he claimed the police did not use AI -- the obvious suspect in a case like this. In December, Guildford blamed "social media scraping" gone wrong; in January, he chalked it up to some bad Googling. "We do not use AI," he told Parliament on January 6. "On the West Ham side of things and how we gained that information, in producing the report, one of the officers would usually go to... a system, which football officers use all over the country, that has intelligence reports of previous games. They did not find any relevant information within the searches that they made for that. They basically Googled when the last time was. That is how the information came to be." But Guildford admitted this week that this explanation was, in fact, bollocks. As he acknowledged in a letter on January 12, "I [recently] became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot." He had not intended to deceive anyone, he added, saying that "up until Friday afternoon, [I] understood that the West Ham match had only been identified through the use of Google." This has made a bad situation even worse. Today, in the House of Commons, Home Secretary Shabana Mahmood gave a long statement on the case in which she threw Guildford under the bus and backed over him five or six times. Mahmood blamed the ban on "confirmation bias" by the police. She said the Amsterdam stories they used were "exaggerated or simply untrue." And she highlighted the fact that Guildford claimed "AI tools were not used to prepare intelligence reports," but now "AI hallucination" was said to be responsible. The whole thing was a "failure of leadership," and Guildford "no longer has my confidence," she said. This last bit was something that everyone in the UK appears to agree on. Conservatives want Guildford to go, too, with party leaders calling for his resignation. MP Nick Timothy has been ranting for days on X about the issue, especially the fact that hallucination-prone AI tools are being used to produce security decisions. "More detail on the misuse of AI by the police," he wrote today. "They didn't just deny it to the home affairs committee. They denied it in FOI requests. They said they have no AI policy. So officers are using a new, unreliable technology for sensitive purposes without training or rules."
[2]
UK police blame Microsoft Copilot for intelligence mistake
The chief constable of one of Britain's largest police forces has admitted that Microsoft's Copilot AI assistant made a mistake in a football (soccer) intelligence report. The report, which led to Israeli football fans being banned from a match last year, included a non-existent match between West Ham and Maccabi Tel Aviv. Copilot hallucinated the game and West Midlands Police included the error in its intelligence report. "On Friday afternoon I became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot [sic]," says Craig Guildford, chief constable of West Midlands Police, in a letter to the Home Affairs Committee earlier this week. Guildford previously denied in December that the West Midlands Police had used AI to prepare the report, blaming "social media scraping" for the error. Maccabi Tel Aviv fans were banned from a Europa League match against Aston Villa in November last year, because the Birmingham Safety Advisory Group deemed the match "high risk" after "violent clashes and hate crime offences" at a previous Maccabi match in Amsterdam. As Microsoft warns at the bottom of its Copilot interface, "Copilot may make mistakes." This is a pretty high profile mistake, though. We tested Copilot recently and my colleague Antonio G. Di Benedetto found that Microsoft's AI assistant often "got things wrong" and "made stuff up." We've reached out to Microsoft to comment on why Copilot made up a football match that never existed, but the company didn't respond in time for publication.
[3]
West Midlands copper chief cops it after Copilot copped out
Craig Guildford banned Israeli fans based on Microsoft's match report, told MPs 'we don't use AI,' then discovers... they did The chief constable of West Midlands Police has retired after his force used fictional output from Microsoft Copilot in deciding to ban Israeli fans from attending a football match at Birmingham club Aston Villa. Chief Constable Craig Guildford, 52, retired from England's third-largest police force on 16 January. He was due to meet his boss, Simon Foster, Police and Crime Commissioner for the West Midlands, on January 27. He had earlier written to the chair of the House of Commons home affairs committee to apologize for incorrectly saying his officers had not used generative artificial intelligence (AI) when researching whether to block Maccabi Tel Aviv fans from attending a Europa League match against Aston Villa on 6 November 2025. West Midlands Police made its decision to block the away fans partly based on reports of disruption at a non-existent match between Maccabi Tel Aviv and London club West Ham. On 6 January, Guildford told MPs on the home affairs committee that officers had found this material through a Google search that did not involve use of AI functions. "We do not use AI," he said in evidence to MPs. In a letter on 12 January, however, Guildford said he had since realized that the made-up material had in fact come from a generative AI tool: "I became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot (sic)." Home secretary Shabana Mahmood had earlier said she had no confidence in Guildford, although any decision on his employment was up to the West Midlands police and crime commissioner. As well as questions over where it had found material, the force was criticized for taking an anti-Israeli stance in making its decision. Generative AI tools have previously made up cases cited by lawyers in both the US and the UK. In October, consultancy Deloitte refunded A$440,000 to the Australian government after using generative AI in writing a report that featured made-up references and footnotes. ®
[4]
Police chief admits misleading MPs over AI role in incorrect report that led to Israeli fans' ban
Here's Chief Constable Craig Guildford's letter to Karen Bradley, chair of the Home Affairs Select Committee: Dear Dame Karen Bradley, I write further to my appearance at the HAC on the 1st December 2025 and 6th January 2026 and in relation to the questions raised by the Committee concerning the provenance of the inclusion of the West Ham v Maccabi Tel Aviv match in the WMP report to the Birmingham City Council Safety Advisory Group. In preparation for the force response to the HMICFRS inquiry into this matter, on Friday afternoon I became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot. Both ACC O'Hara and I had, up until Friday afternoon, understood that the West Ham match had only been identified through the use of Google. This will be further explained in the additional material being provided to the Committee. I would like to offer my profound apology to the Committee for this error, both on behalf of myself and that of ACC O'Hara. I had understood and been advised that the match had been identified by way of a Google search in preparation for attending HAC.
[5]
British Police Used Microsoft Copilot for Faulty Report That Led to Ban on Soccer Team Fans
A police force in the United Kingdom has admitted to relying on Microsoft's AI assistant Copilot in the fabrication of a faulty intelligence report. Late last year, the Israeli soccer team Maccabi Tel Aviv played against the British Aston Villa in a game in Birmingham, U.K. Prior to the game, a Birmingham safety committee decided to not allow Maccabi Tel Aviv fans to attend the game, basing the decision on an intelligence report from West Midlands police deeming the match high-risk for hooliganism. The details of that report were later heavily contested by government officials. Now, West Midlands police chief constable Craig Guildford has admitted that his subordinates relied on Microsoft Copilot to fabricate the report and failed to fact-check its findings. Specifically, the report referred to a match between Maccabi Tel Aviv and the British soccer team West Ham that was completely hallucinated by Copilot. The two teams have never played against each other, and on the day of the imaginary game, West Ham had another game against the Greek team Olympiacos, according to the U.K. Parliament's Home Affairs Committee. "On Friday afternoon I became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot," Guildford wrote in a letter to the Home Affairs Committee, after weeks of denying the use of AI. Artificial intelligence is far from a perfect technology. It is still particularly prone to hallucinations, and that tendency can fuel the spread of misinformation with real-life consequences, and the West Midlands intelligence report incident is far from the first example. Back in October, major consulting firm Deloitte had to pay the Australian government a partial refund after delivering them a $290,000 AI-generated report riddled with fake academic research papers and court judgments. But despite the shortcomings of AI, American big tech companies have been engaged in a full-blown effort to deploy the technology far and wide, across all parts of the workforce. “America needs to be the most aggressive in adopting AI technology of any country in the world, bar none, and that is an imperative,†Nvidia CEO Jensen Huang said at a press and industry briefing in October. “We have to encourage every single company, every single student, to use AI.†Huang has also recently gone on to say that talking about the risks of AI deployment was "hurtful" and "not helpful to society." Microsoft is also aggressive about the future of AI in the workforce. The company made AI use mandatory for its employees and markets Copilot as an AI assistant to boost productivity in the workplace. The Copilot technology is widely deployed by a range of companies across the American corporate world. As of late last year, Copilot is also used by the U.S. House of Representatives.
[6]
West Midlands police chief apologises after AI error used to justify Maccabi Tel Aviv ban
Craig Guildford says he gave incorrect evidence to MPs and mistake arose from 'use of Microsoft Copilot' The chief of West Midlands police has apologised to MPs for giving them incorrect evidence about the decision to ban Maccabi Tel Aviv football fans, saying it had been produced by artificial intelligence (AI). Craig Guildford told the home affairs select committee on Monday that the inclusion of a fictitious match between Maccabi Tel Aviv and West Ham in police intelligence "arose as a result of a use of Microsoft Copilot". The chief constable had previously told MPs that the force did not use AI and the mistake regarding the West Ham match, which had never taken place, was made by "one individual doing one Google search". It comes as the home secretary, Shabana Mahmood, prepares to make a statement to MPs about the findings of a report by His Majesty's Inspectorate of Constabulary into the decision to ban Maccabi Tel Aviv fans from attending a Europe League match against Aston Villa in November. The fake West Ham match was a part of intelligence that was presented to the council-led security advisory group, who made the decision to ban away fans. In an email to the home affairs select committee published on Wednesday, Guildford said he would like to offer his "profound apology" for the error. "I had understood and been advised that the match had been identified by way of a Google search in preparation for attending HAC. My belief that this was the case was honestly held and there was no intention to mislead the committee."
[7]
UK police chief resigns over AI intel embarrassment, blames 'political and media frenzy' rather than his force's failure to fact check Copilot
UK government officials said they had "lost confidence" in the outgoing chief constable as a result of the scandal. West Midlands Police chief constable Craig Guildford, the head of the UK police force that admitted earlier this week that it had used faulty information from Microsoft's Copilot AI assistant in its controversial decision to ban Israeli football fans from a November 2025 Europa league match, has announced his resignation (via the BBC). His retirement follows days of criticism over the scandal, with UK home secretary Shabana Mahmood and Downing Street both expressing that they had "lost confidence" in the chief constable. Rather than his force's failure to fact check a notoriously unreliable technology, Guildford said in a statement announcing his retirement that "the political and media frenzy" was responsible for his decision to resign. "The political and media frenzy around myself and my position has become detrimental to all the great work undertaken by my officers and staff in serving communities across the West Midlands," Guildford said. "I have carefully considered my position and concluded that retirement is in the best interests of the organisation, myself and my family. It has been the honour of my career serving as the Chief Constable of West Midlands Police." In his statement, Guildford did not apologize for the West Midlands Police's handling of the Aston Villa v Maccabi Tel Aviv match, or for its inclusion of factually inaccurate information in the preceding intelligence report. In November 2025, the Birmingham Safety Advisory Group -- led by West Midlands Police and Birmingham City Council -- made the contentious decision to ban Israeli fans from attending an upcoming Europa League match between Aston Villa and Maccabi Tel Aviv, fearing a repeat of November 2024's violent clashes over Gaza war tensions with Israeli football fans in Amsterdam. That decision was informed by an intelligence report submitted by the West Midlands Police, which -- in addition to featuring interpretations of the 2024 Amsterdam fan violence that have since been questioned by Dutch authorities -- referenced an earlier West Ham v Maccabi Tel Aviv match in 2023. That West Ham v Maccabi Tel Aviv match, however, never occurred, which is the exact kind of error you don't want in an intelligence report. In a letter to the Home Affairs Committee earlier this week, Guildford admitted the West Ham match was fabricated by Microsoft's Copilot AI assistant, which the WMP had consulted and failed to fact check while compiling its report. "On Friday afternoon I became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Copilot," Guildford said. "Both ACC O'Hara and I had, up until Friday afternoon, understood that the West Ham match had only been identified through the use of Google." In December, Guildford had been questioned in a Home Affairs Committee hearing investigating the handling of the Aston Villa v Maccabi Tel Aviv match. When asked by a minister of parliament whether the WMP "did an AI search, got something about West Ham, and just whacked it into" the report, Guildford said "No, not at all," claiming he instead believed the nonexistent football match to have emerged from the police force's "search through social media." In his letter, Guildford acknowledged that the WMP had indeed just whacked the results of an AI search into its intelligence report, and offered his "profound apology to the Committee for this error." Like other LLM-driven assistants, Copilot features a disclaimer that it "may make mistakes," which you might hope would encourage the personnel of security, law enforcement, and defense organizations who are utilizing LLM technologies to exercise due caution. But when those technologies, despite their infamous unreliability, are marketed as one-button replacements for human knowledge and problem-solving, it's hard to think of these types of oversights as being anything but inevitable. In November, Microsoft AI CEO Mustafa Suleyman said it's "mindblowing" that people aren't more impressed with generative AI. Mustafa, if you're listening: This is why. Thankfully, it's not like we're setting ourselves up for even higher stakes AI embarrassments. In other news, US secretary of defense Pete Hegseth announced this week that the Pentagon will begin integrating X's Grok AI into military networks. Yes, the same LLM platform drawing growing outrage for flooding X with generated pornographic imagery of nonconsenting people, including minors. Surely nothing could go wrong with that, either.
[8]
UK police chief blames AI for error in evidence over Maccabi fan ban
London (AFP) - An under-fire UK police chief on Wednesday blamed the use of AI for erroneous evidence given to MPs over the decision to ban Maccabi Tel Aviv football fans from a match against Aston Villa. The police classified the match in Birmingham in November as "high risk", citing previous Maccabi games including a Europa League encounter in Amsterdam which sparked clashes between locals and Israeli fans. The decision to ban Maccabi fans from the Villa Park UEFA Europa League game was slammed by Prime Minister Keir Starmer, with the government ordering an independent report which is due to be published later on Wednesday. Scrutiny has increased on West Midlands police after multiple pieces of evidence used to justify the decision proved flawed, with the force rejecting allegations that the move was motivated by politics rather than fan safety. In an intelligence report for the game, police cited a match between West Ham and Maccabi Tel Aviv -- which never took place. When questioned about this by lawmakers earlier this month, chief constable Craig Guildford insisted that the error was the result of a Google search, and that the force had not used artificial intelligence in its research. However, in a letter to MPs on Wednesday, Guildford admitted that the erroneous information was due to the use of Microsoft Co Pilot, an AI chatbot. "I would like to offer my profound apology to the committee for this error," Guildford said, adding that there was "no intention to mislead the committee". That came after UK media reported in December that Dutch police also disputed evidence cited by the West Midlands force to justify the ban. UK police claimed they were told that Maccabi fans were behind several violent incidents during the 2024 Amsterdam clashes -- but that intelligence was partly contradicted by Dutch politicians and police. The decision to ban the fans was also sharply criticised by Israeli politicians, who claimed that it was antisemitic. British interior minister Shabana Mahmood will present the findings of the independent inquiry on Wednesday, which could heap further pressure on Guildford after opposition leader Kemi Badenoch called for his resignation over the debacle. The Guardian newspaper reported that the watchdog report is set to say that the police made a series of errors in how it gathered and handled intelligence while making the decision. The match went ahead amid heavy security, but without Maccabi fans after the Israeli team turned down their away ticket allocation.
[9]
UK police apologise to parliament after admitting a major and controversial policy decision was based on AI making stuff up
The Chief Constable of West Midlands Police in the United Kingdom has apologised to members of parliament after admitting that a major policing decision had been made "as result of a use of Microsoft Co Pilot [sic]". The decision was taken in November 2025 to ban fans from the Israeli team Maccabi Tel Aviv from attending a Europa League match against the Birmingham-based team Aston Villa. It was hugely controversial at the time, with even Prime Minister Keir Starmer questioning the move and calling it "wrong", and MPs have continued digging into why it was made. The Chief Constable, Craig Guildford, had previously insisted the police "do not use AI" but that a Google search had provided erroneous information used in a report that led to the ban: Notably, the inclusion of a match between Maccabi Tel-Aviv and West Ham that never happened. Now Guildford has written to Karen Bradley, chair of the Home Affairs Select Committee, admitting that AI was used in assembling the report: "In preparation for the force response to the HMICFRS inquiry into this matter, on Friday afternoon I became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot," writes Guildford. "Both ACC O'Hara and I had, up until Friday afternoon, understood that the West Ham match had only been identified through the use of Google [...] I would like to offer my profound apology to the Committee for this error, both on behalf of myself and that of ACC O'Hara." Guildford had previously been grilled by MPs about the inclusion of the non-existent match on December 1, 2025, at which time he said: "Within my narrative, which I have compiled over the weekend, the one assertion in relation to West Ham is completely wrong. "I am told that is a result of some social media scraping that was done, and that is wrong. That was one element in a document that was eight or nine pages long, but we stand by the key tenets in the document." Paul Kohler MP asked Guildford at that December hearing if the force just "did an AI search" and "whacked it into the [report]", to which Guildford responded: "No, not at all. We do a very comprehensive assessment." On 6 January Guildford repeated the police denials. "The summation in the House -- it was a question that was asked in the House -- was that West Midlands Police may have used AI on this particular occasion," said Guildford. "We do not use AI." Not done with digging a hole on that occasion, Guildford went on to waffle about "a system, which football officers use all over the country, that has intelligence reports of previous games", adding that "I am told that they just did a Google search on that [previous fixtures] because they could not find it in the normal system." To briefly pause and re-focus on the topic at hand, the controversy surrounds a decision taken by Birmingham's Safety Advisory Group, which is led by West Midlands Police and Birmingham City Council, to deem the November 6 match between Maccabi Tel Aviv and Aston Villa "high risk." As a result of this Tel Aviv fans were banned, a decision that was decried as antisemitic by some senior politicians. Police at the time pointed to previous incidents at Maccabi Tel Aviv games, including fan violence during a 2024 Europa League match in Amsterdam with Ajax, to justify the decision. The Dutch authorities have questioned the UK police's interpretation of the causes of violence at the match. But conflicting opinions over the Ajax game is one thing. It's a match that never actually happened being used to inform this decision that is truly shocking, and raises massive questions over the competence and command structure of the UK police. Copilot can't get basic facts right: how on Earth could it be used to inform a decision of this magnitude, and one that was always sure to attract scrutiny? "Another day, another confession from West Midlands Police," said Conservative MP Nick Timothy MP on X. "Despite denials at two separate hearings, it turns out they did use AI to produce their dodgy 'intelligence' dossier. Their account of their conduct in getting Israeli fans banned from Villa Park continues to unravel." His Majesty's Inspector of Constabulary (HMIC), Sir Andy Cooke, is due to write to the home secretary about the matter later today. That same home secretary is one of those who questioned the initial decision. And rumbling beneath it all is the toxic suggestion of antisemitism being involved in the decision. The government's independent adviser on antisemitism, Lord Mann, says: "What on earth were they doing using AI to create an untruth to back their case." He called on Chief Constable Guildford to resign and for West Midlands Police to be put under special measures by the police inspectorate. The intelligence report referring to the non-existent match between Maccabi Tel Aviv and West Ham has not been published, though it has been quoted in parliament. "Early on in the intelligence report, it says: 'The most recent match Maccabi played in the UK was against West Ham in the Europa Conference League on 9 November 2023. This was part of the '23-24 European campaign. It marked Maccabi Tel Aviv's last competitive appearance on UK soil to date,'" said Lord Mann. "That is in the intelligence report, but that did not happen. West Ham have never played Maccabi Tel Aviv. On that day, West Ham played Olympiakos of Greece and beat them 1-0. I think Tel Aviv were playing a Ukrainian team somewhere." The UK government has invested billions in all sorts of AI measures, and is one of those that believes it's going to somehow revolutionise the country and change the way it does everything. It's easy to look at the last year of AI humiliation, or software companies complaining that nobody likes it, and hand-wave the AI boosters away. But an incident like this, where a major political decision was made and parliament was misled as a result, surely has to be some sort of canary in the coal mine.
[10]
Police chief apologises for AI error that helped form Maccabi Tel Aviv fan ban decision
West Midlands Police's chief constable has apologised to MPs for giving them an error in evidence over the decision to ban Maccabi Tel Aviv fans, as the Home Secretary is set to address Parliament. Force leaders have been under fire over the decision to ban supporters of the Israeli football team from attending a Europa League match against Aston Villa on Nov. 6. Chief constable Craig Guildford wrote to the Home Affairs Committee to apologise for the mistake, after he appeared twice to give evidence over the controversy. In a letter to committee chairwoman Dame Karen Bradley, the senior police figure said that evidence given to the committee by himself and Assistant Chief Constable Mike O'Hara that wrong intelligence over a West Ham match with Maccabi Tel Aviv was because of a Google search was incorrect. Instead, the "erroneous result" arose from the use of the artificial intelligence tool Microsoft Co Pilot. Mr. Guildford wrote: "Both ACC O'Hara and I had, up until Friday afternoon, understood that the West Ham match had only been identified through the use of Google. "I would like to offer my profound apology to the Committee for this error, both on behalf of myself and that of ACC O'Hara. "I had understood and been advised that the match had been identified by way of a Google search in preparation for attending HAC. "My belief that this was the case was honestly held and there was no intention to mislead the Committee." During the select committee hearing on January 6, MPs asked Mr Guildford if any artificial intelligence had been used in the force's process. He said: "There was a definite note that we've got to the bottom of in terms of the West Ham game. "The summation, I think in the House, it was a question that was asked in the House was that, you know, you've used the AI, or West Midlands may have used AI on this particular occasion. "We don't do that. We don't use the AI." The police chief has faced mounting pressure and calls to resign over the ban. Maccabi Tel Aviv fans were barred from travelling to the game at Villa Park by the local Safety Advisory Group (SAG), which cited safety concerns based on advice from the police force. This included a reference by the force to a match between the Israeli club and West Ham United that never happened. The decision by the SAG -- which is made up of representatives from the council, police and other authorities -- sparked political outrage, including from Prime Minister Sir Keir Starmer. Since then, doubts have been growing over the intelligence used by police, including disputes over the accuracy of information. Mr. Guildford has insisted the decision was not politically influenced. It comes as Home Secretary Shabana Mahmood will make a statement to MPs on Wednesday after she ordered an investigation into the move to be carried out by His Majesty's Inspectorate of Constabulary and Fire and Rescue Services. According to The Guardian, the report from chief inspector of constabulary Sir Andy Cooke will say West Midlands Police made a series of errors in how it gathered and handled intelligence. A Home Office spokesperson said: "The Home Secretary has this morning received the Chief Inspectorate's findings into the recommendation by West Midlands Police to ban Maccabi Tel Aviv fans from attending a match against Aston Villa. "She will carefully consider the letter and will make a statement in the House of Commons in response later today." The power to sack Mr. Guildford lies with West Midlands police and crime commissioner Simon Foster, who has said he will formally review evidence on decision-making around the ban.
Share
Share
Copy Link
Chief Constable Craig Guildford admitted West Midlands Police used Microsoft Copilot to create a faulty intelligence report that banned Maccabi Tel Aviv fans from a match. The AI hallucinated a non-existent game between West Ham and Maccabi Tel Aviv. After denying AI use for weeks, Guildford resigned following the revelation that sensitive security decisions relied on unverified generative AI output.
West Midlands Police has become embroiled in a controversy that highlights the risks of deploying generative AI tools for sensitive security decisions without proper oversight. Chief Constable Craig Guildford admitted in a January 12 letter to the Home Affairs Committee that Microsoft Copilot was responsible for an AI hallucination that appeared in a faulty intelligence report used to ban Israeli football fans from attending a match
1
. The report, which assessed security risks for a Europa League match between Aston Villa and Maccabi Tel Aviv on November 6, 2025, included a reference to a match between West Ham and Maccabi Tel Aviv that never occurred2
. This fictional information generated by the AI assistant became a key piece of evidence used by Birmingham's Safety Advisory Group when deciding to prevent Maccabi Tel Aviv fans from attending the game.
Source: The Verge
The situation deteriorated significantly due to Guildford's handling of questions about how the erroneous information entered the report. When he appeared before Parliament in December 2025 and again on January 6, 2026, Guildford explicitly denied that his force used AI tools, stating "We do not use AI"
1
. He attributed the error first to "social media scraping" and later to a flawed Google search conducted by officers2
. This pattern of misleading MPs over AI role continued until January 10, when Guildford discovered that Microsoft Copilot had in fact been used to generate the problematic match reference. In his apology letter to the Home Affairs Committee, he explained that both he and Assistant Chief Constable O'Hara "had, up until Friday afternoon, understood that the West Ham match had only been identified through the use of Google"4
. The admission of this irresponsible use of AI sparked immediate calls for accountability.
Source: BBC
Home Secretary Shabana Mahmood delivered a scathing assessment of the incident in the House of Commons, declaring that Guildford "no longer has my confidence" and describing the episode as a "failure of leadership"
1
. She criticized the police for relying on information that was "exaggerated or simply untrue" and highlighted the confirmation bias that led to banning football fans despite questionable evidence. The decision to ban Israeli football fans had become politically charged, with some viewing it as discriminatory given that Islamic terror attacks, including an October 2 attack on a Manchester synagogue, represented a more serious security threat1
. Conservative MPs joined calls for Guildford's resignation, with MP Nick Timothy expressing particular concern about officers using "a new, unreliable technology for sensitive purposes" without proper AI policy guidelines1
.Related Stories
Craig Guildford, 52, retired from his position as chief of England's third-largest police force on January 16, just days after admitting the Microsoft Copilot error
3
. He had been scheduled to meet with Simon Foster, the Police and Crime Commissioner for the West Midlands, on January 27, but his departure came before that meeting could take place. The incident adds to a growing list of cases where AI hallucination has caused real-world harm. In October, consulting firm Deloitte refunded A$440,000 to the Australian government after delivering a report filled with fabricated references generated by AI3
. Generative AI tools have also produced fictional legal cases cited by lawyers in both the US and UK courts.
Source: ESPN
The West Midlands Police incident raises urgent questions about how organizations deploy AI tools without adequate safeguards. Microsoft markets Copilot as a productivity assistant across corporate and government sectors, and as of late last year, the U.S. House of Representatives had begun using the technology
5
. Microsoft itself warns at the bottom of its Copilot interface that "Copilot may make mistakes," yet the technology continues to spread rapidly2
. Testing by The Verge found that Microsoft's AI assistant frequently "got things wrong" and "made stuff up"2
. The West Midlands case demonstrates how misinformation generated by AI can influence high-stakes decisions when organizations lack proper verification protocols. The force reportedly had no AI policy in place, allowing officers to use the technology for intelligence gathering without training or oversight1
. As tech companies push aggressive adoption of AI across all sectors, this incident serves as a stark warning about the need for accountability frameworks before deploying generative AI in contexts where errors can have serious consequences for public safety and civil liberties.Summarized by
Navi
[1]
[3]
[4]
13 Dec 2024•Policy and Regulation

03 Jul 2025•Technology

26 Aug 2024

1
Business and Economy

2
Policy and Regulation

3
Policy and Regulation
