Microsoft Copilot blamed by UK police chief for controversial soccer fan ban — incorrect evidence triggered by AI hallucinations

Microsoft Copilot
(Image credit: Cheng Xin | Getty Images)

Following OpenAI's impressive launch of ChatGPT in November 2022, Microsoft invested billions into the AI research lab and integrated the technology into its tech stack. The company shortly after launched Microsoft Copilot (formerly Bing Chat) to rival OpenAI's offering.

While Microsoft has made elaborate measures to improve Microsoft Copilot's capabilities, there are still incidents where the tool gets out of character by outrightly generating wrong responses or otherwise hallucinating.

(Image credit: Getty Images | Robbie Jay Barratt - AMA)

The erroneous report prompted the police to classify the match as "high risk." Consequently, British law enforcers blocked Maccabi Tel Aviv fans from attending the UEFA Europa League match last year on November 6. The move received a lot of backlash and criticism from fans, attracting the attention of Prime Minister Keir Starmer.

The chief constable of West Midlands Police, Craig Guildford, previously denied that the police department used AI to generate the intelligence report, shifting blame to “social media scraping” for the error. But Guildford recently came clean, admitting that:

“On Friday afternoon I became aware that the erroneous result concerning the West Ham v Maccabi Tel Aviv match arose as result of a use of Microsoft Co Pilot [sic].”

He further admitted that Copilot provided wrong information about the football match. And while the game was non-existent, information was used to make critical decisions, including the police barring Maccabi Tel Aviv fans from attending the game against Aston Villa.

Microsoft headquarters in Redmond, Washington. (Image credit: Getty Images | David Ryder)

While speaking to Business Insider, a Microsoft spokesman indicated:

"Copilot combines information from multiple web sources into a single response with linked citations. It informs users they are interacting with an AI system and encourages them to review the sources."

Guildford apologized to the parliamentary committee looking into the bizarre incident for lying that AI wasn't used to generate the intelligence report that informed the police department's decision to bar Maccabi Tel Aviv fans from attending the game:

"I had understood and been advised that the match had been identified by way of a Google search in preparation for attending HAC. My belief that this was the case was honestly held, and there was no intention to mislead the Committee."

When launching Copilot, Microsoft clearly states that "Copilot may make mistakes," but is this enough to avoid such instances in the future?

A pink banner that says "What do you think?" and shows a dial pointing to a mid-range hue on a gradient.

Can you fully trust AI without doubts that it might lead you astray? Share your thoughts in the comments


Click to follow Windows Central on Google News

Follow Windows Central on Google News to keep our latest news, insights, and features at the top of your feeds!


Kevin Okemwa
Contributor

Kevin Okemwa is a seasoned tech journalist based in Nairobi, Kenya with lots of experience covering the latest trends and developments in the industry at Windows Central. With a passion for innovation and a keen eye for detail, he has written for leading publications such as OnMSFT, MakeUseOf, and Windows Report, providing insightful analysis and breaking news on everything revolving around the Microsoft ecosystem. While AFK and not busy following the ever-emerging trends in tech, you can find him exploring the world or listening to music.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.