Don't listen to Microsoft Copilot, David Attenborough and William Shatner are very much alive

Windows 11 Copilot
Microsoft Copilot still has issues that cause it to share false information. (Image credit: Windows Central)

What you need to know

  • When asked about notable people who died this year, Microsoft Copilot may share a list of people who are still alive.
  • In my testing, Copilot shared a list of living persons and people who died in previous years.
  • AI chatbots, including Copilot and Google Bard, have issues with sharing factually correct information and often "hallucinate."

The phenomenon was noticed by several people who took to X (formerly Twitter) and other platforms. When asked if he was okay, William Shatner jokingly responded that he was not fine after reading about his death. The Verge shared other examples, including one listing Attenborough as deceased. I've seen similar results in my testing. In addition to listing living people as dead, Copilot incorrectly stated several deaths from previous years occurred in 2024.

This is only the latest example of AI getting facts wrong. Copilot has shared false information regarding the US election in the past. Some believe that ChatGPT, which is part of what powers Copilot, has gotten less intelligence since launch. In the early days of the chatbot, Copilot, then known as Bing Chat, shared bizarre and creepy responses.

I have first-hand experience with AI chatbots spreading false information. Last year, I wrote an article about how Google Bard incorrectly stated that it had been shut down. Bing Chat then scanned my article and wrongly interpreted it to mean that Google Bard had been shut down. That saga provided a scary glimpse of chatbots feeding chatbots.

AI often struggles with logic and reasoning. That fact isn't surprising when you consider how AI works. Tools like Copilot are not actually intelligent. They're not using reasoning skills in a way that a human would. They're often tripped up by the phrasing used in prompts and miss key pieces of information in questions. Mix in AI's struggles to understand satire and you have a dangerous recipe for misinformation.

Fixing AI

Microsoft unveiled a major update to Copilot yesterday. The update is meant to make the AI assist more personal and interactive. As explained by our Senior Editor Zac Bowden, "Microsoft really wants you to view the new Copilot as more than just an AI tool. It wants you to treat it like a friend, whether that be by asking it for advice on how to ask out a crush, venting about work, or chatting about nothing because that’s what people do."

The new Copilot has features such as "Copilot Voice" that aim to make interaction with the chatbot feel conversational. The tool can also suggest topics to discuss and share summaries of daily news.

A new interface and some voice features may help in making Copilot feel more personal, but I'd prefer my friends, human or digital, don't share false information and claim living cultural icons are dead. Perhaps more training time will make our digital friend more factual.

Sean Endicott
News Writer and apps editor

Sean Endicott is a news writer and apps editor for Windows Central with 11+ years of experience. A Nottingham Trent journalism graduate, Sean has covered the industry’s arc from the Lumia era to the launch of Windows 11 and generative AI. Having started at Thrifter, he uses his expertise in price tracking to help readers find genuine hardware value.

Beyond tech news, Sean is a UK sports media pioneer. In 2017, he became one of the first to stream via smartphone and is an expert in AP Capture systems. A tech-forward coach, he was named 2024 BAFA Youth Coach of the Year. He is focused on using technology—from AI to Clipchamp—to gain a practical edge.