Bing Chat now has fewer 'hallucinations' following an update

Bing Chat AI
(Image credit: Daniel Rubino)

What you need to know

  • Bing Chat received an update to version 96 recently.
  • The update reduces how often the chatbot will refuse to answer a question.
  • Hallucinations should also be reduced, thanks to the update.

Bing Chat recently received an update to version 96. The latest version of the chatbot is fully in production, according to Mikhail Parakhin, CEO of Advertising and Web Services at Microsoft. The chat service within Bing should now respond to more questions and generate better answers.

Parakhin highlighted two noteworthy improvements on Twitter:

  • Significant reduction in cases where Bing refuses to reply for no apparent reason
  • Reduced instances of hallucination in answers

Hallucinations in this case refer to Bing Chat adding incorrect information within an otherwise correct response. These are problematic because Bing will present false information as if it were factual alongside correct data, making it hard to discern between true statements and false ones.

As an example, Bing Chat may not know a figure related to financial data and will make one up. It then presents that amount alongside other pieces of information that are correct. That exact phenomenon occurred during Microsoft's demonstration of Bing.

Reducing hallucinations should improve the chatbot's ability to answer questions that center on facts.

Bing has only been in preview for a few weeks, so Microsoft is listening to feedback and changing the search engine to match. Initially, people figured out ways to make Bing act in bizarre ways. Microsoft responded by limiting the search engine. It has since raised some of those restrictions and worked to make the search engine respond to more questions.

As a quick note, that Twitter account is not verified, but the changes listed appear to be accurate. Mike Davidson, who has a large following within the industry and has worked on Bing and Edge, responded to some questions in the Twitter thread. Parakhin's LinkedIn profile breaks down his role at Microsoft.

We asked Bing Chat which version it was running to verify Parakhin's tweet, but the chatbot was not helpful.

Bing Chat cannot identify its version number

Bing Chat would not identify its version number due to confidentiality and its rules. (Image credit: Future)

Davidson shared that the team behind Bing Chat is working on a way to show the version number to those who ask.

Sean Endicott
News Writer and apps editor

Sean Endicott is a tech journalist at Windows Central, specializing in Windows, Microsoft software, AI, and PCs. He's covered major launches, from Windows 10 and 11 to the rise of AI tools like ChatGPT. Sean's journey began with the Lumia 740, leading to strong ties with app developers. Outside writing, he coaches American football, utilizing Microsoft services to manage his team. He studied broadcast journalism at Nottingham Trent University and is active on X @SeanEndicott_ and Threads @sean_endicott_.