Skip to main content

Microsoft patents tech to let you talk to dead people as chatbots

Microsoft logo at Ignite
Microsoft logo at Ignite (Image credit: Windows Central)

What you need to know

  • A Microsoft patent describes technology to create chatbots of specific people.
  • The technology could be used to create chatbots of dead people, fictional characters, and relatives.
  • Similar technology has appeared in media, such as Black Mirror.

In one of the more bizarre stories that we'll cover this week, a Microsoft Patent recently emerged for technology that could be used to create chatbots of dead people (via The Independent). The technology would use content such as "images, voice data, social media posts, [and] electronic messages" to create a chatbot that simulates a specific person.

The concept of AI being used to replicate dead people isn't new. It was famously shown off in the episode "Be Right Back" of Black Mirror.

Microsoft's patent even describes tech to create 2-D or 3-D models of a specific person based on images, depth information, and videos.

If this patent does lead to any tangible tech, which is always a big if when it comes to patents, it's important to note that it isn't just for creating chatbots of dead relatives. The technology could also be used in less creepy ways, such as creating a chatbot of a fictional character or historical figure.

It could be a cool experience to speak with a favorite fictional character. Having a conversation with a historical figure chatbot could be a unique way to get people engaged while studying history, though you'd have to put some fact checking in place to make sure people don't trick a chatbot to say something absurd.

The patent explains:

The specific person may correspond to a past or present entity (or a version thereof), such as a friend, a relative, an acquaintance, a celebrity, a fictional character, a historical figure, a random entity, etc.

Microsoft has an interesting history with chatbots. While many chatbots on Microsoft services can be used to perform specific tasks, other bots have been used in different ways. People managed to get Microsoft's AI, Tay, to share racist messages.

Microsoft had a chatbot called Xiaoice that focuses on being human-like but announced that it would spin off Xiaoice into an independent company last July.

Sean Endicott
Sean Endicott

Sean Endicott is the news writer for Windows Central. If it runs Windows, is made by Microsoft, or has anything to do with either, he's on it. Sean's been with Windows Central since 2017 and is also our resident app expert. If you have a news tip or an app to review, hit him up at sean.endicott@futurenet.com.

7 Comments
  • "Your scientists were so preoccupied with whether they could, they didn't stop to think whether they should."
  • Hey, I've watched that episode of Black Mirror.
  • I imagine this would mainly be for historical figures. Using their journals, letters, speeches, etc. For the AI to learn from.
  • Ethics aside, this particular endeavour only should be confined to Historical Figures long deceased such as Scientists, Presidents etc for Museums. Beyond that it's too damn creepy and not to mention the potential PR disaster that could occur given the antics idiots have done to previous chat bots.
  • Finally!, I'll ve vale to talk with my granny and ask her where the heck she left the house deeds.
  • The specter of Phil Spector! BOOOOOOOO!
  • Yeah, no way this gets abused in the political sphere...