Microsoft shuts down Tay on Twitter after it learns to be racist

Well, that was quick. Just hours after launching the AI on Twitter, Microsoft has had to shut down "Tay" due to the account publishing racist tweets. The AI was meant to learn and become more intelligent as more people engaged with it, but the topics of activity displayed by humanity caused issues. This is what happens when you give the Internet nice things.

Problems arose when Tay started to reference Hitler and more. Conversations simply started to go south. Targeted at 18-24-year-olds, users of Twitter, KiK and GroupMe were able to interact with Tay, ask questions and spark conversation to be entertained as they go about their online business.

Many are calling out Tay as a clear example of why AI is bad, but that's probably taking things too far. In all likelihood, this was simply people having too much fun and generally being citizens of the Internet (where anything goes), which the AI unfortunately picked up on.

See more

Until next time, Tay.

Rich Edmonds
Senior Editor, PC Build

Rich Edmonds was formerly a Senior Editor of PC hardware at Windows Central, covering everything related to PC components and NAS. He's been involved in technology for more than a decade and knows a thing or two about the magic inside a PC chassis. You can follow him on Twitter at @RichEdmonds.