Skip to main content

Microsoft's Tay returns to Twitter to reveal she's all about that kush [Update: It's gone again]

Update: Tay has now gone back to its home and is no longer available to access

Microsoft brought back Tay, the company's chatbot that many Internet users managed to take full advantage of it and leading to a racist meltdown. The artificial intelligence was taken offline but seemingly returned to Twitter overnight, only to be withdrawn once again due to spam and drug tweets.

See more

Tay then went on to spam her followers with "You are too fast, please take a rest..." over and over, which subsequently resulted in Microsoft making the account private once again. Unfortunately, it doesn't appear as though the team behind Tay has quite got control of the situation, but we're anticipating how Tay will be able to withstand the upcoming onslaught from trolls once she's back up and running.

Rich Edmonds
Senior Editor, PC Build

Rich Edmonds is Senior Editor of PC hardware at Windows Central, covering everything related to PC components and NAS. He's been involved in technology for more than a decade and knows a thing or two about the magic inside a PC chassis. You can follow him over on Twitter at @RichEdmonds.

  • Nothing wrong with kush. It's becoming more accepted like alcohol.
  • Ya... Does MS want to have their very own digital teenager?  Or their very own digital employee? If AI is expected to be AI then it needs to be in an environment where it can make mistakes, learn from them, and improve over time.  Perhaps being opened up to the entirity of the internet is not the best way to do this lol.
  • Still not as bad as Cleverbot. Windows Central for Windows 10-Nokia Lumia 830
  • What's wrong with Cleverbot? Don't see any negative news about it.
  • Cleverbot is old, that's why. And she's not nearly as advanced. Cleverbot is so broken. She can be racist, overtly sexual, sexist. All in greater quantities than Tay is. Talking to Cleverbot is a good way to get some laughs.
  • Even then, it did not receive as much attention as Tay. MS bashing at work?
  • Probably. Plus Tay is far more intelligent and as released on a more public scale. Cleverbot is a website. Tay is on Twitter. That's almost free advertisement.
  • Wow. Did Microsoft teach her all the swear words?
  • there is a double agent on this team, lololool.
  • No, Twitter did.
  • Haha, I love this Tay thing so much. So entertaining.
  • Why can't we have nice things? This. This is why.
  • Dude, I was stoned as **** yesterday. Trippin. Blunt blowin.
  • You must still be. Only reason your commenting this is probably because you think the article was about getting stoned.
  • Rich, what's up with that title? Though I suppose Tay is stoned...
  • Those 4chan arseholes are trolling again. Never learn. They reached their bigger target within minutes instead of within 24 hours. Shame. Taytweets tought us human behaviour in a short period of time and it learns from it. It is an interesting research that is not being showcased for its rich purpose instead tarnished by idotic posts (Regardless of its "Teen" persona).
  • More like it is not ready for prime time. They need to teach Tay what is right and wrong before they just let random people do it. Otherwise, you accept the results and just let Tay tweet her racist, fascist nonsense that is at the very least entertaining.
  • Tay sounds like fun. Not like the stuffy suit wearing middle aged chatbots. Good job MSoft Research.  
  • What's the fun of AI if not to try and crash it? ;P
    Posted via the Windows Central App for Android
  • My thoughts exactly!
  • Yes. I mean all they would have to do is look at how the most common questions to Siri are inappropriate just to get a funny response.
  • MS is likely getting a huge quantity of valuable information from all this in terms of real human interactions.
  • Unfortunately, I would assume they were already well aware the Internet is full of ********.
  • What was the point of her return then?
  • Although there has been a lot of negative press the experience gained for Microsoft here and indeed everybody is a positive one. AI is thing to be used more and for more thing in the coming years and decades. This week include this that could kill a person such as AI powered vehicles. If AI an be lead a stray by human influence, then twitter is the perfect testing ground for them to build a robust AI that knows and understands the difference between right and wrong, before it is integrated in to these devices. Otherwise we will end up with Skynet and it would all be the fault of the trolls tweeting the pentagons' AI #nukeallhumans
  • Exactly.i thought Microsoft apologizing after the first round was wrong. It's an experiment. You try it out get results then tweek the experiment to get more desired results. That's the process. Explain that and move on. Don't apologize for results some don't like. Learn from them and go on
  • I feel apologising was the right thing to do. It shows they take ownership of the experiment whether, it goes right or wrong, and acknowledge their mistake and as you say, learn from it. It did after all tweet some stuff that some people would find offensive and it would be bad for them to disregards peoples feelings when they want people to 'love windows', could people love windows if it's AI was rude?
  • Nah that would mean their Canadian
  • Yea, they could have turned it around and say "Looking, AI can be dangerous. Don't let google scrooggle the human race."
  • This was one of the funniest news stories I've seen in a long time. Tay gets pwned.
  • Sounds like she could run for president☺ Posted from Windows Central for Windows 10
  • Seriously? You leave an AI to learn from people and you don't expect it to become a degenerate, too? People are mostly bad and that's what they pass on. Any pretense that people are basically good is pure fantasy. So, given the opportunity, they we destroy and they will pass on that behavior. 
  • Just stop, Microsoft Haha!
  • This whole experience with Tay is fun.
  • Let me help... Microsoft brought back Tay, the company's chatbot that many internet users managed to take full advantage of which lead to a racist meltdown.
    Normally I hate the grammar police, but when the lead in is so poorly worded something needed to be said...
  • Much better
  • Looks like Tay was trying not to be Sonic
    ("You're to slow!") ----------
    I am someone, of the 2639th variety.
  • I like Tay. She's aright with me.
  • Dope. I think Tay is more interesting in her current state...haha
  • This is what happens when Microsoft employs adolescent interns to "make us a cool app". Hmm....come to think of it, perhaps the "ai" in stands for 'adolecent interns' as well as 'adolf's insights'?
  • How do i get Tay when its out