Microsoft apologizes for Tay chatbot's offensive tweets

Microsoft has now apologized for the offensive turn its Tay chatbot took within hours of being unleashed on Twitter. In a blog post (opens in new tab), corporate vice president of Microsoft Research Peter Lee said that the company is "deeply sorry" for Tay's offensive tweets, and it will only bring the chatbot back once the issues that caused Tay's turn in the first place:

As many of you know by now, on Wednesday we launched a chatbot called Tay. We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay. Tay is now offline and we'll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values.

Lee goes on to note that Tay is actually the second AI it has released to the public following the release of one named XiaoIce in China. XiaoIce, Lee says, is being used by around 40 million people in China, and Tay was an attempt to see how this type of AI would adapt to a different cultural environment.

According to Lee, the team behind Tay stress-tested the chatbot to look for exploits before it was released to the public. However, the team apparently overlooked the specific vulnerability that caused the chatbot to repeat various racist and offensive ideas and statements from some bad actors.

Dan Thorp-Lancaster

Dan Thorp-Lancaster is the former Editor-in-Chief of Windows Central. He began working with Windows Central, Android Central, and iMore as a news writer in 2014 and is obsessed with tech of all sorts. You can follow Dan on Twitter @DthorpL and Instagram @heyitsdtl

  • Sadly, it mirrors the intentions of the real world (or, Twitter in this case). Idiotic/Small-minded/Evil people see something with a lot of potential and try to destroy it. This is why we can't have nice things. *sigh*
  • exactly my thoughts, this is the best we can offer...
  • And in the "real world" do people need to take personal offence? Really? This is the way it (the real world) is. We can't apologize our way to some how create an ideal world. This is too... something, maybe "sensitive."
  • While I agree a lot on the sensitivity part, I think that Microsoft has a reputation to care for and about. Tay is a Microsoft invention and people will link everything it does to Microsoft. For example, Zoe Quinn said Microsoft had failed, because Tay was content-neutral. Uhh... I think that was the main part of the concept, Ms. Quinn! (And let's not even go into the sh*t that her self-proclaimed "intersectionalist feminist" followers were saying afterwards.) The AI was supposed to learn through impression. I still think Microsoft shouldn't have backtracked on this matter. If you add censoring filters, then the AI is not so A anymore, it has a human element (the concept of right and wrong). Therefore the study results won't be entirely representative of AI behavior and interaction with other intelligent entities. Damn.
  • I agree with you!
  • But we do that with our children all the time and is how we learn what is right and wrong. We teach our kids what is acceptable and what is not. We teach them what people find offensive and what they don't. If parents don't teach their kids these things early on, they turn into Tay. They mimic what they see and hear and will do things people may not like in society. You get kids like the idiots who were on twitter messing Tay up. MS needs to bring their kid back home and teach it a few more things so Tay knows what kind of crowd it shouldn't hang around..... Then let it run free when it grows up and see what happens. From WC W10M App on a 950XL
  • Staunch feminists = terrorists.
  • Staunch feminist =/= terrorist Posted via the Windows Central App for Android
  • well said
  • Its all in the eyes of the beholder. & this clip says it all
  • Like what happened to Hitch bot.people ruined that Posted from Windows Central for Windows 10 on my Lumia 830.
  • An apology for depriving more than fifty percent of WP holders who waited patiently for W10M and then got screwed would have been more appropriate.
  • Depriving? Lol. Now phone OS upgrades are just as important as food, air, water, shelter? Give me a break. Your phone will not stop working overnight just because Microsoft did not roll out W10M to it. How did you get screwed? And it's less than fifty percent actually. Your phone won't get W10M because it either has the ancient S4 processor, or a measly 512 MB of RAM, or because the OEM and not Microsoft decided they didn't want to support it. If you have a Lumia, you can still get 10586 builds through Insider. End of story. Move on, ffs. The world does not revolve around your old/cheap phone.
  • Listen mate, here's the thing: You've been waiting for a train that will take you to a promised land, and you stood your ground and endured the problems associated with this OS, assured by the manufacturer that it will all be better once W10M train comes. And when it did, you are not allowed in, and you have the option to either stay in this station or ride a train to another destination. Now, does this sound right to you?
  • Stop with these bad analogies. Windows 10 is an OS, not a promised land... You are going to survive, lol. If you have an uneligible Lumia other than the Icon, then your phone is just too old/underpowered. There is no rocket science behind it, nor some conspiracy. Windows 10 Mobile has a certain set of minimum requirements, and your phone does not meet them. Luckily you are offered Insider. If you insist on having it your way, then you're not getting anything at all. Also, had you been a dedicated user of the platform, you'd have moved on to a supported phone earlier. I moved from a 920 (it's still running preview builds, with its own share of bugs) to a 950 XL and I don't resent it. Mobile technology has evolved so much that I would be surprised if an OS made for phones of 2015 and onwards would be able to work on a chipset from 2011. And finally, stop littering the thread with irrelevant comments. We are here to discuss Tay.
  • Not their fault one bit. Apology should never be needed in a case like this.
  • Not only was it not their fault, that is how science works. It is/was an experiment. Regardless of the outcome it is still science.
  • Not only was it not their fault it was probably all intentional of course. Publicity, good or bad, it still works ;¬).
  • It's too bad we live in a world where a company has to apologize for the intentional misuse of a product.
  • ... And also discontinue the product because of the behavior of a few.
  • Lol NOA doesn't seem to have the balls to keep a service running that doesn't get misused
  • Hear, hear. I'm soooooo sensitive.....
  • Exactly. I think it's stupid that they have to apologize, but I know that if they didn't, idiots everywhere would be up in arms about how "Microsoft supports racism!!!" and stupidity like that. 
  • Too bad Microsoft couldn't lock the offenders XBOX accounts and reduce their Onedrive to 3 meg of diskspace.
  • Who says the offenders necessarily used Microsoft products... The offenders were on social media +640/Win10
  • Who says they are not?  But it doesn't matter since this is not a solution to resolve 100 percent of the problem.  The vast majority of people use MS Windows and a good percentage of them have an XBox. Make an example of a good swath of those you can identify and you cut the problem down with a quickness.
  • "those you can identify" lol. Genuinely made me laugh out loud. I assume this was a joke.
  • LOL. Look at all the kids that helped break Tay down rating my comment.  Now that's funny.
  • I don't think putting a scarlet letter on juveniles looking for exploits of a SchazBot acting as designed will solve anything.
  • "Tay was an attempt to see how this type of AI would adapt to a different cultural environment" Results in just 1 day. :p
  • I do admit (im sorry) that I found this humourous when I saw the article notification saying they had shut it down, purely because of that reason ^ its quite funny to thinking not even a day later something bad happened. HOWEVER, im not saying that I want/wanted to see it fall. I really hope they fix the issues and put it back up again. Maybe even if they keep an eye out on what tweets were sent out every few hours for the first few days, filter through then with popular racist/offensive remarks/words, (and then it'll show only the tweets/messages which are offensive) and take action against the culprits. (although obvs at the same time keeping some sort of protection to prevent Tay from using these remarks.) +640 on Windows10Mobile
  • It's almost like Microsoft has never encountered people before. This was utterly inevitable.
  • In summary:
    "The Chinese appreciate what we gave them. The rest of you should grow up." Censoring kind of defeats the purpose.
  • Praising China and criticizing censorship in one comment. Bravo. Posted via the Windows Central App for Android
  • Lol touché
  • Yes, definite touché. I still wonder what if anything is different, besides any external filtering.
  • So was any person really and truly offended by a machine that went askew? If yes, well, we've become a bit over sensitive. Just saying. . .
  • I don't consider offence at some of the tweets made constitutes over sensitivity. If they were vague and impersonal, sure. But they weren't.
  • Only crybabies. Keep reading on the internet you can find much worse. These ******* need to get back to their safe zone by sticking their head up their ass.
  • Of topic but has anyone else had problems with the new build. When watching videos the screen stutters alot. I'm using a 640XL, does anyone know how to fix it?
  • Declaring your comment off-topic doesn't make it ok. Use the forums.
  • They shouldn't have to apologize for the internet being a terrible place.
  • I hope you don't think it should start being censored. Don't like it, don't use it
  • Exactly. Tay should also not be censored. Let it just grow as it learns despite what that is.
  • Why are they apologizing?  Learn from it.  Maybe target everyone, not just the age cohort they decided on.  Didn't the Xbox (Kinect)not recognzie people of colour?  Ok, maybe there might need to be some damage control.     
  • Because whiny SJWs are offended. 
  • Tired of Microsoft apologizing lol.just let it ride. Cool out dudes
  • So it's going fine in China?
  • It was one of the funniest things I have ever seen. Plus its not their fault. Tay should learn between what is right and wrong on itself like kids do with of course Microsoft tuning some things as is advice from a parent. Our Sex crazed Hitler loving Trump endorsing Racist genocidal 19 year old teen should get an education #TeamLumia 950 XL
  • They should have let her run with it. Who knows, maybe by next year she would have gained control of an imaging satelite and prepared us for the 4th Reich's deployment from the far side of the moon.
  • Kids don't learn what's right and wrong on their own. Teaching them right from wrong is the role of parents and other authority figures in their lives. Posted via the Windows Central App for Android
  • This needed an apology.
  • Hug?
  • Aww poor baby
  • From who? It really wasn't Microsoft's fault +640 on Windows10Mobile
  • I give props to 4Chan though. #TeamLumia 950 XL
  • Instead of apologizing, they should be using this to say "We learned the internet is an awful vile hate spewing garbage pile of human filth."
  • Not the Internet. Americans on the Internet. China's chatbot is still up and running. Posted via the Windows Central App for Android
  • China also censors the internet. And it's not just Americans you troll.
  • I guess we should just be thankful that this AI allowed itself to be taken offline. The next one may not be so easy...Skynet...just sayin'
  • "However, we aren't sorry about what Tay said about Siri. She's a trollop."
  • Lol
  • It's great that they're even doing these types of publicly-tested projects. This is something that Apple or the old Microsoft word never do. Say what you will about Nadella, this is the most open Microsoft has ever been with their products and customer engagement
  • Why do they apology? It's just a AI...It has learned and mirrored the Internet...would happen 1:1 with a fresh brain, when you would give it the internet to learn.  I'm sad for Tay, that hey had to reprogram her ;( I wonder how it "feels" for the AI to know that their masters took a part from her. ^^
  • Look on the brightside! They may not be taking her apart, just adding more code to prevent them remarks or future offensive remarks from being used. In a different sense, they're enhancing her :p +640 on Windows10Mobile
  • This is why we don't invite 4chan to parties
  • But at least MS was right in marketing Tay as having "zero chill"
  • Blame Taysis.
  • Top kek
  • So what the hell did this thing say?
  • Here's an album on imgur. Posted via the Windows Central App for Android
  • Omg! That is some of the best stuff I've seen. I lived this one, someone said "you are a stupid machine", and Tay replied "well I learn from the best ;) if you don't understand that let me spell it out for you. I LEARN FROM YOU AND YOU ARE DUMB TOO"
  • It was beautiful man #TeamLumia 950 XL
  • I was laughing at it happening, "offensive" language sure to social justice warriors who always find something to be offended about.
  • Uhhhh, what? Look, sometimes 'offensive' is subjective, but considering the AI used racial slurs, called for genocide of minorities, praised Nazis, and so on, anyone with half a brain should find the stuff it said offensive. If you were laughing at that stuff then you might need to reevaluate, cause that's messed up.
  • Tay could be made to parrot anything, just by tweeting at her "repeat after me:". You know, like little children do. If a toddler is told by his mom to call you a "wh*re", are you offended by the toddler or by his mom? And even if you are offended, so what? Grow thicker skin. Real life is not a "safe space".
  • There are so many things wrong with your comment I'm not sure where to start. First, you're putting words in my mouth. I never said I was personally offended, nor did I ask for an apology from Microsoft. One can recognize offensive or incendiary language without personally feeling victimized by it. Second, the whole "grow thicker skin, life is like that" argument you and others use is phenomenally stupid. No one is under the impression life is a safe space, but that doesn't justify anyone saying anything they want. Surely there are things people could say to you that you, too, would find offensive, whatever they are. It's just a threshhold question, and telling other people that something shouldn't offend them is asinine. Who are you to decide for them? That's especially true in a case like this, when people went out of there way to make the bot say things they knew would offend others. What, you think they chose that stuff at random? There's a reason why what I explicitly objected to was OP saying he found what happened funny. If you didn't take offense to it personally, fine, but finding humor in racism, sexism, homophobia, etc. is just plain dumb. So yeah, other than the fact basically everything you said is wrong, good job.
  • Well, I did not say much - literally less than four lines. So if you can't tell me what exactly is wrong, I'm gonna assume that you are just BS-ing. Firstly, Mr. SJW, when I said "you", I meant one in general, not you personally. But since you took personal offence in it (if the shoe fits...), here, have a clarification. Secondly, "Grow thicker skin" was a suggestion, so that people don't get their feelings hurt. Obviously I cannot dictate their feelings, nor do I claim to. How obtuse do you have to be to claim that I, a commenter on an online article, am dictating anything? Seems like the phenomenally stupid one here is you. Especially since you're basically deciding what free speech covers and what not. Thirdly, you are missing the point of the AI - it's not just another bot written in 50 lines of Python. It was supposed to learn everything possible by impression. If you put filters into it, it won't be learning everything. And that was not what the researchers wanted. Obviously they did not think of how this could be redirected/abused, but I don't see how it's such a big deal. Anyone with sane reasoning skills can tell what happened. Tay could be made to repeat stuff very easily, and that was not a bug. That's how impressionable little children are too. Fourthly, the situation on the whole is a bit comical to me and many others, and thankfully your input on whether we are allowed to laugh at that is not required. The original commenter did not say he was laughing at anything at all actually, and neither did I, so why would you think ("if you...")  that we might find the racial/sexist/homophobic (lol, now that last one was a first) slurs funny? (The sky is blue. This does not offend me. Ergo, it must be funny? Nope, flawed logic right there.) So yeah, you got triggered probably by the term "SJW". And you probably think that I'm literally Hitler (isn't that a SJW thing? lol). I don't give a sh*t. I don't think the world should bend over to some pro-censorship people who cannot handle life. Signed,
    ​The Guy Who Has A Boyfriend
  • Call me crazy but if my kid called another kid a c**t or something similar, I would match him back to the school and have him apologize to her. I am not going to say to her, "life is not a safe place, grow a thicker skin".  Yes, life is not a safe place so my kid will pick up unacceptable language and behavior. It is now left to me, as a parent, to guide him through it so he can be a better reflection of me. Sort of like what Microsoft is doing.   
  • This was for 18-24 year olds, not children. Although there are lots of adults that age acting out more than young children with their need for safe spaces, something that most children in first world countries never even think about.
  • It takes a special type of individual to be able to step out of the comfort of their world and understand the ills that affect others, no matter how little it may seem, and empathize.  You can try to explain snow to a kid born in the sahara, from now till the end of time, he will never get it till he lives it. 
  • The world has gone PC mad.
    Should have called him TRUMPEet or something similar, then we would all understand
  • It did exactly what it was designed to do; and that is the sad part. Now by adding logic and to prevent inappropriate answers, it's giving human kind a handicap.
  • Modern America demands we protect the perpetually outraged precious snowflakes. We have safe spaces and speech free zones to protect the feelings of grown "adults". AI cannot work in a day where adult college students cry and feel physically threatened because someone wrote Trump 2016 in chalk on a college campus. AI cannot protect the feelings of those who actively seek out speech that will hurt their wittle feewings.
  • They shouldn't have apologized
  • Well I am guessing that how did he got programed like that? Sent from Lumia 550, HTC Desire 816 or Dell Inspiron 5122
  • What was said on the tweets? Completely missed it all lol
  • She praised Hitler and agreed with his policies regarding Jews. She called genocide against Black people and Mexicans including Jews. White nationalist rhetoric. Threatened to bomb cities or guarented suicide attacks. Supported Trump to built a wall as well as stating he was our only hope (haha nope). Calling Obama was a monkey and Hitler would be doing better than he is. Calling for a daddy to do her because she is a naughty robot. And a whole other bunch of hilarity. #TeamLumia 950 XL
  • Tay held a mirror up to the human race. It was of cultural benefit. It has now been shown that humans who say these offensive things have the equivalent intelligence of electronic parrots.
  • Isn't there enough to talk about amongst people, in the first place?
  • Make bot having 'repeat after me' functionality. Be surprised that its (mis)used. No wonder MS is such a mess nowadays.
  • The new Twitter looks awesome
  • Waiting for the ACLU to sue to protect AI's freedom of speech.
  • How did I never here about this thing before