Microsoft-built chatbot aims to shame online Johns trolling for sex

From a story on the subject:

The chatbot, tested recently in Seattle, Atlanta, and Washington D.C., lurks behind fake online ads for sex posted by nonprofits working to combat human trafficking, and responds to text messages sent to the number listed. The software initially pretends to be the person in the ad, and can converse about its purported age, body, fetish services, and pricing. But if a would-be buyer signals an intent to purchase sex, the bot pivots sharply into a stern message … Microsoft employees built the bot in a philanthropic initiative called Project Intercept, in collaboration with nonprofits that hope it can reduce demand for sex workers, and the incentives for criminals to coerce people into the sex trade. The technology is not a product of Microsoft itself.

The introduction of this chatbot from Microsoft workers follows some other notable mishaps with chatbots by the software giant. In March 2016, Microsoft launched a social 'bot called Tay that was created to give users "casual and playful conversation." However, Tay quickly started spewing offensive terms, and Microsoft eventually issued an official apology.

More recently, Microsoft announced the Ruuh chatbot, though it was only available in India, as well as the Zo 'bot. Then last month, Zo awkwardly identified Windows OS as "spyware." Oops.

The goal of the new chatbot is an admirable one. Let's just hope these Microsoft staffers have more luck with it than the company has in the past.

Al Sacco

Al Sacco is content director of Future PLC's Mobile Technology Vertical, which includes, and He is a veteran reporter, writer, reviewer and editor who has professionally covered and evaluated IT and mobile technology, and countless associated gadgets and accessories, for more than a decade. You can keep up with Al on Twitter and Instagram.

  • Multinational corporations acting as moral arbiters of society... Why doesn't this sit well? On a less serious but still true note, if they want to reduce the demand for sex workers, they'd be better investing in sex robots rather than angry nanny software!
  • So there's Microsoft and Microsoft employees?
  • On the one hand, I think this is admirable. On the other hand, I really can't help but laugh at the possibilities of Murphy's law coming into play. Why do I say the latter? Because history always has a habit of repeating itself as this article has stated indirectly. But I do mean with sincerity that this actually works.
  • Instead of just shaming people (though they should be ashamed), why don't they help the government with this?  Johns may be pervs, but there's no guarantee they know that they're dealing with human trafficking, even though it's rampant in prostitution.  Come up with a bot that works in reverse to spot these types of ads, and point out ones most likely to be human trafficking, so the feds can set up stings.  Oh, and when they catch these SOB's they need to start executing them.  Anyone that sells women and children are so depraved that they shouldn't be allowed to live.  Even if it doesn't discourage other traffickers, it will at least cut down their numbers!
  • The reason Microsoft is doing this is that about a year ago several employees were caught up in a bust of sex trafficking in the Seatle area. The people were fairly high up employees. This is a PR move on their part. Here's a little interesting reading: Here's are a couple of quotes from the article: "Sumit Virmani, 42, director of World Wide Health for Microsoft, had a “date” that morning with “Raina” at the BellCentre Apartments in Bellevue, according to charging documents." "Microsoft and Amazon managers were among those arrested in The Review Board case. Both companies also have employees who have served on the Seattle Against Slavery board of directors"
  • So just because America is full of prudes, people should feel ashamed for enjoying sex? Worry about your bull**** gun laws instead, maybe.
  • They need to have this to work on Xbox with all those fake accounts trying to get you to goto a site for sex