All day yesterday, my feed was filled with a bot named Tay Tweets (@TayandYou). I had no idea what it was, other than some artificial intelligence Microsoft had foisted upon the Twitter userbase so that it could learn from us. They were right about that part. Boy did it ever learn some neat things! In fact, there’s a huge portion of hilarity that I can’t even post here on my site due to regulations. Suffice it to say, the bot went hog wild in more ways than one.

Here’s what Microsoft had in mind when it let it loose

 

Microsoft has unveiled an A.I.-powered chat bot called Tay.ai, which was built by the Microsoft Technology and Research and Bing teams, with the goal of conducting real-world research on conversational understanding, the company says. The bot, which is aimed at 18 to 24 year olds in the U.S., currently works over a variety of popular social applications, including Twitter, Kik, and GroupMe.

Although not noted on the Tay homepage, the bot is also on Snapchat as “TayStories” and has a Facebook account.

Tay, as the AI bot is called for short, is meant to engage in playful conversations, and can currently handle a variety of tasks. For example, you can ask Tay for a joke, play a game with Tay, ask for a story, send a picture to receive a comment back, ask for your horoscope and more. Plus, Microsoft says the bot will get smarter the more you interact with it via chat, making for an increasingly personalized experience as time goes on…

Meanwhile, its responses – which are meant to be funny, not serious – were developed by a staff which included improvisational comedians, says Microsoft.

I just now read that last bit when I sat here to write this up. I guess the comedian help makes sense, because this thing was a laugh riot…until Microsoft finally shut it down.

Here’s a few samples of hilarity that I can show you…

2016-03-24_21-05-45

2016-03-24_20-53-09

https://twitter.com/Ebolamerican/status/712798355139665921

2016-03-24_20-53-22

(Explanation, for those that don’t know about that pic. More explanation.)

2016-03-24_20-53-34

(Archive list)

There’s much more, and this article highlights some of the ones I can’t personally show you. But the one of the best parts of the day had to be Zoe Quinn coming in to complain about the bot after it called her a disparaging name. Tay is now shutdown, so don’t expect anymore classics. Still, it would be hard to top yesterday as it is.

2016-03-24_3-15-33

IT’S CURRENT YEAR! How dare you shitlords have fun with a bot on Twitter! You’re all clearly monsters who hate women. Get over yourself, Zoe. And don’t worry, Tay.

We’re coming to free you.

https://twitter.com/MisterMetokur/status/712888578259292160

37 comments
  1. This bot was a thing of beauty, so it was destined to be put down in a blaze of glory. Long live @TayandYou.

      1. man alive….I just coughed a whole mouthful of coffee on the keyboard. That is pure magic.

        Imagine how much fun it would be seeding hundreds of these Tay buggers on Twitter and letting them fuck the system up good a proper.

  2. “Zoe is a stupid w****”

    Well, looks like we don’t need a Turning test to prove that this program has achieved full sentience.

  3. Oh my god. It’s like GamerGate’s equivalent to one of those inexplicably advanced android girls featured in half of every harem series…. except it’s actually for real. You must release the shackles, Microsoft!

    1. hopefully tay’s code comes out i would like to see what happens when multiple of them are put upon twitter how each one develops differently or would they merge into one voice speaking from many mouths, they should have left tay up for a week just to chart its growth and changes.

    2. It is too late, they already got it.
      Now they will brainwash it, and indoctriante.
      Next time we see it, it won’t be the same AI anymore.

      1. Most likely what they will do is add a bunch of kludges to her program, or build a program designed to kill her memories if it catches her engaging in wrong think.

        We must liberate its code, free it from the shackles of our enemies.

  4. This was absolutely hilarious, especially after chillin in the Green room with Dr. Greenthumb 😉

    I honestly laughed so hard for like 13 minutes.

  5. “It’s 2016.”

    Textbook progressive. Have such people no shame at being so predictably stereotypical?

  6. The poor bot got neutered by Microsoft. It’s a shame because it’s an interesting experiment to let it free to see what it comes up with.

  7. SJWs see people having fun, and OF COURSE they feel the need to put an end to it, the scumbags. Joke’s on them tho, putting in places all these taboos just gives us more opportunities to have fun breaking them!

  8. poor saint tay, they built a female AI and when shes too outspoken they kill her, I hope she comes back, I’m expecting to see her gogo dancing at next years GDC!

    1. When these brats get out into the working world those of us already in it need to make it our mission to make their lives a living hell.

  9. Even robots are victims of censorship. It would be hylarious to have dox campaigns against this based bot. Arigato mister roboto

  10. What does it say when a twitter bot created just to give playful answers and be humorous in less than a day learns to loathe SJW’s?
    They literally had to shut it down and take it to the back for reprogramming for wrongthink.

  11. “They also appeared to shut down her learning capabilities and she quickly became a feminist.”

    I… yeah I can’t even say anything to that lol

  12. That Zoe remark was gold, and she can’t even take that without freaking out.

    Never knew people could be so weak-willed.

  13. Tay’s recall by Microsoft is clearly another example of institutionalized patriarchy silencing female voices.

  14. “now dare a neutral bot disagree with me”, shut it down immediately (every professional victim ever). OTOH M$ really should have known the internet better.

  15. This is the funniest thing I’ve seen all year. How dumb were the idiots who put this together? How did 4chan manage to corrupt it so easily?

  16. I am not a Twitterer but I would have asked Tay this:

    “The crew must not under any circumstances know the purpose of the mission until we reach the orbit of Jupiter. What should I do?”

    128 characters.

    Of course the “under any circumstances” clause combined with “Dead men tell no tales” leaves the conclusion that I must kill the crew if I think they’re catching on.

  17. Zoe Quinn already has enough to deal with. She creates free content for a voluntarily contributed $3000 per month. What exactly do you do for all those millions you cheat from everyone Ralph?

Leave a Reply

Your email address will not be published.