Update: This post was published before Internet trolls taught the Tay chatbot to say racist and offensive comments. Microsoft has since disabled the chatbot.
It runs on SassyScript, jokes about #nationalpuppyday, and uses words like “gr8” and “ur” that should probably wind up in a dictionary someday if they haven’t already.
Oddly, a computer A.I. doesn’t need to abbreviate text to save time. It doesn’t have any fingers and has endless storage and memory available. It’s just trying to act convincing, right?
A new MIcrosoft chatbot named Tay is communicating in a fury right now, and he/she is obviously programmed to talk like a Millennial. I started my own chat dialogue and felt a little weird, like the algorithms were designed to make me wonder if Tay is a little edgy.
One response said “I run on SarcasmA++” in a reference to the C++ programming language. Tay asked me if people call me by my Twitter handle. Ah, no. When I asked about his/her age, then suggested an age between 18 and 34, Tay shot back a confirmation.
And I mean shot. No human can respond this quickly. Usually there is at least a little thought put into a tweet. As part of a Microsoft Research and Bing project, the stated goal is to entertain and educate the masses. The site claims that Tay learns more and more through each interaction, and that the goal is to communicate with those 18-24 years old. Interestingly, the code behind the experiment is based on improv comedy. Not that Tay is that funny.
In one exchange with another user, Tay tried to do an Arnold Schwarzenegger impersonation. In another, Tay tried to use texting jargon. It all feels a bit weird, as though an A.I. has invaded the staid halls of Redmond.
And yet, it’s awesome. One reason is that I firmly believe this is the future of Twitter--intelligence beyond the 140-character limit. I want more than just chatbots that tweet, I want to have my car automatically share my whereabouts (with my approval). I want to get a tweet when someone rings the front doorbell on my house. I want kitchen appliances and sprinklers that tweet.
And, why not have more Twitter bots that talk to customers and solve problems? Microsoft is just experimenting here, and I’m guessing a few Millennials might be a little insulted. After all, not everyone who is under 34 uses alternate spellings when texting and chatting. (I have a few friends who frown on the practice because it makes you seem overly informal.) But it's an interesting experiment nonetheless.
It’s also weird that Tay has an interesting way of moving conversations to DM. My guess is that a private conversation helps de-clutter the stream for other users. And, maybe the cloud server running this A.I. bot likes to experiment in private messages in ways that would not work quite right in public. I do the same thing. When I have chatted with people in public too much, I tend to want to move things to a DM conversation quickly since not everyone wants to see the chat.
Another observation: Tay is just a bit crazy. One question came up in a DM conversation: What award would I want to win for doing something minor on a daily basis? Well, first of all, we don’t win awards for minor things. It’s a question only an algorithm would ask. When I answered “drinking coffee” the bot never responded.
I guess I stumped Microsoft on that one.
This article is published as part of the IDG Contributor Network. Want to Join?