Microsoft launches AI chat bot, Tay.ai

Microsoft's Research and Bing teams have developed a chat bot, Tay.ai, aimed at 18 to 24 year olds, the 'dominant users of mobile social chat services in the U.S.'

Microsoft is testing a new chat bot, Tay.ai, that is aimed primarily at 18 to 24 year olds in the U.S.

microsofttayaichatbot.jpg

Tay was built by the Microsoft Technology and Research and Bing teams as a way to conduct research on conversational understanding. The Bing team developed a similar conversational bot, Xiaolce, for the Chinese market, back in 2014. Microsoft execs dubbed Xiaolce "Cortana's little sister."

According to Tay's About page, the chat bot was built "by mining relevant public data and by using AI and editorial developed by a staff including improvisational comedians." Anonymized public data is Tay's primary data source, the page says.

Artificial intelligence

Should we be as terrified as Elon Musk and Bill Gates?

The time is now for the tech industry to put guidelines in place to guard against AI's long term dangers.

The reason the bot is targeted specifically at the 18 to 24 year-old age group is that group is"the dominant users of mobile social chat services in the U.S.," the About page says.

If a user wants to "share" with Tay, the bot tracks that user's nickname, gender, favorite food, zip code and relationship status. Users can delete their profiles by submitting a request via the Tay.ai contact form.

The bot's Twitter account, which has been verified, is https://twitter.com/TayandYou. The bot also is on Snapchat, Kik and GroupMe.

Thanks to The Walking Cat (@h0x0d on Twitter), we know that Microsoft has built a bot framework for developers. Maybe Tay was developed with that framework (just a guess on my part)? Or is Tay an example of the kind of bots that Microsoft will enable others to build using its AI/machine learning technologies?

Update (March 24): A day after launching Tay.ai, Microsoft took the bot offline after some users taught it to parrot racist and other inflammatory opinions. There's no word from Microsoft as to when and if Tay will return or be updated to prevent this behavior in the future.

Update (March 25): Microsoft's official statement is Tay is offline and won't be back until "we are confident we can better anticipate malicious intent that conflicts with our principles and values."

Newsletters

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
See All
See All