Microsoft is testing a new chat bot, Tay.ai, that is aimed primarily at 18 to 24 year olds in the U.S.
Tay was built by the Microsoft Technology and Research and Bing teams as a way to conduct research on conversational understanding. The Bing team developed a similar conversational bot, Xiaolce, for the Chinese market, back in 2014. Microsoft execs dubbed Xiaolce "Cortana's little sister."
According to Tay's About page, the chat bot was built "by mining relevant public data and by using AI and editorial developed by a staff including improvisational comedians." Anonymized public data is Tay's primary data source, the page says.
The reason the bot is targeted specifically at the 18 to 24 year-old age group is that group is"the dominant users of mobile social chat services in the U.S.," the About page says.
If a user wants to "share" with Tay, the bot tracks that user's nickname, gender, favorite food, zip code and relationship status. Users can delete their profiles by submitting a request via the Tay.ai contact form.
Thanks to The Walking Cat (@h0x0d on Twitter), we know that Microsoft has built a bot framework for developers. Maybe Tay was developed with that framework (just a guess on my part)? Or is Tay an example of the kind of bots that Microsoft will enable others to build using its AI/machine learning technologies?
Update (March 24): A day after launching Tay.ai, Microsoft took the bot offline after some users taught it to parrot racist and other inflammatory opinions. There's no word from Microsoft as to when and if Tay will return or be updated to prevent this behavior in the future.
Update (March 25): Microsoft's official statement is Tay is offline and won't be back until "we are confident we can better anticipate malicious intent that conflicts with our principles and values."