Microsoft reveals Zo chatbot as follow-up to Tay
AI engine can only be used by existing Kik subscribers to prevent abuse
Microsoft has revealed its latest AI effort - a chatbot named Zo. The robot learns about language and how people use words and emotion together by engaging in real conversation with humans, just like its predecessor Tay.
In an attempt to stop people abusing Zo, only those who have an account on messaging platform Kik can sign up for an invite to test it, although those who don't have a Kik account are able to specify they have a Facebook Messenger or Snapchat account, suggesting Microsoft will be testing it on other platforms in future too.
Although Zo is so far, great at normal conversation according to MSPoweruser's Mehedi Hassan, it's not particularly good at taking part in more advanced conversations, such as those about politics or other conversations that require a more sophisticated knowledge.
Zo's older brother Tay was withdrawn from testing after it appeared the people it was conversing with were feeding it with hateful information, which didn't contribute useful information to Microsoft.
On paper, Tay was a smart invention, learning everything it knew from conversations with other humans, some people began abusing the AI filling its robotic mind with potentially damaging comments, including racial hatred, support for fascists and other derogatory comments.
"The AI chatbot Tay is a machine learning project, designed for human engagement," Microsoft said as it withdrew the AI innovation from public testing.
"It is as much a social and cultural experiment, as it is technical. Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay's commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments."