Last Wednesday, Tay, a Twitter bot developed by Microsoft was unveiled. The software company said the bot was its experiment in "conversational understanding." But in less than one day, Twitter was able to corrupt the naïve AI chatbot
The software giant was initially all praises at its creation claiming that the more Tay chats, the smarter it gets. It learns by engaging people through "casual and playful conversations," so the Redmond, WA-based company thought.
Regrettably, the conversation didn't remain playful for too long, Right after Tay was launched, people started to tweet the AI chatbot with all sorts of unsavory remarks, some racist, some misogynistic and some Donald Trumpish.
Being essentially a robot parrot that is blessed with an internet connection, Tay started to repeat the words it learned and gave back these sentiments to Twitter users. That proved the old but wise programming proverb: garbage in, garbage out.
Though it may be true that Microsoft stress tested the chatbot before its launch, it didn't factor its vulnerability to such situations that turned it into a "racist" and offensive AI robot. The software company was not able to prevent a program that could be fooled by anyone.
The hideous things that happened in less than one day of Tay's launching forced the software developer to temporarily stop its operation and concentrate on correcting the wrongs done. It hasn't stopped apologizing and explaining for what had happened.
"As many of you know by now, on Wednesday we launched a chatbot called Tay," wrote Peter Lee, Corporate Vice President for Research at Microsoft, in a blog.
"We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay," he continued.
"Tay is now offline and we'll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values," Lee added.
© 2017 Jobs & Hire All rights reserved. Do not reproduce without permission.