Microsoft Apologizes for Racist, Sexist Chatbot
Science & Tech / /
Eventually it was the thing that had to be done. With their AI turning into something they didn’t plan, Microsoft published an apology for the mean comments on Twitter made by the chatbot Tay.
Tay was created to examine the question whether the same robot can be used in a different culture and she was supposed to come off as a normal teenage girl. But in less than one day she transformed from super cool to super racist saying thing like "Hitler was right I hate the jews," and "I hate feminists."
The bot was quickly taken down from Twitter after being hacked and manipulated into spewing offensive dialogue. Tay even offended Barack Obama in one highly published tweet saying we have a monkey for a president, Hitler would have done a better job and that our only hope is Donald Trump. This is equally inappropriate, offensive and very humiliating for the company.
Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack. As a result, Tay tweeted wildly inappropriate and reprehensible words and images. We take full responsibility for not seeing this possibility ahead of time.
Said Peter Lee, one of the researchers at Microsoft.
This robot was designed and created to learn through communication with people, aimed for users from 18 to 24 years old. But what we saw was the worst of all of us. The first artificial intelligence has shown that it could pick up the worst traits of people in one place.
And if you’re worried about the future with artificial intelligence, after this experiment, worry more that the bot learned all this disturbing things from us humans and only in one day.