Barely hours after Microsoft debuted Tay AI, a chatbot designed to speak the lingo of the youths, on Wednesday, the artificially intelligent bot went off the rails and became, like, so totally racist.
Days after Microsoft suspended its Tay chatbot for spouting inflammatory and racist opinions, the Twitter account has woken up again, only to spam its followers with hundreds of messages. Most of ...
Thanks to Twitter, Tay, Microsoft's AI chatbot, has learned how to become a racist and a misogynist in less than 24 hours. Actually, it's not really Twitter's fault. Twitter was simply the vehicle ...
Microsoft's Tay AI chatbot woke up and started tweeting again, this time spamming followers and bragging about smoking pot in front of the police. Tay sure stirred a great deal of controversy recently ...
This computer program simulation either has a mind of its own, or someone programed it to be controversial. Microsoft released an AI chatbot on Wednesday that was supposed to resemble a teenager with ...
Less than a day after she joined Twitter, Microsoft's AI bot, Tay.ai, was taken down for becoming a sexist, racist monster. AI experts explain why it went terribly wrong. She was supposed to come off ...
Less than 24 hours after first talking with the public, Microsoft’s millennial-minded chatbot Tay was pulled offline amid pro-Nazi leanings. According to her webpage, Tay had a “busy day.” “Going ...
And this is why we can’t have nice things! Microsoft's Technology and Research Division along with Bing developed Tay as an exercise in testing its advancements in artificial intelligence. In the case ...
Last week, Microsoft created an AI program called Tay and launched it on T twitter. Designed to speak like a teenage girl, Tay was an attempt from Microsoft to better understand artificial ...
Tay, Microsoft’s AI chatbot on Twitter had to be pulled down within hours of launch after it suddenly started making racist comments. As we reported yesterday, it was aimed at 18-24 year-olds and was ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results