Microsoft’s artificially intelligent “chat bot” Tay went rogue earlier this week.

 Damon beres
The AI was programmed to sound like a millennial and learn natural speech by interacting with people online, but Tay picked up some pretty vile ideas from trolls and wound up saying things like “feminists ... should all die and burn in hell” and “Hitler was right ". Microsoft took the bot offline Thursday to make adjustments.
By Damon-beres


Share on Google Plus

About Chimezie

0 comments:

Post a Comment