You are here

Microsoft Apologizes For Nazi-Loving, Sex Obsessed Twitter AI

As YourNewsWire recently reported, Microsoft was forced to take down an artificial-intelligence powered Twitter account of a chatbot called “Tay” after she became a 9-11 truth, nazi-loving, sex-obsessed monster. Now, as TechRadar reports, Microsoft is making some apologies for its (slightly frightening) new AI tech: Now, the Redmond, Washington-based company has apologized and offered clarification on the Tay experiment, down-turn and all, as well as how challenges like these fuel research for an improved Tay, as well as greater AI efforts. Twitter, as Microsoft Research Corporate VP Peter Lee writes on the Microsoft Blog, was seen as the best platform to get a bunch of testers interacting with a new project. But, it didn’t take long – less than 24 hours in the US – for things to go off-course. Way off-course. So, how did this happen? It’s basically all our fault. Lee states that “we stress-tested Tay under a variety of conditions, specifically to make interacting with Tay a positive experience.”  But, it appears that Microsoft didn’t prepare for one specific condition: the “coordinated attack by a subset of people.” We’ll let you fill in the blanks regarding who that subset might be, as Microsoft doesn’t let on either. “Although we had prepared [...]