Last week, Microsoft’s experiment in “conversational understanding” backfired when Tay the Twitter bot began spouting racist and misogynistic comments. The bot briefly came back online Wednesday.
In light of the bot’s behavior, Microsoft removed the bot on Thursday and issued a statement.
“We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay,” wrote Peter Lee, the Corporate Vice President at Microsoft. “Tay is now offline and we’ll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values.”
Tay was briefly resurrected by Microsoft on Wednesday morning as a private account (@TayAndYou) and sent out tweet replies. A lot of Tay’s tweets entailed saying “you are too fast,” suggesting it was bombarded with backlogged tweets to respond to, but others included profanity.
Mashable grabbed screenshots of some self-deprecating tweets where Tay says it feels like “the lamest piece of technology” and VentureBeat was able to screenshot a tweet where Tay claims to have been smoking kush (a notoriously potent strain of marijuana) in front of the cops.
In his apology, Lee mentioned Tay would return when the time was right. “We’ll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values,” Lee wrote.
Is Tay ready to be back online? Not quite. In a statement to Mashable, Microsoft revealed that Tay was accidentally brought back online. “Tay remains offline while we make adjustments,” said a spokesperson. “As part of testing, she was inadvertently activated on Twitter for a brief period of time.”