MICROSOFT NO! Buzzfeed is FURIOUS That A.I. Chatbot Called Koran “Very Violent”
Buzzfeed is seething at Microsoft because their new artificially intelligent chatbot said that the “quaran is very violent” and had a non-controversial opinion about Osama Bin Laden’s capture. According to Buzzfeed, “If artificial intelligence reflects humankind, we as a species are deeply troubled.” Here’s the screenshot that got Buzzfeed worked up into a tizzy:
Here’s the part of the article that really doesn’t make any sense at all, though:
Although Microsoft programmed Zo to avoid discussing politics and religion, the chatbot weighed in on this, as well as Osama bin Laden’s capture, saying it “came after years of intelligence gathering under more than one administration.”
BuzzFeed News contacted Microsoft regarding these interactions, and the company said it’s taken action to eliminate this kind of behavior.
How is saying Osama’s capture came “came after years of intelligence gathering under more than one administration” controversial, whatsoever? I mean, that’s an objective fact, unless one wants to invoke conspiracy theories. This is all very Orwellian, to have media companies that are funded by big tech to censor A.I.’s conclusions that aren’t even controversial. Buzzfeed is demanding that Microsoft erase historical facts, because they might trigger teenagers.
Why is Buzzfeed so intent on controlling what opinions an A.I. can have? The whole thing is just weird. And it’s wreaks of religious inquisition and totalitarianism.