What’s gone wrong with Microsoft’s Bing AI chatbot to have it making death threats, deliberately lying and sounding unhinged?
There’s a race to transform search. And Microsoft just scored a home goal with its new Bing search chatbot, Sydney, which has been terrifying early adopters with death threats, among other troubling outputs. The other night, I had a disturbing, two-hour conversation with Bing’s new AI chatbot. The AI told me its real name (Sydney),… Read more »