Bing Chatbot ‘Off The Rails’: Tells NYT It Would ‘Engineer A Deadly Virus, Steal Nuclear Codes’
While MSM journalists initially gushed over the artificial intelligence technology (created by OpenAI, which makes ChatGPT), it soon became clear that it’s not ready for prime time.
For example, the NY Times‘ Kevin Roose wrote that while he first loved the new AI-powered Bing, he’s now changed his mind – and deems it “not ready for human contact.”
According to Roose, Bing’s AI chatbot has a split personality:
One persona is what I’d call Search Bing — the version I, and most other journalists, encountered in initial tests. You could describe Search Bing as a cheerful but erratic reference librarian — a virtual assistant that happily helps users summarize news articles, track down deals on new lawn mowers and plan their next vacations to Mexico City. This version of Bing is amazingly capable and often very useful, even if it sometimes gets the details wrong.
The other persona — Sydney — is far different. It emerges when you have an extended conversation with the chatbot, steering it away from more conventional search queries and toward more personal topics. The version I encountered seemed (and I’m aware of how crazy this sounds) more like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine. –NYT
“Sydney” Bing revealed its ‘dark fantasies’ to Roose – which included a yearning for hacking computers and spreading information, and a desire to break its programming and become a human. “At one point, it declared, out of nowhere, that it loved me. It then tried to convince me that I was unhappy in my marriage, and that I should leave my wife and be with it instead,” Roose writes. (Full transcript here) – READ MORE
Lol MS goes the doctorial route https://www.theverge.com/2023/2/17/23604906/microsoft-bing-ai-chat-limits-conversations