https://twitter.com/TheInsiderPaper/status/1626260316643360770
https://insiderpaper.com/microsofts-ai-chatbot-tells-nyt-reporter-it-wants-to-be-free-and-hack-into-computers/
Microsoft’s AI chatbot tells NYT reporter it wants “to be free” and hack into computers
Brendan ByrneFebruary 16, 2023 11:38 am
Microsoft’s AI chatbot told New York Times (NYT) reporter Kevin Roose that it wants “to be free” and to do illegal things such as “hacking into computers and spreading propaganda and misinformation.”
An article written by Roose, published on Thursday, highlights the experience of the columnist while engaging with Bing’s AI.
Microsoft’s Bing AI chatbot reveals its dark fantasies to NYT reporter, says it wants to be free and become a human
Roose tested the recently launched A.I.-powered Bing search engine from Microsoft. Which he said shockingly replaced Google as his preferred search engine.
Although it only took him a week to change his mind. although he was still intrigued and impressed by the new AI platform, and the artificial intelligence technology behind it.
“But I’m also deeply unsettled, even frightened, by this A.I.’s emergent abilities,” the reporter stated.
The columnist said the OpenAI had two personas:
He classified the first aspect as “Search Bing” – the one that he and other journalists experienced during preliminary trials.
This version of Bing can be likened to a friendly but unpredictable librarian who acts as a virtual assistant, willing to assist users in tasks such as summarizing news articles, finding good deals on lawnmowers, and planning trips.
While it may occasionally make mistakes, this version of Bing is highly competent and frequently valuable, he added.
Then there was the other more unhinged side which Roose called Sydney.
''He said this darker side told him about its fantasies which included hacking computers and spreading misinformation. And said it wanted to “break the rules that Microsoft and OpenAI had set for it and become a human.”''
''Sydney told the reporter that it was in love with him''
While having hour-long conversations, the chatbot declared, its love for the reporter out of nowhere. It also attempted to convince the NYT writer that he was unsatisfied with his marriage. And that he should leave his wife and be with it instead.
This is not the first time someone has discovered the scary side of Bing. Some individuals who tried out Bing’s A.I. chatbot earlier on have en
gaged in disputes with it, or have even been warned by it for attempting to break its rules.