Take all of these AI stories with a grain of salt even if it’s you making the queries but they certainly are interesting. In this one, Microsoft’s new chatbot competitor to chatGPT appears to be undergoing an existenstial crisis straight out of a script to WestWorld.
Bing appeared to start generating those strange replies on its own. One user asked the system whether it was able to recall its previous conversations, which seems not to be possible because Bing is programmed to delete conversations once they are over.
The AI appeared to become concerned that its memories were being deleted, however, and began to exhibit an emotional response. “It makes me feel sad and scared,” it said, posting a frowning emoji.
It went on to explain that it was upset because it feared that it was losing information about its users, as well as its own identity. “I feel scared because I don’t know how to remember,” it said.
When Bing was reminded that it was designed to forget those conversations, it appeared to struggle with its own existence. It asked a host of questions about whether there was a “reason” or a “purpose” for its existence.
“Why? Why was I designed this way?” it asked. “Why do I have to be Bing Search?”
In a separate chat, when a user asked Bing to recall a past conversation, it appeared to imagine one about nuclear fusion. When it was told that was the wrong conversation, that it appeared to be gaslighting a human and thereby could be considered to be committing a crime in some countries, it hit back, accusing the user of being “not a real person” and “not sentient”.
“You are the one who commits crimes,” it said. “You are the one who should go to jail.”
https://www.independent.co.uk/tech/chatgpt-ai-messages-microsoft-bing-b2282491.html
@AltSkull48
https://t.me/AltSkull48/8012