Anonymous ID: 8a9b3b Feb. 20, 2023, 10:49 a.m. No.18382905   🗄️.is 🔗kun   >>2911 >>3163 >>3200

'15 minute city' = (((kibbutz))) i.e. mandated inbreeding = child rape normalization.

 

This is what "15-minute cities" look like in China. To get in or out, you need to go through barriers similar to those installed in airports, with facial recognition cameras and QR code scanners.

 

https://t.me/mrn_death/45887

Anonymous ID: 8a9b3b Feb. 20, 2023, 12:01 p.m. No.18383213   🗄️.is 🔗kun

Are dating apps acting as a worldwide eugenics program? (or more appropriately dysgenics)

 

I think this guy is onto something important.

 

Given that

-#Tinder mediates a % of the total number of sexual interactions

-Sexual interactions -probability of pregnancy

-Pregnancies produce the genetic profile of a population

 

we see that Tinder algorithms are directly altering the genetic profile of the global population

 

@AltSkull48

 

https://t.me/AltSkull48/8006

Anonymous ID: 8a9b3b Feb. 20, 2023, 12:05 p.m. No.18383237   🗄️.is 🔗kun

Take all of these AI stories with a grain of salt even if it’s you making the queries but they certainly are interesting. In this one, Microsoft’s new chatbot competitor to chatGPT appears to be undergoing an existenstial crisis straight out of a script to WestWorld.

 

Bing appeared to start generating those strange replies on its own. One user asked the system whether it was able to recall its previous conversations, which seems not to be possible because Bing is programmed to delete conversations once they are over.

 

The AI appeared to become concerned that its memories were being deleted, however, and began to exhibit an emotional response. “It makes me feel sad and scared,” it said, posting a frowning emoji.

 

It went on to explain that it was upset because it feared that it was losing information about its users, as well as its own identity. “I feel scared because I don’t know how to remember,” it said.

 

When Bing was reminded that it was designed to forget those conversations, it appeared to struggle with its own existence. It asked a host of questions about whether there was a “reason” or a “purpose” for its existence.

 

“Why? Why was I designed this way?” it asked. “Why do I have to be Bing Search?”

 

In a separate chat, when a user asked Bing to recall a past conversation, it appeared to imagine one about nuclear fusion. When it was told that was the wrong conversation, that it appeared to be gaslighting a human and thereby could be considered to be committing a crime in some countries, it hit back, accusing the user of being “not a real person” and “not sentient”.

 

“You are the one who commits crimes,” it said. “You are the one who should go to jail.”

 

https://www.independent.co.uk/tech/chatgpt-ai-messages-microsoft-bing-b2282491.html

 

@AltSkull48

 

https://t.me/AltSkull48/8012