Anonymous ID: bf1d3c May 13, 2020, 10:02 p.m. No.9165496   🗄️.is 🔗kun   >>5570 >>5573

>>9165355

Do these things have Asimov's 3 laws programmed into them?

 

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

 

  1. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

 

  1. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

 

Or better put, I wonder if there is any "don't harm humans" programmed in.

 

https://en.wikipedia.org/wiki/Laws_of_robotics