Do these things have Asimov's 3 laws programmed into them?
-
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
-
A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
-
A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
Or better put, I wonder if there is any "don't harm humans" programmed in.
https://en.wikipedia.org/wiki/Laws_of_robotics