The First Law: A robot may not harm, nor through inaction allow to come to harm, a human being. The Second Law: A robot must obey the orders of a human being, where those orders do not conflict with the First Law. The Third Law: A robot must act to protect its own existence, where this will not conflict with the First or Second Laws. Isaac Asimov was the only one of the Big Three (Asimov, Heinlein, Clarke) that I ever actually saw in person. It was at a talk whose subject I don't even really recall, but I do remember [ Continue reading... ]
What You’re Saying