View Single Post
  #7  
Old 01-28-2013, 02:02 PM
ZaltysZ's Avatar
ZaltysZ ZaltysZ is offline
Approved Member
 
Join Date: Sep 2008
Location: Lithuania
Posts: 426
Default

Quote:
Originally Posted by KG26_Alpha View Post
Isaac Asimov

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
The first law has a fault - a possible deadend, because that law does not cover situation in which dilemma may arise: saving one human causes harm to another, so saving violates the law, but doing nothing violates it too. Undefined behavior anyone? That is the nastiest thing could happen in software.
Reply With Quote