Robot
Robot
Law two:
- A robot must obey the orders given by human beings except
where such orders would conflict with the first law.
Law three:
- A robot must protect its own existence as long as such
protection does not conflict with the first and the second law.
1) Safety
- As mentioned in law one which is ( A robot may not injure a
human being or, through inaction, allow a human being to
come to harm) because who should be held accountable if
someone’s safety is compromised by a robot? Who should be
blamed? The robot, the agent using the robot, or the maker/
inventor?
-
2) Emotional Component
- Looking at how fast technology progresses nowadays, it is not
completely impossible for robots to develop emotions.
If the problems arise when the robot deviates from the laws
specified, then the maker or the inventor of the machine should be
blameworthy. As it means that the robot was not programmed very
well because it violated the laws. Since the robot thinks for itself,
whatever decision it makes and whatever consequences it may
bring, the robot itself should be held responsible.