Thursday, April 30, 2015

Robo Phobia?

Should we be concerned about a world with more and more advanced robots taking on human tasks?  Will people become too attached to their robots? Will artificial intelligence someday make robots independent of humans? What kind of laws are needed to regulate the use of robots?



 Do We Need Asimov's Laws?

Isaac Asimov's "Three Laws of Robotics"


  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.





Morals and the Machine

Death by Robot

Will Artificial Intelligence Destroy Humanity? Here are 5 Reasons Not to Worry


UN Considers Ethics of "Killer Robots"

The Big Robot Questions


Are there ethical issues of entrusting the care of your elderly parents to robots? Or will personal robots help the elderly live longer and more independently?




No comments:

Post a Comment