Do We Need Asimov's Laws?
Isaac Asimov's "Three Laws of Robotics"
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Morals and the Machine
Death by Robot
Will Artificial Intelligence Destroy Humanity? Here are 5 Reasons Not to Worry
UN Considers Ethics of "Killer Robots"
The Big Robot Questions
Are there ethical issues of entrusting the care of your elderly parents to robots? Or will personal robots help the elderly live longer and more independently?
No comments:
Post a Comment