Quote:
Originally Posted by mrmistoffelees
Not to mention the ethics side of a fully automated driverless vehicle.
I'm sure we've all seen/heard the scenario before
In a crash scenario, one of the following must occur
1. The driver is killed
2. A pedestrian is killed
3. The occupants of another vehicle are killed.
How does 'AI' make the decision as to who dies
|
Asimov's "Law of Robotics"
First Law: A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
Second Law: A robot must obey orders given it by human beings, except where such orders would conflict with the First Law.
Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.