Quote:
Originally Posted by mrmistoffelees
Not to mention the ethics side of a fully automated driverless vehicle.
I'm sure we've all seen/heard the scenario before
In a crash scenario, one of the following must occur
1. The driver is killed
2. A pedestrian is killed
3. The occupants of another vehicle are killed.
How does 'AI' make the decision as to who dies
|
Don't think it does decide it just happens and has already happened, a pedestrian was killed and it wasn't their fault, it was proven to be the cars, I remember one of the executives giving a statement trying to excuse it saying well the car has done 250000 miles, how many people have you killed in your last 250k miles driving?