In 2016, a Mercedes-Benz executive said that in developing autonomous vehicles (AVs), the company would prioritize the safety of its customers over that of bystanders and other drivers: “If you know you can save at least one person, at least save that one. Save the one in the car." The company probably did not expect the media backlash that ensued.
“Mercedes-Benz admits automated driverless cars would run over a CHILD rather than swerve and risk injuring the passengers inside," a Daily Mail headline announced. Within a week, the carmaker publicly backed away from its position, stating that “neither programmers nor automated systems are entitled to weigh the value of human lives." Of course, human drivers already make implicit trade-offs when it comes to safety.
We often prioritize not just our survival but our convenience, as when we fail to stop for a pedestrian who is clearly intending to enter a crosswalk or when we make a rolling stop to save time at a stop sign. Yet many people are clearly uncomfortable with the idea of AVs that explicitly encode similar preferences.
For carmakers developing self-driving vehicles, the important question is: Would customers actually refuse to ride in “selfish" AVs? In new research, I asked 5,584 participants to consider a situation in which an AV facing a close call must decide whether to prioritize saving its own passenger or a pedestrian. People were more outraged, on average, by the idea of an AV preferring to save its passenger, even when asked to imagine that they themselves were the passenger.
Read more on livemint.com