The social dilemma of driverless cars

86
driverless cars

Driverless cars pose a quandary when it comes to safety. These autonomous vehicles are programmed with a set of safety rules, and it is not hard to construct a scenario in which those rules come into conflict with each other.

Suppose a driverless car must either hit a pedestrian or swerve in such a way that it crashes and harms its passengers. What should it be instructed to do?

A newly published study shows that the public is conflicted over such scenarios, taking a notably inconsistent approach to the safety of autonomous vehicles, should they become a reality on the roads.

In a series of surveys last year, the researchers found that people generally take a utilitarian approach to safety ethics: they would prefer autonomous vehicles to minimize situations of extreme danger.

That means having a car with one rider swerve off the road and crash to avoid a crowd of 10 pedestrians. At the same time, the survey’s respondents said, they would be much less likely to use a vehicle programmed that way.

Essentially, people want driverless cars that are as pedestrian-friendly as possible — except for the vehicles they would be riding in.

Most people want to live in in a world where cars will minimize casualties. But everybody want their own car to protect them at all costs.

The result is what the researchers call a “social dilemma,” in which people could end up making conditions less safe for everyone by acting in their own self-interest.

Or, as the researchers write in the new paper, “For the time being, there seems to be no easy way to design algorithms that would reconcile moral values and personal self-interest.”

The paper is published in the journal Science.

Follow Knowridge Science Report on Facebook, Twitter and Flipboard.


Citation: Bonnefon JF, et al. (2016). The social dilemma of autonomous vehicles. Science, 352: 1573-1576. DOI: 10.1126/science.aaf2654.
Figure legend: This Knowridge.com image is for illustrative purposes only.