Who Should Driverless Cars Be Programmed To Kill: The Passenger Or The Pedestrian?

Read on ecnmag.com.

If you had to push one person in front of a moving driverless car to save a group of people on the tracks, would you? It’s a pretty heavy ethical question. In fact, in philosophy, there’s a name for this moral dilemma, called the trolley problem. But for the sake of argument, we’re going to sub in autonomous cars for trolleys.

Researchers from the Toulouse School of Economics decided to present the question to the public, in an effort to determine how driverless cars should be programmed if a trolley problem happened in real life.

A series of questions were presented to a group of online survey-takers, assessing the morality of the algorithms if driverless cars are programmed to (a) stay on course and kill several pedestrians, or swerve and kill one passer-by (b) stay on course and kill one pedestrian, or swerve and kill its passenger or (c) stay on course and kill several pedestrians, or swerve and kill its passenger.

One version of the survey randomized the amount of people that would be killed if the driver did not swerve (between 1 and 10) and asked if the car should sacrifice the passenger or bystanders. The second version tested how people would program cars themselves—always sacrifice the passenger, always protect the passenger, or random. The third group was read a story where ten people were saved because the car swerved, killing the passenger. They were asked to imagine themselves as the passenger, and then a bystander, and assess the morality.

The researchers found that more than 75 percent of respondents supported self-sacrifice of the passenger to save 10 people.

It’s nice to know humanity has heart. But I think these respondents would answer differently if there were a law that said cars should automatically be programmed to always save pedestrians over passengers. Is it ethical for the autonomous cars to hold all responsibility? What if the pedestrian was in the wrong? Will these algorithms be intelligent enough to make the call?

We’re told that driverless cars will reduce traffic fatalities and save lives—pedestrian or passenger, so hopefully these trolley car problems are few and far between. But their regulations will arguably be more complex than the algorithms themselves.

So if you were given a say in programming, who would you save, the passenger or the pedestrian(s)?

Submit a comment