Advertisement

technologyTechnology
clockPUBLISHED

Should Your Car Be Willing To Kill You? You Might Have To Decide

pathdoc/Shutterstock

How altruistic do you think you are? Would you be willing to get into a driverless car that is prepared to sacrifice you, the passenger, in order to save pedestrians? Or do you think your autonomous vehicle (AV) should prioritize your life above all others? 

A group of scientists from the University of Bologna has come up with an idea called the "Ethical Knob". This would be a device that allows you to chose your AV's ethical setting, with "full altruist" on one side and "full egotist" on the other. Their study is published in Artificial Intelligence and Law.

Advertisement

Now that driverless cars are expected to hit our roads in the next few years, the ethical and legal challenges they bring up are getting a lot of attention. AVs have been advertised as safer, more efficient alternatives to human-driven vehicles, which result in something like 3,000 deaths globally every day. Ninety-four percent of those are caused by human error and can, in theory, be prevented if we switch to AVs. But what should be done about the remaining 6 percent?

Unlike human drivers, who are governed by instinct, robotic drivers will respond with a pre-installed code. According to a 2015 study, most people (76 percent) believe AV's should be utilitarian meaning that actions are taken to minimize harm. But – and this is a big but – most people are not actually willing to get into a driverless car that puts them or their family at risk. So while, theoretically, they approve of cars that are willing to let them die if it means saving more people, they won't, in reality, use a car designed to act utilitarianly.  

This is a dilemma. If designers were compelled by law to install an impartial or utilitarian code in their AVs, they might not get many buyers. If, on the other hand, there are no such regulations, they would be likely to program the car to protect the passenger at all costs, putting pedestrians at risk. 

So what would happen if you put this moral burden on the passengers themselves, asked Guiseppe Contissa and his colleagues. They came up with the "Ethical Knob".

Advertisement

When the dial is switched to ego mode, the car will always respond in a way to protect the passenger even if that involves sacrificing several pedestrians. On altruistic mode, it will always sacrifice the passenger in order to save pedestrians. In the middle, passengers can choose the impartial option. In this case, the AV responds utilitarianly. 

“The knob tells an autonomous car the value that the driver gives to his or her life relative to the lives of others,” explained Contissa, reports New Scientist.

“The car would use this information to calculate the actions it will execute, taking in to account the probability that the passengers or other parties suffer harm as a consequence of the car’s decision."

Obviously, putting this kind of ethical burden on the passenger could be problematic.

Advertisement

“If people have too much control over the relative risks the car makes, we could have a Tragedy of the Commons type scenario, in which everyone chooses the maximal self-protective mode,” Edmond Awad of the MIT Media Lab, told New Scientist.


ARTICLE POSTED IN

technologyTechnology
  • tag
  • ethics,

  • passenger,

  • driverless,

  • autonomous cars,

  • pedestrian,

  • ethical knob

FOLLOW ONNEWSGoogele News