Go full altruist and sacrifice yourself or sacrifice others in order to survive? An "ethical knob" that spells dilemma!
With the rising popularity of self-driving cars, it isn't clear who's responsible when the vehicle crashes. Unlike with humans, these cars only rely on code and not on instinct when it comes to danger.
And while researchers are trying so hard to teach these cars ethics, there's still this push and pull on who should survive in an accident. Should we go utilitarian and minimize any casualties? Or should we make sure that we're not kill when accident happens? Now, a team of researchers may have a solution.
“We wanted to explore what would happen if the control and the responsibility for a car’s actions were given back to the driver,” says Guiseppe Contissa at the University of Bologna in Italy.
They've designed a dial with the “full altruist” to “full egoist” settings, having a middle ground that's impartial. “The knob tells an autonomous car the value that the driver gives to his or her life relative to the lives of others,” says Contissa. “The car would use this information to calculate the actions it will execute, taking in to account the probability that the passengers or other parties suffer harm as a consequence of the car’s decision.”
However, this can also be a problem. “If people have too much control over the relative risks the car makes, we could have a Tragedy of the Commons type scenario, in which everyone chooses the maximal self-protective mode,” says Edmond Awad of the MIT Media Lab, lead researcher on the Moral Machine project there. Also, having the impartial option enables people to get away from moral responsibility.
“It is too early to decide whether this would be a good solution,” says Awad.
Get weekly science updates in your inbox!