Toggle Menu
  1. Home/
  2. Tech & Science/

Experts offer insights on ethical questions regarding self-driving cars

In two years, self-driving cars will be reliable enough for people to sleep in them while they travel to their destination, Tesla founder Elon Musk recently said. But such a development raises multiple ethical questions and experts are struggling to find ways to make this future a responsible one.

Several Stanford University scholars gave their insights on how self-driving technology will change the world and debated the ‘hows’ and ‘ifs’ of the matter. Talking to the Stanford News Service, they highlighted the most significant ethical questions and concerns when it comes to letting algorithms take the wheel of the car.

The ‘not productive’ trolley problem

One of the biggest ethical questions regarding self-driving cars is the way an algorithm would choose between protecting the life of the driver, passengers, and other traffic participants. In other words, the trolley problem:

loading...

There is a runaway trolley barreling down the railway tracks. Ahead, on the tracks, there are five people tied up and unable to move. The trolley is headed straight for them. You are standing some distance off in the train yard, next to a lever. If you pull this lever, the trolley will switch to a different set of tracks. However, you notice that there is one person on the side track. You have two options:

1. Do nothing, and the trolley kills the five people on the main track.
2. Pull the lever, diverting the trolley onto the side track where it will kill one person.

Which is the most ethical choice?

Engineers of autonomous cars will now have to tackle this question and other, more complicated scenarios, said professors Ken Taylor and Rob Reich, the director of Stanford’s McCoy Family Center for Ethics in Society.

“Everyone is saying how driverless cars will take the problematic human out of the equation. But we think of humans as moral decision-makers. Can artificial intelligence actually replace our capacities as moral agents?” Taylor asked.

“It won’t be just the choice between killing one or killing five. Will these cars optimise for overall human welfare, or will the algorithms prioritise passenger safety or those on the road? Or imagine if automakers decide to put this decision into the consumers’ hands, and have them choose whose safety to prioritise. Things get a lot trickier,” Reich, who is also a professor of political science, added.

But Stephen Zoepf, executive director of the Center for Automotive Research at Stanford (CARS), along with several other Stanford scholars, including mechanical engineering Professor Chris Gerdes, argue that agonising over the trolley problem isn’t helpful.

loading...

“It’s not productive. People make all sorts of bad decisions. If there is a way to improve on that with driverless cars, why wouldn’t we?” Zoepf said.

He argued the more important ethical question is what is the level of risk society would be willing to incur with self-driving cars on the road. For the past several months, Zoepf and his CARS colleagues have been working on a project on the ethical programming of automotive vehicles.

Loss of jobs, loss of meaning

If self-driving vehicles become the norm, another ethical concern is the number of jobs that will be lost. As Tesla’s founder already announced and previewed an electric semi truck that could one day also be autonomous, millions of truck drivers that haul cargo on U.S. roads got shivers down their spines. And them losing their jobs is an issue regarded by Elon Musk, who recently said a lot of people derive their meaning from their employment so if they’re not needed anymore it would also be a psychological problem.

“There will be fewer and fewer jobs that a robot can not do better. I want to be clear. These are not things that I wish would happen, these are things that I think probably will happen. With automation, there will come abundance. Almost everything will get very cheap. The much harder challenge is: “how do people then have meaning?” A lot of people derive their meaning from their employment. If you’re not needed, if there’s not a need for your labour, what’s your meaning, do you feel useless? I think ultimately we will have to have some kind of universal basic income. I don’t think we’re going to have a choice,” Elon Musk said at the World Government Summit in Dubai.

More than 3.5 million truck drivers haul cargo on U.S. roads, according to the latest statistics by the American Trucking Associations, a trade association for the U.S. trucking industry.

“We have to be prepared for this job loss and know how to deal with it. That’s part of the ethical responsibility of society. What do we do with people who are displaced? But it is not only the transformation in labour. It is also the transformation in transport, private and public. We must plan for that, too,” Margaret Levi, professor of political science and the director of the Center for Advanced Study in the Behavioral Sciences, said.

She argued both tech companies and governments can and must take steps to prepare for those losses.

No matter their stance on a particular issue with self-driving cars, the scholars agree that there needs to be greater collaboration among disciplines in the development stage of this and other revolutionary technology.

John Beckett

Loading...