Moral Dilemma of Self-Driving Cars: Which Lives to Save in a Crash

Would you ride in a self-driving car that has been programmed to sacrifice its passengers to save the lives of others, in the event of a serious accident? New research has found that people generally approve of autonomous vehicles (AV) governed by so-called utilitarian ethics, which would seek to minimize the total number of deaths in a crash, even if it means harming people in the vehicle. But it gets more complicated than that. The study, based on surveys of U.S. residents, found that most respondents would not want to ride in these vehicles themselves, and were not in favor of regulations enforcing utilitarian algorithms on driverless cars.

The researchers say this moral dilemma suggests that attempts to minimize loss of life by legislating for utilitarian algorithms could actually increase casualties by slowing the adoption of lifesaving technology

Read the full story at Live Science