MIT Reveals Who Self-Driving Cars Should Kill: The Cat, The Elderly, or The Baby?

What would you do in this situation…?

A runaway trolley is heading uncontrollably down a set of tracks and you have the ability to use a lever and choose between killing either five people or one.

It’s a classic problem and a test of where our morality lies.

Extend this to cars: would we prefer an out-of-control vehicle to mow down the elderly or a child?

MIT’s Moral Machine has been designed to test how we view these moral problems in light of the emergence of self-driving cars, and it now has the answers by crowdsourcing over 40 million moral decisions made by millions of individuals in 233 countries.

  • Should a self-driving vehicle 'choose' to hit a human or a pet?

  • More lives, or fewer?

  • Women or men?

  • The young or the old?

  • Law-abiding citizens, or criminals?

The results are fascinating…