braindump2 in  
Data Scientist  

Interesting moral dilemma

So I'm very interested in Ai, machine intelligence and how bias, preferences, and morality play in designing autonomous systems. A friend of mine shared this project with me and I wanted to see where you guys fell on the spectrum.


The scenario:


A self-driving car has brake failure. What should the car do?


https://www.moralmachine.net/

Moral Machine

Moral Machine

A platform for public participation in and discussion of the human perspective on machine-made moral decisions

moralmachine.net
moral-machine-768x593.png
5
9501
Sort by:
ni8joJNc3m001Philosophy / Ethics  
This was actually pretty hard. In a life or death situation, I'm torn. The image you shared is kinda wild but valid. Do you save the life of 1 person or 3? I think most people would say, save 3 lives. But then if the question is, you can only have 3 of 4 lives, do you value the lives of 2 homeless people more than say 1 avg income or wealthy person (as the image suggests)? Now it gets especially gut wrenching and sticky and the biases people have are in full effect. Do you know what others responded?
2

About

Public

Tech

Members

572,262