braindump2ย inย ย 
Data Scientistย 2 years ago

Interesting moral dilemma

So I'm very interested in Ai, machine intelligence and how bias, preferences, and morality play in designing autonomous systems. A friend of mine shared this project with me and I wanted to see where you guys fell on the spectrum.

The scenario:

A self-driving car has brake failure. What should the car do?

Moral Machine

Moral Machine

A platform for public participation in and discussion of the human perspective on machine-made moral decisions


ni8joJNc3m001Philosophy / Ethicsย 2 years ago
This was actually pretty hard. In a life or death situation, I'm torn. The image you shared is kinda wild but valid. Do you save the life of 1 person or 3? I think most people would say, save 3 lives. But then if the question is, you can only have 3 of 4 lives, do you value the lives of 2 homeless people more than say 1 avg income or wealthy person (as the image suggests)? Now it gets especially gut wrenching and sticky and the biases people have are in full effect. Do you know what others responded?