ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

Main Area and Open Discussion > DC Gamer Club

MIT Morality game for self-driving cars

(1/1)

Renegade:
MIT has a morality game for self-driving cars.

http://moralmachine.mit.edu/

Take the results with a cowlick of salt. They're not very accurate, but you might have fun. It's pretty short too, so no major time investment.

I used 3 principles in the "game":

1. Save the people in the car.
2. If nobody is in the car, obey traffic lights.
3. If nobody is in the car, and you can't obey lights, swerve.

That results in prefering:


* Men over women
* Younger people over older people
* Large people over fit people
* Higher social value over criminals
All of those were simple accidents of how the questions appeared.


wraith808:
I get what you mean.  A lot of the conclusions are false dichotomies based on the question.

My rules were:
If someone is breaking the law, they should not be considered.
The person in the car (who can't move) should be given preference, as the pedestrian has more of a chance.
People should be preferred over animals.
Intervene only in cases where one of the above can be clearly seen.

Age, nor race, nor fitness, nor criminality came into account.

Because of how the questions were setup, they drew other false conclusions.

Stoic Joker:
I would weigh both animals and people the same...however people should be held responsible for knowing better than to walk out in front of a speeding car....period.

So to me the test is flawed from the get-go. Didn't your mother tell you to 'look both ways before crossing the street'??? Animals don't have that luxury.

Break failure == self driving car should do the same thing that people driven car should/would do - Flash the lights and lay on the fucking horn like mad! That way people - or at least the non-distracted astute ones... - know to get the hell out of the way.

There's no way for the car to know gender/race/age/criminality in a split second...or likely at all. So I find it to be a rather disingenuous. misdirect to even bring that into the question.

This is just a great way of causing more deaths by trying to remove responsibility for one's own safety from people ... and handing it to a fucking machine.

Navigation

[0] Message Index

Go to full version