topbanner_forum
  *

avatar image

Welcome, Guest. Please login or register.
Did you miss your activation email?

Login with username, password and session length
  • Friday December 6, 2024, 8:52 am
  • Proudly celebrating 15+ years online.
  • Donate now to become a lifetime supporting member of the site and get a non-expiring license key for all of our programs.
  • donate

Author Topic: MIT Morality game for self-driving cars  (Read 4269 times)

Renegade

  • Charter Member
  • Joined in 2005
  • ***
  • Posts: 13,291
  • Tell me something you don't know...
    • View Profile
    • Renegade Minds
    • Donate to Member
MIT Morality game for self-driving cars
« on: October 08, 2016, 09:37 AM »
MIT has a morality game for self-driving cars.

http://moralmachine.mit.edu/

Take the results with a cowlick of salt. They're not very accurate, but you might have fun. It's pretty short too, so no major time investment.

I used 3 principles in the "game":

1. Save the people in the car.
2. If nobody is in the car, obey traffic lights.
3. If nobody is in the car, and you can't obey lights, swerve.

That results in prefering:

  • Men over women
  • Younger people over older people
  • Large people over fit people
  • Higher social value over criminals

All of those were simple accidents of how the questions appeared.


Slow Down Music - Where I commit thought crimes...

Freedom is the right to be wrong, not the right to do wrong. - John Diefenbaker

wraith808

  • Supporting Member
  • Joined in 2006
  • **
  • default avatar
  • Posts: 11,190
    • View Profile
    • Donate to Member
Re: MIT Morality game for self-driving cars
« Reply #1 on: October 08, 2016, 11:02 AM »
I get what you mean.  A lot of the conclusions are false dichotomies based on the question.

My rules were:
If someone is breaking the law, they should not be considered.
The person in the car (who can't move) should be given preference, as the pedestrian has more of a chance.
People should be preferred over animals.
Intervene only in cases where one of the above can be clearly seen.

Age, nor race, nor fitness, nor criminality came into account.

Because of how the questions were setup, they drew other false conclusions.

Stoic Joker

  • Honorary Member
  • Joined in 2008
  • **
  • Posts: 6,649
    • View Profile
    • Donate to Member
Re: MIT Morality game for self-driving cars
« Reply #2 on: October 08, 2016, 12:36 PM »
I would weigh both animals and people the same...however people should be held responsible for knowing better than to walk out in front of a speeding car....period.

So to me the test is flawed from the get-go. Didn't your mother tell you to 'look both ways before crossing the street'??? Animals don't have that luxury.

Break failure == self driving car should do the same thing that people driven car should/would do - Flash the lights and lay on the fucking horn like mad! That way people - or at least the non-distracted astute ones... - know to get the hell out of the way.

There's no way for the car to know gender/race/age/criminality in a split second...or likely at all. So I find it to be a rather disingenuous. misdirect to even bring that into the question.

This is just a great way of causing more deaths by trying to remove responsibility for one's own safety from people ... and handing it to a fucking machine.