Interesting article in focus magazine a month or two back about why your driverless car could choose to kill you.
It was generally all about how it would make a decision when all outcomes result in someone's death.
So for example if you were driving and had the choice of hitting a car with 3 convicts on the run as opposed to hitting a car with 2 normal people in - assuming we had all the info you would choose to hit the car with the three convicts in - however the logic for a driverless car would be to take the choice which would result in the least deaths.....
There was also an interesting thought experiment about a guy who is stood by the points lever on a train track - a train is coming down the track and if he lets it continue on its current course it will kill 5 people - if he flicks the lever it will only kill 1 person death.
So the geezer pulls the lever and is then done for murder of that one person - if he had done nothing he would not be guilty of the murder of the 5 people as he did nothing.
Just made me think that's all!
It was generally all about how it would make a decision when all outcomes result in someone's death.
So for example if you were driving and had the choice of hitting a car with 3 convicts on the run as opposed to hitting a car with 2 normal people in - assuming we had all the info you would choose to hit the car with the three convicts in - however the logic for a driverless car would be to take the choice which would result in the least deaths.....
There was also an interesting thought experiment about a guy who is stood by the points lever on a train track - a train is coming down the track and if he lets it continue on its current course it will kill 5 people - if he flicks the lever it will only kill 1 person death.
So the geezer pulls the lever and is then done for murder of that one person - if he had done nothing he would not be guilty of the murder of the 5 people as he did nothing.
Just made me think that's all!



Comment