Sunday, October 25, 2015

The unanswerable question

According to Why Self-Driving Cars Must Be Programmed to Kill the automakers are releasing cars which make decisions that can lead to someone having to die. In terms of insurance and who is at fault if the computer makes the decision has not been defined. In terms of the way a human decides on whether or not to avoid pedestrians instead of hitting a wall cannot be captured in a computers algorithms. 

I know personally for me if I was unfortunate enough to have to make this type of decision I couldn't honestly answer at anytime before the incident was about to occur. There are so many factors that go into how I would react from my general overall mood at that time to all of the factors involved with driving itself. I know one thing for sure and it would be that I would always avoid hitting other humans over inanimate objects, however I never know what situation that reaction will place me in. My reaction may lead to another issue in which I can avoid collision. 

One question I would have for the automakers is once the car picks how it is going to avoid the collision is it set in that mode. The other option is to continue monitoring through the collision avoidance to see if conditions have changed during the avoidance. Many factors can change in a short period of time. A pedestrian may move in a way which aids in your avoidance so that you don't have to collide with a building. I feel this along with written laws and policies which are set and defined regarding blame and fault of accidents should all be answered before the autonomous vehicle is legally allowed to be used on a public road way.

No comments:

Post a Comment