bubbleblower: cropped head shot of me with nebula background (Default)
[personal profile] bubbleblower
In Silicon Soapware #244 (http://www.well.com/~bubbles/SS0244.txt) I imagined the owner of a self-driving car telling it to do something illegal, with the car refusing. I was thinking of it as sounding like that scene in "2001" where HAL refuses to open the pod bay door. It just felt sort of amusing to imagine it.

Then I got to thinking. Might there be times when it would be good for the person directing a self-driving car to be able to have it do things that are technically in violation of the law? Perhaps a traffic light is malfunctioning and if the car is ever going to get anywhere it will need to run the red light. Perhaps there is some kind of medical emergency or something. Perhaps the car is being pursued by criminals and needs to do some unexpected (and illegal) maneuver to escape. Or perhaps it's something no one has anticipated.

Whatever it is, you want to be able to do it if you really think it's necessary. And since at least the first self-driving cars will probably have a way for the driver to switch to manual mode and then attempt just about anything, making the override procedure much more difficult than going to manual mode would be pointless.

On the other hand, you don't want to ignore the law completely. The car should be at least a little bit reluctant to break the rules. One way to achieve this might be a "magic" phrase, sort of like in the childhood game "Simon Says". Or perhaps when a car detects an illegal order it asks "Are you sure?" and proceeds only if it gets a "Yes" answer.

You may want to have the procedure for overriding the rules depend on the potential consequences. In the case of the stuck traffic signal, telling the car to go ahead when it appears safe, assuming the car can and will measure speeds of other vehicles and calculate stopping distances and such, should be easy. On the other hand, having it plunge full speed ahead into dense fog RIGHT NOW could have much more serious consequences. There probably needs to be some way for the human to confirm awareness of this difference in seriousness.

I suspect the designers of self-driving cars are thinking about this. Or at least I hope they are.

Date: 2014-12-22 07:38 am (UTC)
thnidu: Jolly Roger, black w white skull & cross-snakes, THNIDU in creepy orange caps. tinyurl.com/c32ajat (skull)
From: [personal profile] thnidu
The Q&A could be a deadly delay. Say there's something else out of control -- a car rolling off a transport, or a crushed car coming loose from a stack of them on a flatbed truck and falling into the next lane. (I saw that second one happen in, I believe, Marin County; it fell onto the car just behind us, with yes, significant damage.) You need to be able to swerve fast.
Page generated Aug. 22nd, 2017 02:42 pm
Powered by Dreamwidth Studios