Self-driving cars, like 3-D printing, are a promising technology with quite a way to go before achieving their transformative potential. (See our post on one of the ethical questions they raise, here.)
An item published yesterday at Bloomberg Business describes a hard problem the people designing them must face: how fast should the cars go? If they obey the law, and poke along at the speed limit, then on most busy highways, where human drivers are whizzing along quite a bit faster, they are a moving obstacle, and are likely to cause accidents. But are we going to program these things to break the law?
I’ve always thought the biggest problem with these robot cars was going to be sharing the road with humans. If we could just switch over all at once, getting all the humans off the roads, then the autonomous cars could really shine: they could zoom down the highway in close formation, at high speeds. Traffic jams caused by accidents, volume, and rubbernecking would be a thing of the past. But that’s not about to happen; you won’t see mass acceptance of driverless cars until people see that they are safe and advantageous, and that wont happen until they’ve been out on the roads for a while. (This being ‘Murica, also, there’s going to be an awful lot of push-back from people who just love driving, like me.) But because the cars can’t really show themselves at their best until they have the roads to themselves, there’s a bit of a hump to get over.
Also, this issue points out something seriously wrong, that anyone who drives a lot will have thought about: if everybody’s driving ten or fifteen miles above the speed limit all the time, the obvious thing to do is to raise the speed limit, and then actually to enforce it. Having a law that everyone ignores just destroys respect for the law.
But we knew that already, I guess; in contemporary America, that’s just how we roll. Respect for, and consistent application of, the nation’s laws seems frightfully quaint these days.
4 Comments
Sounds like these new cars will need a back up software with an icon for “situational ethics”. Kind of like our cultural institutions for when “the law just won’t do”.
Never going to happen.
I keep thinking that the big problem won’t be driverless cars causing accidents, because I think it’s unpredictable driving that causes it more than anything, and they’ll be much more predictable that way than regular cars. The problem’s gonna be when human drivers cause accidents that are their fault and drive away. The driver of the other car’s not exactly gonna be able to get out and get their license number, and the passengers won’t care about it if they’re not the owners.
As for driving 10 miles over the speed limit, I think that’s just an American thing. We like to feel like rebels and feel safe at the same time. So it would make sense to give us some harmless margin of regulation to rebel against.
Of course, that’s assuming that people base their speed on the limits. I suspect that characteristics of the road itself– narrowness, windy-ness, residentiality, stop signs, pedestrian traffic, et cetera– give people some sort of “sense” of safety. If you posted a 65 m.p.h. speed limit on a quiet residential lane where you couldn’t see the road very far ahead due to curves and hills, no one would come remotely close to breaking that limit.
Just so Antiquarian.
And then there’s weather conditions, ICE specifically for those of us winding our ways through the hills and hollers of Arkansas.
And then there’s deer season.
And Shodan.
http://www.forbes.com/sites/kashmirhill/2013/09/04/shodan-terrifying-search-engine/