In 2010, the fewest number of people died in motor vehicle crashes in the United States since 1949. Still 32,885 people were killed in car crashes in 2010. Drunk drivers - drivers with a blood alcohol concentration of .08 g/dL or more - were responsible for 10,759 of those fatalities. The U.S. Department of Transportation estimated that 2.24 million people were injured in car crashes in the same year.
There is clearly room for improvement. It seems like Google's self-driving car has a chance to do better than the 33,883 fatalities and 2.24 million injuries. Self-driving cars have a lot of upside. They cannot get drunk. They do not get distracted. According to the Federal Motor Carrier Safety Administration driver inattention is a factor in at least 80% of accidents Self-driving cars do not get fatigued. Fatigue accounted for 13% of Large Truck Crashes last year. There are also a host of other benefits. For instance, self-driving cars do not tailgate, speed, run red lights, or blast through stop signs; they are familiar with every road you drive on, whether the driver has been there before or not.
Self-driving cars also present difficulties. On August 7th Google announced that their autonomous cars had driven 300,000 miles accident free (the one accident occurring when a human was driving the car. But in the same article they announce that, " We’re encouraged by this progress, but there’s still a long road ahead. To provide the best experience we can, we’ll need to master snow-covered roadways, interpret temporary construction signals and handle other tricky situations that many drivers encounter." (source http://googleblog.blogspot.co.uk/2012/08/the-self-driving-car-logs-more-miles-on.html)
Potential Failures And Liabilities
It seems like there are a lot of "other tricky situations" in driving. Yet there exist no standards as of yet for how to regulate those tricky situations. Its easy to imagine situations where the car must choose between its passengers safety and the safety of another driver or pedestrian. Should the car swerve out of the way of a pedestrian if that means it could run into another car or drive off of a cliff? Should the car swerve out of the way of a deer under those same circumstances? Will its vision be good enough to distinguish between a deer and a person, or a cardboard cutout and a person?
There are certainly more tricky situations than the ones I could imagine up myself. And there certainly will be mistakes. My favorite quote from Fred P. Brooks, Jr. 's No Silver Bullet: Essence and Accidents of Software Engineering is "The complexity of software is an essential property, not an accidental one." The software behind the self-driving car is certainly complex. Brooks would certainly predict that there will be errors and accidents as self-driving cars become more widely used. Accidents make liability an interesting and important question. Auto manufactures are held responsible when their cars don't perform like they are supposed to. Drivers and passengers involved in accidents have been awarded millions of dollars from auto manufacturers who make faulty cars. But there cars have standards and reasonable expectations for how the cars should perform, so liability becomes a question of whether or not the car met these standards or if the driver was at fault. But the software that governs self-driving cars is too complex to regulate and probably too complex for most lawmakers to understand. To dream up every scenario where the car has to make a decision and then decide how the car should act is too involved and complicated to possibly get everybody involved in the lawmaking process to come to some consensus. It might not be an issue now as Google seems to be significantly ahead of any challenger in the car manufacturing business and we will just have to trust that Google's programmers do the right thing. But what will we do when there is a competitor, and its car behaves in a situation differently than Google's. We will need a framework by which we can evaluate the decision making processes of the car to determine how liable the manufactures of the self driving car's software are.
Then there is the liability of the driver. As laws stand now the DMV requires a licensed driver to be in the driver's seat ready to take the wheel. In his article "The Need for autonomous vehicle law", Nicholas Kaasik discusses how in Nevada it is legal for the operator of a self-driving car to text but illegal for the driver to be intoxicated (even in autopilot mode) His article is worth reading (source: http://cornellsun.com/node/50621) The question has not yet been thoroughly answered as to whom is responsible if there is a car crash in autonomous mode. On one hand driver's are expected to be able to take control at a moment's notice. On the other hand, the point of the self-driving car is to remove the human's need to pay attention. Whether is is legal or not, people will take naps in the self-driving car or watch a movie or have involved conversations. And that will be totally fine 99.9% of the time, so is it fair to hold the human responsible in 100% of the accidents? It is an interesting question and one that will probably take several years to resolve in courts and in Congress.