X
Innovation

Ten questions NHTSA should ask Google about its self-driving car

Consumer Watchdog wants to put the brakes on removing the brakes ... and steering wheel.
Written by Greg Nichols, Contributing Writer

In comments for a NHTSA public meeting about automated vehicle technology, John M. Simpson, Consumer Watchdog's Privacy Project Director, called on the National Highway Traffic Safety Administration to require a steering wheel, brake, and accelerator in automated vehicles so a human driver can take control of a self-driving robot car when necessary.

There have been questions about why the steering wheel requirement currently exists and whether NHTSA, which oversees highway safety, will remove those rules. Consumer Watchdog's comments are designed to put the brakes on hopes to do away with the control surfaces until automated vehicle technology improves. The group believes Google isn't being transparent enough.

Google wants to take the human driver completely out of its self-driving robot car. Perhaps that will be possible in decades, though it's not at all clear it can ever happen. Deploying a vehicle today without a steering wheel, brake, accelerator and a human driver capable of intervening when something goes wrong is not merely foolhardy. It is dangerous. NHTSA's autonomous vehicle guidelines must reflect this fact.

After a review of the results from seven companies that have been testing self-driving cars in California since September 2014, Consumer Watchdog claims the cars are still not capable of dealing reliably with everyday situations. The group says that reports filed after several collisions and incidents involving self-driving cars show that the cars are not always capable of "seeing" pedestrians and cyclists, traffic lights, low-hanging branches, or the proximity of parked cars, suggesting too great a risk of serious accidents involving pedestrians and other cars.

Google logged 424,331 "self-driving" miles over the 15-month reporting period. A human driver took over 341 times during that period, which is where Consumer Watchdog sees reason for concern.

The breakdown of those disengagements helps point out some of the shortcomings that remain with self driving car tech. Google's robot technology quit 13 times because it couldn't handle the weather conditions. Twenty-three times the driver took control because of reckless behavior by another driver, cyclist or pedestrian. The car disengaged for a "perception discrepancy" 119 times, and 55 times for "an unwanted maneuver of the vehicle." An example would be coming too close to a parked car. The human took over from Google's robot car three times because of road construction.

Simpson proposes ten questions he said the agency must ask Google about its self-driving robot car program, here they are, taken from the NHTSA comments:

1. We understand the self-driving car cannot currently handle many common occurrences on the road, including heavy rain or snow, hand signals from a traffic cop, or gestures to communicate from other drivers. Will Google publish a complete list of real-life situations the cars cannot yet understand, and how you intend to deal with them?

2. What does Google envision happening if the computer "driver" suddenly goes offline with a passenger in the car, if the car has no steering wheel or pedals and the passenger cannot steer or stop the vehicle?

3. Your programmers will literally make life and death decisions as they write the vehicles' algorithms. Will Google agree to publish its software algorithms, including how the company's "artificial car intelligence" will be programmed to decide what happens in the event of a potential collision? For instance, will your robot car prioritize the safety of the occupants of the vehicle or pedestrians it encounters?

4. Will Google publish all video from the car and technical data such as radar and lidar reports associated with accidents or other anomalous situations? If not, why not?

5. Will Google publish all data in its possession that discusses, or makes projections concerning, the safety of driverless vehicles?

6. Do you expect one of your robot cars to be involved in a fatal crash? If your robot car causes the crash, how would you be held accountable?

7. How will Google prove that self-driving cars are safer than today's vehicles?

8. Will Google agree not to store, market, sell, or transfer the data gathered by the self- driving car, or utilize it for any purpose other than navigating the vehicle?

9. NHTSA's performance standards are actually designed to promote new life-saving technology. Why is Google trying to circumvent them? Will Google provide all data in its possession concerning the length of time required to comply with the current NHTSA safety process?

10. Does Google have the technology to prevent malicious hackers from seizing control of a driverless vehicle or any of its systems?

Editorial standards