Automotive industry stakeholders have touted autonomous cars as being much safer than human drivers. According to the US National Highway Traffic Safety Administration (NHTSA), 94 percent of crashes are caused by human error, so it seems that obvious that adding more automation will make cars safer. However, nonprofit public interest group Consumer Watchdog warns that the rush to deploy self-driving cars is dangerous.
"We're very concerned that there is a real press to rapidly deploy these things too quickly," Consumer Watchdog Privacy Project director John Simpson tells ZDNet. "There needs to be very adequate safety precautions and procedures in place."
On Wednesday Simpson, on behalf of Consumer Watchdog, sent a letter to Uber urging the transportation company to disclose the details of its self-driving vehicle fleet, which is . He wrote, "You have opted to use public roads as your laboratory and with that comes a responsibility to fully disclose exactly what you are doing both when things go right and when they inevitably go wrong."
Consumer Watchdog asked Uber to publicly report all crashes involving the test vehicles, plus any technical data and videos related to any crash. The letter also requests monthly reports similar to Google's reports, although it doesn't imply that Google's self-driving program is perfect. In fact, Simpson says he previously sent a similar letter to Google, and he is disappointed that Google hasn't released all of the data and videos from its trials. Lastly, Simpson asked Uber to publicly release "disengagement reports," which are annual reports that describe each time the technology failed.
The letter also included a list of questions that he describes as "broader and more philosophical," in an effort to find out more about Uber's long-term vision for robotic cars. For example, "Will Uber's robot car?" and "Does Uber have the technology to prevent malicious hackers from seizing control of a driverless vehicle or any of its systems?"
We spoke with Simpson about Consumer Watchdog's trepidation with self-driving vehicles. He says that the main concern is that autonomous features are being overhyped and rolled out too quickly, with minimal testing and oversight because everyone -- even NHTSA -- is too enamored with the idea of self-driving cars.
In reality, he says, the technology isn't ready yet, at least not with our current infrastructure. He explains:
"If every single car on the road was a robot car and they were communicating with each other you might have, in some ways, a safer situation than you have now. But that's not what we've done, and that's not what we'll have for a long time. Right now the average age of an automobile on the road is something like 11-and-a-half years, so people are going to be holding onto their cars for quite a while. As new vehicles out with more and more autonomous driving features, the problem is going to be how those vehicles interact with your traditional car."
He raises valid points, such as whether autonomous vehicles should be used in designated lanes. Industrial robots were fenced off from humans for decades, and they are only just now starting to become mobile and collaborative (able to safely work alongside people). So why should robotic cars be any different?
Still, Uber's self-driving cars are expected to start picking up passengers on regular Pittsburgh streets any day now. They will have human drivers who are expected to intervene if needed, but as we've seen with Tesla's autonomous features, people tend to expect self-driving cars to do everything. Simpson fears that drivers are ignoring "fine print" style warnings about drivers being ultimately responsible, and they are expecting the cars to do more than they can.
Regarding Tesla's autopilot feature, he says, "Even the name itself smacks of a promise that it can't keep." In June, a man was killed when a tractor trailer hit his Model S. Neither the driver nor the car's autopilot system noticed the truck against a brightly lit sky, so the brakes were not applied.
Simpson thinks NHTSA "fell down on the job when it came to Tesla," and let the vehicles onto the marketplace without adequate oversight. NHTSA is expected to publish an updated autonomous vehicle policy this year, and that has Simpson seriously worried. He says:
"I'm afraid that rather than remembering that their fundamental role is a regulator and protector of safety for highways, the top people of NHTSA have been kidnapped -- well, not kidnapped, but sort of taken over by the industry, and they're going to be out there promoting this whole policy. Mark Rosekind, the Administrator, and Secretary of Transportation Anthony Foxx are becoming promoters of the technology, rather than careful regulators, and I think that's a problem."
While Simpson may be right about the need to proceed with caution, that doesn't mean that self-driving cars should be abandoned altogether. They certainly have the potential to create safer roads, reduce traffic, and let's not forget the convenience factor. People with disabilities such as blindness could gain independence, and commuters could sit back and relax while a robot carries them to their destination.
As automakers continue to release new cars with more autonomous features, Consumer Watchdog suggests that drivers will need better training about the technology's limitations, and any company with a self-driving program must be completely transparent.
"Answering our ten questions will help the public fully understand what Uber is attempting to do and the risks that are involved," Simpson wrote to Uber. "Full transparency about your planned self-driving robot car testing activities is the only acceptable approach."
Consumer Watchdog has not received a response from Uber yet. ZDNet asked Uber and NHTSA for comment and will update the story if it receives one.