This week federal regulators released a highly-anticipated policy for autonomous vehicles. The National Highway Traffic Safety Administration (NHTSA) published a 116-page document that outlines how automakers should test and deploy self-driving cars. Safety advocates have been pushing for stronger federal guidance regarding vehicle automation. Meanwhile, robotic cars are already let loose on American streets.
The policy is split into four parts: a 15-point safety assessment; a distinction between federal and state policies; current regulatory tools; and new tools and authorities. NHTSA didn't disclose many specific details, so it's still too soon to say whether the policy strikes the right balance between ensuring safety while leaving enough freedom for innovation.
"I think it's a good first step, and a move in the right direction," says Mary "Missy" Cummings, Director of the Human and Autonomy Lab at Duke University. In a phone interview, she tells ZDNet, "But it's still not clear how all of these things are going to be manifested.
In March, Cummings testified at a congressional hearing about the future of self-driving cars. She expressed concern about the rush to deploy autonomous systems that aren't ready for public roads. Now, she's pleased that NHTSA has released a preliminary policy, but she still has several concerns.
For example, she says, the policy says that semi-autonomous driving systems (e.g. Tesla's autopilot) that fail to account for distracted drivers will be subject to recall, but the details are still very open-ended. Cummings is also concerned that the policy doesn't specifically address how manufacturers should handle cybersecurity or operational challenges, such as sensors that don't perform well in extreme weather conditions.
Just two weeks ago, Consumer Watchdog Privacy Project director John Simpson echoed Cummings' concerns about the rush to deploy self-driving cars. He told ZDNet, "Mark Rosekind, the Administrator, and Secretary of Transportation Anthony Foxx are becoming promoters of the technology, rather than careful regulators, and I think that's a problem."
Now Simpson seems relieved that the policy is stricter than expected. "This isn't the checkered flag to industry to irresponsibly develop robot cars that we had feared," he wrote in a press announcement. "It's not a secret, cozy process with the manufacturers, but includes a real commitment to transparency and public involvement. The administration clearly heard the concerns raised by safety advocates and has addressed many of them."
The new guidelines should ensure that self-driving vehicles undergo rigorous testing before they are deemed safe. However, no matter how reliable the technology is in a test lab, the real unknown factor (and potential danger) comes from humans. Although Tesla warned customers that its autopilot feature was only meant to "assist," drivers still hopped in the backseat and let the car drive itself down the highway.
Cummings says she personally knows many people in the human factors division at NHTSA, and she knows they are fully aware of the limitations of how humans actually interact with automated systems. "But the problem isn't NHTSA, it's the people at Uber and Tesla and GM," she explains. "Are they really fully on board with just how fragile the limitations of human attention are?"
Despite these risks, the federal government is eager to move forward with self-driving technology, which could eventually make our roads much safer and introduce a new era of mobility. In an Op-Ed on Monday, President Barack Obama wrote, "Regulation can go too far. Government sometimes gets it wrong when it comes to rapidly changing technologies. That's why this new policy is flexible and designed to evolve with new advances."