X
Innovation

The obstacles to autonomous vehicles: Liability, societal acceptance, and disaster stories

Self-driving vehicles have made many strides, but certain barriers are preventing widespread adoption.
Written by Conner Forrest, Contributor

If you were to talk to a technology optimist about autonomous vehicles and self-driving cars, you'd probably hear about how revolutionary the technology is and how much it will disrupt markets around the world. And it's true: autonomous vehicles will revolutionize trucking and personal transportation, among other things. But it's not going to happen tomorrow.

It will take years, maybe decades, for driverless vehicles to reach widespread adoption. The technology behind these innovations is getting closer to achieving full autonomy every day, but the industry still faces obstacles in liability and societal acceptance, as well as negative consumer response to disaster stories.

The technology is robust and getting better every day, with developments like sensor fusion, said 451 Research research director Christian Renaud, but that same technology will play into liability issues.

SEE: IT leader's guide to the future of autonomous vehicles (Tech Pro Research)

"What's still an area of development is the decision-making processes that go into the corner cases," Renaud said.

The liability issue

Some 90-95 percent of driving happens along the same route you commute everyday with the same stop lights and traffic patterns, according to Renaud. Those are "relatively trivial to optimize for," he said, and they are "solved problems right now" for autonomous vehicles. But, teaching the artificial intelligence (AI) systems in the car to account for a deer running in front of a car, or someone intentionally trying to cause a wreck for insurance fraud, are completely different issues.

Autonomous vehicles will coexist with traditional vehicles for the next 50 years or so, Renaud said, so the AI has to account for all these other models as well, including older vehicles with their own mechanical issues.

The good thing, according to Forrester research vice president Laura Koetzle, is that autonomous systems have been in development for a long time -- for example, John Deere has been selling autonomously-steering tractors since 2002. This means that government regulators and insurance companies have had a while to think about how liability works.

Usually, technology is way ahead of regulations and liability planning, Koetzle said, but in the case of autonomous vehicles, the opposite is true.

"[The] US DOT has an autonomous vehicles policy. You can go read it, it's published," Koetzle said. The UK Modern Transport bill also has explicit planning -- including an insurance framework -- for autonomous cars, Koetzle said.

However, just because regulatory bodies are considering the technology does not mean that the subject of liability is settled. And, as Renaud noted, individuals are still worried about being held liable for something the machine did when they weren't controlling it. However, Gartner research director Michael Ramsey doesn't think that will be a problem.

"When it comes down to it, automakers are ultimately going to have to be responsible for a failure of the machine," Ramsey said, noting that both Audi and Volvo have expressed a similar sentiment. The bigger question, according to Ramsey, is how vehicle fault will actually be determined.

"The only threshold that exists in reality is the automakers' fear of being sued out of existence," Ramsey said.

As such, Ramsey said manufacturers are self-limiting the technology's introduction based on their own belief that it is safe. However, he also acknowledged that there is no definitive way to guarantee an automaker has performed all the necessary safety testing. Ramsey also said that regulators must build structures that assign blame for given situations.

There have been efforts to implement safety standards on the federal level, but they haven't been passed as legislation yet, Ramsey said. This sector is difficult to regulate because autonomous vehicles involve regulatory aspects that are typically handled separately at the state and federal level.

"The federal government regulates the equipment, and the state and local governments regulate the behavior -- meaning the driver," Ramsey said. "And in this case, they're the same thing."

Once the technology gets out into the wild, Ramsey said, regulators and insurers will be forced to adapt to its changes. Ramsey said he believes that regulation will then start to be built on a local or state basis, based on what happens.

Insurance shifts

On the human driver level, shifts in insurance are the major concern. Koetzle said that insurers have told her the future will look the same as the current process: If your car is in an accident, the insurance company will pay the claim so you can get your car fixed. On the back end, they'll figure out whether it was a software, hardware, or another issue.

"In the short- to medium-term, you're not going to see any reduction in premium for the insurance customer for autonomous driving," Koetzle said, "because the cost of replacing components on a vehicle that drives autonomously is much higher than on one that doesn't."

But, as the cost of building autonomous vehicles comes down, eventually there will be "meaningful declines" in insurance costs, Koetzle said. Additionally, Koetzle said she believes that usage-based insurance will become one of the primary products offered by insurers.

Societal acceptance

Despite the potential cost-savings, today "there's obviously still a bunch of skepticism," Renaud said. Part of this has to do with a lack of education around the technology.

According to Gartner research, 55 percent of people polled in the US and Germany said they wouldn't even consider riding in a fully-autonomous vehicle, but 71 percent said they would consider riding in a partially-autonomous vehicle. The biggest concerns are around security and technology failure.

General societal acceptance is a "big concern," Ramsey said, but it's also the easiest barrier to address -- and it doesn't have to be tackled immediately due to the predicted slow adoption of these vehicles. During the technology's initial phase, only early adopters will use it, but it will eventually make its way to other sectors of society.

Over time, as cars evolve from level 2 to 3 to 4 to 5 of autonomy, individuals will slowly come to accept driverless vehicles as a reality without any shock or awe, Renaud said. One day, he said, an Uber will show up to your house without a driver, but it won't come as a surprise, because the last few times the driver was just reading a book anyway, and was only there if something went wrong.

In the future, major cities with congestion areas might even mandate downtown areas as autonomous-only zones to help with traffic and safety, Renaud predicted. This will encourage others to follow suit, which will also help with adoption.

Disaster stories

What remains, though, as the elephants in the room are the disaster stories associated with autonomous vehicles, which greatly impact public perception of the technology.

"Big public failures will play a role in public acceptance or non-acceptance of autonomous transportation," Koetzle said.

One of the most-cited disasters was a fatal accident that occured when a man driving his Tesla Model S with Autopilot engaged struck a tractor-trailer and was decapitated. Despite the fact that the system warned the driver seven times to put his hands on the wheel before the accident, and Tesla was eventually cleared by the NHTSA, many have placed the blame on Tesla, considering the wreck a failure of technology.

Other accidents, like a self-driving bus experiencing an accident on its first day out, another Tesla crashing into a firetruck, a Chevy Bolt hitting a motorbike, and various small incidents here and there are held up by autonomous vehicle opponents as evidence the technology is not ready to coexist with society. However, when asked what disasters were actually the fault of the car, Renaud said, "I can't think of any, honestly."

Statistically, Renaud said, autonomous vehicles are much safer drivers than their human counterparts. What people should really worry about, Koetzle said, are cybersecurity issues.

In 2015, a Jeep was hacked by researchers, who were able to kill the car's braking system. If these kinds of hacks were to persist, even on a small scale, it would "cause chaos," Koetzle said, particularly if a vulnerability was discovered that affected a large group of autonomous vehicles -- similar to Spectre and Meltdown in PCs. The answer is for automakers to follow airplanes, which physically separate their entertainment system from their avionics systems, Koetzle said. It's more expensive and harder to do, but it is one step toward keeping autonomous vehicles safe.

Also see

roadblock.jpg
Image: iStockphoto/75tiks
Editorial standards