This article was originally published on TechRepublic.
In March 2015, an Audi SQ5 began its nine-day journey from the Golden Gate Bridge to midtown Manhattan. The 3400-mile drive involved the normal obstacles: construction, highways, city driving, lane-passing.
What wasn't involved? Hands on the wheel.
Until recently, the concept of a driverless car seemed like the stuff of science fiction. But much has changed in just a few short years as the technology behind autonomous vehicles has taken huge strides. The cross-country driving Audi, powered by technology from Delphi, drove in autonomous mode 99 percent of the time.
The US federal government has also begun to embrace autonomous vehicles as a coming reality. In February 2016, the US Department of Transportation (US DOT) announced that it considers the AI powering Google's driverless cars (which have already logged hundreds of thousands of self-driven miles) officially a "driver"--marking a groundbreaking moment in the history of transportation. And in September 2016, the DOT unveiled guidelines for the development of autonomous vehicles, calling it "the most comprehensive national, automated vehicle policy that the world has ever seen."
The adoption of driverless vehicles has also come under scrutiny--most notably after the fatal accident that happened in May 2016 when Joshua Brown, operating his Tesla Model S in Autopilot mode, crashed into an 18-wheel tractor trailer. The incident highlighted the complex technological and social hurdles still facing the widespread adoption of driverless vehicles. Since then, Tesla has also released an update to Autopilot, which Elon Musk said would have likely prevented the accident.
Despite the debate, one fact is undisputed: Autonomous vehicles are here, and many more are coming.
In Singapore and Pittsburgh, Pennsylvania, self-driving taxis are now shuttling passengers on public roads. By 2020, more than a dozen companies plan to unveil cars with some level of autonomous driving
SEE: Autonomous driving levels 0 to 5: Understanding the differences
Ford announced its intention to mass-produce fully-autonomous, level 5 vehicles by 2021--that's no steering wheel, no brakes, no human required. According to a report from BI Intelligence, there will be 10 million of them on the road at that point.
So why are the automakers and the government getting on board now?
Autonomous driving has a long history--the first form of self-driving cars, powered by a radio antennae, was developed almost a century ago, back in 1925. And some of the technology that is being used today has been around for decades. In 1989, Carnegie Mellon was already using neural networks to control its autonomous vehicles, and by the 1990s, they already had their own vehicle that could drive cross-country.
However, a variety of factors have combined in recent years to make driverless vehicles ready for wide adoption. Gill Pratt, head of the Toyota Research Institute, part of a $1 billion investment in AI, points to several reasons why.
"The technology has finally become mature," said Jeffrey Miller, IEEE member and associate professor of engineering at the University of Southern California. Although vehicles have been equipped with intelligent features for the past couple of decades, such as pop-up headlights, automatic seatbelts, adaptive cruise control, lane correction, and more, the technology is now becoming "better, more reliable, and more cost-effective."
But even with these great leaps forward, fully-autonomous cars are several years off. There are still software and mechanical hurdles to overcome, and different companies are taking different approaches to testing out new technology.
For example, Google is still updating software on a near-daily basis, said Miller. And some companies are still negotiating the balance (and tradeoff) of power that happens when the driver needs to regain control of the car.
Jim McBride, autonomous vehicles expert at Ford, said that his company will be skipping Level 3, which involves transferring control from car to human, to get straight to Level 4, with no driver required. "We're not going to ask the driver to instantaneously intervene," said McBride. "That's not a fair proposition."
Bryant Walker Smith, professor at the University of South Carolina, also sees a major problem with Level 3. "If there's a hazard on the road and if the driver doesn't re-engage, the system doesn't have to do anything about it," he said "That's a very scary design prospect."
Environmental factors are a concern as well. Navigating tricky situations at stop signs, figuring out what to do when unknown variables like animals run into the road or driving safely in bad weather can be problematic. Most systems, said Smith, have not yet been tested in snow.
Despite these barriers, Tesla has taken a unique approach when it comes to integrating new technology. A Tesla representative told TechRepublic that the company is "disruptive, in the original sense of the word."
"We have decided not to let 'perfect' be the enemy of the 'better,'" the Tesla rep said. "Instead, we will release 'better' as soon as it is available."
In addition to technical and safety questions, Federal and state regulations are also still being worked out. But the Federal government is not sitting idly by. In January 2016, the Obama administration announced a plan to invest nearly $4 billion into research for autonomous driving. And a groundbreaking announcement by the National Highway Traffic Safety Administration (NHTSA) the following month allowed for the AI system controlling Google's self-driving car to be considered a driver, in response to the company's proposal to the NHTSA in November 2015.
In September 2016, the DOT announced the first-ever guidelines for autonomous driving.
Some states currently have specific laws that would ban autonomous driving--the state of New York, for instance, does not allow any hands-free driving. Without clear regulations, testing self-driving cars is a challenge. And although a few states--California, Nevada, Florida, and Michigan--currently allow autonomous vehicles on the road, California is the only one with licensing regulations at this point.
Miller told TechRepublic he thinks it's because of the pressure Silicon Valley puts on lobbyists in California. "They're lobbying the legislature, saying 'hey, you've gotta do something,'" said Miller. "And California doesn't want Google to go to another state to get help."
But without specific regulations, many questions remain unanswered. Autonomous driving is poised to shake up the entire car insurance industry, for example, and there are still many unanswered questions when it comes to liability. What will licensing look like? Will new drivers still be required to get traditional licenses, even if they aren't behind the wheel? What about young people, or older people with disabilities? What will be required to operate these new vehicles?
Governments need to work quickly to catch up with tech. Auto regulations are some of the strictest--and obviously for good reason, since public safety is on the line.
So where will we first start seeing driverless cars? Again, Singapore now has self-driving taxis in a limited area of the city, and Uber has unveiled its own driverless fleet in Pittsburgh. But these cars are still operated by safety engineers behind the wheel, taking notes. Ford's CEO has said that its own driverless cars will likely start out in ride-sharing fleets. And Tesla's updated Masterplan also highlights its intention to weave shared vehicles into the mix.
But beyond this, the answer is still unclear. Many predict that the trucking industry will be massively disrupted by driverless technology. Miller predicts that in the next five years, the first vehicles will be high-end luxury models bought by consumers.
And who we see owning them will depend, in a large part, on where we're looking.
"It's a cultural/societal issue whether it's accepted," Miller said. "There are countries in Europe where public transportation is the primary means of getting around--so car sharing will be popular." But in places like LA? "Car ownership is a big part of what's ingrained in society... it could take more time."
Wondering who's in the front of the race right now? While companies like Tesla and Google have gotten a lot of press lately, it's still too early to tell. "This is still an emerging technology," said Thilo Koslowski, former vice president and automotive practice leader at Gartner. Miller agrees: "The big players right now may not be the big players in a few years."
Ford says it will release self-driving cars in 2021. And Tesla says its level 5, fully-autonomous cars will be ready in 2018--and in October, announced that every Tesla in production will now have the hardware to be capable of self-driving.
Beyond the question of who will be first, Koslowski isn't sure that kind of comparison makes sense, anyway.
Why? Once the technology is perfected, companies like Google will be looking for partners. Also, "anybody who's working on this helps the entire category," Koslowski said. Consumers aren't necessarily looking to buy the first model to hit the market, either. Economist Adam Grant makes a good argument for why it can be smart to wait until after the first version is out.
"There's a learning curve and a trust factor that needs to be considered," said Koslowski.
At the heart of the driverless car question is consumer acceptance. While the technology factor is necessary to get these vehicles on the road, how to make these vehicles work is just one part of the equation. The second, Koslowski told TechRepublic, is "rethinking the entire car culture we've built over the past century."
The experience of driving is about to radically change in a world of self-driving cars. And the issue of trust will be central to the adoption of the new technology. According to Koslowski, the societal aspect is the real hurdle. Miller agrees. "The core challenge," he said, "is determining whether the relevant technologies have reached a demonstrated level of socially acceptable risk."
So what is a socially-acceptable level of risk? According to Koslowski, Smith, and others, it depends on a public understanding that these cars will not be flawless. But that's not an easy thing for many people to get their heads around.
"This is the first robot that we will experience on a day-to-day basis," he said. "Loss of control and trust is what a lot of people have problems with. All of a sudden we're delegating a task that consumers have been doing for a century to a robot."
While working as Senior Advisor for Innovation to Secretary of State Hillary Clinton, Alec Ross, author of Industries of the Future, traveled to 41 countries. He observed a deeply-ingrained distrust of robots, particular in Western society. According to Ross, the concept of robots carry "cultural baggage" that may not be easy to shake.
"How quickly can we get consumers to feel comfortable and want these technologies?" asks Koslowski. The answer depends on how realistic we are, he said.
Although safety is predicted to improve significantly, with an expected 80% reduction in accidents by 2040, the idea of "safety" may need to be redefined, shifting from driver to software. According to Smith, new areas of concern could include technical failures, related to bugs, as well as security vulnerabilities--cybersecurity will become a major concern when these cars hit the road.
Still, the danger posed by cybersecurity attacks "pales in comparison to the carnage on the road today," said Smith. "I'm concerned about the vulnerability of vehicles, but I'm terrified about today's drivers in today's vehicles."
There are about 32,000 fatalities per year due to auto accidents in the United States (the average number peaked at around 55,000 per year in the early 1970s), according to data from the NHTSA. Over 90% of those accidents are caused by human error, as reported in multiple studies.
But even with the assumption that safety will improve, the combination of technical uncertainties with cultural baggage in the western world make it likely that we will have a low tolerance for error when it comes to safety in autonomous vehicles. Many people may find it more forgivable for a human to make a mistake on the road than the computer that's in charge.
Smith predicted that "the first crash, the first injuries and fatalities, will be a big deal." A Tesla spokesperson echoed this point. "Any given regulator will be crucified for an issue with a feature or technology they approved that had a bug." The predictions were proven correct after influential publications like the New York Times portrayed the Tesla Autopilot crash in a critical way without taking the whole context into consideration. In that case, a major hurdle was understanding the difference between Tesla's Autopilot feature--which is more of an advanced cruise control--and a fully-autonomous car, which doesn't require the attention of the driver.
So perhaps the greatest challenge will be getting society on board--literally.
Driving cars is deeply ingrained in American culture. There's a romanticism about getting on the open road for the first time. "Most of us grew up thinking of freedom as a steering wheel and a learner's permit," said journalist Pagan Kennedy. "I vividly remember the first time I stomped on a gas pedal and zoomed out onto a highway; the car pounced with such force that I was thrown back in the seat. That sensation is still burned into my mind."
Letting go of the wheel will undoubtedly require a culture shift, letting go of the nostalgia. And when it comes to entrusting a self-driving vehicle to keep them safe, people have varying levels of comfort. When asked how they would feel about riding in a driverless vehicle, some respondents, like Anna Munoz, executive assistant at CBS Interactive in San Francisco, gave a flat-out "No." Ms. Munoz, presumably like many others, trusts herself over a self-driving car. "Nothing would get me to try it," she said.
Others are open to it--just not quite yet. Joe Jones, co-founder of iRobot, is willing to let a car take over if something goes wrong, yet still feels that "self driving cars haven't yet achieved the level of reliability where I'd be comfortable without a fail-safe mechanism." And Helen Greiner, CEO of CyPhy Works, a drone company, also has mixed feelings. "I'd be comfortable driving if I had the ability to grab the wheel--in few years," said Greiner. "If I'm going to be in the backseat and the car's fully driving, I think it's going to take me a little bit longer."
Still, some are perfectly comfortable ceding control. Roman Yampolskiy, director of Cyber Security Laboratory at the University of Louisville, puts it this way: "I had the 'privilege' of being driven by teenagers and seniors, and would pay good money to have an AI drive me instead."
Joanne Pransky, the self-dubbed "robot psychiatrist," echoed the sentiment: "What could be more life-threatening than sitting as a passenger in a car being driven by a distracted teen in training? Self-driving cars can't come soon enough!"
And despite her romantic memories of the open road, Kennedy would also "let a robot take control."
"I am terrible driver," she said. "Just ask my boyfriend."
But whether fully-autonomous driving arrives in two years or five, there is no doubt that it is close--and no doubt that it will radically alter the way we live.
We could use smartphones as an analogy, said Smith. "If you went back 15 years and asked people if they wanted smartphones, they would say, 'No. What? Why would I spend money on that?'" he said. Many people were first skeptical of smartphones--they were difficult to use, expensive, intrusive, and some worried about the harms of radiation. But then, "suddenly, they were popular, convenient, a business necessity--[and] people adopted them," Smith said.
"When someone is losing their sight or no longer able to drive, when a fifteen-year-old wants to get to his friend's house easier, when people reliant on mass transit realize they can get someplace much quicker," said Smith, "they will embrace these [autonomous vehicles]."
Speaking at a symposium hosted in July 2016 by the Association for Unmanned Vehicle Systems International (AUVSI), Smith said that we may be thinking about the issue of driverless vehicles all wrong.
"We should be terrified," he said, "of conventional driving."