Apple and Tesla under fire over software engineer's fatal Autopilot crash

NTSB criticizes Apple for lacking a phone-use driving policy and regulators for their hands-off approach to tech.
Written by Liam Tung, Contributing Writer

The National Transportation Safety Board has criticized Tesla, Apple, and road-safety regulators in the report from its nearly two-year investigation into the fatal crash of Apple software engineer 38-year-old Walter Huang.  

Huang was driving his Tesla Model X P100D SUV with Autopilot drive assistance enabled when he crashed into a barrier on March 23, 2018 in Mountain View, California. 

The NTSB found that Huang's vehicle crashed into a damaged safety barrier on US Highway 101 at 71mph after the vehicle's safety systems failed to detect the barrier and accelerated into it

It said Huang didn't attempt to avoid slamming into the barrier, "most likely due to distraction" caused by the gaming app he was playing on his iPhone. 

SEE: The new commute: How driverless cars, hyperloop, and drones will change our travel plans (TechRepublic cover story) | Download the PDF version

The board says Tesla's crash-avoidance system was "not designed to, and did not, detect the crash attenuator". Because of this, Autopilot accelerated the vehicle, and the vehicle failed to provide a crash alert and didn't activate emergency braking. 

NTSB recommends that partial driving-automation systems must be able to effectively detect potential hazards and warn drivers of them to be safely deployed in high-speed environments. 

It also criticizes the US Department of Transportation and the National Highway Traffic Safety Administration for taking a "nonregulatory approach" to systems like Autopilot. 

Apple too comes under fire for failing to implement a policy for driving while using a company-issued iPhone. 

"The driver was using a company-supplied phone, but his employer, Apple, did not have a policy preventing cell phone use while driving. Strong company policy, with strict consequences for using portable electronic devices while driving, is an effective strategy in helping to prevent the deadly consequences of distracted driving," writes the NTSB. 

It also calls on Apple and other phone makers to create an "engineering solution to the distracted driving problem". 

"Electronic device manufacturers have the capability to lock out highly distracting functions of portable electronic devices when being used by an operator while driving, and such a feature should be installed as a default setting on all devices," the board writes. 

California's Department of Transportation is also partially blamed for failing to repair the damaged road safety barrier in a timely manner. NTSB concludes that Huang "most likely would have survived the collision" if the barrier wasn't damaged. 

SEE: Tesla plans for German Gigafactory delayed over environmental concerns

But the harshest criticism is directed at Tesla for failing to include safeguards that limit the use of Autopilot in unsuitable situations. 

"If Tesla does not incorporate system safeguards that limit the use of the Autopilot system to those conditions for which it was designed, continued use of the system beyond its operational design domain is foreseeable and the risk for future crashes will remain," it writes. 

At a briefing yesterday, NTSB chairman Robert Sumwalt called out Tesla for failing to respond to its 2017 recommendations for automakers to design driver-assist systems that prevent misuse. He said Volkswagen, Nissan, and BMW reported their efforts to meet the recommendations, but Tesla never did, according to the LA Times

"Tesla ignored us," said Sumwalt. "We ask recommendation recipients to respond to us within 90 days. It's been 881 days since these recommendations were sent to Tesla. We're still waiting."

ZDNet has contacted Apple and Tesla for their comments and will update this article if they respond.


The Tesla crashed into a road barrier dividing the main highway and an exit ramp on US Highway 101 in Mountain View.  

Image: NTSB

More on Tesla, artificial intelligence, and self-driving cars

  • Tesla yanks Autopilot features from used car because 'they weren't paid for'  
  • US looks into over 100 complaints of Tesla cars suddenly accelerating and crashing  
  • New fatal Tesla crash: US road traffic regulator probes possible Autopilot link  
  • Elon Musk reveals Tesla's electric Cybertruck and smashes its windows  
  • Tesla's new Smart Summon: Here's why it has no place in public parking lots
  • Elon Musk: Older Tesla cars set for full self-driving chip upgrade in Q4  
  • Elon Musk on Tesla's Autopilot: In a year, 'a human intervening will decrease safety'
  • Fatal Tesla crash: Car was on Autopilot when it hit truck, say investigators
  • Elon Musk's new Tesla: Self-driving Model Y SUV out in 2020 from $47,000
  • Elon Musk: $35,000 Tesla Model 3 arrives but job cuts coming as sales shift online
  • Tesla starts to release its cars' open-source Linux software code
  • Tesla fatal crash: Parents of dead teen sue over alleged faulty battery
  • New NTSB Tesla fatal crash report: Model S battery reignited twice after Florida crash
  • NTSB's Tesla fatal crash report: Autopilot sped up, no braking in final seconds
  • Tesla Model S allegedly in Autopilot hits parked police car
  • Elon Musk: Tesla Autopilot gets full self-driving features in August update
  • Tesla's Autopilot: Cheat sheet TechRepublic
  • Tesla to close retail stores, only sell cars online CNET 
  • Editorial standards