The National Transportation Safety Board has criticized Tesla, Apple, and road-safety regulators in the report from its nearly two-year investigation into the fatal crash of Apple software engineer 38-year-old Walter Huang.
Huang was driving his Tesla Model X P100D SUV with Autopilot drive assistance enabled when he crashed into a barrier on March 23, 2018 in Mountain View, California.
The board says Tesla's crash-avoidance system was "not designed to, and did not, detect the crash attenuator". Because of this, Autopilot accelerated the vehicle, and the vehicle failed to provide a crash alert and didn't activate emergency braking.
NTSB recommends that partial driving-automation systems must be able to effectively detect potential hazards and warn drivers of them to be safely deployed in high-speed environments.
It also criticizes the US Department of Transportation and the National Highway Traffic Safety Administration for taking a "nonregulatory approach" to systems like Autopilot.
Apple too comes under fire for failing to implement a policy for driving while using a company-issued iPhone.
"The driver was using a company-supplied phone, but his employer, Apple, did not have a policy preventing cell phone use while driving. Strong company policy, with strict consequences for using portable electronic devices while driving, is an effective strategy in helping to prevent the deadly consequences of distracted driving," writes the NTSB.
It also calls on Apple and other phone makers to create an "engineering solution to the distracted driving problem".
"Electronic device manufacturers have the capability to lock out highly distracting functions of portable electronic devices when being used by an operator while driving, and such a feature should be installed as a default setting on all devices," the board writes.
California's Department of Transportation is also partially blamed for failing to repair the damaged road safety barrier in a timely manner. NTSB concludes that Huang "most likely would have survived the collision" if the barrier wasn't damaged.
But the harshest criticism is directed at Tesla for failing to include safeguards that limit the use of Autopilot in unsuitable situations.
"If Tesla does not incorporate system safeguards that limit the use of the Autopilot system to those conditions for which it was designed, continued use of the system beyond its operational design domain is foreseeable and the risk for future crashes will remain," it writes.