All products featured on WIRED are independently selected by our editors. However, we may receive compensation from retailers and/or from purchases of products through these links.
On March 23, Walter Huang died when his car crashed into a concrete barrier. The Apple engineer and father of two had been driving his Tesla Model X SUV along California's Highway 101 before the fatal collision.
In the moments before the collision his vehicle had been in Tesla's semi-autonomous Autopilot mode. The crash is the latest involving a semi-autonomous or fully-autonomous vehicle: Elaine Herzberg was killed by one of Uber's autonomous vehicles when she crossed a street on March 19 and Joshua Brown died after his Tesla collided with a truck in May 2016.
Deaths involving both fully and partially self-driving vehicles are an inevitability – and increasingly likely as more vehicles hit public roads. But Huang's death has opened new questions into the use of autonomous vehicles. In response, Tesla has reportedly been in a spat with the National Transportation Safety Board (NTSB), which is the body responsible for investigating fatal crashes in the US.
After Tesla talked publicly about the accident, the NTSB removed the firm from the investigation for disclosing sensitive details too early. Tesla has countered this by saying it removed itself from the investigation.
"Tesla violated the party agreement by releasing investigative information before it was vetted and confirmed by the NTSB," the agency said in a statement. It continued to say Tesla releasing information could lead to "speculation and incorrect assumptions". NTSB chairman Robert Sumwalt said: "Uncoordinated releases of incomplete information do not further transportation safety or serve the public interest."
In response Tesla, in a statement emailed to Bloomberg, the firm said it withdrew from the NTSB investigation as it believes not publishing information about the investigation "fundamentally affects public safety negatively".
Investigations into crashes are long and complex processes and companies with vehicles involved want to answer questions about incidents. This is increasingly so with developing technologies where public trust may not be high. The NTSB says its work to determine who was at fault can take between 12 and 24 months.
"In other industries, nothing is allowed to be released until the investigation is over," says Neville Stanton a professor of human factors in transportation at the University of Southampton. "That seems to me to be quite fair."
Read more: Uber's fatal crash shows the folly of how we test self-driving cars
And it's not a new problem. Following the death of Brown in 2016, the National Highway Traffic Safety Administration (NHTSA) – which is separate to the NTSB – released 153 pages of correspondence between it and Tesla to WIRED. In the detailed but largely administrative discussions the two organisations co-ordinated the public release of information about their investigations and how they would work.
Showing how speculation of investigations discussed publicly can have an impact, in one exchange NHTSA officials asked to discuss technical issues that "have been floated in the media". In another exchange, one Tesla member of staff said a blog post was being published outlining the crash "in the next 30 minutes" as "damage" was done by news stories. "I trust that you will confirm the accuracy of the relevant factual statements when asked tomorrow," Tesla said to NHTSA.
After the most recent incident, Tesla issued a statement saying Huang was at fault for the crash – even though Autopilot was running.
Tesla's Autopilot setting lets the car's sensors – including radars, ultrasonic detectors and LiDAR – control the vehicle's speed and movement between lanes. The features are far from full self-driving and if the human behind the steering wheel takes their hands off it, they will be warned by the car. Failure to put hands back on the controls will eventually result in the car bringing itself to a stop.
Tesla said Huang had been given several visual and one audible hands-on warning during his trip and his hands weren't on the wheel for six seconds before the collision. "The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken," Tesla said. "Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur."
Stanton says humans taking their hands away from the steering wheel and not paying attention is inevitable. In the fatal Uber collision the safety driver was looking at their phone. "In any task requiring a long period of human vigilance, we find that performance degrades," he says. "People simply can't attend to when there is not much happening."
As for the investigation into the death of Huang, the NTSB says it "expects" Tesla to continue to provide it with data for analysis despite not being part of the investigation. And the company is still involved in two other separate crash investigations. "If we're going to prevent crashes in the future, and surely that's the aim, then we need to understand what went wrong," Stanton says. "We'll only understand what went wrong if all the data is made available to the investigating agency."
This article was originally published by WIRED UK