2016 Tesla Model X electric SUV review

Tesla partially to blame for 2018 fatal crash, says US Transport Safety Board

But driver of the car was found to be playing a video game at the time of the crash


The American National Transport Safety Board (NTSB) has concluded a two year investigation into a 2018 Tesla Model X crash that ended in one fatality, quoting the “Autopilot system’s limitations” as one of the reasons for the tragedy.

The driver of the Tesla, Walter Huang, died on March 23, 2018, after his electric SUV careered into a crash barrier at 71mph in Mountain View, California. The car’s semi-autonomous Autopilot function was engaged at the time.

The vehicle caught fire — after the driver had been removed — before being hit by two other cars. Huang, 38, was taken to hospital before dying of his injuries.

As well as pointing out the restrictions of Tesla’s Autopilot function, the NTSB found the American company to be at fault for ineffective monitoring of the driver’s engagement.

The Autopilot function is designed to only be used when the driver is prepared to take manual control at all times. The Tesla Model X that Huang owned is equipped with a steering wheel that can sense when the driver has their hands on it.

“Tesla needs to develop applications that more effectively sense the driver’s level of engagement and that alert drivers who are not engaged,” said the report.

Robert Sumwalt, Chairman of the NTSB, pointed out that consumers need to be made aware of the differences between autopilot and autonomous.

“This tragic crash clearly demonstrates the limitations of advanced driver assistance systems available to consumers today,” he said. “If you are selling a car with an advanced driver assistance system, you’re not selling a self-driving car. If you are driving a car with an advanced driver assistance system, you don’t own a self-driving car.”

The agency’s denunciation, however, was not aimed solely at Tesla. The National Highway Traffic Safety Administration (NHSTA) was condemned for neglecting to fully appraise the risk of increasingly popular assisted driving systems. A further evaluation of the efficacy of the Autopilot has been recommended, “to ensure the deployed technology does not pose an unreasonable safety risk.”

Tesla is not the only Silicon Valley behemoth to come under fire in the report. Apple, for whom the victim of the crash was an employee, was blamed for not operating a system wherein driver’s phones are disabled when driving.
This came after it was revealed that the driver of the car was playing a video game on his company-provided phone in the minutes leading up to the crash. This led him to not realise that the car was careering towards a cash barrier.

The evidence given suggests that Huang was over-reliant on the Autopilot system, expecting it to act more like a pilot than an assistant, despite warnings from Tesla regarding the limitations of the current technology.

The state of California has also had to shoulder its share of the blame. The crash barrier that the Tesla veered into was broken from a crash that had occurred 11 days earlier, and therefore did not perform as designed. Thus, the highway operator has been criticised for its sluggishness in repairing the barrier, which may have mitigated the outcome of the crash if repaired to working condition.

In the spirit of its founder Elon Musk, Tesla has not been silent about the incident. Four days after the accident, it released a statement. It read:

“We were deeply saddened to learn that the driver of a Model X vehicle involved in an accident last Friday passed away. Safety is at the core of everything we do and every decision we make, so the loss of a life in an accident involving a Tesla vehicle is difficult for all of us. Earlier this week, Tesla proactively reached out to the authorities to offer our assistance in investigating.”

It also asserted that Tesla cars had successfully driven on the same stretch of road roughly 85,000 times previous to the incident, and that Autopilot was engaged on the stretch of highway over 200 times per day. It went on to blame the severity of the incident on the fact that the crash barrier that the Tesla had collided with had been previously damaged and not repaired. It finished its post by saying:

“It is worth noting that an independent review completed by the U.S. Government over a year ago found that Autopilot reduces crash rates by 40%. Since then, Autopilot has improved further. That does not mean that it perfectly prevents all accidents — such a standard would be impossible — it simply makes them less likely to occur.”

In a further statement three days later, Tesla confirmed that the Autopilot function had been active, and revealed that the driver of the vehicle had ignored “several visual and one audible hands-on warning” before the crash.

It then reaffirmed the safety of the Autopilot software, saying that “you are 3.7 times less likely to be involved in a fatal accident” while it is engaged. Pre-empting backlash for what some might perceive to be a callous rendition of safety statistics, it finished:

“We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety.”