Tesla

Tesla Autopilot system under investigation by US safety organisation

Eleven separate incidents under scrutiny


THE NATIONAL Highway Transportation Safety Administration in the US has launched an investigation looking at eleven serious incidents involving Tesla’s so-called ‘Autopilot’ system.

The crashes, which occurred between January 2018 and July 2021, all involved Teslas failing to avoid emergency vehicles when coming upon the scene of other road incidents. That’s despite highly visible emergency vehicles, flares and illuminated arrow boards to warn drivers of the hazard ahead.

All of the cars in question, involving Tesla Models S, X, 3 and Y from 2014 to 2021, were either using Tesla’s Autopilot system or other driver assistance features.

During the eleven incidents, seventeen injuries and one death were recorded. Separately, several other deaths involving the Autopilot system have also been recorded.

The NHTSA said that its investigation aims to “better understand the causes of certain Tesla crashes,” including “the technologies and methods used to monitor, assist and enforce the driver’s engagement with driving while Autopilot is in use.” It will also look into any other contributing factors.

Whilst Tesla does warn that “current Autopilot features require active vehicle supervision and do not make the vehicle autonomous,” at least one fatal incident – not included in the current NHTSA investigation – is known to have happened with nobody sitting in the driver’s seat at the time.

Tesla has also said that, despite the incidents, its data shows that cars using Autopilot have fewer accidents per mile than cars being driven without its driver assistance technology.

The problem, according to some analysts, involves drivers assuming that their Tesla’s driver-assistance tech is more capable than it really is and failing to pay attention to the road ahead.

In addition to the risk of injury and fatality to the Tesla occupants, the combination of that lack of attention and the potential for the Autopilot to fail to recognise obstacles poses a serious danger for other road users who, most likely, have no wish to be treated like guinea pigs in the name of developing driver-assistance technology.

According to the NHTSA: “No commercially available motor vehicles today are capable of driving themselves. Every available vehicle requires a human driver to be in control at all times, and all state laws hold human drivers responsible for operation of their vehicles. Certain advanced driving assistance features can promote safety by helping drivers avoid crashes and mitigate the severity of crashes that occur, but as with all technologies and equipment on motor vehicles, drivers must use them correctly and responsibly.”

Tesla did not immediately responded to news of the probe, but following the announcement the company’s stock recorded a fall of 5%.