TWO PEOPLE have died in a collision in Texas involving a Tesla Model S equipped with the car maker’s semi-autonomous Autopilot system.
The two casualties, a man aged around 70 and another in his late sixties reported to be his best friend, were seated in the passenger and rear seats at the time of the crash on Saturday, according to the emergency services.
“There was no one in the driver’s seat,” confirmed Sergeant Cinthya Umanzoor of the local police force. Constable Mark Herman told The Wall Street Journal that police are “almost 99% sure … there was no one at the wheel of that vehicle.”
The car’s owner reversed the vehicle out of his driveway before climbing out of the driver’s seat and getting in the back, a relative said. A few hundred yards from the house, the car careered off the road and hit a tree, erupting into flames.
“Autopilot packages are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment” — Tesla
The car burned for four hours while authorities worked to extinguish the blaze, with little experience or training in how to deal with an electric vehicle fire. Around 32,000 gallons of water were used to extinguish the blaze, and the emergency services were said to have contacted Tesla after the car’s batteries repeatedly reignited. High voltage lithium-ion batteries can reignite due to the stored energy they contain.
In a vehicle safety report released over the weekend, Tesla claimed it had registered one accident per 4.19 million miles driven with the Autopilot function engaged during the first quarter of 2021, compared to one accident per 3.45 million miles in the final quarter of 2020. Tesla cars are “engineered to be the safest cars in the world”, it said.
Elon Musk, Tesla’s outspoken CEO, took to Twitter to claim that driving a Tesla with Autopilot engaged reduces the risk of an accident nearly tenfold.
Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle https://t.co/6lGy52wVhC
— Elon Musk (@elonmusk) April 17, 2021
On Monday evening, Musk again made his case on the social network, alleging that “data logs recovered so far show Autopilot was not enabled and this car did not purchase FSD [Full Self-Driving].
“Moreover, standard Autopilot would require lane lines to turn on, which this street did not have.”
Speaking to Reuters, Constable Herman said that Musk’s tweet regarding the data logs was the first news police had heard from Tesla representatives.
“If he is tweeting that out, if he has already pulled the data, he hasn’t told us that,” Herman told the agency. “We will eagerly wait for that data.”
Replying to Musk’s tweet, a professor from the Human and Autonomy Lab at Duke University said that research conducted at the institution found the Tesla Model 3 will drive without lane markings or the driver’s hands on the steering wheel.
our recent road tests show that not only will a Tesla Model 3 drive for at least 30 seconds with no lane markings and no hands on the steering wheel, but that the autopilot occasionally will surreptitiously turn itself off https://t.co/n1J8976Gph
— Missy Cummings (@missy_cummings) April 19, 2021
In the UK it is a legal requirement that a human is in control of their vehicle at all times, with their hands on the wheel.
Tesla maintains that “Autopilot packages are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment.”
Tesla steering wheels have sensors that detect when a driver’s hands are not on the steering wheel, although cheap steering wheel weights are available online that can hoodwink the technology. Safety groups have called on Tesla to introduce in-car camera and motion detecting technology to ensure that a driver is present behind the wheel at all times.
The tragedy at the weekend is the latest in a long list of unwanted press appearances for the Autopilot technology. In March last year the American National Transport Safety Board (NTSB) found Tesla to be partially responsible for the death of a 38-year old man in Los Angeles in 2018, citing the “limitations of advanced driver assistance systems available to consumers today,” and Tesla’s ineffective monitoring of drivers.
The US National Highway Traffic Safety Administration (NHTSA) has already opened an investigation into a Model S crash that killed two people in a Los Angeles suburb on December 29 last year, although a spokesperson would not confirm whether or not Autopilot was activated.
In March, the agency said that it has opened 27 investigations into crashes involving Tesla vehicles, with at least three of those occurring recently.
Tesla has been banned from advertising its Autopilot technology in Germany after a court in Munich found that the car maker’s claims about the system were misleading and exaggerated.
Tesla did not initially respond to Driving.co.uk‘s request for comment regarding this weekend’s fatal collision.
- After reading that two people have died in a Tesla self-driving car crash, you might be interested in reading that the car maker now accepts Bitcoin as payment.
- The car maker was named in a security camera hack in March.
- Tesla’s new U-shaped steering wheel is legal in the UK, the Department for Transport has confirmed.