Tesla Model Y goes on sale in UK, priced from £54,990

Tesla scraps ultrasonic sensors, new models to rely on cameras for autonomy

Just watch the bumper...


Tesla has announced that it will no longer employ ultrasonic sensors on its models, meaning that certain headline driver assistance features such as parking assist, automated parking and remote summoning will cease to be available on certain new models in the short term.

As of later this year, the Tesla Model 3 and Y will begin to be sold worldwide minus the presently-included ultrasonic sensors, with plans to drop the sensors on the Model S and X from 2023.

Ultrasonic sensors are used like echo-location to detect nearby objects and are found on most new cars, commonly used as parking sensors front and rear. As a result, some of Tesla’s parking assistance functions including “Smart Summon” will be disabled on all Teslas sold without the sensors in the near-term.

Tesla Model X

Smart Summon is a function that is operated through a smartphone app and allows the car to drive itself remotely to the user at low speeds over short distances, such as in a car park.

In the future, Tesla says its cars will instead rely on eight cameras: one rear-facing, a couple on the front bumper, one sideways-facing on each B-pillar and a further trio of forward-facing cameras for all of its driver assists, collision warning and automated driving systems.

Once the camera technology improves, the company says it may reinstall those soon-to-be-deleted functions that currently rely on ultrasonic sensors (USS), and how it intends to improve that technology, it suggested, was by crowdsourcing data gleaned from cars currently running test versions of its misleadingly-named Full Self-Driving driver assistance software.

“Along with the removal of USS we have simultaneously launched our vision-based occupancy network — currently used in Full Self-Driving (FSD) Beta — to replace the inputs generated by USS,” Tesla said in a statement.

What the company aims to do is to use the data gathered from owners who have signed up to use the beta (test) version of its FSD system in real-world scenarios to help its camera-fed software “learn” and improve.

Tesla says all cars it produces are now fully self-driving

The move follows Tesla’s decision last year to scrap the use of radar sensors as part of its automated emergency braking and adaptive cruise control (Autopilot) systems and to instead rely solely on cameras, very much going against the grain among car-makers that, for safety and back-up reasons, typically employ an overlapping array of cameras, ultrasonic sensors, radar and sometimes laser-based lidar (light detection and ranging) as part of the operation of their advanced driver assistance and safety systems.

Critics of camera-based driver assistance systems say that they become next-to-useless after dark, with other observers suggesting that Tesla’s decision to drop both radar and ultrasonic sensors is a cost-cutting measure as well as a way to reduce the number of computer chips required in its cars amid the ongoing global semiconductor shortage that shows few signs of abating.

Tesla’s move away from radar-based driver assistance and safety systems has landed it in hot water with regulators in the United States, with the National Highway Traffic Safety Administration (NHTSA) currently running an investigation into thousands of cases of “phantom braking”.

Most of the cases of phantom braking appear to have occurred after Tesla’s scrapping of radar when the entirely camera-based forward collision warning and automated emergency braking systems threw up false warnings due to shadows, errors and other factors that they mistakenly perceived as obstacles, causing the car to suddenly brake, sometimes at motorway speeds.

Tesla’s radar-based safety systems haven’t been without their issues either though, and the most recent NHTSA inquiry is the third such investigation by the body into Tesla, with another looking at the propensity for the company’s earlier radar-equipped cars to crash into emergency vehicles when Autopilot was engaged and the driver was not paying sufficient attention to intervene.

The NHTSA also launched another earlier investigation into the now-disabled Tesla feature that allowed occupants to play video games on the car’s infotainment screen when the car was in motion.

Under the Society of Automotive Engineers’ (SAE) framework of vehicle autonomy Tesla’s level of self-driving capability currently stands at Level 2+, meaning that while the driver assistance systems can take over the steering, braking and acceleration under certain conditions, they require the driver’s full attention at all times, to intervene should the systems deactivate.

Mercedes is in the process of rolling out a feature known as Drive Pilot on its S-Class models, which represents Level 3 automation, meaning that vehicles equipped with this system can navigate around traffic, detect weather conditions and automatically merge when lanes of traffic end.

The system is limited to certain geo-fenced areas, however, and only works at speeds up to 37mph despite relying on lidar, radar, ultrasonic sensors and a significant array of cameras.

Mercedes-Benz S-Class 2021 review by Will Dron for Sunday Times Driving.co.uk

There are, at present, no Level 3 or Level 4 autonomous driving systems known to be in development that rely entirely on cameras, with almost all of them, like Mercedes, relying on a combination of cameras, ultrasonic sensors, radar and lidar, all of which prompts the question as to whether Tesla’s Level 2 Full Self-Driving system will ever live up to its current misnomer. 

Related articles

Latest articles