Autopilot

What Tesla’s 2 Million Vehicle Recall Means For Road Safety

And what the largest recall in Tesla history says about autopilot software.

by Kristi Pahr
Updated: 
Originally Published: 
A Tesla store in Colma, California, US, on Wednesday, Dec. 13, 2023. Tesla Inc. will fix more than 2...
Bloomberg/Bloomberg/Getty Images

In mid-December, Tesla announced a sweeping recall, the largest in the company’s history — nearly every vehicle the company has sold in the past 10 years — to address concerns with the Autopilot system, which Tesla CEO Elon Musk said in 2016 allowed Teslas to “drive autonomously with greater safety than a person. Right now.”

Tesla describes its “Autopilot” feature as a “catch-all term that refers to Tesla’s entire suite of semi-autonomous driving features,” including Traffic-Aware Cruise Control, Autosteer, Auto lane change, Autopark, Summon, as well as other programs in beta-testing or that are unreleased.

After an extensive two-year investigation into crashes that happened while Autopilot was in use, the National Highway Traffic Safety Administration (NHTSA) found ample reason to encourage Tesla to issue a recall to address problems with Autosteer, a software program that is intended to allow drivers to have their cars drive semi-autonomously while they’re in the vehicle.

“In certain circumstances when Autosteer is engaged, the prominence and scope of the feature’s controls may not be sufficient to prevent driver misuse of the SAE Level 2 advanced driver-assistance feature,” said Tesla in a statement announcing the recall. “At no cost to customers, affected vehicles will receive an over-the-air (OTA) software remedy. The remedy will incorporate additional controls and alerts to those already existing on affected vehicles to further encourage the driver to adhere to their continuous supervisory responsibility whenever Autosteer is engaged.”

In short, the NHTSA believes that Autopilot is not adequately ensuring that drivers are paying attention on the road — and the update purports to fix that problem by giving drivers additional warnings that they may be misusing Autosteer/Autopilot.

Concerns with Autopilot, explained

Rolling Stone reporting suggests that Tesla and Musk likely oversold the ability of the Autopilot function, leading consumers to believe they could be literally or metaphorically asleep at the wheel on any road. Operations manuals for Tesla vehicles state that Autopilot and Autosteer should only be used on controlled-access highways where traffic flows in only one direction, such as interstates. Critics say the manufacturer did not do enough to ensure drivers did not misuse or take advantage of the system.

Critics have also expressed concern that the update does not do enough to stop the same problems from occurring again. A previous investigation by The Washington Post found that at least eight fatal accidents occurred on roads where Autopilot doesn’t operate reliably, while others occurred when drivers were not prepared to intervene in the event the system made a mistake.

The recall will focus on over-the-air installation of a software update that, according to a statement from Tesla, will address this issue: “The remedy will incorporate additional controls and alerts to those already existing on affected vehicles to further encourage the driver to adhere to their continuous driving responsibility whenever Autosteer is engaged, which includes keeping their hands on the steering wheel and paying attention to the roadway.”

Does the Tesla recall do enough?

In a CleanTechnica article, Dan O’Dowd, a frequent critic of Tesla and leader of pedestrian safety advocacy group Dawn Project, said: “The correct solution is to ban Tesla’s defective software, not to force people to watch it more closely. NHTSA’s recall misses the point that Tesla must address and fix the underlying safety defects in its self-driving software to prevent further deaths.”

According to the website Tesladeaths.com, a watchdog group that compiles data on all reported Tesla accidents that resulted in the death of a driver, vehicle occupant, pedestrian, cyclist, or other motorist, the Autopilot function has allegedly been engaged in accidents that resulted in 42 deaths since 2013.

Of those 42 Autopilot-related deaths, 16% of those killed were reportedly pedestrians or cyclists, and almost half were occupants of other vehicles.

Despite those numbers, a Tesla spokesperson said, “We at Tesla believe that we have a moral obligation to continue improving our already best-in-class safety systems. At the same time, we also believe it is morally indefensible not to make these systems available to a wider set of consumers, given the incontrovertible data that shows it is saving lives and preventing injury.”

Autopilot and driver assist features are becoming increasingly popular, not just in luxury vehicles like Tesla but in brands like Ford, Chevrolet, and GMC. The NHTSA itself says that automation can increase road safety. But until manufacturers insist on checks, alerts, and other features that ensure driver engagement and allow the driver to intervene in the event of a mistake, the danger to pedestrians, cyclists, and other drivers will increase as automation features become more common.

“NHTSA’s investigation remains open as we monitor the efficacy of Tesla’s remedies and continue to work with the automaker to ensure the highest level of safety,” said an NHTSA spokesperson. “Automated technology holds great promise for improving safety but only when it is deployed responsibly; today’s action is an example of improving automated systems by prioritizing safety.”

This article was originally published on