Tesla vehicles sit on the lot at a Tesla dealership in Austin, Texas, on April 15, 2024.
Brandon Bell | Getty Images
The National Highway Traffic Safety Administration is pressing Tesla for answers about changes the company made to its Autopilot driver assistance system following a voluntary software recall in December that affected about 2 million vehicles in the U.S.
Tesla must meet a deadline of July 1 to provide information to the regulator or face fines up to $135.8 million, according to a letter sent by the NHTSA to company on May 6.
The recall was intended to improve Tesla’s driver-engagement systems, which are used to monitor whether drivers are safely using features like traffic aware cruise control, lane keeping and auto steering — part of Autopilot. Since the recall, at least 20 Tesla vehicles have been involved in crashes in which the system was thought to be in use, according to a filing on the NHTSA’s website.
The “recall remedy” probe follows a three-year investigation by the agency that found safety issues with Tesla Autopilot contributed to at least 467 collisions and 14 deaths from January 2018 through August 2023.
The NHTSA had concluded that drivers involved in those crashes “were not sufficiently engaged in the driving task and that the warnings provided by Autopilot when Autosteer was engaged did not adequately ensure that drivers maintained their attention on the driving task.”
Driver-engagement systems, sometimes known as driver-monitoring systems, in Tesla vehicles include torque sensors in the steering wheel to detect whether drivers are keeping their hands on the wheel, and in-cabin cameras that monitor a driver’s gaze. They should alert any inattentive driver to pay attention and stay ready to steer or brake at any time.
The NHTSA is seeking detailed crash data from the electric vehicle maker since the agency issued the recall update on Autopilot, including data and video stored in or streamed from its cars and retained by the company.
They’re also asking for records about Tesla’s engineering teams and their approach to “safety defect determination decision making,” “issue investigation,” “action design including human factors considerations (initial and modifications),” and “testing.”
Tesla is in the middle of a massive reorganization and sweeping layoffs. The company hasn’t disclosed how many jobs in its Autopilot and vehicle-safety engineering teams may have been cut.
For about a decade, CEO Elon Musk has been promising that Tesla is on the cusp of a self-driving breakthrough. With sales of Tesla EVs dropping in the first quarter, Musk has been focusing investors’ attention on his dream of a future full of Tesla artificial intelligence products, including robotaxis and “sentient” humanoid robots that can do factory work.