The federal government’s top auto safety regulator is looking into why Tesla didn’t issue a recall last month when it updated its car software to improve its ability to detect problems. Emergency vehicles are stopping like police cars and fire trucks.
The regulator, the National Highway Traffic Safety Administration, has also asked Tesla for data on its Fully Self-Driving software, which has allowed a small number of owners to test on the road. public.
The NHTSA opened a formal investigation over the summer into 12 crashes in which Tesla vehicles were operating in Autopilot mode – a driver-assistance system that can steer, brake and accelerate on its own. vehicle – does not detect stopped emergency vehicles with low-light flashing lights.
In a letter to Tesla on Tuesday, the agency reminded the company that federal law requires automakers to initiate recalls if they discover defects that pose a safety risk. whole.
NHTSA asked the company for details on a software update sent in late September that modified the Autopilot feature and enhanced emergency light detection.
The letter asks Tesla to state whether it intends to withdraw in relation to the update, and if not, any legal or technical reasons for Tesla’s refusal to do so.
“Any manufacturer that provides an ongoing update to mitigate a defect that poses an undue risk to the safety of a motor vehicle must promptly submit a recall notice,” the agency said. attached,” the agency said in the letter.
The letter was sent by Gregory Magno, director of NHTSA’s vehicle defects division in its defect investigation office, to Eddie Gates, Tesla’s field quality manager.
NHTSA also requires Tesla to provide the number of owners who have been provided with the Full Self-Driving software and a copy of any agreements the company has with owners. Tesla CEO Elon Musk has described Full Self-Driving as a technology that allows cars to drive themselves in most cases. But this software is not capable of driving a car without the active participation of the driver.