Tesla faced many questions about its Autopilot technology after a Florida driver died in 2016 when sensors and cameras could not see and brake for a tractor vehicle to cross a street.
Now, the company faces stricter scrutiny over the past five years over Autopilot, something Tesla and its chief executive officer, Elon Musk, have long maintained to help their cars. safer than other means. Federal officials are reviewing a series of recent accidents involving Teslas either using Autopilot or possibly already using it.
Last week, the National Highway Traffic Safety Administration confirmed that it was investigating 23 such crashes. In a crash this month, a Tesla Model Y crashed behind a stopped police car on a highway near Lansing, Mich. The driver who was not seriously injured used Autopilot, police said.
In February Detroit, in the same circumstances as the 2016 Florida crash, a Tesla drove underneath a tractor crossing the street, tearing the roof of the car apart. Driver and one passenger were seriously injured. Officials have not yet said whether the driver has turned on Autopilot.
The NHTSA is also reviewing an accident that occurred on February 27 near Houston, in which a Tesla crashed into a stopped police car on a highway. It’s not clear if the driver uses Autopilot. Police said the car did not slow down before the collision.
Autonomous driving is a computer system that uses radar and cameras to detect lane markings, other vehicles and objects on the road. It can steer, brake and accelerate automatically without the need for driver input. Tesla has said it should only be used on median highways, but social media videos show drivers using Autopilot on a variety of roads.
“We need to see the results of our investigations first, but these incidents are the latest examples showing the advanced cruise control features that Tesla has not as good at detecting and then stopping. for a stopped vehicle in the case of the highway “. Jason Levine, executive director of the Center for Automotive Safety, a group formed in the 1970s by Consumers Union and Ralph Nader.
This new oversight comes at a pivotal moment for Tesla. After hitting record highs this year, its share price has dropped about 20% amid signs that the company’s electric vehicle is losing market share to traditional automakers. . Ford Motor’s Mustang Mach E and Volkswagen ID.4 recently visited showrooms and were seen as serious challenges for the Model Y.
The results of the current investigations are important not only to Tesla but also to other tech and automotive companies that are developing autonomous cars. While Musk frequently suggests that widespread use of these vehicles is imminent, Ford, General Motors and Waymo, a division of Alphabet, the parent company of Google, have said that the moment could be years or even decades.
Bryant Walker Smith, a professor at the University of South Carolina who has advised the federal government on autonomous driving, said it is important to develop innovative technologies to reduce traffic deaths. , currently about 40,000 people per year. But he says he’s concerned about the Autopilot, and how Tesla’s naming and marketing implies that drivers can safely turn their attention off the road.
“There’s an incredible difference between what the company and its founders are saying and letting people believe, and what their system is really capable of,” he said.
Tesla, which disbanded its public relations division and often didn’t respond to questions from reporters, did not return phone calls or emails seeking comment. And Musk did not respond to questions sent to him on Twitter.
The company has not publicly addressed recent incidents. While they were able to determine if the Autopilot was turned on at the time of the accident because their vehicle was constantly sending data to the company, they had yet to tell if the system was in use.
The company has argued that its cars are very safe, claiming that their own data shows that Teslas experienced fewer accidents per mile and even fewer when the Autopilot was in use. It also says that it tells the driver that they must pay attention to the road when using the Autopilot and must always be ready to regain control of their vehicle.
A federal investigation of the deadly 2016 crash in Florida found that Autopilot was unable to recognize a white semi-trailer car in the blinding sky and that the driver could use it when he is not on the highway. The Autopilot continued to drive the car at 74 miles per hour even while driving, Joshua Brown, ignored a few warnings to keep his hand on the steering wheel.
A second fatal incident occurred in Florida in 2019 under similar circumstances – a Tesla crashed into a tractor truck while Autopilot was operating. Investigators determined that the driver was without a steering wheel prior to the collision.
While the NHTSA did not force Tesla to recall the Autopilot, the National Transportation Safety Commission concluded that the system “played a key role” in the 2016 crash in Florida. It also says the technology lacks protective measures to prevent the driver from leaving the steering wheel or looking away from the road. The Safety Board came to the same conclusion when investigating a 2018 crash in California.
For comparison, a similar GM system, Super Cruise, monitors the driver’s eye on and off if he or she looks away for more than a few seconds. That system can only be used on major highways.
In a February 1 letter, National Transportation Safety Committee Chairman Robert Sumwalt criticized NHTSA for not doing more to evaluate Autopilot and asking Tesla to add protective measures. prevent the driver from using the wrong system.
The new administration in Washington may take a firmer stance on safety. The Trump administration has not sought to impose many regulations on autonomous vehicles and has sought to loosen other rules that the auto industry dislikes, including fuel-efficiency standards. In contrast, President Biden appointed an acting NHTSA administrator, Steven Cliff, who previously worked at the California Aviation Resources Board, which frequently clashed with the Trump administration over regulations.
Concerns about the Autopilot could prevent some car buyers from paying Tesla for a more premium version, the Full Self-Driving, which the company sells for $ 10,000. Many customers have paid for it hoping to be able to use it in the future; Tesla has given the option to operate on about 2,000 cars in “beta” or beta form starting late last year, and Musk recently said the company will soon be. give more cars. Fully self-driving driving is said to be able to operate Tesla cars in local cities and roads, where driving conditions are more complicated due to oncoming traffic, intersections, traffic lights, pedestrians and cyclists.
Despite their names, Autopilot and Full Self-Driving have major limitations. Their software and sensors are unable to control the car in many situations, which is why drivers must keep their eyes on the road at all times and place their hands on or near the steering wheel.
In a letter to California’s Department of Motor Vehicles last November, a Tesla lawyer admitted that the Fully Self-Driving System had struggled to respond to a variety of driving and non-driving situations. should be considered a fully automatic driving system.
The system is “incapable of recognizing or reacting” to certain “circumstances and events,” writes Eric C. Williams, Tesla’s general counsel. “They include static objects and debris on roads, emergency vehicles, construction zones, large uncontrolled intersections with multiple entrances, congestion, adverse weather, complex traffic. or opposing on driveways, roads without maps. ”
Mr. Levine of the Center for Automotive Safety has complained to federal regulators that the names Autopilot and Self-Drive are completely misleading and can encourage some risky drivers.
“Autonomous mode shows the car can drive itself and more importantly stop by itself,” he said. “And they’ve doubled-down with Full Self-Drive, and again led consumers to believe the car is capable of doing things it isn’t.”