Federal highway safety investigators want Tesla to tell them how and why it developed the fix in a recall of more than 2 million vehicles equipped with the company’s Autopilot partially automated driving system.
Quick Read
- US regulators query Tesla on Autopilot recall effectiveness: The U.S. National Highway Traffic Safety Administration (NHTSA) is questioning Tesla about the development and verification of a recall fix for over 2 million vehicles equipped with its Autopilot system, following 20 reported crashes since the update.
- Concerns over driver warnings and behavior: NHTSA is scrutinizing whether the new driver warnings, implemented to improve safety on non-highway roads, are effective, especially when drivers potentially bypass monitoring systems.
- In-depth investigation into human factors and qualifications: The agency’s inquiry demands extensive details from Tesla on how human behavior science influenced the Autopilot design and the qualifications of the personnel involved in this evaluation.
- Tesla’s broader safety measures and updates: Following the recall, Tesla issued further safety updates, which NHTSA is also evaluating for their effectiveness and timing, amid ongoing concerns about the system’s use in inappropriate settings.
- Background of Autopilot scrutiny: This inquiry is part of a broader examination of Tesla’s Autopilot after multiple incidents, including crashes involving parked emergency vehicles that have raised significant safety concerns.
The Associated Press has the story:
US seeks info from Tesla on how it developed, verified whether Autopilot recall worked
Newslooks- DETROIT (AP) —
Federal highway safety investigators want Tesla to tell them how and why it developed the fix in a recall of more than 2 million vehicles equipped with the company’s Autopilot partially automated driving system.
Investigators with the U.S. National Highway Traffic Safety Administration have concerns about whether the recall remedy worked because Tesla has reported 20 crashes since the remedy was sent out as an online software update in December.
The recall fix also was to address whether Autopilot should be allowed to operate on roads other than limited access highways. The fix for that was increased warnings to the driver on roads with intersections.
But in a l etter to Tesla posted on the agency’s website Tuesday, investigators wrote that they could not find a difference between warnings to the driver to pay attention before the recall and after the new software was sent out. The agency said it will evaluate whether driver warnings are adequate, especially when a driver-monitoring camera is covered.
The agency asked for volumes of information about how Tesla developed the fix, and zeroed in on how it used human behavior to test the recall effectiveness.
The 18-page letter asks how Tesla used human behavior science in designing Autopilot, and the company’s assessment of the importance of evaluating human factors.
It also wants Tesla to identify every job involved in human behavior evaluation and the qualifications of the workers. And it asks Tesla to say whether the positions still exist.
A message was left by The Associated Press early Tuesday seeking comment from Tesla about the NHTSA letter.
Tesla is in the process of laying off about 10% of its workforce, about 14,000 people, in an effort to cut costs to deal with falling global sales. CEO Elon Musk is telling Wall Street that the company is more of an artificial intelligence and robotics firm rather than an automaker.
Phil Koopman, a professor at Carnegie Mellon University who studies automated driving safety, said the letter shows that the recall did little to solve problems with Autopilot and was an attempt to pacify NHTSA, which demanded the recall after more than two years of investigation.
“It’s pretty clear to everyone watching that Tesla tried to do the least possible remedy to see what they could get away with,” Koopman said. “And NHTSA has to respond forcefully or other car companies will start pushing out inadequate remedies.”
In the letter, NHTSA also asks Tesla for information about how the recall remedy addresses driver confusion over whether Autopilot has been turned off if force is put on the steering wheel. Previously, if Autopilot was de-activated, drivers might not notice quickly that they have to take over driving.
The recall added a function that gives a “more pronounced slowdown” to alert drivers when Autopilot has been disengaged. But the recall remedy doesn’t activate the system automatically — drivers have to do it. Investigators asked how many drivers have taken that step.
NHTSA is asking Telsa “What do you mean you have a remedy and it doesn’t actually get turned on?” Koopman said.
The letter, he said, shows NHTSA is looking at whether Tesla did tests to make sure the fixes actually worked. “Looking at the remedy I struggled to believe that there’s a lot of analysis proving that these will improve safety,” Koopman said.
The agency has said it will evaluate the “prominence and scope” of Autopilot’s controls to address misuse, confusion and use in areas that the system is not designed to handle.
Safety advocates have long expressed concern that Autopilot, which can keep a vehicle in its lane and a distance from objects in front of it, was not designed to operate on roads other than limited access highways.
Tesla tells owners that the system cannot drive itself despite its name, and that drivers must be ready to intervene at all times.
The agency also says Tesla made safety updates after the recall fix was sent out, including an attempt to reduce crashes caused by hydroplaning and to reduce collisions in high speed turn lanes. NHTSA said it will “assess the timing and driving factors behind these updates, their impacts on subject vehicle performance and Tesla’s basis for not including them” in the original recall.
NHTSA began its Autopilot crash investigation in 2021, after receiving 11 reports that Teslas that were using Autopilot struck parked emergency vehicles. In documents explaining why the investigation was ended due to the recall, NHTSA said it ultimately found 467 crashes involving Autopilot resulting in 54 injuries and 14 deaths.