Tesla recalls ‘Full Self-Driving’ to fix unsafe actions
By TOM KRISHER
AP Auto Writer
DETROIT (AP) — U.S. safety regulators have pressured Tesla into recalling nearly 363,000 vehicles with its “Full Self-Driving” system because it can misbehave around intersections and doesn’t always follow speed limits.
The recall, part of part of a larger investigation by the National Highway Traffic Safety Administration into Tesla’s automated driving systems, is the most serious action taken yet against the electric vehicle maker.
It raises questions about CEO Elon Musk’s claims that he can prove to regulators that cars equipped with “Full Self-Driving” are safer than humans, and that humans almost never have to touch the controls.
Musk at one point had promised that a fleet of autonomous robotaxis would be in use in 2020. The latest action appears to push that development further into the future.
The safety agency says in documents posted on its website Thursday that Tesla will fix the concerns with an online software update in the coming weeks. The documents say Tesla is doing the recall but does not agree with an agency analysis of the problem.
The system, which is being tested on public roads by as many as 400,000 Tesla owners, can make unsafe actions such as traveling straight through an intersection while in a turn-only lane, failing to come to a complete stop at stop signs, or going through an intersection during a yellow traffic light without proper caution, NHTSA said. The problems happen in “certain rare circumstances,” the agency wrote.
In addition, the system may not adequately respond to changes in posted speed limits, or it may not account for the driver’s adjustments in speed, the documents said.
“FSD beta software that allows a vehicle to exceed speed limits or travel through intersections in an unlawful or unpredictable manner increases the risk of a crash,” the agency said in documents.
Musk complained Thursday on Twitter, which he now owns, that calling an over-the-air software update a recall is “anachronistic and just flat wrong!” A message was left Thursday seeking further comment from Tesla, which has disbanded its media relations department.
Tesla has received 18 warranty claims that could be caused by the software from May of 2019 through Sept. 12, 2022, the documents said. But the Austin, Texas, electric vehicle maker told the agency it is not aware of any deaths or injuries.
In a statement, NHTSA said it found the problems during tests performed as part of an investigation into Tesla’s “Full Self-Driving” and “Autopilot” software that take on some driving tasks. The investigation remains open, and the recall doesn’t address the full scope of what NHTSA is scrutinizing, the agency said.
Despite the names “Full Self-Driving” and “Autopilot,” Tesla says on its website that the cars cannot drive themselves and owners must be ready to intervene at all times.
NHTSA’s testing found that Tesla’s FSD beta testing, “led to an unreasonable risk to motor vehicle safety based on insufficient adherence to traffic safety laws.”
Raj Rajkumar, a professor of computer engineering at Carnegie Mellon University, doubts that Tesla can fix all of the problems cited by NHTSA with a software update. The automaker, he says, relies only on cameras and artificial intelligence to make driving decisions, a system that will make mistakes.
“Cameras can miss a lot of things,” Rajkumar said. “These are not straightforward issues to fix. If they could have fixed it, they would have fixed it a long time back.”
Most other companies with self-driving vehicles use laser sensors and radar in addition to cameras to make sure vehicles see everything. “One sensing modality is not perfect by any metric,” Rajkumar said.
He questioned whether NHTSA will require testing before the software update is sent out to make sure it works. The agency said that it works closely with automakers as they develop recall remedies “to ensure adequacy.”
In documents, NHTSA says that on Jan. 25, as part of regular communications with Tesla, it told the automaker about concerns with FSD, and it asked Tesla to do a recall. On Feb. 7, Tesla decided to do the recall out of an abundance of caution, “while not concurring with the agency’s analysis.”
The recall is another in a list of problems that Tesla has with the U.S. government. In January, the company disclosed that the U.S. Justice Department had requested documents from Tesla about “Full Self-Driving” and “Autopilot.”
NHTSA has been investigating Tesla’s automated systems since June of 2016 when a driver using Autopilot was killed after his Tesla went under a tractor-trailer crossing its path in Florida. A separate probe into Teslas that were using Autopilot when they crashed into emergency vehicles started in August 2021. At least 14 Teslas have crashed into emergency vehicles while using the Autopilot system.
NHTSA has sent investigators to 35 Tesla crashes in which automated systems are suspected of being used. Nineteen people have died in those crashes, including two motorcyclists.
The agency also is investigating complaints that Teslas can brake suddenly for no reason.
Since January of 2022, Tesla has issued 20 recalls, including several that were required by NHTSA. The recalls include one from January of last year for “Full Self-Driving” vehicles being programmed to run stop signs at slow speeds.
“Full Self-Driving” went on sale late in 2015, and Musk has used the name ever since. It currently costs $15,000 to activate the system.
The recall announced Thursday covers certain 2016-2023 Model S and Model X vehicles, as well as 2017 through 2013 Model 3s, and 2020 through 2023 Model Y vehicles equipped with the software, or with installation pending.
Shares of Tesla closed Thursday down 5.7%. The stock has rallied about 64% in the year to date, reversing 2022’s hefty loss.