Skip to Content

Opinion: Tesla’s response to auto recall isn’t reassuring

Opinion by William Wallace

(CNN) — Imagine you turn on the news and learn that a man has been killed, tragically, while using a lawnmower. It happens to be the same model that you own and use.

In this hypothetical scenario, investigators carefully analyze what happened and conclude that, while the man was partially at fault, flaws in the lawnmower’s design contributed to his death. They recommend a couple of reasonable design changes to prevent future tragedies.

The manufacturer doesn’t make these changes. In fact, the company doubles down on the flawed design. Years go by, hundreds of similar incidents occur, and more people die, including users of the lawnmower and people who just happen to be walking nearby.

Would you be okay with this? Would you feel like the manufacturer was putting your safety first?

Unfortunately, a scenario like this has played out on multiple occasions in recent years, illustrating a sense of impunity some companies seem to feel when it comes to safety laws. The current situation involving Tesla and its Autopilot suite of driver assistance features is just the latest example.

It shows that federal safety regulators need to reassert themselves and prove they can hold companies accountable in a timely manner. Congress should support these agencies with the funding, staffing and streamlined authority they need to be more nimble.

Earlier this month, after years of scrutiny by safety officials, advocates and lawmakers, Tesla recalled more than 2 million US vehicles. The National Highway Traffic Safety Administration (NHTSA) found that drivers could too easily misuse Autopilot in situations where they were not in control of the vehicle or where the system was not designed to be used. These conclusions are similar to what the National Transportation Safety Board (NTSB) has found in investigations of Tesla crashes since 2016.

Importantly, Autopilot does not make a car self-driving. It can keep the car a set distance from vehicles traveling ahead of them and provide steering support to keep the vehicle centered in the lane. But Tesla warns drivers to keep their hands on the steering wheel, remain mindful of surroundings and always be prepared to take immediate action.

Nevertheless, according to The Washington Post, there have been at least 17 deaths and five serious injuries associated with Autopilot, and 736 US crashes since 2019. Several deaths involved people outside the Tesla, such as motorcyclists, and at least 16 crashes involved Teslas colliding with stationary first responder or maintenance vehicles.

While Tesla did not concur with the NHTSA’s analysis, the company agreed to carry out the recall voluntarily in the interest of resolving the two-year investigation. The solution offered by the company is a free over-the-air software update it says will improve controls and alerts to keep drivers engaged.

Consumer Reports is in the process of evaluating Autopilot after the software update on the Tesla vehicles in our fleet. Unfortunately, our experts’ preliminary evaluation suggests the fix is insufficient, with the software not going far enough to prevent misuse or driver inattention. For example, CR’s testers were still able to use Autopilot after covering the in-car camera, and drivers can still use the feature if they’re looking away from the road.

This recall marks a critical moment for Tesla drivers and those who share the road with them. It’s essential for Tesla and the NHTSA to actually address the serious safety issues at hand by ensuring that Autopilot can be used only in situations for which it’s been designed, such as on limited-access highways, and only when the system has verified that the driver is paying attention to the road.

It’s alarming that — based on CR’s preliminary evaluation and the assessments of other safety experts — the recall might not work effectively in its current form.

It’s especially concerning because Autopilot is not alone in the marketplace. According to CR’s most recent data, active driving assistance systems are available on more than half of 2023 model-year vehicles, and few come with the safeguards they need. It’s foreseeable that safety regulators could start to see a pattern of incidents in non-Tesla vehicles, too.

But Tesla’s massive Autopilot recall also makes clear that our auto safety system is not serving consumers the way they might expect. How can people trust that their cars are designed to be safe and free of defects if a company under scrutiny takes years to carry out a recall recommended by safety experts — and then provides a remedy that might not actually fix the problem?

As consumers, we should demand more. We should credit companies that put safety first, and call on others to step up. Congress should challenge and empower the NHTSA so that it has the resources, legal tools and independence necessary to hold companies accountable more quickly and comprehensively.

Ultimately, when a safety recall is needed, it should happen in a matter of months — not years — and it should be effective. That is certainly not too much for consumers to ask.

The-CNN-Wire
™ & © 2023 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.

Article Topic Follows: CNN - Opinion

Jump to comments ↓

CNN Newsource

BE PART OF THE CONVERSATION

KION 46 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content