Tech News

Tesla Autopilot Was Uniquely Risky—and May Still Be

A federal report published today found that Tesla’s Autopilot system was involved in at least 13 fatal crashes in which drivers misused the system in ways the automaker should have foreseen—and done more to prevent. Not only that, but the report called out Tesla as an “industry outlier” because its driver assistance features lacked some of the basic precautions taken by its competitors. Now regulators are questioning whether a Tesla Autopilot update designed to fix these basic design issues and prevent fatal incidents has gone far enough.

These fatal crashes killed 14 people and injured 49, according to data collected and published by the National Highway Traffic Safety Administration, the federal road-safety regulator in the US.

At least half of the 109 “frontal plane” crashes closely examined by government engineers—those in which a Tesla crashed into a vehicle or obstacle directly in its path—involved hazards visible five seconds or more before impact. That’s enough time that an attentive driver should have been able to prevent or at least avoid the worst of the impact, government engineers concluded.

In one such crash, a March 2023 incident in North Carolina, a Model Y traveling at highway speed struck a teenager while he was exiting a school bus. The teen was airlifted to a hospital to treat his serious injuries. The NHTSA concluded that “both the bus and the pedestrian would have been visible to an attentive driver and allowed the driver to avoid or minimize the severity of this crash.”

Government engineers wrote that, throughout their investigation, they “observed a trend of avoidable crashes involving hazards that would have been visible to an attentive driver.”

The agency will now also look into Tesla’s statements that drivers can opt in to parts of the recall fix, and are easily able to reverse parts of it.

According to Tesla, the post-recall fixes included stricter driver attentiveness requirements while using Autopilot, larger driver-monitoring alerts (“Please pay attention to the road”), and a suspension policy that restricts drivers’ use of the feature if the system finds they are using it improperly.

“Tesla should get serious about driver monitoring, and should limit its use to roads it’s built to work on, for crying out loud,” says Phil Koopman, an engineering professor at Carnegie Mellon University whose research includes self-driving car safety. Without more serious intervention, Koopman says, a cycle of investigations and recalls could continue for years.

Car safety experts with the publication Consumer Reports found in February that Tesla’s recall fixes did not prevent Autopilot misuse.

The new probe comes at a bad time for the electric-auto maker, which is facing its worst sales and revenue growth numbers in years. Tesla also revealed in regulatory filings last fall that it faces an investigation by the US Department of Justice into its Autopilot feature and how it has represented the driving ranges of its battery-powered vehicles.

The timing is also exceedingly awkward for Tesla because it’s pushing even further into autonomy. On an earnings call with investors Tuesday, CEO Elon Musk seemed unperturbed by the company’s recent downturn, focusing instead on Tesla’s work in autonomy and its plans to operate a fully autonomous ride-hail service. “Really, we should be thought of as an AI or robotics company,” he said. “If you value Tesla as just like an auto company … fundamentally, it’s just the wrong framework.”

Musk has said the company will hold an unveiling event for a purpose-built Tesla robotaxi, called the Cybercab, in August. Just this week, Tesla slashed the price of its more advanced driver assistance feature, called Full Self-Driving, by a third, to $8,000. The company began offering car customers free 30-day trials of FSD last month. Now, its original, flagship automated feature is under new scrutiny.