WSJ Uncovers Hidden Autopilot Data That Explains Why Teslas Crash
More than a thousand accidents have been reported by Tesla to the NHTSA since 2021, according to the Wall Street Journal (WSJ). The details of these accidents have been, up until now, hidden from the public view. The publication reveals that Tesla over-relies on its camera technology, which differs from the rest of the industry.
Expert concerns and investigations into Tesla Autopilot
Expert on autonomous driving Missy Cummings said back in 2016, “There is no question someone is going to die in this technology. The question is when.”
Cummings also told the WSJ, “Computer vision is such a deeply flawed technology. And we’re not doing enough to determine how to fix its gaps and how to make it recognize objects.”
Since 2021, the NHTSA has been launching a series of investigations into Tesla’s Autopilot technology. However, the WSJ reports that the organization has released little information about these investigations, and the cars still remain on the road. Tesla has not been forthcoming with this information about Autopilot, which the company states is proprietary.
For this story on Tesla, the WSJ matched up reports from individual states with the crash data Tesla submitted to the NHTSA. They found that “long-standing concerns about the Autopilot technology are showing up on America’s roads.”
Of the 1000-plus crashes reported by Tesla to the NHTSA, the WSJ was able to uncover 222, and found that 44 of them occurred when Autopilot “veered suddenly.” Thirty-one occurred when a Tesla failed to stop or yield when an object appeared in front of it.
“Failure to stop” crashes, where the Tesla drives off the road or into another vehicle, added up to the most cases of injury and death out of the instances that the WSJ studied.
One of these crashes killed husband and father Steven Hendrickson. Hendrickson was driving his Tesla Model 3 to work when his vehicle crashed full-speed into an overturned semi-truck on the road. It appears that the Tesla Autopilot failed to recognize the truck. Hendrickson’s wife filed a lawsuit against Tesla.
Finding a fundamental flaw in Autopilot technology
The WSJ has possession of a video from Hendrickson’s crash and asked experts to analyze it. One noted that “the kind of things that tend to go wrong with these systems are things like it was it not trained on the pictures of the overturned double trailer. It just didn’t know what it was…The way machine learning works is that it was trained on a bunch of examples. If it encounters something it doesn’t have a bunch of examples for, it may have no idea what’s going on.”
The publication points out a “fundamental flaw” in Autopilot’s technology, mainly that it relies on camera or computer vision, with radar as a backup in some models. Other vehicles have radar, cameras, computer vision, and LIDAR (light detection and ranging). Tesla CEO Elon Musk has said that LIDAR is unnecessary and expensive.
John Bernall, a former Tesla employee who was fired in 2022 for posting Autopilot failures, tells the WSJ, “What I noticed from these image clips is that these cameras are not calibrated properly.” The WSJ also notes that if the cameras on the vehicle aren’t seeing the same thing, they can encounter great challenges in identifying objects and obstacles. They use an example of some raw data from an Autopilot crash similar to Hendrickson’s. The WSJ obtained this data from an anonymous hacker. When a crashed pickup truck appears on the road ahead, one of the Tesla’s cameras sees it, but the other doesn’t. As the obstacle gets closer, the cameras do not recognize it, and the Tesla crashes at full speed.
Conflicting views on safety
Over the years, Musk has maintained that his vehicles and the technology are safe, saying, “I do think that long-term, [Autopilot] can reduce accidents by a factor of 10…the safety per mile is better than human driving.”
On the other hand, Cummings tells the WSJ, “I am besieged with requests from families of people who have been killed in Tesla crashes. It’s really tough to explain to them that this is the way the tech was designed.”
The Department of Justice is also investigating Tesla for its marketing of Autopilot. The investigation surrounds concerns that Tesla’s claims misled drivers into believing that the car has full self-driving capabilities. Tesla denies claims that it misled the public about the car’s capabilities and argues that it never claimed that the vehicles are fully autonomous.
Were you or a loved one injured in a Tesla Autopilot accident? At Plattner Verderame, P.C., we can help. Call our office or submit our contact form to schedule your first meeting today at no cost to you. We are located in Phoenix and Tempe and look forward to speaking with you.
I have been active in leadership in the Arizona Association for Justice (lawyers who represent injured folks, and formerly known as the Arizona Trial Lawyers Association) since 1985. I served as President in 1991. I was an active participant in battles to protect the Arizona Constitution from the insurance industry and big business interests in 1986, 1990 and 1994.
Read more about Richard Plattner