Tesla Autopilot Tech Found Not to Blame in Fatal Crash
The company wins key autopilot trial after fatal 2019 crash, verdict could set precedent
A jury in California has found Tesla’s Autopilot driver-assistance technology not to blame for a fatal crash in 2019.
The landmark case constituted the first time the electric automaker faced a trial in the United States over allegations that Autopilot caused a fatality and the verdict could set an interesting precedent, with more legal cases imminent.
The lawsuit, which was heard in Riverside Superior Court, alleged that Tesla knowingly supplied cars with a defective Autopilot system. This ultimately led to a crash that killed Model 3 owner Micah Lee and seriously injured two passengers, including an eight-year-old boy.
The suit detailed how Lee’s Model S suddenly veered sharply off a highway east of Los Angeles while traveling at 65 mph, before hitting a palm tree and bursting into flames – all in a matter of seconds.
The plaintiffs were the two injured passengers – Lindsay Molander and her son Parker Austin – who were seeking $400 million, plus punitive damages.
Tesla denied the Autopilot tech was faulty and argued that it was unclear if it was even engaged at the time. It also suggested that Lee had been drinking before the crash. Tests conducted after the accident confirmed that he had alcohol in his bloodstream, but not a sufficient amount to be considered intoxicated under state laws.
After four days of deliberations, the jury decided, via a 9-3 vote, that the vehicle did not have a manufacturing defect, essentially agreeing with Tesla’s contention that driver error was to blame.
However, the plaintiffs’ lawyer Jonathan Michaels said in a statement that was first reported by Reuters: “The jury’s prolonged deliberation suggests that the verdict still casts a shadow of uncertainty.”
Over the past couple of years, scrutiny has intensified on Tesla’s driver-assistance systems Autopilot and Full Self Driving from a variety of different quarters.
A number of lawsuits have been filed, including one in California in September 2022 that alleged Tesla had “deceived and misled consumers regarding the current abilities of its ADAS technology.”
The software is also being investigated by the U.S. Justice Department and in late October it emerged that Tesla had been issued subpoenas for documents relating to Autopilot and FSD.
The National Highway Traffic Safety Administration is also probing several crashes involving Teslas fitted with the tech, including some with stationary first responder vehicles, and in June it demanded detailed information from the company on Autopilot.
Tesla has also been accused of falsely advertising the systems’ capabilities and last December, California made a move toward preventing it from marketing its tech as Full Self Driving.
Despite the potentially misleading names, Tesla’s website does spell out the limitations. It reads: “Autopilot, Enhanced Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous.”
Like what you've read? For more stories like this on self-driving vehicles and other emerging technologies, sign up for our free daily email newsletter to stay updated!
About the Author
You May Also Like