Tesla Issues Response Defending Autopilot
The company says regardless of the situation, it is the driver who remains in control
Tesla has issued an uncharacteristically robust response to media criticism of its Autopilot tech.
The company, which disbanded its public relations department and generally does not comment on negative stories about its advanced driver assistance system, took exception to a Dec. 10 story in the Washington Post entitled: “Tesla drivers run Autopilot where it’s not intended – with deadly consequences.”
In a lengthy post on X, formerly known as Twitter – owned by Elon Musk, Tesla’s co-founder and biggest shareholder – the automaker said: “While there are many articles that do not accurately convey the nature of our safety systems, the recent Washington Post article is particularly egregious in its misstatements and lack of relevant context.”
The Post piece highlighted an incident in 2019 in Key Largo, Florida, where a couple who had stopped by the road were struck by a Tesla driving on Autopilot that the Post said had “crashed through a T intersection at about 70 mph.” One was killed; the other seriously injured.
The Post continued: “But the 2019 crash reveals a problem deeper than driver inattention. It occurred on a rural road where Tesla’s Autopilot technology was not designed to be used.”
It added: “The crash is one of at least eight fatal or serious wrecks involving Tesla Autopilot on roads where the driver assistance software could not reliably operate, according to a Post analysis of two federal databases, legal records and other public documents.”
This led to its central accusation: “Even though the company has the technical ability to limit Autopilot’s availability by geography, it has taken few definitive steps to restrict use of the software.”
Tesla’s response essentially focuses on two areas: the specifics of this crash and its software in general. But its main contention is that regardless of location, it is the driver who remains in control.
With regard to the Florida incident, for which it is facing a pending lawsuit, it accused the Post of “misreporting” and “omitting several facts.”
Tesla claimed that the complaint (in the lawsuit) acknowledges driver misuse and negligence; that a previous lawsuit saw the complainants settle with the driver; the Tesla driver did not blame Tesla; that the driver had acknowledged the car was his responsibility; and that the Post did not disclose that the driver was pressing the accelerator to travel at 60 mph when Autopilot was restricting it to 45 mph.
Most tellingly of all, it points out that “contrary to the Post article, the Complaint doesn’t reference complacency or Operational Design Domain” [ie – where Autopilot is designed to operate best].
On a more general level, Tesla reiterates that: “Whether the driver chooses to engage Autosteer [part of Autopilot] or not, the driver is in control of the vehicle at all times. The driver is notified of this responsibility, consents, agrees to monitor the driving assistance, and can disengage anytime.”
And it claims to have “incontrovertible data that shows [Autopilot] is saving lives and preventing injury,” citing statistics that show a Tesla with the tech is broadly ten times safer – less likely to crash – than the U.S. average.
Scrutiny of Tesla’s automated tech has intensified over the past couple of years, but there are those within the industry who believe some of the criticism has been motivated by Elon Musk’s increasingly controversial profile.
Whether Tesla’s Twitter defense marks the start of a concerted pushback by the automaker remains to be seen.
About the Author
You May Also Like