Boodor is Your Go-to Source for the Latest Business News, Stay Informed and Make Informed Decisions.
⎯ 《 Boodor • Com 》

Tesla Didn’t Fix Autopilot After Fatal Crash, Engineers Say

2023-08-18 01:19
Tesla Inc. failed to fix limitations in its Autopilot system following a gruesome Florida crash that killed a
Tesla Didn’t Fix Autopilot After Fatal Crash, Engineers Say

Tesla Inc. failed to fix limitations in its Autopilot system following a gruesome Florida crash that killed a driver in 2016, company engineers said in a family’s lawsuit over a very similar 2019 fatal collision that’s headed to a jury trial.

The electric-car maker didn’t make any changes to its driver-assistance technology to account for crossing traffic in the nearly three years between two high-profile accidents that killed Tesla drivers whose cars slammed into the side of trucks, according to newly revealed testimony from multiple engineers.

After years of touting autonomous driving as the way of the future, Tesla and Chief Executive Officer Elon Musk are under legal pressure from consumers, investors, regulators and federal prosecutors who are questioning whether the company has over-hyped its progress toward self-driving vehicles during the last eight years.

Tesla also is in the cross-hairs of multiple investigations by the National Highway Traffic Safety Administration over possible defects in Autopilot linked to at least 17 deaths since June 2021.

Musk vs. Experts

The trial set for October, the first for the company over a death blamed on Autopilot, will pit Musk’s repeated assertion that Teslas are the safest cars ever made against technology experts expected to testify that the company’s marketing has lulled drivers into a false sense of security.

Read More: Tesla Fatal-Crash Lawsuit to Test Musk’s Autopilot Claims

Musk was excused from being questioned in the case by a Florida judge last year. The billionaire chief executive is “hands-on,” “very involved with the product’s definition” and “very involved with making certain decisions around how things should work” with Autopilot, according to excerpts from a 2020 deposition of Tesla’s former director of Autopilot software, Christopher “CJ” Moore.

Tesla’s attorneys didn’t immediately respond to requests for comment.

The automaker contends it has been transparent about Autopilot’s limitations, including challenges with detecting traffic crossing in front of its cars. Tesla warns in its owner’s manual and car screens that drivers must be alert and ready to take control of vehicles at any time.

Tesla prevailed earlier this year in its first trial over a non-fatal Autopilot crash when a Los Angeles jury cleared the company of wrongdoing over a woman’s claim that the driver-assistance feature in her Model S caused her to veer into the center median of a city street.

Tractor-Trailer

The case set to be presented to a jury in Palm Beach County, Florida, was brought by the family of Jeremy Banner, a 50-year-old father of three who had switched on Autopilot 10 seconds before his Model 3 plowed into the underbelly of a tractor-trailer in 2019. An investigation by the National Transportation Safety Board found that Banner probably didn’t see the truck crossing a two-lane highway on his way to work. Autopilot apparently didn’t see it either.

Despite the company’s knowledge “that there’s cross traffic or potential for cross traffic, the Autopilot at the time was not designed to detect that,” according to testimony given in 2021 by company engineer Chris Payne that was excerpted in a recent court filing. Engineer Nicklas Gustafsson provided a similar account in a 2021 deposition.

Last week, Banner’s widow revised her complaint to seek punitive damages, raising the stakes for Tesla at trial. She argues the company should have re-programmed Autopilot so that it would shut off in dangerous circumstances after Tesla driver Joshua Brown crashed into the side of a truck in 2016.

“There is evidence in the record that the defendant Tesla engaged in intentional misconduct and/or gross negligence for selling a vehicle with an Autopilot system which Tesla knew to be defective and knew to have caused a prior fatal accident,” the Banner family said in the amended complaint.

Read More: Tesla Can’t Perfect Autopilot Without a Few Deadly Crashes

One of the expert witnesses retained by the Banner family brought is Mary “Missy” Cummings, who recently served as an adviser to the National Highway Traffic Safety Administration. Cummings, a Duke University professor and vocal skeptic of Autopilot, said in a court filing that Tesla “is guilty of intentional misconduct and gross negligence” for failing to test and enhance Autopilot between the Brown and Banner crashes.

Tesla made “public statements that its Autopilot technology is far more capable than it actually is,” Cummings wrote.

The company said in the wake of the 2016 crash that it altered how its driver-assistance system detected potential obstructions ahead, such as the white side of the tractor trailer that it couldn’t distinguish against a bright sky. The newer version emphasized a radar system for scanning ahead rather than camera sensors, Tesla said.

An NTSB investigation of the 2016 collision recommended that automakers limit the use of semi-autonomous systems to road conditions for which they were designed.

Trey Lytal, a lawyer for the Banner family, said Tesla allowed the “same defect” to take two lives three years apart.

“Tesla not only knew of this defect, but was warned by regulators for the US government that the system should not be used on roads with cross traffic or people would be killed,” he said in an emailed statement.

The case is Banner v. Tesla Inc., 50-2019-CA-0099662, Circuit Court of 15th Judicial Circuit, Palm Beach County, Florida.

(Updates with comment by attorney for deceased driver.)