


A Florida jury on Friday found that flaws in Tesla’s self-driving software were partly to blame for a crash that killed a 22-year-old woman in 2019 and severely injured her boyfriend. The verdict is a significant setback for the carmaker, which is staking much of its future on developing self-driving taxis.
The jury awarded $59 million in compensatory damages to the family of the woman and $70 million to her boyfriend, plus $200 million in punitive damages. Tesla will be required to pay a third of the compensatory damages and all of the punitive damages.
The jury found that Tesla bore 33 percent responsibility for the crash, and blamed the driver, George Brian McGee, for the remainder. Mr. McGee had previously settled with the family for an undisclosed sum.
The decision comes just weeks after Tesla began limited testing of autonomous taxis in Austin, Texas. Elon Musk, the company’s chief executive, said in a conference call with investors in July that the service could cover half the population of United States by the end of the year.
Mr. Musk, who has a history being overly optimistic about how quickly products will become available, has said that Tesla’s growth hinges on revenue from autonomous taxis and humanoid robots rather than car sales, which have been declining.
Tesla said it would appeal the verdict.
“Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology,” the company said in a statement.
The trial, in federal court in Miami, focused attention on Tesla technology, known as Autopilot, and whether the company’s self-driving software is safe.
It was the first federal lawsuit stemming from a fatal accident involving Autopilot to go to a jury. Tesla has settled at least several cases out of court.
Tesla was sued by the family of Naibel Benavides, a college student who died on April 25, 2019, after being struck by a Tesla Model S sedan driven by Mr. McGee on a dark, two-lane road near Key Largo, Fla. Dillon Angulo, her boyfriend, was severely injured and is a plaintiff in the suit, which was filed in U.S. District Court for the Southern District of Florida.
Mr. McGee was approaching a T-intersection with Tesla’s Autopilot software activated when he dropped his phone and bent to look for it. The Tesla blew through the intersection at more than 50 miles per hour and crashed into a black S.U.V. legally parked on the far side, according to testimony.
Ms. Benavides and Mr. Angulo were standing outside the S.U.V.
Mr. McGee told police immediately after the crash that he did not notice the intersection or the stop sign posted nearby. Data from the car showed he hit the brakes 1.65 seconds before impact.
While approaching the intersection, Mr. McGee had his foot on the accelerator pedal, overriding a function of Autopilot that is capable of stopping for objects in the road.
Mr. McGee said on the witness stand that he thought Autopilot would protect him and prevent a serious crash if he made a mistake.
Brett Schreiber, who represented the plaintiffs, accused Tesla of a “misinformation campaign” that exaggerated Autopilot’s capabilities and caused drivers to become complacent. He quoted Mr. Musk as saying that the system was safer than a human being.
“The car they claimed to have invented didn’t exist,” Mr. Schreiber said during closing arguments Thursday. “They knew all along that the Autopilot was defective.”
Tesla’s lawyers sought to lay the blame on Mr. McGee.
Joel Smith, representing Tesla, noted that Mr. McGee had admitted being distracted after dropping his phone. He was a “reckless” and “aggressive” driver who was driving well over the speed limit, Mr. Smith said.
“No car could have prevented” the crash, Mr. Smith said.
They plaintiff’s lawyer said data and video from the car showed that Autopilot recognized the S.U.V., at least one pedestrian and the end of the road before the accident.
Mary Cummings, an expert on autonomous driving technology and a former safety adviser to the National Highway Traffic Safety Administration, testified that Autopilot was defective because it failed to react to obstacles it recognized and failed to ensure Mr. McGee kept his eyes on the road.
Similar driver-assistance systems made by General Motors and Ford Motor have cameras that track a driver‘s eyes to make sure they are looking at the road. The version of Autopilot in Mr. McGee’s Tesla would keep operating as long as the driver touched the steering wheel occasionally, whether his eyes were on the road or not. Newer versions of Tesla’s system have cameras that monitor drivers.
Mr. Smith, the lawyer for Tesla, said the company never claimed its cars could drive without human oversight. He showed jurors excerpts from the car’s owner’s manual that warned, “It is the drivers’ responsibility to stay alert, drive safely and be in control of the vehicle.”
Federal safety officials were aware of at least 211 accidents from 2018 to 2023 involving Tesla cars operating with Autopilot engaged, according to evidence presented during the trial.
The lawsuit also claimed Tesla withheld crucial data and video from Mr. McGee’s car, and only produced it after the plaintiffs recovered the data and video from the car’s computer on their own.
Mr. Smith said the data was deleted by mistake.