This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.
News

Technology

Dec. 27, 2024

EV litigation, regulation will continue changing 

Experts expect product liability claims will overtake driver negligence lawsuits when fully autonomous vehicles become the norm. But assigning liability will be more difficult.

The New York Times

The National Highway Traffic Safety Administration on Dec. 20 announced a national reporting and review framework for manufacturers of driverless cars. The voluntary program would have manufacturers provide the agency with data about the safety and performance of vehicles equipped with automated driving systems. In return, those manufacturers would get an "exemption pathway" for vehicles with autonomous driving systems.

The Federal Motor Vehicle Safety Standards do not have requirements specific for self-driving vehicles. Automobiles that comply with existing standards can generally be equipped with automated driving systems. Self-driving vehicles designed without mirrors or pedals, for example, need exemptions to operate. It remains to be seen whether the proposed rulemaking survives. Reuters reported that a document from President-elect Donald Trump's transition team advocated scrapping a car-crash reporting rule that is opposed by Tesla. Jason Miller, a transition team senior adviser, told the news wire that the recommendations came "from outsiders who have no role in charting administration policy."

Tesla automaker has reported more than 1,500 crashes to federal safety regulators under the program, putting it in the crosshairs of NHTSA investigators. Tesla's CEO, Elon Musk, spent millions of dollars helping Trump get elected.

Litigation stemming from car accidents is expected to evolve as well. Experts said they expect product liability claims will overtake driver negligence lawsuits when fully autonomous vehicles become the norm. But assigning liability will be more difficult.

"If there was a defect with the car, then the manufacturer would be liable for the software," attorney Aaron H. Jacoby told the Daily Journal. "If there's a defect with the software or something in the autonomous driving that's going on, then that would be the software company. It could even be - I don't know if there's a way that a passenger can override what the car is doing - but if there was such an override, then the passenger could be responsible if they told the car to override, and that led to an accident."

Jacoby is managing partner of ArentFox Schiff's Los Angeles office and leader of the firm's automotive practice.

Advocates for self-driving taxis say they are safer than those operated by humans.

Waymo, which operates a fleet of robotaxis in several U.S. cities, claims that driverless vehicles will reduce traffic fatalities. A recent study between Waymo and insurer Swiss Re showed that Waymo's self-driving taxis caused less property damage and bodily injuries than vehicles driven by humans.

The study analyzed liability claims stemming from collisions that occurred over 25 million fully autonomous miles driven by Waymo in Austin, Los Angeles, Phoenix and San Francisco, and compared that data to human driver data based on Swiss Re's data from more than 500,000 claims over 200 billion miles. But there are various levels of automation. And things can get dicey when humans are involved. The Society of American Engineers defined six levels of self-driving automation, ranging from Level 0, which has no automation, to Level 5, which is fully automated, meaning a car can drive itself anywhere under any conditions.

Tesla has two self-driving technologies: Autopilot and Full Self-Driving, which is in beta testing. Autopilot is a Level 2 system, which means it can steer, brake and accelerate, but humans must monitor it at all times and intervene when necessary. Full Self-Driving is a level 4 system, which means the vehicle can drive itself under certain conditions. Waymo's fleet of robotaxis is SAE Level 4.

In April 2023 a Los Angeles jury found human error was to blame in a lawsuit that claimed the Autopilot feature in a Tesla Model S caused it to crash into a median and injure its passengers. Plaintiff Justine Hsu was driving a leased 2016 Model S 75D when Autopilot allegedly failed to recognize the center median. The air bag deployed in a "slingshot-like fashion" causing several breaks to Hsu's jaw and loss of teeth, according to the complaint. Hsu v. Tesla Inc., 20STCV18473 (L.A. Super. Ct., filed May 14, 2020).

Tesla's attorneys argued that the car's Autopilot feature did not relieve her of her obligation to ensure the safe operation of the vehicle.

"The jury said she chose to put it in Autopilot and therefore she's negligent and Tesla gets a pass," Hsu's attorney, Alison S. Gokal of Gokal Law, told the Daily Journal.

"I'm always hesitant to speculate about where this is going because the technology is moving fast, and the court system is moving so slowly," Gokal continued. "There is still going to be driver's negligence but not at the same rate. Product defect is a much heavier load to carry than simple negligence between person to person."

In November 2023 a Riverside County jury ruled 9-3 in the nation's first Autopilot wrongful death civil trial against claims that a design defect caused a fatal car crash. The plaintiffs said Micah Lee was using the Autopilot on his Tesla Model 3 while traveling at 73 miles per hour and the car veered off the freeway at a 43-degree right angle, crashed into a tree and burst into flames, killing him instantly.

One of the main questions the jury had to answer was whether it was a defect in the Autopilot system or the driver's error that caused the crash. Throughout the trial, jurors heard testimony from Tesla experts that the auto steering system is incapable of turning at an angle beyond 18 degrees, and that it would disengage once the limit was exceeded, returning control to the driver. Tesla's counsel also argued that the crash was due to human error because Lee was found to have a blood alcohol level of 0.05% at the time.

"Partial automation puts lots of responsibility on the driver," Stanford Law School professor Robert L. Rabin said. "We're still in a world where driver negligence may account for a large percentage of the accidents that occur with partially automated vehicles."

Rabin does not believe that the tort system can deal with the issues that will arise from fully automated vehicles. He co-wrote an article for the Stanford Law Review in 2018 in which he advocated for the replacement of tort remedy with a no-fault type of system like workers' compensation. His proposal is a theory at the moment.

"It does seem to me what whenever autonomous vehicles become dominant, the issues of technology and safety become so complex and difficult to discern, especially with machine learning with constant updates," Rabin said.

"A no-fault type of system seems the best way, especially from a compensation perspective and a deterrence perspective.

If vehicle injuries were tracked to the companies that produced the vehicles, that would create a certain amount of deterrence and incentive for greater safety for the manufacturers responsible," he continued.

Such a system would have to be national, Rabin noted.

"You can't have tort in one state and a no-fault system in another in a highly mobile society like ours," he said.

#382636

Antoine Abou-Diwan

Daily Journal Staff Writer
antoine_abou-diwan@dailyjournal.com

For reprint rights or to order a copy of your photo:

Email jeremy@reprintpros.com for prices.
Direct dial: 949-702-5390

Send a letter to the editor:

Email: letters@dailyjournal.com