This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.
News

Technology

Dec. 27, 2024

Legal battles in the electric vehicle era: surveillance, safety and liability

This year, electric vehicles became key players in California's legal system. From Teslas assisting police investigations to General Motors abandoning its robotaxi program after a tragic accident, the intersection of technology and law raised pressing questions about privacy, liability and mass surveillance.


Take a ride in a self-driving taxi with one of California's foremost automative industry lawyers.

Legal matters involving electric vehicles in California this year included Oakland police commandeering Teslas as witnesses; General Motors exiting the robotaxi business after a Cruise vehicle badly injured a woman; SEC and DMV investigations; trade secret and investor claims and, on the positive side - mergers and patents.

In what might be seen as a sequel with a twist to the 1987 movie "RoboCop," Oakland police detained a Tesla Model X in July and reviewed what its external cameras picked up as it sat outside a beauty supply shop during a gunfight. The high-quality footage led to two people being charged with murder in a couple of weeks.

Oakland police obtained search warrants and court permission to tow Teslas at least three times this year, and other cities have done so also. These cases depend on the vehicles being in Sentry Mode when locked and unoccupied. Cameras and sensors record activity around them.

In one instance a car recorded a criminal trying to break into the vehicle next to it.

The potential legal implications, including issues related to property rights, mirror those associated with video footage from home doorbell cameras. However, there's a key difference: If the owners are unavailable, the police get warrants to take custody of the cars to ensure the recordings are preserved and a chain of custody is maintained.

Another issue is what police have a right to do with information on the recording that is not related to the crime they are investigating.

"The Fourth Amendment provides that the people have the right to be secured in their effects against unreasonable search and seizures. The purpose has always been to safeguard an individual from arbitrary government action," commented Long Beach attorney Robin D. Perry. When police want to access a Tesla's stored footage, "a judge evaluates the warrant application knowing full well that issuing the warrant would deprive a presumably innocent owner of a car, not an insignificant thing in car dependent southern California. That burden could be enormous. I was involved in a homicide case where an innocent third party was deprived of his car for months because the car was 'evidence.'"

One potential legislative fix, Perry suggested, "is to require law enforcement to notify innocent third party vehicle owners within 48 hours if their vehicle has been seized. Law enforcement should also be required, where possible, to seize only the video storage device, like a Tesla USB drive, and return the vehicle within 10 business days or be subject to fines and penalties. Such legislation would safeguard the public while ensuring law enforcement can potentially obtain valuable evidence of a crime from the technology."

Orin Kerr, the leading scholar on computer searches in the U.S. commented, "The police should rely on the car owner's permission if they can. But if the car owner is unknown, or won't give permission, a search warrant may be the only way to obtain the needed records.

"The warrant should be written narrowly based on the known date and time of the crime, so other information is not collected," Kerr said.

"But this is always a challenge with digital evidence warrants: It's a needle in the haystack problem. How do you authorize the police to find the needle without also showing them the haystack?" added Kerr, who is the William G. Simon Professor of Law at UC Berkeley but moves to Stanford in a few days.

Eugene Volokh, the Thomas M. Siebel Senior Fellow at the Hoover Institution at Stanford University, explained the law that covers such police actions. "A search warrant needs to be based on probable cause to believe that it will uncover evidence of a crime - whether or not the owner of the property is the one suspected of a crime," he wrote in an email. "And if the warrant uncovers evidence of a crime that's unrelated to what was being searched for, that will still be usable as evidence (including against the owner, even if he hadn't at first been suspected)."

Oakland police officers first utilized a Tesla for "surveillance" in a hotel parking lot where they believed a Canadian tourist's vehicle might have captured footage of a homicide, as reported by the San Francisco Chronicle. While investigating the death of a man who had been shot and stabbed, they spotted the Tesla and recognized its potential evidentiary value. Unable to reach the owner, they obtained a warrant to tow the car. However, the owner returned just as his car was being hooked up and willingly provided access to the vehicle's USB drive.

To obtain the warrant, Oakland Officer Kevin Godchaux wrote in an affidavit obtained by the Chronicle: "I know that Tesla vehicles contain external surveillance cameras in order to protect their drivers from theft and/or liability in accidents. Based on this information, I respectfully request that a warrant is authorized to seize this vehicle from the La Quinta Inn parking lot so this vehicle's surveillance footage may be searched via an additional search warrant at a secure location."

"To be sure, the police can't use a warrant for the cameras just to rummage through the car," Volokh noted. "But if they see something in plain view as they're seizing the car ... then that can indeed be used as evidence."

Markéta Sims, vice president of California Attorneys for Criminal Justice, answered a question about the car owner's rights. "If the car is not promptly returned after the seizure, the owner may bring a motion in criminal court for its return. If the police have a valid warrant, there is no basis to sue them for seizing the car," Sims explained.

Laurie L. Levenson, director of the Center for Legal Advocacy at Loyola Law School, has suggestions for ameliorating or eliminating civil rights problems arising from seizure of the cars and police review of the footage.

"I don't think the vehicle owners can claim any of their privacy rights were violated if their cameras just capture what occurs in public," she wrote in an email. "However, imagine that they also seize footage of the inside of the car, perhaps while it is parked in the owner's garage. All sorts of 'intimate activities' can occur in vehicles. A warrant would have to be careful to prescribe that only footage of public activities could be seized. Private activities should not be seized and, if necessary, perhaps the judge (or a designated special master) should be used to review the film to make sure that protected materials are not disclosed.

"It would be far better if owners who buy such cars are advised that, like cameras outside of businesses and homes, their footage might someday be seized by a warrant," Levenson wrote. "This is a new area so the courts have not yet stated what type of notice must be given to the car owners. With new technology come new challenges of our Fourth Amendment laws. The legal standard is one of 'reasonableness.' If the police just start towing all such vehicles without notice, courts may find that the practice is unreasonable and officers should wait, safeguard the car, and wait for the driver, unless there is an ongoing emergency.

"Finally," Levenson wrote, "this might be an area where the Legislature could step in to give guidance as to how the police should proceed. It would be good to have a thoughtful, informed discussion of the issue that leads to rules for the police conduct."

Perry made a similar point. "These cases really pose a fundamental challenge to the Fourth Amendment. There have been millions of Teslas sold just here in California. There are millions of video doorbells. Amazon Alexa, Siri and the like are inside of our homes and office. As these technologies continue to proliferate, the real question is what is the broad impact of mass surveillance technologies on the public's right to be secured against unreasonable search and seizures? Legally, this is an unsettled question, and the law continues to play catch up."

In the San Francisco accident case, it was the electric vehicle that got into trouble and made a bad situation worse. And so did the company that owned it. It was a scenario in which a human driver would have realized what was happening, but the robot did not.

In September, Cruise, General Motors' self-driving car unit, agreed to a $1.5 million fine after admitting to submitting a false report and failing to disclose crucial details about an October 2023 accident involving a pedestrian, as announced by the National Highway Traffic Safety Administration (NHTSA).

A Cruise robotaxi struck a pedestrian who had initially been hit by a human-driven vehicle. The robotaxi halted with the woman trapped underneath, then attempted to move to the side of the road, dragging her approximately 20 feet at around 7 miles per hour and worsening her injuries to a life-threatening extent.

The next morning, during a video conference with the federal highway agency, Cruise employees omitted any mention of the dragging. They also failed to include it in their written report, and technical issues prevented them from presenting a complete video of the crash, according to the Department of Justice.

Even after subsequently providing the video, Cruise did not amend its initial report. Cruise's management, including counsel, were ousted and its budget was cut. "That omission rendered the report inaccurate and incomplete in light of NHTSA's requirements," the DOJ stated.

As part of this year's settlement, Cruise committed to cooperating with government oversight for three years, implementing a safety compliance program, and providing annual updates to the DOJ. Additionally, the Department of Motor Vehicles suspended Cruise's permits for driverless vehicle deployment and testing, and a Securities and Exchange Commission investigation was begun.

Cruise also settled a civil lawsuit with the injured woman, agreeing to pay her at least $8 million, as reported by Fortune.

In a post about how autonomous vehicles might change the landscape of car accident lawsuits, Oakland personal injury firm Venardi Zurada LLP commented on the case: "At this point, autonomous vehicles are beholden to the same laws of traffic that their human counterparts are subject to. When they violate the rules of traffic, the company that operates the taxi can be held accountable.

"If the vehicle does something abnormal, like drag a pedestrian 20 feet while attempting to pull over, the company that owns and designed the robotaxi is responsible. Such lawsuits are generally filed under a theory of product liability ... so, there are multiple areas of personal injury law that this type of case touches upon."

On Dec. 10, GM announced it was abandoning the robotaxi business, saying it will refocus on driver assisted and autonomous vehicles.

This makes Waymo the leading company in the U.S. for self-driving taxis. In October, the company started a long-term partnership with Hyundai, with legal help from O'Melveny & Meyers LLP. Waymo's latest self-driving tech, called the Waymo Driver, will be put into Hyundai's IONIQ 5 SUV. These cars will gradually join Waymo's taxi service, Waymo One.

Also in October, Elon Musk unveiled Tesla's robotaxi prototype at Warner Bros. Studio in Burbank.

After that demonstration, personal injury attorney Tray Gober of Lee, Gober & Reyna commented, "Whereas a driver who was operating a car in a crash can be found guilty of manslaughter, I don't see that happening to a coder working on the AI for a Tesla Model Y being used as a robotaxi. Instead, when someone dies or is seriously hurt, it's going to have to be resolved in civil court with a product liability claim," either for defective design or defective manufacturing.

Gober is based in Texas, where Tesla headquarters moved in 2021, leaving Palo Alto. "Texas and California seem to be giving a lot of leeway to autonomous vehicles," Gober said. "In Texas, crash reports have been updated to include a section the investigating officer fills out that rates the level of autonomy of the vehicle at the time of the crash, so that shows how much the state is accommodating new technology."

The limits and liabilities of automated cars have been tested in court for several years now and most of these have involved Tesla. It has more of these vehicles on the road, and its assisted driving technology is used more often. Tesla vehicles account for nearly 70% of all incidents where Level 2 semi-autonomous driving systems were in use, according to NHTSA reports. However, the agency's data also showed Tesla's vehicles hold some of the lowest probabilities of injury in crash tests.

Elise Sanguinetti of Arias Sanguinetti Wang & Team LLP has handled many accident cases involving Teslas. "Litigating against Elon Musk is a tall task as his deep pockets create a wall of attorneys between us and any information that may help the people who have been seriously injured or even killed while driving a Tesla,' she told the Daily Journal in November. "Also, as electric cars are a newer technology, the technical portion of these cases can be difficult as the internal systems do not behave like a traditional car or truck."

The major issue jurors in California have had to resolve on this subject is whether an accident was caused by driver inattention or over-reliance on the systems, rather than inherent flaws in the technology.

The first case on this question resulted in a 9-3 win for Tesla in Riverside County in late 2023. The driver of a Tesla Model 3 had Autopilot engaged and went off the freeway, hitting a palm tree. He was killed, while his fiancé and her young son were injured.

"Cars are safe now, but not autonomous. We helped the jury understand that Autopilot is not self-driving. It's a Level 2 assistance device with demonstrated safety features," Tesla's lead attorney in the case, Michael R. Carey of Dykema Gossett PLLC, said in a Daily Journal interview in February.

But as cars become more autonomous, such as robotaxis, where humans have no way to exercise control, manufacturers might be increasing their chances of being held liable, said Atlanta attorney Ted Spaulding of Spaulding Injury Law.

Commenting on Musk's robotaxi prototype debut, Spaulding said, "The irony here is, by not having a steering wheel or brakes for humans to take over, Tesla is making it easier to say it is 100% liable for any rules of the road violations by a robotaxi vehicle that injures property or a person."

Insurance is also an interesting issue. "The owner of the robotaxi would be responsible for insurance coverage on the vehicle," Spaulding said. "However, I could see insurers balking at that, saying they are not going to insure someone as an owner of a vehicle that does not have control of the vehicle and that Tesla needs to be the insurer of these vehicles."

#382635

Laurinda Keys

For reprint rights or to order a copy of your photo:

Email jeremy@reprintpros.com for prices.
Direct dial: 949-702-5390

Send a letter to the editor:

Email: letters@dailyjournal.com