Torts/Personal Injury,
Civil Litigation
Jul. 27, 2021
Tesla’s autopilot feature faces legal scrutiny after teen’s death
While technology gurus have praised Tesla’s autopilot feature, the feature has put Tesla in uncharted legal territory. A recent lawsuit highlights the liability risks that companies using AI-driven technologies such as self-driving cars are likely to face in the coming years.
Ryan P. McCarl
Rushing McCarl LLPPhone: (310) 896-5082
Email: ryan.mccarl@rushingmccarl.com
Ryan is an attorney and writer based in Los Angeles.
John M. Rushing
Partner Rushing McCarl LLP
Phone: (310) 896-5082
While technology gurus have praised Tesla's autopilot feature, the feature has put Tesla in uncharted legal territory. A recent lawsuit -- Escudero v. Tesla, Inc., RG21090128 (Alameda Cnty. Super. Ct., filed Feb. 26, 2021) -- highlights the liability risks that companies using AI-driven technologies such as self-driving cars are likely to face in the coming years.
In Escudero, a father is suing Tesla after losing his 15-year-old son to a car accident in which the electric car manufacturer's Autopilot feature failed to prevent a collision. The Escudero plaintiffs brought claims for products liability, negligence, and wrongful death claiming that the Autopilot system is defective and fails to react appropriately to traffic conditions. Part of the reason the system is unsafe, the complaint alleges, is that it gives customers a false sense of security and causes them to believe that they are operating an autonomous vehicle.
Escudero is one of several ongoing personal injury cases involving Autopilot. Since 2016, there have been at least 13 crashes and three deaths allegedly linked to the system.
Tesla has stressed that the autopilot system is meant to assist drivers and is not designed to prevent all potential crashes or other incidents. Additionally, the system is not intended to be used as a wholesale replacement for drivers; drivers are supposed to remain at the wheel and engaged. However, some drivers -- perhaps encouraged by marketing or hype characterizing cars outfitted with Autopilot as "self-driving" -- have predictably used the system incautiously. One driver has been arrested for being in the backseat while the Autopilot system was engaged, for example.
Tesla could reduce its legal headaches by building in safety mechanisms to prevent such behavior. Its failure to do so from the outset will provide significant fodder for plaintiffs' negligence claims.
Many accidents involving Autopilot-like features will likely be attributable to some combination of technological shortcomings and driver negligence. In 2018, for example, the National Transportation Safety Board investigated a collision involving a Tesla Model S and reported that a design flaw in the Autopilot feature combined with driver inattention caused the crash.
The issue of whether a particular crash is attributable to an AI system, driver error, or both is fact-intensive and likely to lead to costly legal disputes. Where a legal claim turns on a reasonably disputed issue of fact, the claim is often more difficult and expensive to resolve because dispositive pretrial motions are less likely to succeed. Because Tesla and other manufacturers will be unable to dispose of lawsuits like Escudero through demurrers, they will likely be subjected to invasive discovery that risks exposing trade secrets or embarrassing the company.
Though the legal landscape appears favorable for plaintiffs in autonomous vehicle personal injury cases, such cases will be expensive and challenging to prosecute. A new generation of expert witnesses will be needed, and few non-insiders are likely to have a deep understanding of how the systems in question operate.
Further, to the extent that systems like Autopilot rely on AI mechanisms such as neural networks, it is usually impossible to reverse-engineer how these mechanisms make decisions. Neural networks -- complex algorithms that, given certain inputs such as sensor data, produce certain outputs such as steering adjustments -- are often called "black boxes" because they cannot explain how they make predictions and select a course of action.
For now, regulation of autonomous and semi-autonomous vehicles has been left to the tort system. The technology may be new, but the legal theories at issue -- products liability and negligence -- are very old. This is a testament to the flexibility of the common-law tort system. But as autonomous cars and other AI-driven systems become more commonplace, state and federal legislators may feel more pressure to step in to create a regulatory framework. The private sector will likely also create solutions such as new insurance products.
Submit your own column for publication to Diana Bosetti
For reprint rights or to order a copy of your photo:
Email
jeremy@reprintpros.com
for prices.
Direct dial: 949-702-5390
Send a letter to the editor:
Email: letters@dailyjournal.com