This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.

Technology,
Civil Litigation

Jan. 2, 2026

Assembly Bill 621: Strengthening protections against deepfake pornography

Karin Lang

Attorney
Panish | Shea | Ravipudi LLP

Email: klang@panish.law

See more...

Neda Saghafi

Attorney
Panish | Shea | Ravipudi LLP

Email: nsaghafi@panish.law

See more...

Assembly Bill 621: Strengthening protections against deepfake pornography

As artificial intelligence advances and digitally altered images and videos become nearly indistinguishable from reality, the real-world harm caused by this content grows ever more severe. This danger is particularly acute with deepfake pornography--AI-generated sexually explicit imagery that places someone's likeness into intimate scenes without their consent. When such manipulated media are shared online without permission, the impact on survivors is devastating: What was once private becomes forcibly public, exposing survivors to widespread humiliation, harassment, and lasting psychological, reputational and social harm. Deepfake revenge porn isn't just digital mischief--it's a profound violation of privacy and autonomy that can upend lives.

This year, California took an important step to protect the dignity and safety of survivors through Assembly Bill 621, which recasts and extends Civil Code Section 1708.8 such that civil liability for nonconsensual deepfake pornography is expanded. First and foremost, the bill enhances monetary civil liability--increasing the maximum civil penalty recoverable by a depicted individual from $30,000 to $50,000, and from $150,000 to $250,000 for violations committed with malice.

Deepfake pornography has often been excluded from the scope of privacy torts due to its synthetic nature. Under AB 621 however, "digitized sexually explicit material" now includes any image or audiovisual work "created or substantially altered through digitization." This means that individuals portrayed in sexually explicit AI-generated videos and altered images may have grounds to recover under privacy torts.

The law also narrows a common defense: It presumes lack of consent where the creator or distributor cannot produce express written consent from the depicted adult and completely removes the consent inquiry when the depicted individual is a minor. These provisions offer significant protections for survivors when considering the reality that survivors often have no practical ability to prevent or consent to image creation.

Perhaps most significantly is the bill's reach to impose liability on third-party actors. AB 621 expressly imposes liability on persons who knowingly or recklessly aid, abet or facilitate the unlawful creation or dissemination of digitized sexually explicit material. That targets not only bad-actor creators, but also platform operators, service providers and/or intermediaries that turn a blind eye to exploitative networks. For years, the imposition of liability on such third parties was an enormous hurdle for survivors, given that facilitators successfully claimed innocence, ignorance or lack of control over content. AB 621 will force platforms to reconsider business models that monetize or enable nonconsensual sexual imagery.

Looking ahead, AB 621's expanded civil protections set an important precedent for other states and policymakers grappling with the fallout of AI-enabled harms. By codifying robust remedies for survivors and closing loopholes that previously left victims without recourse, the bill affirms that technological innovation must be paired with ethical and legal guardrails. In a world where digital likenesses are increasingly vulnerable, California's approach offers a framework that centers survivors, holds actors accountable and reaffirms the fundamental rights to privacy and bodily autonomy in the digital age.

Karin Lang and Neda Saghafi are attorneys at Panish | Shea | Ravipudi LLP.

#389134

For reprint rights or to order a copy of your photo:

Email Jeremy_Ellis@dailyjournal.com for prices.
Direct dial: 213-229-5424

Send a letter to the editor:

Email: letters@dailyjournal.com