This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.

Intellectual Property

Feb. 27, 2026

The NO FAKES Act is urgent, carefully crafted and constitutionally sound

The NO FAKES Act would close a critical gap by prohibiting the nonconsensual use of digital replicas of an individual's voice or likeness and establishing the first federal intellectual property right in identity--long overdue.

Jeffrey P. Bennett

General Counsel
SAG-AFTRA

See more...

The NO FAKES Act is urgent, carefully crafted and constitutionally sound
Shutterstock

Generative artificial intelligence has exposed a structural gap in American law: There is no uniform federal right protecting individuals against the unauthorized replication of their voice or likeness. As digital replication and synthetic content becomes indistinguishable from authentic human performance, the absence of a coherent legal framework creates both enforcement vacuums and compliance uncertainty.

The Nurture Originals, Foster Art and Keep Entertainment Safe Act (NO FAKES Act) addresses that gap. The legislation would prohibit the nonconsensual use of digital replicas of an individual's voice or likeness in sound recordings and audiovisual works. It would establish the first federal intellectual property right in voice and likeness, a right that is long overdue.

The NO FAKES Act replaces today's fragmented state-law patchwork with a consent-based, constitutionally compliant national standard. This standard was drafted to ensure strict First Amendment compliance. The creative community, both studios and talent, rely on a strong First Amendment to tell the stories and create the content we have all enjoyed for 100 years. That reliance was foremost in mind when the bill was drafted.

The urgency of this bill cannot be overstated. Generative artificial intelligence tools allow anyone an easy ability to create voice and likeness replicas convincing enough to fool your family. If you don't already know that, or believe that, take a look on social media. You will quickly find an avalanche of nonconsensual digital replicas. A core component of the NO FAKES Act is the establishment of a mandatory takedown process. Working together, platforms, content creators and artist representatives established a system for take downs that recognizes the need for individuals to get nonconsensual violations down fast without burdening the freedom of expression we all rely on.

Narrowly targeted with strong protections

The Act focuses on digital replicas, and liability attaches to their use, not their creation. Digital replicas are narrowly defined as computer-generated and highly realistic creations that are readily identifiable as a specific individual. This is not about reference or depiction; it is about a digital clone. If an individual is interested in licensing their voice or likeness, the bill includes guardrails around those licenses. The license must include written consent, and that consent must include a specific description of the intended use of the replica, with a limited term. The marketplace for licensing voice and likeness has existed for decades. The legal and business practices governing such licensing have worked, and the marketplace has flourished. However, the introduction of generative AI and digital replicas demands new statutory guardrails. The Act includes those guardrails.

Nonconsensual digital replicas v. freedom of expression

Skepticism will predictably focus on the First Amendment. Any statute regulating expressive works--particularly audiovisual media--invites scrutiny. The Act is drafted with that scrutiny in mind.

At its core, the Act regulates the misappropriation of voice and likeness, via a digital replica, for commercial exploitation or performance replacement. The Act does not suppress ideas or viewpoints. Courts have long recognized that rights of publicity, aka name/image/likeness (NIL) rights, coexist with the First Amendment. Courts have upheld intellectual property regimes--including copyright and trademark--that necessarily regulate expressive conduct, provided they are viewpoint-neutral and incorporate appropriate limiting principles. The Supreme Court, in the 1977 case Zacchini v Scripps-Howard Broadcasting Co., directly addressed rights of publicity and performance replacement. The Court analogized these NIL rights to copyright and affirmatively protected the proprietary interest of the individual in the case.

The NO FAKES Act makes more than a passing reference to the First Amendment. The Act explicitly lists the areas where the First Amendment will rule, and does so to ensure there is no uncertainty. This includes news, public affairs, commentary, criticism, scholarship, satire, parody and limited documentary or historical uses. However, while the First Amendment protects speech about individuals, it does not guarantee a right to appropriate their biometric identity to create a market substitute for their labor.

The First Amendment exemptions are critical for constitutionality, and they provide certainty for content creators and platforms. They are also a hard swallow for the talent who will be subjected to these at times unpleasant, albeit legal, uses. We have all seen the explosion of unfettered, unhinged commentary on social media and its effects on individuals and communities. If that is the price for a strong First Amendment, so be it, but its protections don't extend to commercial and performance theft.

A process note

The legislation has garnered bipartisan support and backing from both labor and major industry stakeholders, including the Motion Picture Association, the Recording Industry Association of America, the National Association of Broadcasters, IBM, OpenAI and Google/YouTube. It was introduced in the Senate in April 2025 and referred to the Senate Judiciary Committee.

This coalition and the bipartisan support reflects a shared recognition: responsible AI development requires legal clarity. A predictable consent-based regime protects individuals while giving companies a clear compliance pathway.

SAG-AFTRA

The 160,000 members of SAG-AFTRA derive their livelihood from their voices, likenesses and performances. They are the faces and voices who entertain and inform the world and have done so for 90 years. The stakes for us are obvious. But the issues here extend beyond entertainment and media. This bill protects everyone, and its passage is urgent. A framework of national protection must be established as we rapidly enter into an uncharted GAI world already reeling in uncertainty.

#389999


Submit your own column for publication to Diana Bosetti


For reprint rights or to order a copy of your photo:

Email Jeremy_Ellis@dailyjournal.com for prices.
Direct dial: 213-229-5424

Send a letter to the editor:

Email: letters@dailyjournal.com