This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.
News

Dec. 31, 2025

Courts grapple with copyright limits as AI training lawsuits surge

Courts in 2025 advanced AI copyright law, backing fair use for training models while leaving market harm, licensing and authorship questions unresolved as lawsuits, settlements and appeals continue nationwide broadly.

Courts grapple with copyright limits as AI training lawsuits surge
U.S. District Judges William Alsup

Copyright law is designed to protect authors and artists from having their work stolen. But what happens when their work isn't necessarily stolen, but rather fed to a computer algorithm that creates its own, similar work?

Attorneys, courts and tech companies have been scrambling to answer that question since OpenAI launched ChatGPT in late 2022 - and the debates continued throughout 2025.

"We have seen across the spectrum of cases no let-up in terms of the filings," Angela L. Dunning, a copyright attorney at Cleary Gottlieb Steen & Hamilton LLP, said in a video interview. "We are still seeing class actions being filed, we are still seeing individual actions being filed."

Dunning said it's well-established that style can't be protected by copyright. Instead, several cases have focused on replicas of copyrighted works used to train AI models. While the technology is new, Ian C. Ballon, an intellectual property attorney at Greenberg Traurig LLP, said courts have ruled in the past that creating an intermediate copy is a fair use as long as the final product is transformative.

"So, if something, even intermediate copy, is made in order to then create a final product, then you may have a copy that looks like the input, although the question is then, 'Is that a fair use?'" Barron commented. "There are a number of cases going back to video game decisions dealing with the use of intermediate copying in order to create something transformative as being a fair use."

While the biggest domino -- The New York Times' lawsuit against OpenAI -- is awaiting resolution, 2025 featured some notable developments, including a $1.5 billion settlement in a class action against Anthropic PBC in the Northern California federal court. Bartz et al v. Anthropic PBC, 3:24-cv-05417 (N.D. Cal., filed Aug. 19, 2024).

The case settled, pending final approval, after U.S. District Judge William H. Alsup certified a class on the grounds that Anthropic used pirated copies of the plaintiffs' work to train its AI assistant.

But according to Dunning, it's just as notable, if not more, that Alsup found AI training to be a fair use of copyrighted work, a position also taken by U.S. District Judge Vince Chhabria in a similar lawsuit against Meta. Kadrey, et al v. Meta Platforms, Inc., 3:23-cv-03417 (N.D. Cal., filed July 7, 2023).

"The big developments of this last year were the two summary judgment rulings in the Bartz v. Anthropic and Kadrey v. Meta cases, both courts holding that training on copyrighted books constitutes fair use as a matter of law," said Dunning, who represented Meta in the case.

Dunning said she expects much of the ongoing litigation to focus on whether training an AI model constitutes fair use.

The analysis depends on four factors: whether the use is transformative, the nature of the copyrighted work, the amount used and if the use causes market harm to the owners of the copyright. Dunning said the first three are relatively straightforward, but she expects much of the remaining litigation to focus on whether training an AI model causes market harm.

"My prediction is that many courts will find the use highly transformative for factor one, and then a lot of the activity is going to focus on whether the plaintiffs ... can demonstrate the kind of market harm that would be necessary to counter a fair use finding," Dunning said.

Historically, Dunning said, courts have weighed whether customers can use the new work as a direct substitute for the copyrighted work to determine if it causes market harm. But in the ongoing AI litigation, she's seen a new argument emerge that she expects more clarity about next year: that someone could read an AI-generated work instead of the copyrighted one, even if the two are different.

Another question that courts started to answer in 2025 is whether work produced by AI can be copyrighted.

In January, the U.S. Copyright Office issued a report on AI authorship attempting to provide preliminary answers.

"It stated some general principles that most copyright lawyers would agree with, which is something created with no human involvement would not be protectable, but something created with human involvement would, potentially, be protectable," Ballon said.

The issue made its way to the courts after an AI researcher, Stephen Thaler, attempted to copyright a work created by his AI model. His application was rejected, but he brought the Copyright Office to court, where both the district and appellate courts sided against him. Stephen Thaler v. Shira Perlmutter, et al, 23-5233 (D.C. Cir., March 18, 2025).

"The fundamental question they pivoted on is the interpretation of the term 'author' in the Copyright Act. Thaler said you don't need to be limited to humans, and the court said, no, the Copyright Act really exists so that you want to encourage creativity," said Jeffrey S. Kravitz, a Los Angeles mediator.

Dunning said she's aware of one AI-created piece that's been granted copyright, a work called "A Single Piece of American Cheese."

"The copyright office initially rejected the application but then the applicant put in a further response, and it was granted," Dunning said. "This is an image that was entirely AI-generated, but the author was able to demonstrate the control that went into placing each piece in this composite image."

But with dozens of cases still in progress, plenty remains to be decided going into 2026.

Kravitz said he's interested to see if companies turn to licensing agreements to train their AI models on copyright work -- something he said was hinted at in Thompson Reuters Centre GmbH, et al. v. Ross Intelligence Inc., 25-8018 (3rd Cir., April 14, 2025).

"Here's where it gets interesting from my standpoint. ... They reached a settlement whereby there was an undifferentiated license to use the plaintiffs' work," Kravitz said. "You don't have to be a genius to see that a licensing system akin to what's been done with music makes economic sense, keeps it out of the courts, doesn't result in silly opinions."

Kravitz said he's also watching closely to see how AI companies choose to insure themselves against copyright lawsuits.

But the New York Times case remains the one copyright lawyers across the country are watching most closely, Dunning said.

"Everyone is watching the multidistrict litigation in New York involving claims by the New York Times and book authors against OpenAI. ... That will be the first opportunity for a court outside of the Northern District of California to weigh in at summary judgment on (large language model) training," Dunning said. New York Times Company v. Microsoft Corporation, et al, 1:23-cv-11195 (S.D. NY., filed Dec. 27, 2023).

Dunning said that case is different than the ones decided in California since it includes claims that OpenAI's models can occasionally reproduce copyrighted work when prompted.

"It's a claim that hasn't yet been addressed by any court, so I think everyone is going to be watching to see how the court assesses that claim," Dunning said.

#389209

Daniel Schrager

Daily Journal Staff Writer
daniel_schrager@dailyjournal.com

For reprint rights or to order a copy of your photo:

Email Jeremy_Ellis@dailyjournal.com for prices.
Direct dial: 213-229-5424

Send a letter to the editor:

Email: letters@dailyjournal.com