This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.

Feb. 5, 2026

Major court cases over the past 30 years involving Section 230 of the Communications Decency Act

During the three decades since its enactment, the limits of Section 230 have been put the test.

Zeran v. America Online, Inc. (4th Cir. 1997)

Testing the Communications Decency Act's boundaries, the Fourth Circuit held that section 230 of the Act immunizes internet service providers from liability for defamatory content posted by third parties, even after the provider receives notice of the defamation. In Zeran, an anonymous user posted hoax advertisements linking plaintiff Kenneth Zeran to offensive merchandise referencing the Oklahoma City bombing. Zeran sued America Online (AOL) for failing to promptly remove the postings and issue retractions. Affirming dismissal, the Fourth Circuit reasoned that imposing notice-based liability would impermissibly treat AOL as a "publisher," a role Congress expressly chose to immunize to avoid chilling online speech and discouraging voluntary self-regulation. Rejecting Zeran's attempt to distinguish "publisher" from "distributor" liability, it held that both forms of liability are protected by section 230, repudiating the idea that providers lose immunity once they are notified of defamatory content. As a foundational internet law decision, Zeran established broad provider immunity that facilitated the rise of user-generated content while also igniting enduring debate over the proper balance between free expression and accountability for online harms.

-Antoneth Fong

Carafano v. Metrosplash.com, Inc. (9th Cir. 2003)

Christianne Carafano, a television and film actress who performs under the stage name Chase Masterson, became the victim of an online impersonation when an unknown person created a false profile of her on Matchmaker.com, an internet dating service. The profile generated numerous phone calls, voicemail messages, and e-mails to Carafano's home, some of which were sexually explicit and threatening.

Carafano filed suit against Matchmaker.com, alleging claims of misappropriation of her name and likeness, invasion of privacy, defamation, and negligence. Matchmaker.com moved for summary judgment, arguing that it was immune from liability under section 230. Carafano argued that by creating detailed profile questions with multiple-choice answers and narrative prompts, Matchmaker.com was not a neutral conduit but rather an active participant in content creation that should not receive section 230 immunity. The Ninth Circuit concluded that Matchmaker.com was entitled to immunity, drawing a distinction between providing neutral tools to users and actually creating content. The court emphasized that Section 230 immunity applies even when an interactive computer service provides some structure or organization to user-submitted content. The answers to the questions were "provided by the user" regardless of how the questions were framed. This ruling made clear that providing a neutral framework for organizing user submissions does not strip a website operator of Section 230 protection, even when that framework might facilitate harmful content.

-Matt Sasaki

Fair Housing Council of San Fernando Valley v. Roommates.com, LLC (9th Cir. 2008)

Plaintiff Fair Housing Council alleged that Roommates.com violated the Fair Housing Act by requiring users to answer questions regarding sex and sexual orientation to use its services. As to those questions, the en banc Ninth Circuit held that Roommates.com was an information content provider, so it could not claim immunity for posting them on its website. In doing so, the Ninth Circuit made clear that, despite the sweeping immunity granted to interactive computer service providers by section 230, website operators who create content, in whole or in part, may be subject to liability for that content. This first holding by a full federal circuit court after Section 230's enactment noted that courts should err on the side of immunity in close cases, but websites that materially contribute to allegedly illegal content should not be shielded because "[t]he Communications Decency Act was not meant to create a lawless no-man's land on the Internet."

--Josh Ogle

Force v. Facebook (2d Cir. 2019)

In 2016, plaintiffs, several U.S. citizens and their representatives, brought federal civil anti-terrorism claims against Facebook, Inc. Plaintiffs alleged that they were victims of Hamas attacks in Israel and that Hamas' use of Facebook to encourage violence rendered Facebook liable for assisting those attacks. The district court dismissed the claims at the pleading stage, finding that they were barred by section 230. On appeal, plaintiffs argued that their claims did not treat Facebook as the "publisher" or "speaker" of third-party content because Facebook "developed" Hamas' content through its use of algorithms, which connected individuals interested in terrorism with each other.

The Second Circuit affirmed. It found that providing groups with a forum to communicate their messages to interested parties fell "within the heartland" of what it means to be a "publisher" within the meaning of section 230. Notably, it determined that Facebook's use of algorithms did not render it more than a mere publisher: Like matchmaking services, the court reasoned, algorithms are tools designed to match third-party information with a consumer's interests, which is an essential result of publishing. Likewise, it determined that Facebook's algorithms did not "develop" Hamas' content because Facebook did not directly and "materially" contribute to what made the content itself unlawful, e.g., by encouraging or advising users to provide the specific actionable content that forms the basis for the claim. Instead, Facebook acted as a "neutral intermediary."

-Christie Bahna

Lemmon v. Snap, Inc. (9th Cir. 2021)

Snapchat had a Speed Filter, which overlaid a real-time speedometer on photos and videos. An accident occurred when two users were traveling at speeds exceeding 100 miles per hour while one of them used Snapchat's Speed Filter to record and share their speed. Their parents brought suit against Snap, contending that the Speed Filter--by prominently displaying the vehicle's instantaneous speed and tying that metric to social rewards like "best friends" lists and likes--served as a virtual incentive for users to exceed safe speeds. Because the users were motivated to display these high speeds to their social network, the plaintiffs argued, Snap's product design foreseeably encouraged reckless behavior. Snap argued the parents' claims essentially sought to treat the company as the publisher or speaker of its users' content, which Section 230 bars.

The Ninth Circuit concluded that the parents' claim did not seek to treat Snap as a publisher or speaker of third-party information. Rather, the claim focused on Snapchat's own product design decisions. Although the app displayed user speed and other user-generated content, plaintiffs were not arguing that Snap should be liable for that content per se. Instead, they claimed Snap's design itself was negligent in encouraging unsafe behavior. Thus, the court saw the claim as one against Snap as a product designer/ manufacturer, not as a publisher of user content. This ruling underscores that if a claim hinges on a platform's own feature design that allegedly encourages harmful conduct, Section 230 may not bar liability.

-Matt Sasaki

#389669

For reprint rights or to order a copy of your photo:

Email Jeremy_Ellis@dailyjournal.com for prices.
Direct dial: 213-229-5424

Send a letter to the editor:

Email: letters@dailyjournal.com