This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.

Technology

Feb. 6, 2026

The law that built the internet and continues to test the courts

Saturday marks 30 years since President Clinton signed the Communications Decency Act, creating Section 230's immunity shield for online platforms. Courts are still grappling with its limits in cases testing whether social media companies can be held liable for allegedly addictive algorithms. Legal scholars say the law has evolved from protecting emerging platforms to serving as a weapon for dominant tech companies, while bipartisan reform efforts stall despite consensus that something is "awry in big tech."

When Congress passed the Communications Decency Act on Feb. 1, 1996, the internet was still a patchwork of message boards, chat rooms and early online services that few lawmakers fully understood. Buried within that sweeping statute was a short provision -- 47 U.S.C. § 230 -- that would become the most consequential law of the digital age.

Nearly three decades later, Section 230 remains both foundational and embattled. It is credited with enabling the modern internet by shielding online platforms from liability for user-generated content. It is also blamed, often by the same critics, for everything from misinformation and harassment to market concentration and political polarization. Yet despite years of legislative threats and rhetorical attacks, the statute endures largely intact, shaped less by Congress than by courts navigating its limits case by case.

Eric Goldman, associate dean of research at Santa Clara University School of Law, credits then-Rep. Christopher Cox, R-CA, and Sen. Ron Wyden, D-OR, with crafting a provision that anticipated far more than most lawmakers realized.

"From my perspective, it was really deep insights by the drafters," Goldman said. "It was really an insightful strategic play, not just a happy accident." Without that framework, he added, the internet might have evolved into a "walled garden" resembling the cable television industry of the 1990s.

This Saturday marks 30 years since President Clinton signed the Act, yet courts are still grappling with Section 230--most recently in a Los Angeles Superior Court trial and, soon, in a federal trial in Oakland over whether social media companies can be held liable for the addictive effects of their algorithms.

Section 230 was enacted against a specific and narrow concern. In the early 1990s, courts treated online services inconsistently: some decisions suggested platforms that moderated content could be treated as publishers and held liable for what users posted, while services that did nothing might avoid responsibility. Lawmakers feared that outcome would discourage moderation entirely.

But the statute emerged from broader ambitions about democratizing information flow, according to Olivier Sylvain, a professor at Fordham Law School and senior policy research fellow at the Knight First Amendment Institute at Columbia University. "They see great opportunities," Sylvain said of lawmakers at the time. "There are opportunities for exchanging ideas, but opportunities for innovation. You see this all in Section 230's precatory language."

Goldman views the provision as an effort to strike a delicate balance. "It was an attempt to come up with an alternative policy that would curb the problems online but not restrict future growth," he said.

The statute's core provisions sought to resolve that tension. Section 230(c)(1) states that providers of an "interactive computer service" shall not be treated as the publisher or speaker of information provided by another. Section 230(c)(2) protects platforms that remove or restrict content they consider objectionable.

Yet Section 230 wasn't Congress's primary focus in 1996, Sylvain noted. "What they were thinking about was Section 223. They were worried about porn. They were worried about kids' access to porn." That provision was later struck down as unconstitutional, but the sponsors of Section 230 were "very clever" in associating their deregulatory scheme with child protection.

The turning point came in 1997 with Zeran v. America Online Inc., in which the 4th U.S. Circuit Court of Appeals held that Section 230 barred claims against AOL for failing to promptly remove defamatory messages posted by a third party. Goldman said the decision arrived at a critical moment.

"Zeran really cleared the field by coming out before there were a lot of other contrary precedents," he said.

Goldman places the case within a trio of developments that allowed the statute to flourish.

"It was the combination of the Reno v. ACLU decision striking down the Communications Decency Act, the Zeran case saying Section 230 is to be read broadly, and the Digital Millennium Copyright Act notice-and-takedown provision in 1998," he said. "We needed all three of those before Section 230 could really hit its stride."

Goldman also argues that Section 230 litigation developed differently from constitutional free-speech cases.

"That's what I think distinguishes Section 230 litigation from First Amendment litigation," he said. "It's so much harder to get quick wins in constitutional litigation. Courts tread so much more cautiously there, but with Section 230, courts are emboldened by the fact that it's a statute and Congress can fix the statute if they have to. Courts are willing to make more decisive rulings even if the First Amendment would dictate the same outcome."

That dynamic, he noted, has allowed defendants to secure dismissals early and relatively inexpensively -- one reason the statute became such a formidable shield.

Over time, courts across the nation adopted that broad reading, dismissing claims ranging from defamation and negligence to emotional distress, so long as the content was created by users rather than the platform itself. By the 2010s, Section 230 had become one of the strongest liability protections in American law.

The modern internet bears little resemblance to the one Congress saw in 1996. Platforms curate feeds, recommend content, sell targeted advertising, and design products to maximize engagement. Those changes have driven a new generation of legal challenges aimed not at what users say, but at how platforms operate.

For Sylvain, the turning point came around 2015, when platforms began invoking Section 230 to defend algorithmic decisions that targeted specific users. "It gets really worrisome to me," he said. "It's not about third-party content. ... They're design decisions. Their algorithmic design decisions that are distributing content to people who are likely to be interested or harmed."

Plaintiffs have increasingly framed claims around product design and consumer protection, arguing that harms stem from platform features rather than specific posts. While many claims still collide with Section 230, courts have shown growing willingness to scrutinize the distinction.

A series of cases involving terrorism-related content, human trafficking, and online marketplaces have tested the boundaries of immunity. In Lemmon v. Snap, the 9th U.S. Circuit Court of Appeals allowed claims to proceed over Snapchat's speed filter. "The filter itself occasioned the danger," Sylvain said. "And the Ninth Circuit saw that."

Congress has also carved out exceptions. The 2018 Allow States and Victims to Fight Online Sex Trafficking Act narrowed immunity for certain claims, signaling that the statute is not untouchable.

Despite years of speculation, the U.S. Supreme Court has largely avoided issuing a definitive interpretation of Section 230. Recent cases reached the court amid expectations of a landmark ruling -- only to be resolved narrowly or remanded.

That restraint has frustrated critics who argue lower courts have stretched the statute beyond recognition, but it has also preserved flexibility, allowing doctrine to evolve incrementally rather than through a single disruptive decision.

Few statutes generate as much bipartisan rhetoric. Republicans complain about politically biased moderation; Democrats focus on platforms proliferating harmful content. "The one thing that there has been consensus on is that there's something awry in big tech," Sylvain said.

Yet repealing Section 230 outright would expose platforms to massive liability, while narrow reforms raise difficult line-drawing problems. As Goldman sees it, the original drafters sought to avoid exactly that paralysis by giving courts room to adapt the law over time.

For litigators, Section 230 remains a powerful early-stage defense, but the practice has grown more nuanced. Plaintiffs tailor complaints around product features and independent duties; defense lawyers must explain how design choices fit within precedent. The days of reflexive immunity arguments are fading.

As Section 230 enters its next decade, courts are signaling that immunity has limits. What may emerge is a more calibrated doctrine that protects platforms from liability for user speech while allowing claims tied to a company's own conduct.

Sylvain worries the pattern could repeat with artificial intelligence. In his forthcoming book, "Reclaiming the Internet: How Big Tech Took Control--and How We Can Take It Back," he cautions against creating blanket immunity for AI companies using the same innovation-first arguments from the 1990s. Only one other industry enjoys comparable protection, he noted: gun manufacturers. "That tells you a lot."

Section 230 was never meant to settle every question about responsibility online. Thirty years on, it has done something more durable: forced courts, lawmakers and litigants to confront how law adapts when technology outpaces statute. The next chapter will be written not in slogans, but in opinions--one case at a time.

Daily Journal Staff Writer Craig Anderson contributed to this report.

#389684

David Houston

For reprint rights or to order a copy of your photo:

Email Jeremy_Ellis@dailyjournal.com for prices.
Direct dial: 213-229-5424

Send a letter to the editor:

Email: letters@dailyjournal.com