This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.

U.S. Supreme Court,
Technology,
Civil Rights

Dec. 7, 2022

Decision on Section 230 could restructure the internet

If the Court follows Justice Thomas and narrowly reads Section 230 only to protect “distributors” from “publisher” liability, it would expose every online intermediary – large and small – to enterprise-threatening risks.

Adam S. Sieff

Counsel Davis, Wright & Tremaine LLP

Phone: (213) 633-8618

Email: adamsieff@dwt.com

Already poised to reconsider the future of First Amendment protections for online speech, the Supreme Court will soon decide a case taking aim at 47 U.S.C. § 230, the key federal statute that bolsters First Amendment protections online. This is the Supreme Court's first case interpreting Section 230, and its decision in Gonzalez v. Google LLC, No. 21-1333, could restructure how billions of people use the internet.

Section 230 Protects Free Speech

Section 230 is a free speech statute. Stating that "[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider," 47 U.S.C. § 230(c)(1), the law implements important background First Amendment protections.

Long before the internet, the Supreme Court recognized that imposing liability on intermediaries who provide a forum for speech also threatens the rights of speakers and readers who depend on those fora to share their messages. In Smith v. California, 361 U.S. 147 (1959), the Court struck down a law holding booksellers strictly liable for selling allegedly "obscene" books because the law functionally compelled bookstores to self-censor. The problem, the Court noted, was "[t]he bookseller's limitation in the amount of reading material with which he could familiarize himself, and his timidity in the face of his absolute criminal liability, thus would tend to restrict the public's access to forms of the printed word which the State could not constitutionally suppress directly." Id. at 153-54.

The danger with intermediary liability is that it silences all speakers. This was the principle behind New York Times v. Sullivan, 376 U.S. 254 (1964), where the Supreme Court identified the risk that intermediary liability "would discourage" publishers "from carrying" controversial content and thus "shut off an important outlet for the promulgation of information and ideas by persons who do not themselves have access to publishing facilities." Id. at 266. Such "self-censorship," the Court explained, is especially pernicious since it functions as "a censorship affecting the whole public." Id. at 279 (quoting Smith, 361 U.S. at 154). And this danger is heightened online, where intermediaries curate speech from a worldwide audience of billions who depend on these platforms to reach their audiences.

Congress responded to online intermediaries' acute exposure to become vehicles for censorship by enacting Section 230 "to promote rather than chill internet speech." Bennett v. Google LLC, 882 F.3d 1163, 1166 (D.C. Cir. 2018). Section 230 does this by enacting a prophylactic statutory immunity that shields online intermediaries from having to either reject content or face litigation costs and potential liability. Section 230 thus operates much like a federal anti-SLAPP law for the internet, providing a threshold summary process to vindicate First Amendment rights from harassing litigation that itself chills speech. These protections provide assurance that websites and other internet services rely on to host user-generated speech, which is what enables everyday users to share their ideas and reach wide audiences without having to build their own sites or services.

Gonzalez v. Google LLC

Reynaldo Gonzalez sued Google under the Anti-Terrorism Act, 18 U.S.C. § 2333, for the death of his daughter during an ISIS attack at a Paris bistro in November 2015. Gonzalez claims Google materially supported ISIS and contributed to his daughter's death in violation of that law through videos it published on its YouTube service, even though the videos violated YouTube's terms of use. Conceding that Section 230 immunizes Google against liability for publishing the videos, Gonzalez's theory focuses on the allegation that Google recommended the videos to users. Through its recommendation algorithms, Gonzalez claims, Google presented ISIS videos to users whose past viewing history indicated that they could be interested in such content.

Agreeing with the Second Circuit's prior consideration of this exact issue in Force v. Facebook, Inc., 934 F.3d 53, 64-72 (2d Cir. 2019), another case involving Section 230's application to Anti-Terrorism Act claims, the Ninth Circuit affirmed dismissal of Gonzalez's theory, holding that Section 230 bars claims based on injuries suffered as a result of content promoted by a social media platform's recommendation algorithms. See Gonzalez v. Google LLC, 2 F.4th 871, 894-95 (9th Cir. 2021). The court explained that Google's use of "neutral tools" to recommend third-party content did not render Google a creator or developer of that content, and thus did not remove Google from Section 230's protections for promoting third-party content. Id. at 895. Its decision aligned not only with Force but also Dyroff v. Ultimate Software Group, Inc., 934 F.3d 1093, 1095-96 (9th Cir. 2019), a similar Section 230 case involving injuries alleged to have resulted from algorithmic content promotion.

The Ninth Circuit's decision in Gonzalez applied the precedent from Dyroff and Force that claims based on content promotion seek to treat an internet platform as a "publisher" of the promoted content. But in a separate concurrence, Judge Berzon challenged this assumption, suggesting that actions platforms take to recommend content fall outside the "traditional activities of publication and distribution - such as deciding whether to publish, withdraw, or alter content." Gonzalez, 2 F.4th at 913 (Berzon, J., concurring). The late Judge Katzmann in Force likewise questioned whether Section 230 reached recommendations, contending that a platform's recommendations communicate a platform's own message about what a given user might like. Force, 934 F.3d at 82 (Katzmann, J., dissenting in part).

These concerns follow broader statements by Justice Thomas, who has questioned whether Section 230 should apply to platforms that do anything more than passively distribute third-party content, and suggested that Section 230 should not shield even those covered "distributors" from distributor-based liability. See Malwarebytes, Inc. v. Enigma Software Grp. U.S., 141 S. Ct. 13, 15-16 (2020) (Thomas, J., statement respecting denial of certiorari).

Together, these statements reflect what Judge Gould's dissent in Gonzalez called a "rising chorus of judicial voices cautioning against an overbroad reading of the scope of Section 230 immunity." Gonzalez, 2 F.4th at 926 n.9 (Gould, J., dissenting).

What's At Stake

The Supreme Court's decision to accept review in Gonzalez despite the absence of a circuit split suggests at least three other justices may share Justice Thomas's concerns.

If the Court follows Justice Thomas and narrowly reads Section 230 only to protect "distributors" from "publisher" liability, it would expose every online intermediary - large and small - to enterprise-threatening risks. Many forums would shut down, others would eliminate editorial and content moderation controls to retain Section 230's narrowed distributor protections, while those that continued to curate content would collaterally censor massive amounts of speech - basically anything the platform could not individually vet and approve for publication. The internet would lose its dynamism, utility, and potential, and the primary victims would be marginalized speakers with dissenting views foreclosed from public discourse.

Even if the Court follows Judge Berzon and Judge Katzmann by limiting Section 230's protections to publishing or withdrawing (but not recommending) content, it would have much of the same effect. Since a site or service cannot both select and remove content without inherently promoting the content it permits, online intermediaries would have to forgo content moderation altogether or else accept unlimited liability for the speech it allows.

However the Court rules, a decision rewriting Section 230's protections would dramatically alter everyday internet use, and severely curtail the internet's prospect to provide the "vast democratic forums" the Supreme Court described with great optimism in Reno v. ACLU, 521 U.S. 844, 868 (1997).

#370137


Submit your own column for publication to Diana Bosetti


For reprint rights or to order a copy of your photo:

Email jeremy@reprintpros.com for prices.
Direct dial: 949-702-5390

Send a letter to the editor:

Email: letters@dailyjournal.com