This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.

Technology,
Constitutional Law

Oct. 12, 2023

Do social media companies “speak” by curating your news feed? The jury is out.

The Solicitor General argues that allowing for content-moderation would impermissibly allow States to “enhance their citizens’ ability to express their views on social media platforms by suppressing the platforms’ ability to express their own views through the selection and curation of the content they present to the public.

Krista L. Baughman

Partner, Dhillon Law Group Inc.

Phone: (415) 433-1700

Email: kbaughman@dhillonlaw.com

Brooklyn Law School; Brooklyn NY

Krista's practice focuses on First Amendment and anti-SLAPP law and litigation.

Do social media platforms “speak” when they pick and choose what user speech they disseminate? This is the critical question at the heart of Moody v. NetChoice, L.L.C. and NetChoice L.L.C. v. Paxton, cases the Supreme Court is scheduled to hear this term. These cases will resolve a circuit split regarding to what extent social media platforms enjoy First Amendment protections when they “curate” user content.

The cases stem from two state laws enacted in 2021 to regulate the activity of major social media platforms, Ch. 2021-32, Laws of Fla. (S.B. 7072), and 2021 Tex. Gen. Laws 3904 (H.B. 20). Though the details of these laws differ, each law implicates two questions that the Supreme Court has selected to address. First, whether States may lawfully restrict platforms’ ability to moderate user content – things like editing or altering user posts, banning or “deplatforming” users for their speech, limiting or eliminating exposure of user posts (e.g. “shadowbanning”), and prioritizing some content over other content. Second, whether States may require platforms to provide users with an individualized explanation each time content is removed – in Florida, a “thorough rationale” for the action (Fla. Stat. §501.2041(3)(c)), and in Texas, an explanation of “the reason the content was removed,” along with the right to an appeal. (Tex. Bus. & Com. Code Ann. § 120.103(a)(1).

Trade associations representing platforms challenged the laws and sought preliminary injunctions, which were granted at the district court level in both States. The Eleventh Circuit upheld the preliminary injunction for Florida’s law, holding that when platforms moderate content, their “exercise of editorial judgment” is speech, and therefore, the regulations restricted speech and could not pass the heightened scrutiny required by the First Amendment. The Fifth Circuit, by contrast, held that platforms do not engage in “speech” but rather “conduct” when they censor content. They rejected the concept of a carve-out for “editorial discretion” as a special category of First Amendment-protected expression and instead asked whether the law either compelled the platforms to speak or restricted the platforms’ own speech, both of which the Fifth Circuit answered in the negative.

The United States Solicitor General filed an amicus brief endorsing the Eleventh Circuit’s view and argued that platforms have a First Amendment right to curate content. The Solicitor General also argued that content-moderation and individualized-exception requirements “impermissibly burden those protected activities,” reasoning that – just like a newspaper’s compilation of an opinion page – platforms “are in the business of delivering curated compilations of speech created by others,” and that this “speech” is constitutionally protected.

Notably, the Solicitor General’s argument runs counter to the reasoning of Section 230 of the Communications Decency Act, by which Congress determined that the business of hosting user content is conduct, not speech. See 47 U.S.C.A. §230(c)(1) (no platform “shall be treated as the publisher or speaker of any information provided by” a user). Moreover, subdivision (c)(2) of CDA 230 forecloses the idea that platforms have “editorial discretion” writ large, by providing platforms with immunity only for “‘Good Samaritan’ blocking and screening of offensive material,” that which is “obscene,” excessively violent,” “harassing,” or the like. Id., subd. (c)(2).

Yet, through a mash-up of CDA 230 arguments and editorial discretion concepts, platforms have been able to have their cake and eat it too: they can say “we are immune when we host user speech because that’s not our speech,” while also claiming “we are immune when we censor user speech because we are exercising our editorial discretion.”

But platforms don’t exercise editorial discretion like the traditional newspaper. Newspapers primarily contain their own speech, while platforms’ “curated product” is nearly bereft of any speech by the platform itself. Newspapers select, edit, and “determine the news value” of content before publication, but platforms generally post-hoc censor “a tiny fraction of [already disseminated] expression.” NetChoice, L.L.C. v. Paxton, 49 F.4th 439, 464-465 (5th Cir. 2022). And while newspapers “accept[] reputational and legal responsibility for the content [they] edit[],” (id. at 464), platforms expressly disavow responsibility for the content they host.

The Solicitor General argues that allowing for content-moderation would impermissibly allow States to “enhance their citizens’ ability to express their views on social media platforms by suppressing the platforms’ ability to express their own views through the selection and curation of the content they present to the public. But the Fifth Circuit has expressed a competing concern – that by selecting content to host, platforms control the flow of information into users’ households and could “thus silence the voice of competing speakers with the mere flick of a switch.” Paxton at 495 (Judge Edith Jones concurring). The Supreme Court will need to grapple with these realities as well.

The second question posed by Moody and Paxton – whether obligating platforms to provide users with individualized explanations for content removal is too burdensome – implicates some of the same considerations discussed above. While the record on this question is underdeveloped due to the pre-enforcement posture of the cases, it bears noting that some European laws already contain similar requirements, so we may soon get a preview of whether the platforms’ “burdensome” arguments hold up when pressure tested.

Ultimately, this author doubts that posts on social media are, or ever were, meant to be expressions of the platform’s “own views” as opposed to the free exchange of information created by users. But one thing’s for sure: whether content regulation of the modern-day town square will be held constitutionally permissible will have huge implications for American speech for the indeterminate future.

#375207


Submit your own column for publication to Diana Bosetti


For reprint rights or to order a copy of your photo:

Email jeremy@reprintpros.com for prices.
Direct dial: 949-702-5390

Send a letter to the editor:

Email: letters@dailyjournal.com