Constitutional Law
Dec. 8, 2025
Free speech should not endanger minors
A year after California passed SB 976 to curb minors' social-media addiction, a wave of First Amendment lawsuits from tech giants now pits the state's effort to limit algorithm-driven "addictive feeds" against platforms' claims that their curated content is protected speech.
Anne P. Mitchell
Anne P. Mitchell is an attorney, dean emeritus (cyberlaw and cyber security) at Lincoln Law School, and CEO of the Institute for Social Internet Public Policy.
Gov. Gavin Newsom signed SB 976, the "Protecting Our Kids from Social Media Addiction Act" into law just over a year ago. Since then, it's been blocked from taking effect while mired in lawsuits. First it was the lawsuit brought by trade association NetChoice, whose members include Google, Meta and X. This month, Google, Meta and TikTok also filed individual lawsuits after the 9th Circuit ruled mostly in the state's favor saying, among other things, that NetChoice didn't have associational standing to sue on behalf of its individual members.
As a matter of public policy, minors need to be protected. Several laws are designed to protect minors, from requirements for safety seats and seatbelts to prohibitions on sexual conduct with minors, as well as restrictions on minors' access to addictive substances and pornography.
For years, research has shown that social media is genuinely addictive. Countries around the world have recognized digital addiction, with China going as far as creating "Internet addiction camps" to help teenagers kick the "electronic heroin" habit. The danger isn't just failing school or withdrawal from in-person activities. Each year, social media is linked to countless cases of depression among adolescents and teenagers, and in some tragic cases, even to suicide. At a time when peer acceptance is important and compelling, social media reflects a minor's social standing back to them in a concentrated form. And then there's also the bullying.
Of course, content on social media is generally treated as "speech," which triggers First Amendment protections -- but should it be, and does it really qualify?
At its heart, the Protecting Our Kids act requires social media companies to implement mechanisms that are intended to limit minors from accessing what the law calls "addictive feeds," unless the platform obtains verifiable parental consent. In this context, "addictive" can be thought of as "intended to keep the young user on the platform for as long as possible. "For example, limiting mechanisms include turning off notifications during school hours (8:00 a.m. to 3:00 p.m.) and between midnight and 6:00 a.m.; limiting on-platform activity to one hour a day, and setting a minor's account to private so that only approved contacts can interact with them. These mechanisms default by design and can be overridden with verifiable parental consent.
The provision in the Act giving social media platforms the most concern requires that a minor's default feed cannot include media that has been "recommended, selected, or prioritized for display based on information provided by the user, or otherwise associated with the user or the user's device, other than the user's age or status as a minor." In other words, it can't include the most addictive feature of social media platforms: content curated and targeted to keep users scrolling. In this instance, the targeting is not personalized for the general user but directed at minors.
That doesn't look like speech -- it looks like algorithmic packaging of someone else's speech. And social media platforms are reluctant to give it up because it's their prime money maker.
Social media platforms track nearly every user action: how long someone lingers on a post, what they click, which content they "like", and who they interact with. All this data is fed into algorithms that curate feeds ostensibly "just for the user," but in reality, the content is optimized for the platform's profit. The longer users stay scrolling through content the algorithm predicts will capture their attention, the more revenue the platform generates. Advertisers pay based on the number of impressions, and extended user engagement allows the platform to show more ads -- capturing more "eyeballs," and, ultimately, more dollars.
So, social media platforms make sure to curate content that is designed to keep users on the platform for as long as possible. This is the intentionally addictive content that social media is serving up to everyone, including minors.
On Nov. 13, in separate but coordinated lawsuits, Google, Meta
and TikTok filed suit in the Northern District, alleging, among other things,
that it is a violation of their First Amendment rights for the government to
tell them what they can publish and when. They argue that their choices as to
what to show to users, including minors, "involve human decisions about
what content to publish, recommend, and promote" admitting, though, that
they use technology as a tool to implement those decisions -- decisions they
make a point of calling "editorial judgments."
In the 9th Circuit's decision in NetChoice, the court pointed out that when algorithms are involved, and those algorithms are simply responding to users' actions, that is probably not expressive -- one of the tests for a First Amendment challenge. The thing is, and as the 9th Circuit goes on to say, each platform handles matters differently, using distinct algorithms and varying degrees of human input, both from the employees of the platforms and parents of minors. In fact, that reasoning contributed to the court's finding that NetChoice lacked associational standing, prompting individual lawsuits. The court emphasized that each platform is different, with distinct algorithms that may -- or may not --remove 'decisions' from the realm of human editorial judgment.
There are compelling arguments on both sides. However, I think that public policy suggests that if the state can overcome First Amendment challenges -- and I think that they can given the ratio of algorithm-to-human involvement social media platforms are serving up -- then the act should stand. Social media platforms have little incentive to police the content they serve up to minors and have demonstrated that they do not want to. Why would they when they are earning huge profits?
For reprint rights or to order a copy of your photo:
Email
Jeremy_Ellis@dailyjournal.com
for prices.
Direct dial: 213-229-5424
Send a letter to the editor:
Email: letters@dailyjournal.com