This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.

Technology,
Administrative/Regulatory

Jan. 2, 2026

SB 243: New safety rules for AI companion chatbots

David Lisson

Partner
Davis Polk & Wardwell LLP

Email: david.lisson@dpw.com

David Lisson also heads the firm's GenAI litigation initiative, focused on developers, enterprise technology companies, and organizations implementing GenAI. He has extensive experience representing technology companies in their highest-stakes patent, copyright, and trade-secret cases in courts nationwide.

See more...

David I. Feinstein

Counsel
Davis Polk & Wardwell LLP

See more...

Thomas Floyd

Associate
Davis Polk & Wardwell LLP

See more...

SB 243: New safety rules for AI companion chatbots

A focus of the rapidly evolving artificial intelligence (AI) regulatory landscape is AI "companion" chatbots--systems designed to mimic human behavior and interact with users across multiple sessions. Senate Bill 243, signed by Gov. Newsom on Oct. 13 and effective Jan. 1, 2026, will require companies that offer companion chatbot systems to California users (operators) to implement new safety guardrails, including notice requirements and protocols for responding to users in crisis.

SB 243 defines "companion chatbots" as AI systems with natural language interfaces that provide humanlike responses and are "capable of meeting a user's social needs," including by sustaining relationships across multiple interactions. The law exempts several common AI chatbot applications, including those used only for certain tasks such as customer service, internal productivity purposes, research or technical assistance. Also carved out are chatbots within video games that can discuss only game-related topics--aside from certain high-risk topics related to health and sexuality--and voice-activated virtual assistants on consumer devices that do not sustain relationships across interactions and are not likely to elicit emotional responses from users.

There are two separate user notification requirements for covered companion chatbots. First, they must provide clear and conspicuous notice that the companion chatbot is not human if a reasonable person would be misled to believe otherwise. Second, for all users the operator knows are minors, the system must disclose that a user is interacting with AI, provide recurring notifications every three hours, and institute measures to prevent the companion chatbot from producing sexually explicit materials or encouraging the minor to engage in such conduct.

Operators of companion chatbots must also employ protocols for responding to expressions of suicidal ideation or self-harm by users that include, but are not limited to, referring users to crisis service providers. Operators must measure for suicidal ideations using evidence-based methods, although SB 243 offers no guidance on suitable methods.

SB 243 also establishes a new reporting regime for operators of companion chatbots. Starting July 1, 2027, operators must submit annual reports to the Office of Suicide Prevention detailing their protocols for responding to suicidal ideation by users and for preventing the chatbot from engaging with users about suicidal ideation, as well as the number of times the operator referred users to a crisis service provider in the preceding calendar year. Additionally, operators must publish data from these reports, as well as details about their chatbots' protocols, on their websites.

Although SB 243 does not provide for civil penalties, it does allow a private right of action for any person harmed by a violation of the law for the greater of actual damages or $1,000 per violation, injunctive relief and reasonable attorney's fees and costs.

Given the ubiquity of AI-powered chatbots, companies should assess the applicability of SB 243 to their tools to determine whether they constitute "companion chatbots" or fall within the exempted use cases. Should the law apply, companies will want to document the evidence on which their protocols rely to assess user expressions of self-harm and the standards they employ to evaluate explicit or similarly covered content, and they will need to monitor the required reporting metrics on a regular basis as part of their oversight and governance functions.

David Lisson is a partner, David I. Feinstein is counsel, and Thomas Floyd is an associate at Davis Polk & Wardwell LLP.

#389118

For reprint rights or to order a copy of your photo:

Email Jeremy_Ellis@dailyjournal.com for prices.
Direct dial: 213-229-5424

Send a letter to the editor:

Email: letters@dailyjournal.com