This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.

self-study / Data Privacy

May 1, 2026

California's companion-chatbot law may be creating discoverable records in family law cases

Hossein Berenji

Email: hfberenji@berenjilaw.com

Hossein Berenji is founder and lead attorney at Berenji Divorce and Family Law Group, focusing on complex and high-net-worth divorce matters in Los Angeles County. He holds degrees from UCLA and Loyola Law School and is an active member of several prestigious legal associations. For more details, please visit https://www.berenjifamilylaw.com/.

See more...

California Senate Bill 243 was designed to protect consumers from predatory AI companion applications. It requires companion chatbot platforms to maintain suicide prevention protocols, disclose their AI status to users and comply with reporting obligations enforced through a private right of action. What the Legislature may not have anticipated is that the platforms it regulates retain user interaction data by default, and those records may become discoverable in family law proceedings unless California extends statutory or privilege protection to cover them.

In effect, SB 243 may be contributing to the emergence of a new category of evidence: informal, highly personal digital communications created outside any recognized privilege framework, yet preserved in a form that can be sought in discovery.

This is not a settled legal conclusion. But it is a foreseeable consequence of a recent federal court ruling and the growing population of users confiding in companion chatbot platforms whose own terms of service undermine any reasonable expectation of privacy. Family law practitioners in California should understand how these developments may work together, and what that means for client counseling.

Heppner closes the privilege door

Earlier this year, Judge Jed S. Rakoff of the Southern District of New York ruled in United States v. Heppner (S.D.N.Y. 2026) that a criminal defendant could not assert privilege over materials he generated using a publicly available AI tool which were later obtained from his own records. The court found those materials were not protected because the AI was not a lawyer, the communications were not confidential under the platform's own privacy terms, and the materials were not prepared at counsel's direction. Heppner does not establish that all AI companion chats are automatically discoverable. But it does warn that when a client uses a public chatbot outside counsel's direction, on terms that disclaim confidentiality and permit retention or disclosure, traditional privilege arguments may be weak.

Courts have reached different results on different facts, and the law in this area is still developing. In California family cases, that uncertainty matters because SB 243 regulates companion chatbot platforms without creating any evidentiary privilege for user communications.

Any client who uses a free, publicly available AI platform to vent about a custody dispute, track their finances or seek emotional support is sharing sensitive information through a platform that does not guarantee confidentiality. Family law practitioners should treat this ruling as a signal of how California courts may respond when similar questions arise in civil discovery.

The number of people affected is growing. Research from RAND indicates that about one in eight U.S. adolescents and young adults use AI chatbots for mental health support. Many are disclosing things to a bot they would not say to a therapist or attorney. Those disclosures may or may not be discoverable depending on the circumstances, but practitioners should assume opposing counsel may seek them.

SB 243 and the records it requires

SB 243 requires companion chatbot platforms to follow suicide prevention protocols, disclose to users that they are interacting with AI and report safety data annually to the state's Office of Suicide Prevention. It also allows users to sue for violations under Business and Professions Code section 22605. The law does not require these platforms to save user chat logs. But these platforms already save that data on their own through their standard terms of service and privacy policies, and by operating in California, they are subject to California law, including civil discovery.

In a custody dispute, records of a parent expressing hopelessness, hostility or erratic behavior over months could be argued as relevant to emotional fitness. In a support proceeding, disclosures about finances or lifestyle could be used to challenge income and expense declarations. The private right of action under section 22605 may also potentially allow parties to seek records through civil litigation. How courts will treat those requests in a family law context remains to be tested.

Limits on discoverability practitioners should know

Before advising clients that their AI chat logs will appear in discovery, practitioners should recognize the arguments that cut the other way. A party seeking production would need to establish relevance and proportionality under California discovery standards. Depending on how the logs were generated and used, there may be colorable arguments for protection under the psychotherapist-patient privilege in Evidence Code section 1014 or through a motion for protective order under Code of Civil Procedure section 2030.090. Courts may also find that certain categories of emotional support communications fall outside the scope of permissible discovery in family law proceedings.

Competence and the intake conversation

California Rule of Professional Conduct 1.1, Comment 1, expressly requires attorneys to keep abreast of the benefits and risks associated with relevant technology. A California family law attorney who does not ask about AI usage at intake and does not advise clients of the potential discoverability of those communications may not be meeting that standard.

There is a compounding concern. AI chatbots are prone to hallucination. They generate plausible but factually incorrect legal information with regularity. A client who relied on an AI platform to understand their rights may have made consequential decisions based on bad guidance and may have a logged record of that reliance. The practical advice for clients is to treat public AI chatbots as you would an unsecured email account and to not use them to discuss your case, your finances, your children or your emotional state. If they have already done so, counsel needs to know.

The gap the Legislature did not close

SB 243 was enacted to address consumer protection, not evidentiary consequences. It does not include a privilege provision that would shield user interaction data from civil discovery.

A targeted amendment to SB 243 could address this directly, extending qualified privilege to companion chatbot communications with appropriate carve-outs for genuine safety disclosures. That amendment does not currently exist. In its absence, practitioners should advise clients accordingly and watch for how courts handle the first contested discovery requests involving these logs.

What practitioners should do now

The practical response has two components. First, update intake protocols to include questions about AI tool use and provide explicit written advice about the potential discoverability of AI chat logs under Heppner. Second, when representing the requesting party, treat AI chat logs as a standard discovery target alongside phone records and email. The Heppner reasoning establishes that public AI platforms do not support a reasonable expectation of confidentiality.

When issuing discovery requests, ask opposing parties to identify all AI platforms used during the relevant period and request production of any interaction logs retained by those platforms under their standard data retention policies.

SB 243 set out to address consumer safety, not litigation strategy. But the platforms it regulates retain user interaction data, and that data may be reachable through civil discovery in the absence of a privilege amendment. Practitioners who understand that intersection now will be better positioned to counsel clients before the issue arises in an active case.

#1872

Submit your own column for publication to Diana Bosetti


Related Tests for Data privacy

self-study/Data Privacy

New privacy considerations under CPPA regulations

By Kenny Gutierrez, Chiara Portner

self-study/Data Privacy

CPRA Series: Part IV - Data Processing Obligations

By Graham Dean, James Koenig, Sadia Mirza, Ron Raether, Kamran Salour, Edgar Vargas

self-study/Data Privacy

CPRA series: Part III - Notice and disclosure obligations

By Grady Howe, Sadia Mirza, Lissette Payne, Kim Phan, Ron Raether

self-study/Data Privacy

CPRA series: Part II - Consumer rights

By Gerar Mazarakis, Sadia Mirza, Ron Raether, Kamran Salour, Whitney Shephard

self-study/Data Privacy

CPRA Series: Part 1 – Introduction and Overview

By Mary Catherine Kamka, Robyn W. Lin, Sadia Mirza, Ron Raether, Kamran Salour