This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.

self-study / Evidence

Jan. 23, 2026

You can't cross examine a machine

Anita Taff-Rice

Founder
iCommLaw

Technology and telecommunications

1547 Palos Verdes Mall # 298
Walnut Creek , CA 94597-2228

Phone: (415) 699-7885

Email: anita@icommlaw.com

iCommLaw(r) is a Bay Area firm specializing in technology, telecommunications and cybersecurity matters.

See more...

With billions of dollars at stake in litigation, parties often resort to the Battle of the Experts, where judges and juries are confronted with testimony from two well-credentialed experts who reach completely contradictory conclusions from the same facts. Rather than serving as a tool to clarify complex material, expert witnesses are often perceived as being just another litigation tool to be manipulated to fit the needs of the party who pays them.

Even if the expert testimony system is flawed, at least experts can be cross examined to expose any such bias or commercial motive. But what if "expert" evidence is generated by AI? A machine can't be cross examined.

Increasingly, litigants are using AI evidence without an expert witness to analyze complex data such as stock trading patterns to establish causation, analysis of digital data to determine whether two works are substantially similar in copyright litigation, and machine learning that assesses the complexity of software programs to determine the likelihood that code was misappropriated. The Federal Judicial Conference's Advisory Committee on Evidence Rules (Committee) is considering Rule 707, which would hold machine-generated evidence to the same standard as human expert testimony.

Proposed FRE 707 would require that machine-generated evidence offered without any witness, or by a lay witness, would be subject to all of the requirements of FRE 702 for expert witnesses. Federal Rule of Evidence 702 attempts to curb experts from obvious manipulation efforts by admitting only testimony based on sufficient facts, that is a product of reliable methods and that reflects a reliable application of those methods to the fact in the case.

The Committee clarified that the proposed rule would not apply to standard scientific devices such as thermometers and scales, but rather AI-generated evidence that makes predictions or draws inferences from data. The new rule is intended to address evidentiary problems such as "function creep," where the AI platform uses a process for purposes that were not intended, analytical error or incompleteness, inaccuracy or bias built into the underlying data or formulas, lack of interpretability of the machine's process and outright deep fakes.

This new proposal is a welcome shift from the Committee's ill-advised consideration in 2024 to address AI-generated evidence by amending FRE 901(b)(9) to add a new section that would require a party challenging the authenticity of computer-generated or other electronic evidence to demonstrate that the evidence is more likely than not either fabricated, or altered in whole or in part. Astonishingly, that rule would have allowed even obviously fabricated evidence if its probative value outweighed its prejudicial effect on the party challenging the evidence. Fortunately, the Committee realized that the proposal is unworkable, so the rule change did not advance.

By allowing courts to decline to admit unreliable AI-generated evidence, proposed FRE 707 would hopefully require the party offering such evidence to provide full and detailed disclosures about how the evidence was generated. The court should require the party offering AI-generated evidence to identify the software underlying the AI platform, provide evidence of the data used to train the platform and provide evidence that the platform applied its learning program to the data to produce reliable results. This may be difficult.

The Lawyers for Civil Justice rightly noted in comments that it might be impossible to demonstrate that AI-generated evidence is reliable without having an expert testify about the validity of the AI platform and its output. A human expert may be cross examined to explain how an opinion was reached, including factors such as why particular data was prioritized and why and how confident the expert is in his or her conclusion.

Perhaps that's the point. AI-generated evidence should rarely be admitted without an associated witness who can be cross examined. The group also noted that a rule related specifically to AI-generated evidence should be crafted to provide greater guidance rather than simply cross referencing FRE 702.

The Federal Magistrate Judges Association submitted comments arguing that AI is rapidly changing and it is premature to adopt a rule now. Instead, the group urged the Committee simply to require disclosure of AI-generated documents or evidence so that the opposing party has an opportunity to explore the manner in which the evidence was generated and evaluate reliability concerns. While it is obviously necessary to identify AI-generated evidence, that does not help determine whether such evidence is reliable enough to be admitted. This wait-and-see view will not provide necessary guidance to judges who are confronted with AI-generated evidence now and have little guidance on how to vet its reliability or how to address the impossibility of cross examining a machine.

The Committee issued the proposed rule for public comment in May and said it expected to receive extensive comments. Thus far, only 17 parties have submitted comments and the comment period ends on Feb. 16, 2026.

#1786

Submit your own column for publication to Diana Bosetti


Related Tests for Evidence

self-study/Evidence

DUI 101: Common evidentiary issues in DUI trials, Part 2

By Michelle E. DeCasas, Jana M. Seng

self-study/Evidence

DUI 101: Common evidentiary issues in DUI trials

By Michelle E. DeCasas, Jana M. Seng

self-study/Evidence

Use of a misdemeanor for impeachment

By Anne Costin

self-study/Evidence

Character evidence simplified: CORPSE, COD and COW

By Jackson Lucky

self-study/Evidence

Objections to questions asked during trials

By Elia V. Pirozzi

self-study/Evidence

The two secrets of facial recognition technology

By Abraham C. Meltzer