This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.
News

State Bar & Bar Associations,
Legal Education

May 1, 2025

AI-generated exam questions caused disproportionate problems, State Bar says

California Supreme Court petition shows AI-written questions were more likely to be thrown out, confirming test-takers' complaints about February exam.

AI-generated exam questions caused disproportionate problems, State Bar says
Photo: Shutterstock

Bar exam questions generated with artificial intelligence were far more likely to confuse or mislead test takers--and be thrown out--than those written by humans, according to a petition filed with the California Supreme Court on Tuesday.

The filing from the State Bar revealed that AI-generated questions accounted for a disproportionate share of the exam's "performance issues," reinforcing concerns from February test takers who reported strange wording, typos and legal inaccuracies in several sections of the exam.

The bar filed a petition asking the court to approve changes designed to help people who took the February exam. The first test featuring a newly designed, multiple-choice section was plagued by technical issues and complaints about the questions. In public meetings since the exam, many test takers said some questions featured typos and strange wording that made them suspect they were generated by artificial intelligence.

The numbers appear to bear out these claims. Of the 29 questions "removed from scoring," according to the petition signed by bar General Counsel Ellin Davtyan, six were developed by ACS Ventures using AI. That is more than 20% of the 29 questions submitted by ACS, a higher percentage than those created by Kaplan Test Prep or drawn from the First-Year Law Students' Examination. All six were removed because of "negative discrimination."

"A question that has a negative discrimination value (a discrimination value below zero) means that lower-performing test takers are more likely to get the item correct than higher- performing test takers, so including questions with a negative discrimination adversely impacts examination reliability," Davtyan wrote.

Four of the removed ACS questions were in the criminal law section of the exam. Several test takers and academics cited that section as being rife with typos, clunky wording and inaccuracies. Davtyan also submitted a 182-page appendix to the court. These materials flagged eight of the 10 ACS criminal justice questions as having "performance issues." This can mean anything from a question being too difficult or too easy or hard to understand. Nearly half of ACS questions were tagged with one or more performance issues, around three times the rate of other question sources.

Several critics of the February exam asked why the ACS questions were included. The petition offered some answers.

"On or around October 30, 2024, State Bar Admissions' staff requested that ACS draft additional questions for the February 2025 bar examination to ensure that there were a sufficient number of questions in all subtopics of the subject areas," Davtyan wrote.

She continued, "Admissions' staff identified both the topic areas, and the number of items needed. ACS drafted prompts to yield multiple-choice questions that aligned with the topic areas identified by Admissions' staff and ran the prompts through OpenAI ChatGPT."

Elizabeth T. "Eli" Edwards, a reference librarian and adjunct lecturer at UCLA School of Law, noted that in late 2023 the bar released guidance for attorneys using AI in their practices. Those best practices call for "competence and diligence" and warned against "overreliance" on technology.

"The thought that the bar allowed non-lawyers to help draft material using generative AI and this only comes out after the exam just seems appalling," Edwards said. "That's not modeling their own guidance regarding the use of generative AI."

The analysis of questions and a discussion of technical issues experienced by test takers were intended to back up the bar's central request: that the justices lower the raw score needed for passage of the February exam from 560 to 534.

The bar missed the Monday deadline to submit the petition set by the Committee of Bar Examiners. In an email to test takers, bar staff said this was necessary "to ensure that the Supreme Court has all of the information it needs to fairly assess CBE's recommendations."

But many test takers and law school professors have proposed a far different response: returning to the Multistate Bar Examination multiple-choice test from the National Conference of Bar Examiners.

Robert F. Kane, a sole practitioner and adjunct professor at UC Law San Francisco, said he opposed relying on Kaplan, a company that makes its money in test preparation courses and materials.

"Basically, they're turning big sections of the bar over to Kaplan," Kane said. "I just don't understand how they would do that. There are obvious conflicts of interest."

malcolm_maclachlan@dailyjournal.com

#385207

Malcolm Maclachlan

Daily Journal Staff Writer
malcolm_maclachlan@dailyjournal.com

For reprint rights or to order a copy of your photo:

Email jeremy@reprintpros.com for prices.
Direct dial: 949-702-5390

Send a letter to the editor:

Email: letters@dailyjournal.com