State Bar & Bar Associations
Apr. 24, 2025
California bar exam plunges to new low amid scandal
The California State Bar is facing a series of controversies, including the use of AI-generated questions on its bar exam, conflicts of interest in exam validation, and the recycling of first-year exam questions all of which undermine its credibility and trustworthiness.





Mary Basick
Assistant Dean of Academic Skills
UC Irvine School of Law

Katie Moran
Associate Professor
University of San Francisco School of Law

We thought the California State Bar hit rock bottom when it rushed to administer the February 2025 Bar Exam, deploying 200 new multiple-choice questions drafted by Kaplan on an untested online platform. It went as disastrously as feared. But a Monday night announcement revealed that the Bar had much farther to fall.
In an announcement released Monday night, and buried beneath the positive news recommending scoring adjustments for the unfortunate February examinees, the Bar made several staggering admissions about the exam's content.
First, the State Bar admitted that a psychometrician used Artificial Intelligence to draft 23 of the multiple-choice questions scored on the February exam. Even worse, the Bar disclosed that the drafter worked for ACS Ventures, the entity hired by the State Bar to ultimately review and evaluate exam question performance. And truly astonishing: there is no indication that the drafter was a lawyer or had any legal expertise.
After the test, many examinees complained online and in public comments directed at the State Bar that the multiple-choice questions appeared disjointed, poorly drafted, and AI-generated. While we did not doubt the bar takers' experiences, the assertion about AI seemed too outrageous to be true since the California Supreme Court had only authorized Kaplan to draft the 200 multiple-choice questions. The Committee of Bar Examiners never once acknowledged those allegations prior to Monday's surprise late-night announcement, posted only two weeks before bar results will be released.
The State Bar may find their own guidance on AI instructive: "Generative AI use presents unique challenges; it uses large volumes of data, there are many competing AI models and products, and even for those who create generative AI products, there is a lack of clarity about how it works. In addition, generative AI poses the risk of encouraging greater reliance and trust on its outputs because of its purpose to generate responses and its ability to do so in a manner that projects confidence and effectively emulates human responses. A lawyer should consider these and other risks before using generative AI in providing legal services."
Second, the State Bar asserted that ACS Ventures, whom they had hired to assess the reliability of the exam questions, concluded that the multiple-choice questions - including those that they had written - were adequately valid and reliable. Using a supposedly "independent" (to use the Bar's words) psychometrician to write 13% of the scored questions and later validate those same questions is a blatant conflict of interest.
Third, the State Bar admitted that 48 of the scored multiple-choice questions were recycled from the Bar's First-Year Law Students' Exam (FYLSX). The goal of the FYLSX is to determine if an applicant has sufficient knowledge of three first-year subjects to continue their legal studies into their second year. This is different from the bar exam goal, which is to assess the minimum competency of an entry-level attorney.
And get this: the Bar utilized those questions without informing the California Supreme Court. When successfully petitioning the Court to make changes to the exam in October, the Bar included the entirety of its $8.25 million contract with Kaplan, which stated that Kaplan would be responsible for writing the 200 multiple-choice questions on the exam, that Kaplan's use of Artificial Intelligence would be "de minimis," and that Kaplan would have access to some of the Bar's bank of questions from the FYLSX so that they could "use such materials when drafting multiple-choice questions." The State Bar even named its psychometrician from ACS Ventures, Chad Buckendahl, in the petition. Nowhere did the Bar mention that Buckendahl's company would write a single multiple-choice question.
The third source of multiple-choice questions was from Kaplan, who wrote 100 of the scored questions on the exam. We note that only 171 of the 200 questions on the exam were scored. It is unclear who drafted the remaining 29 questions, or why Kaplan did not draft all 200 questions as stated in the contract.
It's no wonder that examinees reported feeling confused by the many differences in style, length, wording, and voice in the questions.
Finally, the Bar attempted to address multiple public comments from the two of us and other academics about the inclusion of 55 expanded areas of testable law on their exam content maps in violation of the Business and Professions Code, requiring two years' notice before making a significant change to the preparation for exam passage. The Bar obfuscated and employed the shotgun approach in a four-paragraph defense of their content maps. They initially claimed that "the majority of the subtopics" were fair game. Then they conceded it was "arguable" that they added new topics, but ultimately disregarded that argument because the content maps are "not exhaustive." They then backpedaled by saying that none of the new (but not new) topics were tested on the February exam. Then finally, the Bar concluded it might refine the content maps further. Which is it? Given the Bar's deliberate past omissions and the clear side-by-side comparison of the MBE Subject Matter Outline with California's Content Map, we are not persuaded by their haphazard defense.
We fear that the Bar is locked in a free fall toward a new rock bottom this July.
The Bar made irresponsible and reckless changes to the February 2025 Bar Exam. They admitted to using AI, paying their psychometrician to write and validate their own questions, and recycling questions from a past test not designed to assess competency to practice law. They have been informed that their Content Map shows clear areas of expansion of testable law. And yet their statement shows neither remorse nor awareness of their incompetence and lack of transparency. In fact, in a fitting use of passive voice indicative of the Bar's lack of accountability for these problems of their own creation, their executive director concluded, "Lessons learned are being incorporated into the July exam."
We see no reason to believe the July exam will be any better. The Bar made all of these changes without transparency, disclosure, or permission. They have committed an egregious breach of public trust. Examinees spend months of their lives preparing for this test; the State Bar failed them and betrayed fellow members of the profession.
How can we prevent another disaster and ensure a fair test for future examinees?
The California Supreme Court must step in and move us back to the validated Multistate Bar Exam (MBE) until the Bar can guarantee it can properly create and vet its own bank of exam questions. The Court should also demand the release of the 200 questions given on the February exam so that truly independent experts can review them.
The Bar has not been reliable, trustworthy, or transparent during this process. However, the Court can be.
Submit your own column for publication to Diana Bosetti
For reprint rights or to order a copy of your photo:
Email
jeremy@reprintpros.com
for prices.
Direct dial: 949-702-5390
Send a letter to the editor:
Email: letters@dailyjournal.com