The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved
Curated by THEOUTPOST
On Thu, 24 Apr, 12:05 AM UTC
13 Sources
[1]
AI secretly helped write California bar exam, sparking uproar
On Monday, the State Bar of California revealed that it used AI to develop a portion of multiple-choice questions on its February 2025 bar exam, causing outrage among law school faculty and test takers. The admission comes after weeks of complaints about technical problems and irregularities during the exam administration, reports the Los Angeles Times. The State Bar disclosed that its psychometrician (a person skilled in administrating psychological tests), ACS Ventures, created 23 of the 171 scored multiple-choice questions with AI assistance. Another 48 questions came from a first-year law student exam, while Kaplan Exam Services developed the remaining 100 questions. The State Bar defended its practices, telling the LA Times that all questions underwent review by content validation panels and subject matter experts before the exam. "The ACS questions were developed with the assistance of AI and subsequently reviewed by content validation panels and a subject matter expert in advance of the exam," wrote State Bar Executive Director Leah Wilson in a press release. According to the LA Times, the revelation has drawn strong criticism from several legal education experts. "The debacle that was the February 2025 bar exam is worse than we imagined," said Mary Basick, assistant dean of academic skills at the University of California, Irvine School of Law. "I'm almost speechless. Having the questions drafted by non-lawyers using artificial intelligence is just unbelievable." Katie Moran, an associate professor at the University of San Francisco School of Law who specializes in bar exam preparation, called it "a staggering admission." She pointed out that the same company that drafted AI-generated questions also evaluated and approved them for use on the exam. State bar defends AI-assisted questions amid criticism Alex Chan, chair of the State Bar's Committee of Bar Examiners, noted that the California Supreme Court had urged the State Bar to explore "new technologies, such as artificial intelligence" to improve testing reliability and cost-effectiveness. However, the California Supreme Court told the LA Times that it "was unaware that AI had been used to draft any of the multiple-choice questions" until the State Bar's Monday press release. The AI disclosure follows other problems with the February exam. Test takers reported being kicked off online testing platforms, experiencing screen lag and error messages, and encountering typos and confusing questions. These issues prompted a federal lawsuit against Meazure Learning, the exam administrator, and calls for an audit of the State Bar. A unique test of dubious quality The State Bar of California switched from the National Conference of Bar Examiners' Multistate Bar Examination to its own hybrid in-person and remote testing model last year while facing a $22 million deficit. It contracted with Kaplan Exam Services for $8.25 million to create test questions. According to the LA Times, law professors Basick and Moran had previously raised concerns about the quality of the February exam questions, noting that the 50 practice questions released before the exam "still contain numerous errors" even after editing. Basick expressed additional concerns about the use of first-year law exam questions, arguing that an exam determining minimal competence to practice law requires a different standard than one assessing first-year law school learning. The State Bar plans to ask the California Supreme Court to adjust test scores for February exam takers but has resisted returning to the National Conference of Bar Examiners exams for July, citing test security concerns with remote testing options. The Committee of Bar Examiners will meet on May 5 to discuss remedies, but Chan said the State Bar is unlikely to release all 200 exam questions or return to National Conference of Bar Examiners tests soon, as nearly half of California bar applicants want to keep the remote testing option.
[2]
California Bar discloses AI was used to develop some questions in problem-plagued February exam
LOS ANGELES (AP) -- The State Bar of California has disclosed that some multiple-choice questions in a problem-plagued bar exam were developed with the aid of artificial intelligence. The legal licensing body said in a news release Monday that it will ask the California Supreme Court to adjust test scores for those who took its February bar exam. "The debacle that was the February 2025 bar exam is worse than we imagined," Mary Basick, assistant dean of academic skills at the University of California, Irvine, Law School, told the Los Angeles Times. "I'm almost speechless. Having the questions drafted by non-lawyers using artificial intelligence is just unbelievable." In February, the new exam led to complaints after many test-takers were unable to complete their bar exams. The online testing platforms repeatedly crashed before some applicants even started. Others struggled to finish and save essays, experienced screen lags and error messages and could not copy and paste text, the Times reported earlier. According to a recent presentation by the State Bar, 100 of the 171 scored multiple-choice questions were made by Kaplan and 48 were drawn from a first-year law students exam. A smaller subset of 23 scored questions were made by ACS Ventures, the State Bar's psychometrician, and developed with artificial intelligence. "We have confidence in the validity of the (multiple-choice questions) to accurately and fairly assess the legal competence of test-takers," Leah Wilson, the State Bar's executive director, told the newspaper in a statement. Katie Moran, an associate professor at the University of San Francisco School of Law who specializes in bar exam preparation, told the newspaper, "It's a staggering admission." "The State Bar has admitted they employed a company to have a non-lawyer use AI to draft questions that were given on the actual bar exam," she said. "They then paid that same company to assess and ultimately approve of the questions on the exam, including the questions the company authored."
[3]
On California's State Bar Exam, More Questions Than Answers
The State Bar of California's new exam has been rife with problems, an A.I. controversy and now the likelihood of delayed results. Thousands of people took the new California bar exam in February, ready to the join the ranks of the state's 195,000 lawyers. But a series of missteps by the institution responsible for licensing lawyers has thrown thousands of nascent legal careers into a frustrating limbo. First, there was the faulty testing software used during the exam. Test takers had trouble logging in. The software often crashed or was missing critical functions like copy and paste, leaving many unable to complete the exam. The organization that administers the test, the State Bar of California, had to offer adjustments of test-takers' scores and other remedies. Then came the news that at least a handful of the multiple-choice questions had been developed with the help of artificial intelligence. To many of those who took the exam, it was hardly shocking -- they already had suspicions that A.I. had been used, based on a few questions that they said had struck them as bizarrely worded or legally unsound. And now, California's future lawyers are likely to have to wait a little longer to find out if they made the cut. The state bar said it would need more time to obtain approval from the Supreme Court of California to adjust test scores in light of the problems. The results of the February exam had been slated to be released on Friday, but that is likely to be delayed. "I just wanted a fair chance to be an attorney," Edward Brickell, a 32-year-old graduate of Southwestern Law School in Los Angeles who took the test, said in an interview. "And it just feels like every week there's another thing that comes out and says, like, 'We didn't give you a fair chance.'" Mr. Brickell and others who took the test have flooded Reddit and other social media sites with horror stories and with plans to organize protests and demand accountability. On Tuesday at a state bar committee meeting, a handful of test-takers used the public-comment period to voice their displeasure and frustration. "You guys are the body that is determining if we are competent to earn a living," one test-taker, Dan Molina, told the state bar's contracts committee at the virtual meeting. "Finances are being destroyed. Lives are being destroyed, and are about to be destroyed even more." With a high threshold for passage, California's bar exam had long been considered one of the hardest in the nation. That threshold had been lowered in recent years. In October, the state bar obtained approval from the California Supreme Court to introduce a reworked exam, with questions developed by a new test provider and the option to allow the test to be taken remotely. The state bar made the change to save money. The state bar had previously used exams developed and prepared by the National Committee of Bar Examiners, the organization behind the exams used by most states that are considered the gold standard in the field. The N.C.B.E. does not allow remote testing. Test takers in California were told that the new exam would not require any substantive changes in preparation, so many of them prepared the same way they would have for the N.C.B.E. version of the test. In November, the state bar administered an experimental exam that functioned as a test run. Those who took it reported technical difficulties. Then, a study guide released by Kaplan, the new test provider, was rife with errors. That guide was quietly corrected and rereleased in the weeks before the exam in February. Kaplan declined to comment. In a sign that the state bar had anticipated some difficulties, it offered more than 5,000 registered test-takers the option to defer taking the exam until July, the next test date. After the February exam, the state bar acknowledged the widespread technical failures. "We know and have stated that these issues were, and continue to be for those still testing, unacceptable in their range and severity," the State Bar of California said in a statement. "We apologize again, and we make no excuses for the failures that have occurred." The state bar added that it would evaluate whether Meazure Learning, the vendor that provided the technology and proctoring services to administer the exam, had failed to meet its contractual obligations. It also said it would enlist a psychometrician -- a specialist who focuses on measuring intangible qualities such as knowledge or intelligence -- to come up with score adjustments for test-takers who had experienced difficulties. The state bar's proposed test score adjustment was announced last week. The proposal lowered the raw passing score considerably. That recommendation was filed with a request for approval from the State Supreme Court on Tuesday -- three days before the results were set to be released. Given the late filing, the state bar told test-takers that the release of the exam results could be delayed, prolonging a dizzying stretch of uncertainty for many. Buried deep in the announcement about the scoring adjustment was the new development: Some of the multiple-choice exam questions were developed not by Kaplan but by the state bar's psychometrics provider, ACS Ventures, with the assistance of artificial intelligence. ACS Ventures did not respond to a request for comment. The state bar said that its Committee of Bar Examiners, the body that oversees the exam, had not previously been made aware of the use of A.I. The committee had been instructed by the State Supreme Court last year to explore changes to make the exam less expensive to administer, including the potential use of A.I. "But the court has not endorsed, nor authorized, the broader use of A.I.," Alex Chan, the chairman of the Committee of Bar Examiners, said in a statement. "While A.I. may eventually play a role in the future of exam development, absent specific judicial guidance, the Committee has neither considered nor approved its use to date." The Supreme Court said it had not been aware that the technology was used in the development of the exam and called for an investigation. The state bar has not disclosed the details of how the technology was used by ACS Ventures to assist in developing exam questions. For Mr. Brickell and others, the disclosure that A.I. was used at all seemed to offer an explanation for some of their confusion. Some questions, he and others who took the test said, did not read as though they had been drafted by a human and listed only incorrect multiple-choice answers. Ceren Aytekin, an aspiring entertainment lawyer, said she had also noticed peculiarities in some of the questions, but she at first refused to believe A.I. had been used. "I initially thought, 'Maybe I'm the wrong one,'" Ms. Aytekin said. "Maybe I'm putting blame on a organization that would never do this to their examinees." She added: "All the issues I spotted make so much sense with A.I. being involved. We just didn't want to believe it." Two other large state bar associations, in New York and Illinois, said they had never used A.I. to develop questions on their exams. The N.C.B.E., which prepares the exams for New York, Illinois and most other states, said it had never used A.I. for that purpose. April Dawson, an associate dean at the Technology Law and Policy Center at the North Carolina Central University School of Law, said the use of A.I. in developing test questions was not an issue on its own. She said the problem was in the fact that it had been done without transparency. "That you would have a licensing body engage in such irresponsible conduct, it really is kind of baffling," she said. If he doesn't pass, Mr. Brickell is likely to take the exam in July. Those who fail the February exam will be able to take it then for free. The state bar has said it will not use any questions that have been developed with A.I. on the July exam. Had the exam not been offered for free in July, Mr. Brickell had contemplated taking it in another state. "I don't want to give them my bar dues as an attorney for the rest of my life," Mr. Brickell said of California's state bar. "This has soured me so much."
[4]
AI helped write bar exam questions, California state bar admits
The bar says it will ask the state supreme court to adjust scores after test-takers also faced platform crashes The state bar of California has disclosed that some multiple-choice questions in a problem-plagued bar exam were developed with the aid of artificial intelligence. The legal licensing body said in a news release on Monday that it will ask the California supreme court to adjust test scores for those who took its February bar exam. "The debacle that was the February 2025 bar exam is worse than we imagined," Mary Basick, assistant dean of academic skills at the University of California, Irvine, School of Law, told the Los Angeles Times. "I'm almost speechless. Having the questions drafted by non-lawyers using artificial intelligence is just unbelievable." In February, the new exam led to complaints after many test-takers were unable to complete their bar exams. The online testing platforms repeatedly crashed before some applicants even started. Others struggled to finish and save essays, experienced screen lags and error messages and could not copy and paste text, the Times reported earlier. According to a recent presentation by the state bar, 100 of the 171 scored multiple-choice questions were made by Kaplan and 48 were drawn from a first-year law students exam. A smaller subset of 23 scored questions were made by ACS Ventures, the state bar's psychometrician, and developed with AI. "We have confidence in the validity of the [multiple-choice questions] to accurately and fairly assess the legal competence of test-takers," Leah Wilson, the state bar's executive director, told the newspaper in a statement. Katie Moran, an associate professor at the University of San Francisco School of Law who specializes in bar exam preparation, told the newspaper: "It's a staggering admission. "The State Bar has admitted they employed a company to have a non-lawyer use AI to draft questions that were given on the actual bar exam," she said. "They then paid that same company to assess and ultimately approve of the questions on the exam, including the questions the company authored." Andrew Perlman, dean of Suffolk University Law School and an advisory council member of the American Bar Association taskforce on the law and artificial intelligence, said he had not heard of AI being used to develop bar exam questions or standards being put in place governing such uses. But he said he was not surprised, given the rapid growth of AI technology. Perlman said AI can be useful for developing questions for assessment, but a critical guard rail is making sure that everything that comes from an AI tool is vetted carefully by experts in the subject matter. He expects its use to continue to grow. Although there might be public skepticism of the emerging technology in the legal profession at this time: "we will be worried in the future about the competence of lawyers who don't use these tools," Perlman predicted.
[5]
AI was used to write the California bar exam. The law community is outraged.
AI doesn't only take the bar exam, it apparently writes the questions too. Credit: Douglas Sacha / Getty Images You've heard of AI models taking the bar exam, but this time, AI also helped write the questions. The State Bar of California revealed on Monday that it used AI to develop a portion of its exam questions, according to the LA Times. The AI-generated exam questions were created by an independent psychometrician called ACS Ventures hired by the State Bar. The questions were "developed with the assistance of AI and subsequently reviewed by content validation panels and a subject matter expert in advance of the exam," announced the State Bar in a statement addressing technical glitches and question errors that test takers had previously complained about. The LA Times reported that 23 out of the 171 multiple choice questions were made by ACS Ventures. The majority of the multiple choice questions were developed by Kaplan, and a "small subset" were taken from the First-Year Law Students' Exam. This past year, the bar was offered remotely to California-based test takers. Students and educators alike were already outraged about the remote test platform crashing and being riddled with bugs. But now, the discovery that some of the exam questions were created with AI has further fueled that outrage. "I'm almost speechless. Having the questions drafted by non-lawyers using artificial intelligence is just unbelievable," Mary Basick, assistant dean of academic skills at UC Irvine School of Law told the Times. "It's a staggering admission," Katie Moran, an associate professor at the University of San Francisco School of Law told the outlet. Moran also pointed out the fact that ACS Ventures, the firm used to craft the AI questions, was the same firm to approve the questions. Alex Chan, who chairs the State Bar's Committee of Bar Examiners, told the outlet that the California Supreme Court had pressured the State Bar to look into "new technologies, such as artificial intelligence" as a means of improving reliability or cost-efficiency. Automating tasks with AI has surged since the rise of generative AI -- and not just simple tasks or low-stakes work, but critical work that has very real consequences. Some suspect that the formula used to calculate the Trump Administration's tariff rates was created by ChatGPT or something similar. In 2023, two New York lawyers were sanctioned for using ChatGPT in a legal brief, which cited fake cases. And academic journals are flooded with papers that include AI-generated text. And those are just a few examples of the ones that got caught. Generative AI's ability to rapidly write, summarize, and source information has been an irresistible way for workers to save time and effort. But it has innate hallucination problems and poses ethical issues by outsourcing work to a bot -- especially when it comes to law students whose entire career rests on passing the bar.
[6]
California Bar discloses AI was used to develop some questions in problem-plagued February exam
The State Bar of California has disclosed that some multiple-choice questions in a problem-plagued bar exam were developed with the aid of artificial intelligence. The legal licensing body said in a news release Monday that it will ask the California Supreme Court to adjust test scores for those who took its February bar exam. "The debacle that was the February 2025 bar exam is worse than we imagined," Mary Basick, assistant dean of academic skills at the University of California, Irvine, Law School, told the Los Angeles Times. "I'm almost speechless. Having the questions drafted by non-lawyers using artificial intelligence is just unbelievable." In February, the new exam led to complaints after many test-takers were unable to complete their bar exams. The online testing platforms repeatedly crashed before some applicants even started. Others struggled to finish and save essays, experienced screen lags and error messages and could not copy and paste text, the Times reported earlier. According to a recent presentation by the State Bar, 100 of the 171 scored multiple-choice questions were made by Kaplan and 48 were drawn from a first-year law students exam. A smaller subset of 23 scored questions were made by ACS Ventures, the State Bar's psychometrician, and developed with artificial intelligence. "We have confidence in the validity of the (multiple-choice questions) to accurately and fairly assess the legal competence of test-takers," Leah Wilson, the State Bar's executive director, told the newspaper in a statement. Katie Moran, an associate professor at the University of San Francisco School of Law who specializes in bar exam preparation, told the newspaper, "It's a staggering admission." "The State Bar has admitted they employed a company to have a non-lawyer use AI to draft questions that were given on the actual bar exam," she said. "They then paid that same company to assess and ultimately approve of the questions on the exam, including the questions the company authored." Andrew Perlman, dean of Suffolk University Law School and an advisory council member of the American Bar Association Task Force on the Law and Artificial Intelligence, said he had not heard of AI being used to develop bar exam questions or standards being put in place governing such uses. But he said he was not surprised, given the rapid growth of AI technology. Perlman said AI can be useful for developing questions for assessment, but a critical guard rail is making sure that everything that comes from an AI tool is vetted carefully by experts in the subject matter. He expects its use to continue to grow. Although there might be public skepticism of the emerging technology in the legal profession at this time, "we will be worried in the future about the competence of lawyers who don't use these tools," Perlman predicted. © 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.
[7]
California Admits AI Was Used to Write Bar Exam Plagued by Problems
Now here's a legal-drama-worthy twist in the recent spate of dumb lawyers getting caught using AI: it turns out that the very bar exam administered to aspiring attorneys in California was itself created with the help of a large language model, The Los Angeles Times reports. The admission was made by the State Bar of California on Monday, following complaints about the quality of the test's questions, and numerous glitches experienced by test-takers when they took it in February. In a news release, the organization said that 23 of the exam's total of 171 scored multiple-choice questions were drafted by the firm ACS Ventures, which developed the questions "with the assistance of AI." Another 48 questions were lifted from an older version of an exam for first-year law students. "The debacle that was the February 2025 bar exam is worse than we imagined," Mary Basick, assistant dean of academic skills at UC Irvine Law School, told the LA Times. "I'm almost speechless. Having the questions drafted by non-lawyers using artificial intelligence is just unbelievable." Katie Moran, an associate professor at the University of San Francisco School of Law, called it a "staggering admission." The same company that used AI to draft the questions was then paid "to assess and ultimately approve of the questions on the exam, including the questions the company authored," she noted to the newspaper. For weeks, test takers had complained that they were randomly kicked off the online platform that the bar was administered on, while screens lagged and showed error messages, per the reporting. The test itself was riddled with typos, and some questions were total nonsense. Despite these complaints -- and despite pleading guilty to AI usage -- a spokesperson for the State Bar insisted that the test questions were reviewed by content validation panels and subject matter experts. In any case, the whole situation sounds like a mortifying catastrophe. For one, the Supreme Court of California, of which the State Bar is an administrative arm, maintains it had no idea about the use of AI to create the test questions until this week -- even though it had instructed the State Bar to explore the use of AI to "improve upon the reliability and cost-effectiveness of such testing" last fall, according to Alex Chan, chair of the State Bar's Committee of Bar Examiners. Casting additional scrutiny, Basick and Moran argued that the exam questions, which should take years to develop, were drafted far too quickly, while 50 practice questions re-released just weeks before the actual exam contained numerous errors, they wrote early this month, per the LA Times. What spurred the dubious measures sounds like a familiar tale of disastrous cost-cutting. Faced with a $22 million deficit last year, the State Bar ditched the commonly used National Conference of Bar Examiners' Multistate Bar Examination, and decided to transition to a hybrid model of in-person and remote testing. To create the new test, it inked a $8.25 million deal with Kaplan Exam Services, and contracted Meazure Learning to administer it. In a fittingly legal result, Meazure Learning is now being sued by some of the students who took the glitchy exams. The State Bar said it will ask the California Supreme Court to adjust test scores for those who took the test in February. Chan said that the Committee of Bar Examiners will meet on May 5 to discuss other remedies, but doubted that the State Bar would release the exam questions to the public or go back to the NCBE.
[8]
California Bar discloses AI was used to develop some questions in problem-plagued February exam
LOS ANGELES -- The State Bar of California has disclosed that some multiple-choice questions in a problem-plagued bar exam were developed with the aid of artificial intelligence. The legal licensing body said in a news release Monday that it will ask the California Supreme Court to adjust test scores for those who took its February bar exam. "The debacle that was the February 2025 bar exam is worse than we imagined," Mary Basick, assistant dean of academic skills at the University of California, Irvine, Law School, told the Los Angeles Times. "I'm almost speechless. Having the questions drafted by non-lawyers using artificial intelligence is just unbelievable." In February, the new exam led to complaints after many test-takers were unable to complete their bar exams. The online testing platforms repeatedly crashed before some applicants even started. Others struggled to finish and save essays, experienced screen lags and error messages and could not copy and paste text, the Times reported earlier. According to a recent presentation by the State Bar, 100 of the 171 scored multiple-choice questions were made by Kaplan Exam Services and 48 were drawn from a first-year law students exam. A smaller subset of 23 scored questions were made by ACS Ventures, the State Bar's psychometrician, and developed with artificial intelligence. "We have confidence in the validity of the (multiple-choice questions) to accurately and fairly assess the legal competence of test-takers," Leah Wilson, the State Bar's executive director, told the newspaper in a statement. Katie Moran, an associate professor at the University of San Francisco School of Law who specializes in bar exam preparation, told the newspaper, "It's a staggering admission." "The State Bar has admitted they employed a company to have a non-lawyer use AI to draft questions that were given on the actual bar exam," she said. "They then paid that same company to assess and ultimately approve of the questions on the exam, including the questions the company authored."
[9]
California Bar discloses AI was used to develop some questions in problem-plagued February exam
LOS ANGELES -- The State Bar of California has disclosed that some multiple-choice questions in a problem-plagued bar exam were developed with the aid of artificial intelligence. The legal licensing body said in a news release Monday that it will ask the California Supreme Court to adjust test scores for those who took its February bar exam. "The debacle that was the February 2025 bar exam is worse than we imagined," Mary Basick, assistant dean of academic skills at the University of California, Irvine, Law School, told the Los Angeles Times. "I'm almost speechless. Having the questions drafted by non-lawyers using artificial intelligence is just unbelievable." In February, the new exam led to complaints after many test-takers were unable to complete their bar exams. The online testing platforms repeatedly crashed before some applicants even started. Others struggled to finish and save essays, experienced screen lags and error messages and could not copy and paste text, the Times reported earlier. According to a recent presentation by the State Bar, 100 of the 171 scored multiple-choice questions were made by Kaplan and 48 were drawn from a first-year law students exam. A smaller subset of 23 scored questions were made by ACS Ventures, the State Bar's psychometrician, and developed with artificial intelligence. "We have confidence in the validity of the (multiple-choice questions) to accurately and fairly assess the legal competence of test-takers," Leah Wilson, the State Bar's executive director, told the newspaper in a statement. Katie Moran, an associate professor at the University of San Francisco School of Law who specializes in bar exam preparation, told the newspaper, "It's a staggering admission." "The State Bar has admitted they employed a company to have a non-lawyer use AI to draft questions that were given on the actual bar exam," she said. "They then paid that same company to assess and ultimately approve of the questions on the exam, including the questions the company authored."
[10]
California Bar discloses AI was used to develop some questions in problem-plagued February exam
LOS ANGELES (AP) -- The State Bar of California has disclosed that some multiple-choice questions in a problem-plagued bar exam were developed with the aid of artificial intelligence. The legal licensing body said in a news release Monday that it will ask the California Supreme Court to adjust test scores for those who took its February bar exam. "The debacle that was the February 2025 bar exam is worse than we imagined," Mary Basick, assistant dean of academic skills at the University of California, Irvine, Law School, told the Los Angeles Times. "I'm almost speechless. Having the questions drafted by non-lawyers using artificial intelligence is just unbelievable." In February, the new exam led to complaints after many test-takers were unable to complete their bar exams. The online testing platforms repeatedly crashed before some applicants even started. Others struggled to finish and save essays, experienced screen lags and error messages and could not copy and paste text, the Times reported earlier. According to a recent presentation by the State Bar, 100 of the 171 scored multiple-choice questions were made by Kaplan and 48 were drawn from a first-year law students exam. A smaller subset of 23 scored questions were made by ACS Ventures, the State Bar's psychometrician, and developed with artificial intelligence. "We have confidence in the validity of the (multiple-choice questions) to accurately and fairly assess the legal competence of test-takers," Leah Wilson, the State Bar's executive director, told the newspaper in a statement. Katie Moran, an associate professor at the University of San Francisco School of Law who specializes in bar exam preparation, told the newspaper, "It's a staggering admission." "The State Bar has admitted they employed a company to have a non-lawyer use AI to draft questions that were given on the actual bar exam," she said. "They then paid that same company to assess and ultimately approve of the questions on the exam, including the questions the company authored." Andrew Perlman, dean of Suffolk University Law School and an advisory council member of the American Bar Association Task Force on the Law and Artificial Intelligence, said he had not heard of AI being used to develop bar exam questions or standards being put in place governing such uses. But he said he was not surprised, given the rapid growth of AI technology. Perlman said AI can be useful for developing questions for assessment, but a critical guard rail is making sure that everything that comes from an AI tool is vetted carefully by experts in the subject matter. He expects its use to continue to grow. Although there might be public skepticism of the emerging technology in the legal profession at this time, "we will be worried in the future about the competence of lawyers who don't use these tools," Perlman predicted.
[11]
California Bar Discloses AI Was Used to Develop Some Questions in Problem-Plagued February Exam
LOS ANGELES (AP) -- The State Bar of California has disclosed that some multiple-choice questions in a problem-plagued bar exam were developed with the aid of artificial intelligence. The legal licensing body said in a news release Monday that it will ask the California Supreme Court to adjust test scores for those who took its February bar exam. "The debacle that was the February 2025 bar exam is worse than we imagined," Mary Basick, assistant dean of academic skills at the University of California, Irvine, Law School, told the Los Angeles Times. "I'm almost speechless. Having the questions drafted by non-lawyers using artificial intelligence is just unbelievable." In February, the new exam led to complaints after many test-takers were unable to complete their bar exams. The online testing platforms repeatedly crashed before some applicants even started. Others struggled to finish and save essays, experienced screen lags and error messages and could not copy and paste text, the Times reported earlier. According to a recent presentation by the State Bar, 100 of the 171 scored multiple-choice questions were made by Kaplan and 48 were drawn from a first-year law students exam. A smaller subset of 23 scored questions were made by ACS Ventures, the State Bar's psychometrician, and developed with artificial intelligence. "We have confidence in the validity of the (multiple-choice questions) to accurately and fairly assess the legal competence of test-takers," Leah Wilson, the State Bar's executive director, told the newspaper in a statement. Katie Moran, an associate professor at the University of San Francisco School of Law who specializes in bar exam preparation, told the newspaper, "It's a staggering admission." "The State Bar has admitted they employed a company to have a non-lawyer use AI to draft questions that were given on the actual bar exam," she said. "They then paid that same company to assess and ultimately approve of the questions on the exam, including the questions the company authored." Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
[12]
California Bar Discloses It Used AI to Develop Some Questions in Problem-Plagued Exam
The State Bar of California has disclosed that some multiple-choice questions in a problem-plagued bar exam were developed with the aid of artificial intelligence. The legal licensing body said in a news release Monday that it will ask the California Supreme Court to adjust test scores for those who took its February bar exam. "The debacle that was the February 2025 bar exam is worse than we imagined," Mary Basick, assistant dean of academic skills at the University of California, Irvine, Law School, told the Los Angeles Times. "I'm almost speechless. Having the questions drafted by non-lawyers using artificial intelligence is just unbelievable." In February, the new exam led to complaints after many test-takers were unable to complete their bar exams. The online testing platforms repeatedly crashed before some applicants even started. Others struggled to finish and save essays, experienced screen lags and error messages and could not copy and paste text, the Times reported earlier.
[13]
The questions of the California bar exam have been made with AI - Softonic
The California State Bar Association has revealed that some of the multiple-choice questions in the controversial February 2025 bar exam were developed with the help of artificial intelligence (using ChatGPT). This fact has raised doubts about the validity and competence of the questions, especially since they were created by non-lawyers, which has caused astonishment and concern in the legal field. Mary Basick, assistant to the dean at the School of Law at the University of California, Irvine, described the situation as "incredible." During the administration of the February exam, multiple technical issues affected the candidates, including crashes of the online testing platform, which prevented many of them from completing the exam. Additionally, applicants faced errors and delays that hindered the writing and saving of their responses. In response to the irregularities, the California State Bar Association will request the Supreme Court to adjust the scores of the affected exams. For her part, Leah Wilson, executive director of the Association, expressed confidence in the validity of the questions, stating that they are capable of fairly assessing the legal competence of the candidates. However, the reference to the use of artificial intelligence in the creation of questions has generated skepticism. Andrew Perlman, dean of the Suffolk University Law School, indicated that while the use of AI in developing questions could be useful, it is essential that the generated content be validated by subject matter experts. An increase in the use of AI tools in the legal profession is expected, despite public distrust regarding their effectiveness. Experts warn that, despite the current controversy, the competition among future lawyers could be affected if they do not utilize these new technologies, which opens a debate about the future of legal practice in California and beyond.
Share
Share
Copy Link
The State Bar of California's admission of using AI to develop exam questions has led to widespread criticism and concerns about the integrity of the February 2025 bar exam, compounding existing issues with the test administration.
The State Bar of California has admitted to using artificial intelligence (AI) to develop a portion of multiple-choice questions for its February 2025 bar exam, igniting a firestorm of criticism from legal education experts and test-takers alike 12. This revelation comes amidst ongoing complaints about technical problems and irregularities during the exam administration.
According to the State Bar, 23 out of 171 scored multiple-choice questions were created by ACS Ventures, the bar's psychometrician, with AI assistance 3. The remaining questions were developed by Kaplan Exam Services or drawn from a first-year law student exam 1.
The disclosure has drawn strong criticism from legal education experts. Mary Basick, assistant dean of academic skills at the University of California, Irvine School of Law, expressed shock, stating, "Having the questions drafted by non-lawyers using artificial intelligence is just unbelievable" 2. Katie Moran, an associate professor at the University of San Francisco School of Law, called it "a staggering admission" 1.
Critics have raised concerns about the integrity of the exam and the potential impact on test-takers' careers. The use of AI in developing exam questions has also raised ethical questions about the role of technology in high-stakes testing 5.
The AI controversy compounds existing issues with the February 2025 bar exam. Test-takers reported numerous technical problems, including:
These issues have led to a federal lawsuit against Meazure Learning, the exam administrator, and calls for an audit of the State Bar 1.
The State Bar has defended its practices, stating that all questions underwent review by content validation panels and subject matter experts before the exam 1. Leah Wilson, State Bar Executive Director, expressed confidence in the validity of the multiple-choice questions to assess legal competence fairly 2.
Alex Chan, chair of the State Bar's Committee of Bar Examiners, noted that the California Supreme Court had urged the exploration of new technologies, including AI, to improve testing reliability and cost-effectiveness 1. However, the California Supreme Court stated it was unaware of AI's use in drafting questions until the State Bar's press release 1.
The State Bar plans to ask the California Supreme Court to adjust test scores for February exam takers 2. The Committee of Bar Examiners will meet on May 5 to discuss remedies 1.
This incident raises broader questions about the use of AI in high-stakes testing and the legal profession. Andrew Perlman, dean of Suffolk University Law School, predicts that AI use in developing assessment questions will continue to grow, emphasizing the importance of careful vetting by subject matter experts 4.
As the legal community grapples with these developments, the incident highlights the need for transparency, ethical considerations, and robust quality control measures in the integration of AI technologies into professional licensing exams.
Reference
[1]
[2]
[3]
[4]
California's higher education institutions are rapidly developing AI programs and partnerships to prepare students for the growing demand in AI-related careers, with a focus on both advanced degrees and more accessible "blue-collar AI" jobs.
2 Sources
2 Sources
The rapid adoption of AI in recruitment processes is causing significant changes and challenges in the job market, affecting both employers and job seekers.
2 Sources
2 Sources
A paradox emerges in schools as teachers increasingly use AI tools for various tasks while attempting to restrict student access, raising ethical questions and concerns about the future of education.
2 Sources
2 Sources
Morgan & Morgan, a major US law firm, warns its attorneys about the risks of using AI-generated content in court filings after a case involving fake citations. The incident highlights growing concerns about AI use in the legal profession.
9 Sources
9 Sources
A comprehensive look at the latest developments in AI, including OpenAI's internal struggles, regulatory efforts, new model releases, ethical concerns, and the technology's impact on Wall Street.
6 Sources
6 Sources