This article originally appeared in The Bar Examiner print edition, Summer 2018 (Vol. 87, No. 2), pp 1–2.
Hon. Rebecca White Berch
As I write this, we are celebrating the results of the successful Annual Bar Admissions Conference in Philadelphia. The reviews for the conference were overwhelmingly positive. One session on the work of the ABA Commission on the Future of Legal Education, however, raised a few eyebrows. One presenter challenged the audience by asking whether law school graduates should need to take a bar exam at all. As a former regulator of bar admissions (while a member of Arizona’s Supreme Court), let me respond “yes.”
Regulators need a bar exam—or some valid and reliable way to determine which applicants for admission possess the necessary knowledge and abilities to practice law—to assist in protecting the public from unqualified lawyers. Would a diploma privilege work—that is, allowing students from in-state law schools to become members of the bar without taking an exam? Perhaps, though such a system allows law schools to vouch for their own students. While a diploma privilege may present no problem in most instances, it does present a potential conflict of interest when a school is called upon to certify those students at the very bottom of the class. How could they refuse to certify any student who had faithfully attended their school for three years and had earned enough credits to graduate from it?
A diploma privilege also presents a potential trap for regulators. Once granted, a diploma privilege presumably applies to graduates of all ABA-approved law schools in the jurisdiction. But consider this example. Let’s say a state is considering a diploma privilege for its two relatively small, highly regarded law schools. The Court is presumably well acquainted with the curricula and faculties at those schools. Then a large, private, for-profit law school opens in the jurisdiction—a school that admits many students with marginal credentials and whose bar passage rate falls below 40 percent. Examples such as this show why a blanket privilege for those who attend an in-state law school, even one approved by the ABA, may not adequately protect the public.
Moreover, many applicants for admission to practice attend law schools outside the admitting jurisdiction. Court members cannot know with detail the rigor of the many schools the applicants attended. The bar exam helps regulators ascertain which applicants possess the basic knowledge, skills, and abilities to practice law.
It’s easy to criticize the bar exam—much easier, in fact, than defending it. While anecdotes suffice for an attack, a thoughtful response to most criticisms is usually nuanced and may (horrors!) involve psychometrics. For example, most have heard of someone who failed to pass the bar exam who seemed to have all the qualifications necessary to successfully practice law. Kathleen Sullivan notably failed to pass the California bar exam while serving as the dean of Stanford Law School.1 Based on such anecdotal examples, some argue that a two-day test provides little insight into whether an applicant is capable of practicing law.
But is that accurate? The research about standardized testing suggests that it is not. The Wall Street Journal published an article earlier this year titled “The Gatekeeper Tests,” by Nathan Kuncel and Paul Sackett, professors of industrial-organizational psychology at the University of Minnesota, which addressed the utility of the SAT and ACT tests in predicting success in college and later in life. The authors concluded that “[s]tandardized tests tell us a lot about an applicant’s likely academic performance and eventual career success. . . . [O]ur own research and that of others in the field show conclusively that a few hours of assessment do yield useful information for admissions decisions.”2
They continued to say that “[t]ests also predict outcomes beyond college. . . . [T]he tests predict not only grades but also several other important outcomes, including faculty evaluations, research accomplishments, degree attainment, performance on comprehensive exams and professional licensure.”3 And they were addressing a three-hour standardized test. How much more might we learn about an applicant from a two-day test that examines not only knowledge and analytical ability, but writing ability and the ability to choose among facts, organize a response, and reason to a correct conclusion?
The bar exam does not, as NCBE concedes, test every skill necessary to succeed as a lawyer. But neither do the SAT or the ACT. Yet Kuncel and Sackett nonetheless concluded that “[t]hough originally intended as a measure of ‘book smarts,’ [the SAT and ACT] also correlated with successful outcomes at both school and work.” For example, they noted that in a 2008 study, the top 1% of performers at age 13, when viewed 20 years later, were “very highly accomplished, with high incomes, major awards and career accomplishments.” And the study found that “[t]hose in the top quarter of the top 1% were more likely . . . to have high incomes, patents, doctorates and published literary works and STEM research” than those in the bottom quarter of the top 1%, showing a high degree of predictive ability.4
In short, the research simply doesn’t support those who claim that a two-day bar exam really can’t tell regulators much—or anything—about the knowledge, skills, and abilities of test takers. Instead, the research reveals that such tests can offer legitimate information from which to predict future performance. And, of course, the law schools regularly rely on the results of a three-and-a-half-hour standardized test—the LSAT—when selecting students for admission.
Detractors also claim that the MBE is merely a test of rote memorization. It’s not. No exam question would, for instance, ask an examinee to simply regurgitate the rule regarding offers or acceptances in contract. Rather, an exam question would present a fact pattern that requires the examinee to analyze whether, given the facts, an offer has been made and accepted. Does the applicant need to understand what makes up an offer and how an acceptance might occur? Yes, of course. But is that “memorization,” or application of appropriate legal knowledge?
The debate will no doubt continue. But my years with NCBE have convinced me that a high quality test, thoughtfully written by experts and thoroughly vetted, can provide insight into an applicant’s knowledge, skill, and ability to practice law.
My year as chair of the NCBE Board of Trustees is rapidly coming to an end. It has been a year of great change and excitement, and of new ideas. Although change there has been, there has also been a continued commitment to excellence among the NCBE staff and volunteers. My thanks to all of you for your dedication.
The year could not have been the success it has been without the leadership of NCBE’s president and CEO, Judith Gundersen, and the wonderful NCBE staff. While all are outstanding, I wish to specially thank Kellie Early, Laurie Lutz, Vicki Millard, Ellen Embertson-Merrill, Mark Albanese, and Kent Brye for their assistance this year. And, of course, Claire Guback, for making these Letters from the Chair easier to read. The NCBE Board of Trustees has also been supportive, and I am grateful for the creative input it has provided on a wide range of issues during the past year. Next year brings a new leadership team for the Board: Chair Michele Gavagni of Florida, Chair-Elect Cynthia Martin of Missouri, and Secretary Bucky Askew of Georgia. The Board couldn’t be in better hands. And thanks to all of you, our Bar Examiner readers, especially those of you who toil in the fields writing, grading, and administering bar exams. We couldn’t do it without you.
Hon. Rebecca White Berch
Contact us to request a pdf file of the original article as it appeared in the print edition.