This article originally appeared in The Bar Examiner print edition, Fall 2019 (Vol. 88, No. 3), pp 25–29.

NCBE produces four exams, three of which are components of the bar examination for user jurisdictions and are the exams that compose the Uniform Bar Examination (UBE):

  • The Multistate Bar Examination (MBE)
  • The Multistate Essay Examination (MEE)
  • The Multistate Performance Test (MPT)

The fourth exam is a stand-alone test administered three times per year covering established standards related to a lawyer’s professional conduct:

  • The Multistate Professional Responsibility Examination (MPRE)

The question-writing process for NCBE’s four exams involves multiple stages of drafting, review, and revision by the 65 members of NCBE’s 10 exam drafting committees: seven MBE committees (one for each subject) and one committee each for the MEE, the MPT, and the MPRE.

The work of these volunteer members—­from 25 jurisdictions, including practicing attorneys, judges, and faculty members from 35 different law schools—is supported by NCBE’s attorney editors and is integral to NCBE in the fulfillment of its mission to the jurisdictions.

The subject of how questions are written and selected for test administrations for the MBE and the two written components of the bar exam has been addressed by NCBE staff members in prior articles; in this two-part section, we bring you a unique perspective on the process from members of the drafting committees that are responsible for writing the questions for NCBE’s four exams.

In this first part, two members of NCBE’s drafting committees—one for the MBE and one for the MPRE—walk through the multiple-­choice question-writing process and share their experiences as members of an NCBE drafting committee.

In the second part, in the upcoming Winter issue, learn about the essay and performance test question-­writing process from two members of NCBE’s MEE and MPT drafting committees.

DID YOU KNOW?

All questions for NCBE’s four exams are pretested so that their performance can be evaluated before they are used as live questions on a future exam. The pretest process is an important step in verifying that the questions demonstrate acceptable statistical characteristics.

For NCBE’s multiple-choice exams, questions are pretested by appearing on an exam as unscored questions:

  • 25 of the 200 questions on each MBE are pretest questions
  • 10 of the 60 questions on each MPRE are pretest questions

These pretest questions are indistinguishable from the rest and are not used to calculate the examinee’s score.

Examinees’ performance on the pretest questions is carefully evaluated to determine whether these questions meet NCBE statistical and psychometric standards. For example, the question cannot be too easy or too difficult, and it must show evidence that examinees who answered it correctly also tended to obtain high scores on the entire exam.

LEARN MORE:

Read the following articles to learn more about the pretesting process for NCBE’s multiple-choice questions:

Drafting MBE Items: A Truly Collaborative Process

By Timothy DavisIn 2003, I was invited to attend a meeting of NCBE’s Multistate Bar Examination (MBE) Contracts Drafting Committee, one of the seven drafting committees for the seven subjects covered on the MBE. As I observed the deliberations, the collaborative nature of the process in which committee members were engaged became readily apparent. Over time, I would come to learn more about this collaborative process that culminates in professionally crafted questions that appear on the MBE—and experience how that process has facilitated my developing expertise, affording me the opportunity to be of greater service to law students and professors.

Multistate Bar Examination (MBE)

  • A six-hour, 200-question multiple-choice examination covering seven substantive areas of law: Civil Procedure, Constitutional Law, Contracts, Criminal Law and Procedure, Evidence, Real Property, and Torts. (Visit the NCBE website for a complete MBE Subject Matter Outline.)
  • Produced by NCBE since 1972.
  • Administered by 54 user jurisdictions as part of the bar examination.
  • Developed and scored by NCBE.
  • Purpose: to assess the extent to which an examinee can apply fundamental legal principles and legal reasoning to analyze given fact patterns.
  • 2019–2020 MBE Drafting Committees (one for each of the seven subjects covered on the MBE): 44 members from 23 jurisdictions (Arizona, Arkansas, California, Colorado, District of Columbia, Florida, Idaho, Iowa, Louisiana, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, New Jersey, New York, North Carolina, Ohio, Oklahoma, Pennsylvania, Texas, Virginia, Wisconsin), including 7 practicing attorneys, 9 judges, and 28 faculty members from 24 different law schools.

The Initial Stage: Independently Drafting Questions According to Best Practice Principles

Development of an MBE item, or question, from drafting to administration is a process that can take up to three years. The initial stage of this process begins with individual effort. Each member of the seven drafting committees drafts items independently. Drafters are provided with test specifications that identify the topics to be tested within each subject area (e.g., from the MBE Contracts subject-matter outline, this might be “formation of contracts” or “contract content and meaning”). With this general guidance, the most difficult stage of the drafting process begins—conceptualizing an item. In generating ideas for items, drafters may turn to cases, treatises, Restatements of the Law, and their experiences as professors, practitioners, and judges.

Best Practice Principles for Drafting Items

Multiple-choice questions can be broken down into the following components: the stem (factual scenario); the lead-in (call of the question); and the options (potential answers), which consist of the key (the correct answer) and the distractors (the incorrect answers). (See the diagram below.)

Components of a Multiple-Choice Question

Multiple-choice questions can be broken down into the following components:

Stem: The factual scenario

Lead-in: The call of the question

Options: The potential answers

Key: The correct answer

Distractors: The incorrect answers

The four principles that guide drafters in item writing are based on best practices derived from psychometric research and specify that items should

  1. be clear and concise;
  2. use only the minimum number of actors and facts necessary to support the correctness of the key and the plausibility of the distractors;
  3. test core concepts rather than trivial or obscure topics; and
  4. assess examinees’ knowledge of legal doctrine and their ability to apply legal reasoning and lawyering skills and strategies rather than rote memorization.

These principles also instruct drafters to aim for a level of difficulty that corresponds to the minimum competency expected of newly licensed attorneys.

Applying the Principles to Each Component of an Item

Drafters attempt to adhere to these principles when working on each component of an item.

The Stem: Particularly in regard to an item’s first component, the stem, drafters aim for clarity. Emphasis is placed on using the minimum number of characters and the minimum amount of background information necessary to support the lead-in and the options. Drafters also seek to avoid developing stems that incorporate subtle gender, racial/ethnic, or regional biases.

The Lead-in: The lead-in, the second component of an item, should be a clear, focused question that frames an examinee’s task based on information presented in the stem. Best practices dictate that drafters write lead-ins that are positively framed complete sentences. Practices to avoid include writing lead-ins that are framed negatively or as sentence fragments or that are unfocused or add facts that do not appear in the stem.

Drafters also attempt to draft different forms of lead-ins to test particular skills. For instance, a predictive outcome lead-in (e.g., “Will the buyer prevail in a breach of contract action against the seller?”) emphasizes an examinee’s ability to synthesize law and facts to predict an outcome. In contrast, a practice-oriented lead-in (e.g., “What objection should the attorney make?”) emphasizes an examinee’s skill at developing an analytical or logical framework for solving a legal problem.

The Options: An item’s third component, the options, consists of the key and three distractors. Best practices for drafting this component include writing options that are concise and nonrepetitive of information in the lead-in, responsive to the lead-in, parallel in style to one another, and based solely on the information provided in the stem. Obviously, the key must be the unequivocal correct answer. At the same time, each distractor must be plausible and supported by information in the stem.

The Next Stages: Multiple Rounds of Review

NCBE Test Editor Review

The collaboration begins to unfold more fully after a committee member drafts a set of items that are submitted to the NCBE test editor assigned to that particular committee. Committee members also provide a citation to the rule, case, or other legal authority that supports each item. Test editors, all of whom are lawyers, engage in a detailed review of each question. The editor’s review is focused on an item’s compliance with NCBE best practices, including that the information in the stem is clear, the lead-in is posed as a focused question, and the options are parallel and responsive to the lead-in. The editor checks to ensure that the key is in fact correct and none of the distractors are plausible second or partial keys, the cited authority in support of the key is accurate, and the item tests a core concept and presents a realistic factual scenario. Test editors send comments to drafters, who then have an opportunity to respond to the editor’s concerns before items are forwarded to the next stage of the process.

Outside Expert Review

New items, which become a part of each committee’s item pool, are also reviewed by two outside content experts, a practicing lawyer and a law professor, both with subject-­area expertise. These reviewers engage in a process similar to that of test editors but focus primarily on whether a question tests a core concept and is realistic and whether the key represents an accurate reflection of the current law on the topic being tested. The reviewers also determine whether the item is at the appropriate level of difficulty, whether more than one option is potentially correct, whether a question is imbued with any subtle bias, and whether each distractor is plausible. External reviewers’ written comments are shared with the entire committee for discussion at the next scheduled committee meeting.

Full Committee Review and Pretesting

Drafting committees meet twice per year. In advance of each committee meeting, committee members review the materials that will be discussed. These materials consist of newly drafted items, items on which the external reviewers have commented, and pretested items that failed to meet NCBE statistical standards (as explained later). Committee members also review items that the committee chair and the test editor have selected to appear on two future MBE exam administration test forms (test form referring to the collection of items in the order in which they are presented to examinees).

At the meeting, committee members engage in lively discussion that addresses the concerns raised by the test editor and the external reviewers. Along with editing items in response to those concerns, committee members jointly edit items in response to one another’s concerns. If editing fails to resolve a concern, an item will either be assigned to a committee member for further revisions after the meeting adjourns or retired (never placed on an exam). Revised items will typically be discussed at the next scheduled meeting. Items that have been revised to address concerns or those for which there are no concerns are promoted to pretest-ready status.

Following the meeting, the test editor and the committee chair will select items to be pretested from this group of test-ready items. At each administration of the bar exam, 25 of the 200 items that appear in an MBE exam booklet are items that are being pretested and are therefore not used to calculate the examinee’s score. (The remaining 175 scored items are those that have already successfully passed the pretest process.) Examinees’ performance on these pretest questions is carefully evaluated to determine whether those questions meet NCBE statistical standards and can be included as scored questions on a future MBE. Pretest items that fall outside of these statistical standards are submitted for review by the committee, which has the opportunity either to edit the items at a future committee meeting or to retire them.

Items that are slated to appear as scored items on future administrations are also reviewed at the semiannual meetings. Each such item would already have gone through the process described above—reviewed as a new item by a test editor, external reviewers, and the full committee; pretested; and reviewed again twice by the full committee as part of an upcoming test form.

My Experience Serving on the Committee

At the time I attended my first meeting of the MBE Contracts Drafting Committee, I did not appreciate that it would mark the beginning of one of my most rewarding professional and service endeavors and that it would also facilitate opportunities that extend beyond my committee work. The experience I have gained in drafting multiple-­choice questions through my work on the committee has assisted me with the questions I draft for my students, and it has also allowed me to be of service to other law professors by leading workshops on the mechanics and best practices for drafting multiple-choice questions. It has been an honor to work with the members of the committee and NCBE staff and experience their professionalism and service not only to NCBE but also to the legal profession.

Portrait photo of Timothy DavisTimothy Davis is a professor at Wake Forest University Law School, where
he teaches Contracts and Sales. He has served on NCBE’s MBE Contracts Drafting Committee since 2003 and as chair since 2008.

Observations of a Newer Member of the MPRE Drafting Committee

By Marcy G. Glenn

It is said that one should avoid observing the making of laws or sausages—because the process is distasteful (although the results are not). After almost a year of service on the Multistate Professional Responsibility Examination (MPRE) Drafting Committee, I can report that the making of law exams—at least the professional responsibility exam for those wishing to practice law—bears no resemblance to sausage production. Rather, it is a rewarding process undertaken by a committee that works earnestly, methodically, and ­collaborative­ly­—a process that is anything but distasteful.

How I Joined the MPRE Drafting Committee

The route to my appointment on the MPRE Drafting Committee was a long one, but also an early and positive indication of NCBE’s measured approach to the creation of its exams. In 2003, I first met and had the opportunity to work with the committee’s then-chair, Nancy Moore, while I was chairing the Colorado Supreme Court’s Standing Committee on the Colorado Rules of Professional Conduct. In 2013, Nancy, on behalf of NCBE, invited me to attend one of the committee’s thrice-yearly meetings.

Within an hour of the first session, I knew that I hoped to someday join the committee. I listened as the members—six distinguished law professors and one former in-house attorney for a large malpractice insurer—discussed the questions that had been drafted independently by committee members in advance of the meeting. Those conversations revealed the members’ deep knowledge of legal ethics and the broader “law of lawyering,” concern for fairness to examinees, essential understanding of test development, and mutual respect. Despite spending all-consuming, eight-hour days in conference rooms, the committee members appeared to thoroughly enjoy their work.

I then served as an external reviewer—one of several practitioners and professors with ethics expertise who offer additional input on questions approved by the committee at its meetings. I also drafted a series of my own questions in preparation for my attendance at NCBE’s MPRE Question-Writing Workshop in Madison in 2018. The workshop was designed to train subject-­matter experts in professional responsibility as question drafters, with the goal of increasing the number of questions in the MPRE question bank in preparation for the exam’s transition to computer-­based testing. A few months later, I was invited to join the Committee, its members and NCBE staff having had several years to evaluate my substantive knowledge, writing skills, judgment, and demeanor. That was one of the more intense gauntlets in my career as a lawyer.

The Committee’s Work During and Between Meetings

Since my appointment, I have attended four committee meetings. Preparation for meetings consists of reviewing draft questions, which, in turn, entails much reading, thinking, rereading, and rethinking, as well as regular cross-checking against the relevant legal authorities and, of course, revising. At the meetings, that process continues but is greatly enhanced by group discussion, occasional debate, and good humor. Committee members painstakingly work through one question after another—approving many for pretesting, assigning others to individual members for further attention in the coming months, and retiring others (that is, never placing them on an exam). (Ten of the 60 questions on each MPRE are unscored pretest questions; pretesting ensures that questions meet NCBE statistical standards before appearing on a future exam as scored questions.) The process is simultaneously exhausting and exhilarating.

[E]xternal reviewers evaluate the questions not only for content but also for bias, difficulty, clarity, and relevance, and may offer additional comments, which the committee addresses during another round of review and editing.

Between meetings, committee members work on their specific drafting tasks. NCBE’s attorney test editors identify specific content areas to be tested from the MPRE subject-­matter outline, for example “formation of client-lawyer relationship,” “current client conflicts,” or “candor to the tribunal.” Committee members—­following best practice principles for drafting multiple-choice questions—write questions that test those subjects and provide a citation to the Model Rule or other legal authority on which each question is based. NCBE’s test editors provide suggested edits on each question, and then individual members and the test editors exchange plenty of redlining and comments. The committee reviews the draft questions at its meetings. Afterward, external reviewers evaluate the questions not only for content but also for bias, difficulty, clarity, and relevance, and may offer additional comments, which the committee addresses during another round of review and editing.

Of course, NCBE staff members play an enormous role in the committee’s work; committee members receive support in every step of the drafting process from the members of the Test Operations Department who attend our meetings and provide valuable input into our work product, NCBE psychometric experts who train and advise us on test development and assess the success of our questions in pretests, and many others at NCBE who enable us to focus on our work while they handle administrative details. It is a truly collective endeavor.

Transitioning the MPRE to Computer-Based Testing

In 2019, NCBE began transitioning the MPRE from a paper-based to a computer-based delivery platform. The inaugural computer-based version of the MPRE was successfully administered in August 2019, with the full transition to be completed with the March 2020 administration. NCBE staff and committee members are working hard to increase the number of questions available for use on the exam in the new computer-based format. To that end, NCBE has recruited and is training additional subject-matter experts—both practitioners and educators with expertise in the subjects tested on the MPRE—to assist in drafting questions for eventual committee review. The end goal remains constant: to fairly and meaningfully measure examinees’ knowledge and understanding of established standards related to the professional conduct of lawyers.

The End Result

When I joined the MPRE Drafting Committee, I knew that I would love the work. I could not have predicted how fulfilling would be the professional relationships I’ve made through the committee—with my fellow committee members as well as with the NCBE staff members. The result of our committee’s hard work is the high-quality exam that jurisdictions rely upon to ensure that newly admitted lawyers have achieved competence in legal ethics. I am proud to be a part of the gratifying process of making that exam.

Multistate Professional Responsibility Examination (MPRE)

  • A two-hour, 60-question multiple-choice examination covering established standards related to a lawyer’s professional conduct. (Visit the NCBE website for the MPRE Subject Matter Outline.)
  • Produced by NCBE since 1980.
  • Administered by NCBE and required for admission by 54 jurisdictions.
  • Developed and scored by NCBE.
  • Purpose: to measure examinees’ knowledge and understanding of established standards related to the professional conduct of lawyers. The MPRE is not a test to determine an individual’s personal ethical values. Lawyers serve in many capacities: for example, as judges, advocates, counselors, and in other roles. The law governing the conduct of lawyers in these roles is applied in disciplinary and bar admission procedures, and by courts in dealing with issues of appearance, representation, privilege, disqualification, and contempt or other censure, and in lawsuits seeking to establish liability for malpractice and other civil or criminal wrongs committed by a lawyer while acting in a professional capacity.
  • 2019–2020 MPRE Drafting Committee: 8 members from 5 jurisdictions (Colorado, Georgia, Massachusetts, New York, Texas), including 1 practicing attorney, 1 retired practitioner, and 6 faculty members from 6 different law schools.

 

Portrait photo of Marcy G. GlennMarcy G. Glenn is a partner with Holland & Hart in Denver, Colorado. She has chaired the Colorado Supreme Court’s Standing Committee on the Colorado Rules of Professional Conduct since its creation in 2003, chaired and is a current member of the Colorado Bar Association’s Ethics Committee, and has chaired and been a member of the Committee on Attorney Conduct for the United States District Court for the District of Colorado. She has served on NCBE’s MPRE Drafting Committee since 2018.

Contact us to request a pdf file of the original article as it appeared in the print edition.

[apss_share]
  • Bar
    Bar Exam Fundamentals

    Addressing questions from conversations NCBE has had with legal educators about the bar exam.

  • Online
    Online Bar Admission Guide

    Comprehensive information on bar admission requirements in all US jurisdictions.

  • NextGen
    NextGen Bar Exam of the Future

    Visit the NextGen Bar Exam website for the latest news about the bar exam of the future.

  • BarNow
    BarNow Study Aids

    NCBE offers high-quality, affordable study aids in a mobile-friendly eLearning platform.

  • 2023
    2023 Year in Review

    NCBE’s annual publication highlights the work of volunteers and staff in fulfilling its mission.

  • 2022
    2022 Statistics

    Bar examination and admission statistics by jurisdiction, and national data for the MBE and MPRE.