I have been studying and writing about the bar exam of late, so I appreciate the guest blogging opportunity, graciously offered by Mary Lynch, which I shall use to share some bar exam musings. Later this week, I hope to follow up with a bit more.
I noted with interest a recent New York Times feature, Is the Bar Too Low to Get into Law School? The feature offered perspectives from five legal professionals, four of whom are law professors, on how best to respond to declining bar exam passage rates. (Scores on the MBE, the anchor of the bar exam in almost every state, have declined again this year.) Two took issue with the bar exam itself, arguing for fundamental changes or its complete abolition. But Linda Sheryl Greene of the University of Wisconsin Law School argued that law schools simply need to do the work of preparing their students for the exam.
Law schools (or at least those not in the very top tier) indeed need to help their students prepare for the bar exam, but the bar exam also has to change in a way that allows law schools to do their part without the deleterious distraction of the exam’s heavy focus on recall of memorized law. Regrettably, bar exam reform efforts over the last 20 years have not focused on the one part of the exam that actually and exclusively tests lawyer competencies, requiring zero memorization of legal rules. That sadly neglected part of the exam is the performance test, which assigns a specific written lawyering task to be completed using a closed universe of factual materials and legal authorities. About one-fifth of the states do not even administer a performance test. Among states that do, the performance test remains the smallest part of the exam, accorded the least weight in scoring. It is in a very real sense the ugly step-child of the bar exam.
The behemoth of the bar exam, the MBE, compels examinees to study and memorize a copious number of legal rules. To be fair, the MBE does not test only for knowledge of law. But every skill set evaluated by the MBE—reading comprehension and legal analysis among them—is evaluated also by the performance test. The MBE’s primary value to the overall exam is psychometric—i.e., when scores on other parts of the exam are scaled to the MBE, the overall exam achieves testing reliability. A reasonable level of testing reliability can be achieved if the MBE is weighted at 40% of the overall score. (See page 13 of this article by the National Conference of Bar Examiners’ former Director of Research.) However, the NCBE recommends 50%, a recommendation that most states follow.
What of the rest of the exam? In every state, the remaining part of the score comes mostly from answers to essay questions, which, like the MBE, require memorization and recall of legal rules. If the MBE is testing knowledge of law (and creating more than enough focus on rote memorization), what reason other than inertia is there for essay questions to retain such a significant place on bar exams? Or to remain on bar exams at all? For years, essay questions were the venue for testing knowledge of state-specific law. However, most states now use the NCBE’s Multistate Essay Examination. And, as a growing number of states adopt the Uniform Bar Examination, several are employing other means outside of the bar exam, such as a required seminar, to ensure that new lawyers are familiar with unique attributes of local law.
And that takes me back to the performance test, the most valid of the testing instruments on the bar exam. The performance test was the answer from bar examiners 20 years ago to the recommendations of the MacCrate Report, which called on law schools and bar examiners to increase their attention to lawyering skills. Since then, while the MBE and essay examinations have been expanded, the performance test has remained stagnant. That needs to change. Through careful attention to the various skills today’s beginning lawyers have to perform, examiners should be able to reinvigorate the performance test and expand its skills coverage. They should also be able to increase the inadequate weight given to the performance test in scoring.
As for legal education’s attitude and approach toward the bar, I think an exam that focuses more heavily on skills through performance testing is one that would put law schools in a better position to help their students prepare. Because performance tests do not evaluate substantive knowledge of law, bar preparation specialists in law schools can easily administer performance tests from previous bar exams to students as both formative and evaluative assessments. Legal Writing professors have been using performance test-style problems for many years, especially with first-year students. Clinical professors use them, and, yes, even some doctrinal professors have too. (I compiled a list of articles discussing the use of performance test-based problems by law professors in footnote 269 of my recent article.)