Studying Better Ways to Test Bar Applicants for Minimum Competence: Another Reason to Care about the California Bar Exam (besides the cut score debate)

In addition to her post on Law School Café about alleged correlations between bar exam scores and lawyer discipline (discussed on this blog here), Professor Deborah Merritt recently offered another bar exam-related post. This one provides intriguing historical perspective on the current need to expand the range of skill sets tested on the bar exam. Following up on points made by Professor Derek Muller, Professor Merritt discusses a 1980 study by the California Committee of Bar Examiners, cosponsored by the National Conference of Bar Examiners (NCBE), on adding a clinical component to the bar exam. Several hundred applicants who had taken the July 1980 California Bar Exam volunteered to complete an additional clinical evaluation requiring them, among other things, to interview a client and examine witnesses. Professional actors played the role of clients, akin to the standard patient role that actors perform for clinical evaluations in medicine. The applicants were scored based on an elaborate protocol.

Delving into the statistical results of the study, including comparisons between outcomes on the conventional bar exam and outcomes on the clinical evaluation, Professor Merritt illuminates how crucial it is nearly 40 years later for bar examiners to study and implement alternative assessments of skills not currently evaluated by the bar exam. She points out that, while the study’s results were by no means definitive, they at least suggest “the disturbing conclusion that a significant percentage of conventional bar passers (about two of every five) lack basic practice skills that are essential in representing clients.”

I find this discussion particularly apt in 2017, the 20th anniversary of the first administration of the Multistate Performance Test (MPT), the written skills test now a part of the bar exam in 40 states and D.C.  What started the path toward written performance testing and the MPT? A study conducted by the California Committee of Bar Examiners (cosponsored by the NCBE), possibly the same one referenced by Professor Merritt.  On the occasion of the MPT’s 10-year anniversary in 2007, the Bar Examiner, a magazine published by the NCBE, briefly described the California-based origins of the performance test and indicated that the MPT was ultimately based largely on “the California model.” (The piece, in the November 2007 edition of the Bar Examiner, is apparently not retrievable online.)

Written performance testing was the last meaningful innovation in bar exam testing. In thinking about who might lead an effort toward the next one that introduces greater clinical evaluation, including possibly of oral skills, I think not of a top-down effort from the resolutely conservative NCBE. It is focused on getting as many jurisdictions as possible to adopt its Uniform Bar Exam (26 and counting as of today). Rather, I think of a bottom-up effort by individual states—perhaps with California in the lead—serving as laboratories for testing methods that could ultimately spread to other jurisdictions, thereby persuading or forcing the NCBE to join.

The history of written performance testing is illustrative of my point. Long before the NCBE went forward with the MPT in 1997, not just California but also Alaska and Colorado devised performance tests of their own and administered them on the bar exam. Indeed, those three states were administering performance tests in the early 1980s, playing an important initial role in advancing the cause of a needed bar exam reform. Here, for example, is a follow-up study of the 1983 California Bar Exam, discussing its two performance tests.

The biggest barrier to innovation at the state level is the NCBE’s influence, which increases with each state that adopts the UBE and thereby constrains itself to offer the conventional bar exam that the NCBE requires it to. Indeed, both Alaska and Colorado, two of the original performance test states from the 1980s, have adopted the UBE, meaning neither of those states will be doing any more bar exam innovation. That leaves California (and any of the other 23 states that have yet to join the UBE, none of which matches the influential profile of California).

Why the California study or studies did not lead to some form of clinical evaluation beyond written performance tests is unclear, though two obstacles that come to mind are expense and testing reliability. Indeed, the 1980 study that Professor Merritt references summarized one of its findings as follows: “[T]he relatively low reliability, administrative difficulties, and high costs associated with most (but not necessarily all) standardized oral tasks probably precludes even considering them as possible components of a general bar examination. Written tests of clinical skills, on the other hand, are relatively easy to construct, administer, and score. Further, unlike oral tasks, the score on written tasks are moderately correlated with one another.”

It seems worthwhile to revisit those conclusions, given the passage of time and possible advances in testing methods, and given that the medical profession requires clinical evaluation of its applicants.  Today, 24 years after the MacCrate Report, 20 years after the advent of the MPT, and 10 years after the Carnegie Foundation Report, the legal profession needs a better bar exam.  I join Professor Merritt’s call for a national task force on the bar exam, sponsored by AALS, the Conference of Chief Justices, the ABA Section of Legal Education and Admissions to the Bar, and maybe even the NCBE.  As Professor Merritt writes, such a task force could “study current approaches to the bar exam, develop a more realistic definition of minimum competence, and explore best practices for measuring that competence.”

But I also come back to the states, and to California specifically. There is a vigorous debate going on about whether California should lower its bar exam cut score. That’s an important discussion to have. But I might suggest another discussion to have about the California Bar Exam: Shouldn’t California resist the UBE and instead conduct a new study of alternative methods for assessing today’s relevant lawyering skills that are not encompassed by the UBE?

Professor Merritt’s Blog post on attorney discipline and bar exam WORTH A READ!

Our blog has often posted about many issues related to licensing lawyers, experiential requirements for admission, the monopolizing power of the NCBE and the pros and cons of the UBE.  Thus, I recommend to our readers an excellent post by our blogger friend Professor Deborah Merritt over at Law School Cafe on bar exam scores and lawyer discipline. Professor Merritt analyzes an article by Pepperdine Professors Robert Anderson and Professor Derek Mueller entitled The High Cost of Lowering the Bar Exam.   Professors Anderson and Mueller opine that “lowering the bar examination passing score will likely increase the amount of malpractice, misconduct, and discipline among California lawyers.” Merritt objects to any causal inference noting,

Two key facts, however, weigh strongly against drawing that type of causal inference. First, as Anderson and Muller point out, “[t]here is virtually no discipline in the first 10 years of practice.” If the bar exam measured qualities related to attorney discipline, one would expect to see disciplinary cases emerge during those 10 years. Wouldn’t attorneys with marginal competency (as measured by the current bar exam) reveal their deficiencies during their early practice years?

Second, attorney discipline almost never rests on lack of knowledge about legal doctrine, poor reasoning skills, or bad writing–the skills currently measured by the bar exam. Levin and her colleagues reported that attorneys most often received discipline for failing to communicate with clients (20.0%), lack of diligence (17.93%), and failure to safeguard client property (11.26%). Only 4.14% of disciplinary sanctions related to “competence”–and even some of those cases may have reflected incompetence in areas that are not tested by the bar exam.

My favorite comment by Professor Merritt provides another example from which we should not infer causality (however tempting it might be to some of us who have been hurt by patriarchy),

We should not exclude individuals from a profession based on qualities that merely correlate with misconduct.

To underscore that point, consider this: The strongest predictor of attorney discipline is the y chromosome. Male attorneys are much more likely than female ones to be disciplined. If we want to use correlations to reduce instances of attorney discipline, it would be much more efficient to ban men from the profession, subject them to special character exams, or require them to achieve a higher bar exam score than women. Those actions, of course, would raise special issues of gender discrimination–but they illustrate the drawbacks of predicting malfeasance based on correlations.

These questions and assumed correlations are important ones. Many defend the decreasing bar passage statistics as appropriate market correction to prevent “undesirables” from entry into the profession — a consumer protection argument. However, as Professor Merritt points out, there is so much more to unpack here. For example, most misconduct challenges occur against solo practitioners or small firms. This raises overlapping socio-economic questions: which lawyers could be perceived as easiest to challenge, which lawyers have the best legal defense teams, and which kind of clients have the most reason to complain.

After teaching for over 28 years and observing which graduates pass the bar on the first try and which do not , I am skeptical of the Anderson-Mueller argument. I would love to see the NCBE and other scholars engage in a socio-economic analysis of bar passage and of disciplinary misconduct.

Legislation & Regulation and the Bar Exam

Most readers of this blog will be familiar with the performance test (PT), a portion of the bar exam in 42 states and D.C. (Forty states use the Multistate Performance Test (MPT); examiners in Pennsylvania and California write and administer their own PT.) For states using the Uniform Bar Exam (UBE), the MPT counts for 20 percent of the overall exam score.

I wrote about the performance test previously here. I extolled its virtue as the only part of the exam that exclusively tests lawyering skills, requiring zero memorization of legal rules; and I bemoaned its status as the ugly step-child of the bar exam that gets next to no attention in conversations about exam reform.

Over time, bar examiners have concluded that certain substantive subjects have grown or lessened in importance to law practice such that they have added subjects to the MBE (e.g., Federal Civil Procedure) or dropped subjects from essays (e.g., Secured Transactions, in some jurisdictions). Why not the same with skills on the PT? Is it not fair to say, for example, that a greater percentage of beginning lawyers today work in fields dominated by regulations than did in 1993 when the MPT was born? Yet the vast majority of PTs to this day test the ability to reason from cases, not from statutes or regulations without the aid of cases.

The anti-regulation bent of the current administration notwithstanding, we live in a heavily regulatory state. Lawyers in numerous specialty areas, including health care law and environmental law; lawyers working for government agencies; or lawyers serving as in-house compliance officers—among the most important skill sets for all of them are reading, interpreting and applying statutes and regulations. (Compliance, by the way, has been a growing field, and positions in compliance are J.D. preferred jobs increasingly being filled by newly licensed lawyers.) Many law schools have responded to this reality by adding a 1L course on legislation and regulation to provide law students the needed foundation for practicing law in our heavily regulatory state. (A running list, accessible from here, indicates that about 30 law schools are offering a course of this nature in the first year.)

In reviewing summaries of the last 28 MPT items (covering the last 14 exams back to February 2010), I found only one among the 28 that provided only statutes and regulations and no cases as part of its law library. Typically, PTs presenting issues of statutory application have both statutes and cases in the library, and the cases provide the statutory interpretation needed to answer the issue posed. That’s still common law reasoning—a very important skill, to be sure, but not very helpful for a lawyer when the only applicable law is a statute or a regulation.

All of the above helps to explain how pleasantly surprised I was to see a purely statutory issue on the February 2017 performance test on the Pennsylvania Bar Exam. The assigned task was to write a memorandum analyzing and supporting the client’s position on three legal issues raised by opposing counsel in a motor vehicle accident. One of the issues was whether a driver had violated the state’s law banning texting while driving. The text of the law appeared in the materials, and applicants had to dissect its language and apply it to the facts—all without the aid of cases in the materials, each of which was relevant only to other issues. This is basic stuff, but exactly the kind of basic stuff that beginning lawyers must be able to do well.

Teaching Tips to Think about Early in the New Semester- By Steven Friedland

With the beginning of a new semester upon us, these thoughts and tips are a great thing to keep in the back of everyone’s mind whether you are a student or a professor.  This great post was done by Steven Friedland.

Flexibility and Mobility in Law School Learning

As a professor who has been teaching for more than two decades, it is easy to feel like a dinosaur in classes populated by students mostly in their 20s.  But within that notion lies the fact that not only do ages change, but cultures as well.  It is evident that within the born-digital generation, cultural understandings, particularly involving learning, are different than mine.

While I think cross-cultural competency is more important than ever in this global era, it also applies to us teaching dinosaurs.  I learned in law school in a linear and fixed fashion – go to class, take notes, go to the library, study and prepare for the next class.  Based on studies and my own anecdotal evidence, there is an increasing preference for mobility and flexibility in learning.  I am becoming a believer in both — using Web platforms like TWEN, Blackboard or Moodle as integral parts of a course, and allowing students to have flexibility in where and when they learn.

I am now experimenting in doctrinal courses to include several flex classes — audiotaped, with an option to take each over a 24 hour period in a self-paced fashion.  These self-paced classes are combined with deliverables — writing an answer to a problem based on the class material and then posting it on the Web platform, or doing some other relevant task based on the material to ensure that some form of learning has occurred.  So far, these classes have been well-received; to my surprise, students like the flexibility about when they take class as much as the remote opportunity. I am enjoying shaking it up in this way.  What is the saying?  Even an old dinosaur can learn….

 

Note-Taking Breaks

In a law school class, there are a variety of note-takers.  Some are the “court reporters,” taking down every word.  Some take far fewer notes, within their own organizational schemes. Many students are using computers, with note-taking programs. I also have had some “deep observers,” who appear to take no notes at all.

But all students seem to rely on the notes they take in putting a course together for deep understanding, especially in the first year of school.  Interestingly, teachers do not generally know how students are taking notes and whether those notes taken are even accurate.  This is why I have started using a colleague’s technique (yes, I like borrowing good ideas from others, no hiding there), of taking “note breaks” in the middle of a doctrinal class — allowing students to check their notes with other students, particularly about important rules, principles or insights. I usually prompt the break by asking, “What were the most important points in class so far?”  This has several effects.  Everyone perks up and the students appear present and engaged.  Students also are more likely to ask questions about what has occurred thus far.  I get useful feedback on what I have communicated well and what I have done poorly.  So all the way around, I find it to be a helpful technique. When students walk out of class, they should be able to rely on and have ready access to useful notes.

 

Retention and Retrieval

Lots of studies have been done that show experts learn differently than novices.  In any educational process, the goal is to move up the scale, from unconscious incompetence, to conscious incompetence, to conscious competence, to the highest level, unconscious competence.  I know about the lowest level, having been there in law school and many other contexts (just thinking back on the longest years of my life taking piano lessons).  The highest level of competence is epitomized by Captain Sully, the U.S. Air pilot who landed his commercial plane without engines in the Hudson River.

So what learning features are associated with experts? Experts recognize patterns of information, have deep understanding of material within a domain, organize their information well for ready access, and constantly self-monitor.  We can learn from these characteristics in law school.  It is traditional for law school professors to evaluate student performance through a single final examination, (although sometimes mid-terms are also offered).  The traditional summative evaluation framework promotes a particular type of studying.  Students study like crazy just before an exam, and then dump all of their knowledge on the test. (This approach was a familiar one for me when I was in school.) To help students progress from novice to expert, though, we should teach for long-term retention and retrieval.  This can occur through the use of numerous problems and opportunities throughout a course by which to practice organizing and storing material before a final exam, the use of structures or outlines by which to approach topics, and a greater emphasis on mnemonics, anchor words and other learning devices.   Sometimes, in our desire to cover great swaths of material, we don’t drill as deeply as we could or should.

New York Proposes “Experiential Learning Requirements” as Condition of Licensure: CLEA and NYS Bar Committee Respond

Readers of this blog and followers of the NCBE’s expansion remember  that this past Spring New York became the 16th state  to  adopt the Uniform Bar Examination (UBE), changing  its longstanding bar admission requirements.  Many voices opposed adoption including the New York State Bar Association (NYSBA) (see Committee on Legal Education and Admission to the Bar (CLEAB) report 10-29-2014  and vote of House of Delegates), the Clinical Legal Education Association (CLEA) and the Society for American Law Teachers (SALT).  Despite these and other  opposition voices, the proposal was adopted with the new changes going into effect for the July 2016 bar examination.

During discussion of the adoption of the UBE, the Court was encouraged  to include clinical or experiential  requirements for licensing so that lawyers admitted to the New York Bar would be ahead of the curve — a position I firmly support.   On the opposite coast, California had been engaged in a multi-year process examining licensure and profession readiness which resulted in a proposal requiring 15 credits of experiential learning before admission.  In response to the movement to incorporate experiential learning in bar admission,  the New York State Court of Appeals formed a Task Force on Experiential Learning and Admission to the Bar.  Just last month, that Taskforce requested comments on its proposal that

New York adopt a new mechanism for ensuring that all applicants for admission to the bar possess the requisite skills and are familiar with the professional values for effective, ethical and responsible practice. In light of New York’s diverse applicant pool, and in an effort to accommodate the varying educational backgrounds of applicants, the Task Force suggests five separate paths by which applicants for admission can demonstrate that they have satisfied the skills competency requirement.

The New York Law Journal examined the proposal in an article found here.   In addition, the Honorable Judge Jenny Rivera, chair of the Taskforce attended a meeting of NYSBA’s Committee on Legal Education and Admission to the Bar (CLEAB) to explain the proposal and answer questions.

It is heartening that the Court is concerned about and wants to  require the development of essential lawyering skills and professional values acquisition. However, without more, Pathway 1 of the current proposal will not actually ensure  that applicants to the bar experience the kind of skill development and value formation that the Taskforce desires.  Pathway 1, referencing new ABA standards,  requires schools to confirm that they have published  their “plan for incorporating into their curriculum the skills and professional values that,  in the school’s judgment,  are required for its graduates’ basic competence and ethical participation in the legal profession.” It also requires law schools to certify  that law graduate applicants for admission “have sufficient competency in those skills and sufficient familiarity with those values” which are publicly available on the law school’s website.  Although Judge Rivera believes that the certification process described in Pathway 1 can have some real bite, as pointed out in comments submitted by the Clinical Legal Education Association (11.9. 15 CLEA SUBMISSION ON EXPERIENTIAL REQUIREMENT ), Pathway 1 simply mirrors the experiential training requirements already mandated by the American Bar Association.     

New York’s  law school deans, not unexpectedly,  submitted comments supporting the “flexibility” of Pathway 1.  The  CLEAB report to the Experiential Taskforce expressed concern that without additional content to Pathway 1 “little will be accomplished” by the proposal.   And as one member of the NYS bar committee  argued, “what law school is going to admit that one of its graduates did not acquire the skills or  values promised on its website?”

In my opinion, the most important concern is whether applicants to the bar have ever represented or interacted with a client, or operated as a lawyer, in a live setting under guided, experienced supervision before admission.  In its comment to the Taskforce, CLEA urges that a “three- credit clinical training requirement” be added for all J. D. applicants to the New York Bar.  This makes sense.  Law school clinics and faculty-supervised externships are designed to create the very kind of skill development and value acquisition with which the Court is concerned.  And clinical faculty have developed the formative assessment tools to maximize skill and professional identity formation.

I am hopeful that, in its next iteration of the proposal, the Taskforce will heed CLEA and CLEAB’s comments and come back with recommendations that will ensure applicants for the bar are ready to engage in competent, ethical and professional representation of New York’s citizenry, corporations, and notforprofits.

 

 

 

 

Bar Exam Musings, Part II: Skillfully Changing the Bar Exam Narrative

There really needs to be a paradigm shift in the way the National Conference of Bar Examiners and state bar examiners approach potential reform of the exam. It should not be so novel an idea to increase the range of skills tested on the bar exam (or at least enhance the testing of existing skills) instead of increasing the number of subjects tested on the bar exam. Adding Federal Civil Procedure as the seventh subject on the MBE, as the NCBE just did this year, is not helping. An expanded MBE exacerbates the already heavy imbalance in favor of testing for content knowledge over testing for professional skills

Granted, some skills do not lend themselves to being tested on a standardized exam, but some very well could. Has the NCBE done a careful study of the skills coverage of the Multistate Performance Test akin to its review of the subject coverage of the MBE that led to the adding of Civil Procedure? I have seen little evidence that it has.

Consider a few skill sets as examples. The vast majority of newly licensed lawyers responding to a recent job analysis survey indicated that their job requires them to investigate facts and gather facts. A similarly large majority indicated that their job requires them to develop strategy for client matters. The MPT is supposed to test these skill sets, but has it? My review of the last 10 years’ worth of MPT questions suggests that it has not but has rather focused consistently on basic legal and factual analysis to be written in the form of a memo, brief, or client letter. (Not that there’s anything wrong with that; it’s just that there is something wrong with having only that.) Moreover, among the documents that MPT examinees are told that they could be asked to produce are a discovery plan or a witness examination plan, but I have never seen either assigned.

Surely, if the MBE deserved review to determine if it needed another subject, the MPT deserves review to determine how it can expand to test more skills and more often.

In the same vein, there is the question of whether and how to test legal research, which has gotten some attention and has been studied by the NCBE. Even legal writing, though a fundamental part of completing an answer to an MPT or essay question, is not really tested on its own merits.

Musings on the Bar Exam and Legal Education’s Attitude toward it

I have been studying and writing about the bar exam of late, so I appreciate the guest blogging opportunity, graciously offered by Mary Lynch, which I shall use to share some bar exam musings. Later this week, I hope to follow up with a bit more.

I noted with interest a recent New York Times feature, Is the Bar Too Low to Get into Law School? The feature offered perspectives from five legal professionals, four of whom are law professors, on how best to respond to declining bar exam passage rates. (Scores on the MBE, the anchor of the bar exam in almost every state, have declined again this year.) Two took issue with the bar exam itself, arguing for fundamental changes or its complete abolition. But Linda Sheryl Greene of the University of Wisconsin Law School argued that law schools simply need to do the work of preparing their students for the exam.

Law schools (or at least those not in the very top tier) indeed need to help their students prepare for the bar exam, but the bar exam also has to change in a way that allows law schools to do their part without the deleterious distraction of the exam’s heavy focus on recall of memorized law. Regrettably, bar exam reform efforts over the last 20 years have not focused on the one part of the exam that actually and exclusively tests lawyer competencies, requiring zero memorization of legal rules. That sadly neglected part of the exam is the performance test, which assigns a specific written lawyering task to be completed using a closed universe of factual materials and legal authorities. About one-fifth of the states do not even administer a performance test. Among states that do, the performance test remains the smallest part of the exam, accorded the least weight in scoring. It is in a very real sense the ugly step-child of the bar exam.

The behemoth of the bar exam, the MBE, compels examinees to study and memorize a copious number of legal rules. To be fair, the MBE does not test only for knowledge of law. But every skill set evaluated by the MBE—reading comprehension and legal analysis among them—is evaluated also by the performance test. The MBE’s primary value to the overall exam is psychometric—i.e., when scores on other parts of the exam are scaled to the MBE, the overall exam achieves testing reliability. A reasonable level of testing reliability can be achieved if the MBE is weighted at 40% of the overall score. (See page 13 of this article by the National Conference of Bar Examiners’ former Director of Research.) However, the NCBE recommends 50%, a recommendation that most states follow.

What of the rest of the exam? In every state, the remaining part of the score comes mostly from answers to essay questions, which, like the MBE, require memorization and recall of legal rules. If the MBE is testing knowledge of law (and creating more than enough focus on rote memorization), what reason other than inertia is there for essay questions to retain such a significant place on bar exams? Or to remain on bar exams at all? For years, essay questions were the venue for testing knowledge of state-specific law. However, most states now use the NCBE’s Multistate Essay Examination. And, as a growing number of states adopt the Uniform Bar Examination, several are employing other means outside of the bar exam, such as a required seminar, to ensure that new lawyers are familiar with unique attributes of local law.

And that takes me back to the performance test, the most valid of the testing instruments on the bar exam. The performance test was the answer from bar examiners 20 years ago to the recommendations of the MacCrate Report, which called on law schools and bar examiners to increase their attention to lawyering skills. Since then, while the MBE and essay examinations have been expanded, the performance test has remained stagnant. That needs to change. Through careful attention to the various skills today’s beginning lawyers have to perform, examiners should be able to reinvigorate the performance test and expand its skills coverage. They should also be able to increase the inadequate weight given to the performance test in scoring.

As for legal education’s attitude and approach toward the bar, I think an exam that focuses more heavily on skills through performance testing is one that would put law schools in a better position to help their students prepare. Because performance tests do not evaluate substantive knowledge of law, bar preparation specialists in law schools can easily administer performance tests from previous bar exams to students as both formative and evaluative assessments. Legal Writing professors have been using performance test-style problems for many years, especially with first-year students. Clinical professors use them, and, yes, even some doctrinal professors have too.  (I compiled a list of articles discussing the use of performance test-based problems by law professors in footnote 269 of my recent article.)

%d bloggers like this: