AALS Video Series on Law Teaching

Recently, a fellow blogger sent us a very helpful tool, that we wanted to share with our readers.  Last year, during the 2015 AALS Clinical Conference, a series of informative videos was created for law professors about the complications associated with law teaching.  The entire series is about an hour long, with each individual video being only about 5 minutes long.  These videos address some of the important pedagogical issues that law professors are currently grappling with, such as assessment, adding experiential learning to doctrinal courses, reflection, and technology.

This in the link to the entire series:

Teaching Tips to Think about Early in the New Semester- By Steven Friedland

With the beginning of a new semester upon us, these thoughts and tips are a great thing to keep in the back of everyone’s mind whether you are a student or a professor.  This great post was done by Steven Friedland.

Flexibility and Mobility in Law School Learning

As a professor who has been teaching for more than two decades, it is easy to feel like a dinosaur in classes populated by students mostly in their 20s.  But within that notion lies the fact that not only do ages change, but cultures as well.  It is evident that within the born-digital generation, cultural understandings, particularly involving learning, are different than mine.

While I think cross-cultural competency is more important than ever in this global era, it also applies to us teaching dinosaurs.  I learned in law school in a linear and fixed fashion – go to class, take notes, go to the library, study and prepare for the next class.  Based on studies and my own anecdotal evidence, there is an increasing preference for mobility and flexibility in learning.  I am becoming a believer in both — using Web platforms like TWEN, Blackboard or Moodle as integral parts of a course, and allowing students to have flexibility in where and when they learn.

I am now experimenting in doctrinal courses to include several flex classes — audiotaped, with an option to take each over a 24 hour period in a self-paced fashion.  These self-paced classes are combined with deliverables — writing an answer to a problem based on the class material and then posting it on the Web platform, or doing some other relevant task based on the material to ensure that some form of learning has occurred.  So far, these classes have been well-received; to my surprise, students like the flexibility about when they take class as much as the remote opportunity. I am enjoying shaking it up in this way.  What is the saying?  Even an old dinosaur can learn….


Note-Taking Breaks

In a law school class, there are a variety of note-takers.  Some are the “court reporters,” taking down every word.  Some take far fewer notes, within their own organizational schemes. Many students are using computers, with note-taking programs. I also have had some “deep observers,” who appear to take no notes at all.

But all students seem to rely on the notes they take in putting a course together for deep understanding, especially in the first year of school.  Interestingly, teachers do not generally know how students are taking notes and whether those notes taken are even accurate.  This is why I have started using a colleague’s technique (yes, I like borrowing good ideas from others, no hiding there), of taking “note breaks” in the middle of a doctrinal class — allowing students to check their notes with other students, particularly about important rules, principles or insights. I usually prompt the break by asking, “What were the most important points in class so far?”  This has several effects.  Everyone perks up and the students appear present and engaged.  Students also are more likely to ask questions about what has occurred thus far.  I get useful feedback on what I have communicated well and what I have done poorly.  So all the way around, I find it to be a helpful technique. When students walk out of class, they should be able to rely on and have ready access to useful notes.


Retention and Retrieval

Lots of studies have been done that show experts learn differently than novices.  In any educational process, the goal is to move up the scale, from unconscious incompetence, to conscious incompetence, to conscious competence, to the highest level, unconscious competence.  I know about the lowest level, having been there in law school and many other contexts (just thinking back on the longest years of my life taking piano lessons).  The highest level of competence is epitomized by Captain Sully, the U.S. Air pilot who landed his commercial plane without engines in the Hudson River.

So what learning features are associated with experts? Experts recognize patterns of information, have deep understanding of material within a domain, organize their information well for ready access, and constantly self-monitor.  We can learn from these characteristics in law school.  It is traditional for law school professors to evaluate student performance through a single final examination, (although sometimes mid-terms are also offered).  The traditional summative evaluation framework promotes a particular type of studying.  Students study like crazy just before an exam, and then dump all of their knowledge on the test. (This approach was a familiar one for me when I was in school.) To help students progress from novice to expert, though, we should teach for long-term retention and retrieval.  This can occur through the use of numerous problems and opportunities throughout a course by which to practice organizing and storing material before a final exam, the use of structures or outlines by which to approach topics, and a greater emphasis on mnemonics, anchor words and other learning devices.   Sometimes, in our desire to cover great swaths of material, we don’t drill as deeply as we could or should.

Ten Questions to Ask Yourself Before Volunteering

As a follow-up to my previous post on “-crastination”, Creativity and the Importance of Downtime, I’m sharing a copy of my favorite handout for helping all of us, students and faculty alike, learn to engage in discernment around saying no, and yes.

Ask yourself these questions

Before volunteering your time, skills & energy to ANYTHING!

  • Is there a chance I will find myself changed by this work?
  • Does this work express my values, the things I say are important to me?
  • Will this put me with people I want to know better?
  • Will doing this help me know myself better?
  • Do I enjoy thinking of myself as a person who would do this?
  • Do I have a special gift to share?
  • When I look back in a year or ten years, will I remember doing this?
  • Will this make me feel more connected or more disjointed?
  • What will I need to say NO to in order to say YES to this?
  • Will it be FUN!


Thanks for Maylin Harndon for sharing her version of this with me.




Teaching Legal Reasoning More Efficiently?

Teaching the traditional analytical skills more efficiently and effectively could provide a much needed opening for broadening the range of skills taught to all law students. In the legal academy’s version of the “socratic method”, law teachers historically taught the analytical skills” implicitly”. They demonstrated legal reasoning by pushing students away from their raw intuitions of fairness and justice to articulate rules and exceptions, while attending carefully to the inevitable ambiguities of language.

Some law teachers suggest that the process of learning to “think like a lawyer” fundamentally requires time and practice and therefore cannot be significantly speeded up.

Yet the implicit approach has been repeatedly challenged by scholars seeking to teach legal reasoning more explicitly, by naming and explaining how it works.*  (An obsession with the goal of teaching legal reasoning more efficiently was a major thread in two phases of my own legal career when I taught first year civil procedure. I struggled both to teach skills more explicitly and to provide students with opportunities to practice them.)

A recent contribution to this quest by my colleague Jane Winn grows out of her experiment teaching common law legal reasoning to undergraduates. Students were randomly assigned to use either a well-regarded study aid, or Winn’s own materials. The materials were also leavened by her own and colleagues’ experiences teaching foreign LL.M. and J.D. students coming from legal systems growing out of the European continental legal tradition.

Winn’s effort, aimed at law students, is notable in three respects. First, at twenty-nine pages it fills an intermediate-length niche: longer than a typical class “handout’, but shorter than the various book length alternatives. Second, it covers case briefing, outlining and exam questions, demonstrating how the three are related. Third, it grew out of an attempt to test her teaching method empirically using random assignment to a control group. Both law students and legal educators should find it a useful contribution.

The 2015 ABA accreditation standards may provide a laboratory in which to test efforts such as Winn’s. Standard 302 now requires law schools to adopt learning outcomes that, under subsection (b), must include legal analysis and reading; Standard 314 requires law schools to provide students with both formative assessment (feedback) and summative assessments (final “grades”); under Standard 315 law schools must engage in “ongoing evaluation of the program of education, learning outcomes, and assessment methods”. At its best this combination of more intentionally articulated outcomes, feedback to students, and program evaluation could prompt law schools to evaluate the potential for greater efficiency and effectiveness in teaching legal reasoning. I remain hopeful that enough schools will approach this task rigorously and in good faith that at least some progress can be made.

*Winn’s illustrious predecessors include:

  • Leading Legal Realist Karl Llewelyn, whose The Bramble Bush: Classic Lectures on Law and Law School have been assigned to generations of law students;
  • University of Chicago Professor and President and U.S. Attorney General Edward H. Levi, author of An Introduction to Legal Reasoning, originally published in the University of Chicago Law Review and then in book form;
  • Critical Theorist and Harvard Professor Duncan Kennedy, who took the decidedly un-Harvard step of visiting at New England School of Law in his attempt to reach beyond elite students and sharpen his skill at teaching students about the “gaps, conflicts and ambiguities” that underlie the development of the common law. He shared his insights widely with former students moving into teaching careers. produced a short volume
  • My former colleagues Pierre Schlag and David Skover, who produced a short volume early in their careers that catalogued the Tactics of Legal Reasoning (1985).
  • Richard Michael Fischl and Jeremy Paul, Getting to Maybe: How to Excel on Law School Exams (1999)
  • Leading clinical teachers Albert J. Moore and David Binder, Demystifying The First Year of Law School: A Guide to the 1L Experience (2009)

In recent decades much of the heavy lifting in legal reasoning has devolved upon teachers of legal analysis, research and writing. Among the results is a burgeoning literature proposing variations on the syllogistic Issue-Rule-Analysis (or Application)-Conclusion approach to analyzing and writing about legal problems, as well as a variety of textbooks.


Unmasking Assumptions about Employment Outcomes and Legal Education

In an upcoming Wisconsin Law Review article, Robert Kuehn, Associate Dean for Clinical Education and Professor of Law at the Washington University Law School, presents a cogent, well-supported and thoughtful article describing the limitations of and lessons we can learn from the existing empirical analysis correlating student enrollment in clinical education and employment outcomes.  Kuehn’s article, entitled Measuring Legal Education’s Employment Outcomes is particularly powerful because it provides a thorough empirical rejection of the claim that clinical coursework might actually harm employment outcomes, as asserted by Professor Jason Yackee and which attracted some sound-bite attention earlier this year. In what is, perhaps,  an unexpected twist, Kuehn demonstrates that using Yackee’s statistical assumptions and methodology also would produce negative correlations for those students who participate on law journals or in moot court competitions.  Kuehn argues that one can’t draw any reliable conclusion from Yackee’s 2013 model, and perhaps not from any nationwide statistical model – as opposed to a particularized analysis of one school –  on the likely effect of clinical courses (or other activities like law journal or moot court) on employment, and surely not the negative effect Yackee posits. Kuehn points out that as to clinical coursework, the available evidence (through surveys) indicates that such experiences do aid some students in securing employment.

If you, like me, still become a bit nervous about how much you actually remember from undergraduate statistics courses, do not be alarmed by this post!  You will find Kuehn’s article accessible and a quick good read, even when he is using words like “regression analysis,” “granular data” and “variable choices.”   Here are the points made in Measuring Legal Education’s Employment Outcomes which I found most helpful:

  1. Kuehn’s reminder that when one confuses correlationwith causation one is bound to come up with a “misdiagnosis.” One problem with Yackee’s analysis is the lack of granular data to calculate the true employment rate for those who took a clinic (or who did not).  In fact, the data is so poor that “the results never account for more than half of the variability in employment across schools.”
  2. Kuehn’s explanation of the “confounding effect of prestige” and bar passage on employment outcomes.
  3. The problems of validity and reliability raised by analyses which employ information from ABA questionnaires, particularly those self-reports submitted prior to 2014.
  4. The fact that “13% of law schools” provide 80% of the school-funded jobs to law graduates. Not surprisingly, Kuehn found this factor biases many results if you examine nationwide statistics. And when Kuehn removes those jobs from the statistical analysis, Yackee’s correlation with clinical education falls apart even using his own assumptions and methodology.
  5. Yackee’s model yields completely different results if one uses the US News Lawyers/judges data versus academic peer data to control for the possible influence of perceived prestige.
  6. Application of Yackee’s model to “Law Journals” and “Skills Competition” and S. Newssub-groups also show no relationship to employment outcomes!
  7. In Yackee’s model, a better ranking is “strongly associated with improved employment outcomes.” However, Kuehn points out that a “closer examination of the relationship between rank and employment indicates that this positive association, although statistically significant when applied across the entire range of top 100 schools, does not hold true for schools ranked 51 through 100 (emphasis added).” 
  8. Kuehn’s documentation of employers who require, “strongly prefer” or identify law clinic experience as a positive factor in hiring such as The U.S. Department of Homeland, legal services and  legal aid offices, district attorney, public defender, fellowships and private law firms.
  9. Kuehn’s description of National Association of Law Placement (NALP) existing information: such as the  2011 survey of lawyers with non-profit and government offices;  the NALP survey of lawyers in firms of predominantly more than 100 attorneys; the NALP survey of public interest legal employers;  and the NALP 2013 presentation on the employment market reporting that ” law firms say they want new graduates to have ‘more experiential learning, client-based and simulation.”
  10. Kuehn provision of good information on other employer information such as the Lexis-Nexis WHITE PAPER: HIRING PARTNERS REVEAL NEW ATTORNEY READINESS FOR REAL WORLD PRACTICEProfessor Neil Hamilton’s employer survey to determine the relative importance of twenty-one different competencies in employer hiring decisions, and Professor Susan Wawrose’s legal employer focus groups which found employers prefer new hires with ” well developed professional or ‘soft skills” along with “strong fundamental practice skills.”

Professor Kuehn concludes by recommending that studies could best be done on a school-by-school basis by “surveying likely employers to find out what educational experiences of students are most valued.”  Professor Kuehn also recommends that schools could also “retrospectively look at various employment outcomes for graduates and any relationship” to students’ experiences while in school.

I agree with Professor Kuehn and am happy to report that  Albany Law School,  through its faculty Assessment committee and Admissions office,  is currently engaged in conducting employer focus groups and analyzing what best helps our students obtain employment in their desired career paths.  Until good data and information suggests otherwise, Professor Neil  Hamilton’s advice to law students,which Professor Kuehn quotes in his “must read” article, bears repeating:

In this challenging market for employment, a law student can differentiate herself from other graduates by demonstrating to legal employers that the student both understands the core competencies that legal employers and clients want and is implementing a plan to develop these competencies, including an ability to demonstrate that the student has experience with these competencies.

What’s going on in California? “TFARR- recommended” 15 credits of competency training

For those who did not closely follow the California State Bar debate on the requirement of 15 credits of competency training for bar admission (the work of the Task Force on Admissions Regulation Reform, or “TFARR”), I summarize the current status.  (Although I am currently co-prez of the Clinical Legal Education Association, known as CLEA, this post is not written with that hat on.)  This is my own thinking, albeit, informed by the excellent work of the CLEA Advocacy committee.

The TFARR process was two-staged, over a three year period, with opportunities for public comment throughout. CLEA  participated in that process and submitted five separate comments on the proposals that are available at http://www.cleaweb.org/advocacy under “Briefs and Other Advocacy” (documents 4-8).

In the end, TFARR recommended 15 credits of competency training which can be achieved in a variety of ways (in addition to how experiential credits can be earned under the new ABA regulations), and which include six credits of summer work. You can read the TFARR Phase II Final Report  at: http://www.calbar.ca.gov/AboutUs/PublicComment/Archives/2014PublicComment/201411.aspx

The process was complete in November, 2014, with final TFARR recommendations to the State Bar Board of Trustees (that responded to public comments) and unanimous adoption by the Board: http://board.calbar.ca.gov/Agenda.aspx?id=10891&tid=0&show=100008800&s=true#10013881 (agenda item 113). The TFARR Phase II FInal Report represents a compromise based on extensive input.

Lately, some confusion has arisen because of a letter posted to the AALS website authored by a non-standing committee of Deans.  The confusion arises because:

  1. Neither AALS nor this special Dean’s committee ever participated in the two stage TFARR process and so appear to be sort of “johnny come latelys, ” and
  2. The letter mistakenly focuses on an earlier draft of the final proposal failing to recognize the compromises already reached in the final proposal.

I understand that there are efforts underway to correct the confusion which makes me happy since the Deans’ letter is signed by two people whom I have long admired in a variety of contexts.

Other blogs are already exploring the 15 credit  proposal and its interesting and creative approach. For example,   “Kudos to California”  What do our readers think?

What Makes Your Subject Distinctive?

As law schools continue to develop their learning outcomes, an important question we all should consider is, “what makes my course distinctive?”  For example, in my research on assessment in legal research courses, I was struck by how much the analytical and problem solving skills developed by legal research instruction are the same as those developed by many other courses in the law school curriculum.  That led me to ask, “what makes legal research instruction distinctive?”  The answer was not simply, as an outsider might suggest, that legal research classes teach tools for finding law (digests, Westlaw, etc.).  Rather, I was struck that legal research instruction is distinctive in the extent to which an effective legal researcher must have an appreciation for the power of taxonomies, must exercise imagination in the context of realistic boundaries of time, cost, and purpose, must be able to ask for help, and must develop strong metacognitive practices (to continually question “is this process working?”).  The difference is of degree rather than kind of course, but it is a distinctive difference nonetheless.

Given the narrow focus of legal education, it seems that this question of distinctiveness or “value added” is the most critical question I can ask in planning my courses.  Not that the distinctive outcomes of my courses should be the sole, or even dominant outcomes.  Legal education outcomes require an iterative process and cross-curricular experiences for students to become competent and to enable transfer of learning to new settings.  Yet, understanding what makes my outcomes distinctive forces me to justify my outcomes and consider their connections with other law school outcomes.

So what makes my outcomes in Professional Responsibility distinctive?  Certainly the identity of the anticipated uses of the doctrine we are learning leads me to choose to emphasize professional identity formation outcomes as important if not distinctive.  In most law school courses, students are learning the law to serve others and are encouraged to use, interpret, and advocate about the law to achieve a client’s objectives.  In Professional Responsibility, the students will be using the law to advise themselves.  My outcomes include expecting that students will be able to clarify their observational standpoint when considering issues of professional ethics; recognize that self interest clouds judgment and ways to gain more objectivity; and differentiate the approaches to interpretation of law that one might use to advocate for a client regarding past conduct from approaches that are wise, ethical, and effective when interpreting the law to guide our own future conduct.  Finding effective methods to assess students development of these perspective is a challenge but I have found that simply asking students to read cases of attorney discipline and ask, “what went wrong with the attorney’s thinking?” is a good place to start.

What makes your course outcomes distinctive?  How has that led to distinctive assessment practices?

%d bloggers like this: