March 30th is Final Deadline for Proposals — Emory Law’s Conference on Teaching Transactional Law and Skills (June 1-2, 2018)

Hello All.

The end of the school year is fast approaching and we want to give you one last chance to submit a proposal demonstrating what you are doing to foster excellence in the teaching of transactional law and skills.

Therefore, we’ve extended the proposal deadline through Friday, March 30, 2018.

Please submit your proposals as soon as possible. After March 30th, we’ll turn all of the proposals over to our Program Committee, who will notify those accepted and begin putting together the Program Schedule.

Even if you do not submit a proposal, please register for the conference now.

We are reaching far and wide to embrace the whole community of transactional law and skills educators, so please encourage your colleagues – including new teachers and adjunct professors (both at reduced registration fees) – to join us. It will be a wonderful time to gather, talk, share, teach, learn, and celebrate.


Sue Payne, Executive Director




Re-assessing the “drive” to measure learning outcomes

A recent NY Times editorial, “The Misguided Drive to Measure ‘Learning Outcomes” by Molly Worthen, prompted revisiting the purpose-driven nature of what are labeled “educational” trends and this trend in particular – especially as it relates to legal education. Although quantifying “learning outcomes” has been pushed at the secondary and undergraduate levels for a while, it is now being required of legal educators.  As a lawyer and legal educator faced with the direction to set and measure learning outcomes, I have found myself conflicted in part because I find its limitations and the funneling nature of metrics incongruous with its language of education drawn from Bloom’s taxonomies.  There is also a current of politicization in the “drive.” In the end, the unique responsibilities of law school faculty as lawyers and in planning and implementing a school’s educational program is significant. Those roles inform the use and effectiveness of setting goals and actively paying attention to our effect on students – after getting past the lingo, of course.

At the outset, I found myself skeptical of the trend’s origins as an “educational” tool at all.  As the Worthen piece points out, although the push to quantify undergraduate learning is about a hundred years old, the real drive grew in the 1980’s. That decade’s impetus for quantification of teaching came during a time known for materialism, yuppies, the rapid growth of technology, and then President Reagan’s promise to return  prayer to school. Given this origin in an era of consumerism and growing talk of “liberalism” in higher education, it is readily inferrable that assessing learning objectives was the product of accreditors and the institutions they control with financial and conservative goals; see, David Clemens “Student Learning Outcomes and the Decline of American Education,” August 31, 2016, fostering assessment less as an educational goal and more as a means of control.

Moreover, the idea of “measuring outcomes” is inherently a business or economics concept or a concept more aptly used when discussing computers; it thus seems inapposite in legal education unless education is truly primarily a business or the language is adapted.  Outcomes, inputs, and outputs are terms used when major donors, for example, look for data-driven proof that their resources produce something.  Outputs are readily measureable.  Outcomes, the effect on people served that occurs because of the resource use, are much less readily measureable.  A donor then withdraws support when outputs don’t exceed inputs or outcomes are unmet.  Worthen notes that outcomes measurement was extended to undergraduate education because of accrediting agencies that demanded a form of learning assessment as a means of demonstrating fitness to receive federal student financial aid – a sort of stand-in for the donor as fund provider.  The telling part of the development of learning outcomes measurement is its push from employers.  Worthen wrote “Employers report repeatedly that many new graduates that they hire are not prepared to work, lacking the critical thinking, writing and problem-solving skills needed in today’s workplace” quoting then President Bush’s 2006 Commission on the Future of Higher Education’s Report.  I’m hard pressed to find teachers or professors who demanded development of assessments.

The direct pressure to produce learning outcomes in legal education comes from the ABA – likewise after major law firm employers similarly decried a lack of practice readiness in student graduates. Their complaints arose concurrent with a downturn in the economy that led those same employers to reduce their internal practice training.  With law schools pushed to the consumer, business model of delivering education, the slide into measuring outcomes has occurred despite criticism.

And, much of what is written about measuring learning outcomes other than by its developers is highly critical. Several early critical articles start from the position that significant aspects of education are simply not measurable because in part, education is a process, not a product a la Jerome Bruner. Thus, they argue, attempting to “measure” learning outcomes is inconsistent with good education. E.g.,  James Mckernan, Some Limitations of Outcome-Based Education. Journal of Curriculum and Supervision. Volume 8. 343-353 (1993).  That critique suggests measuring learning outcomes assumes knowledge can be broken down into “micro-outcomes” in disregard of the epistemology of knowledge (Mckernan) as theorized by some like Herbert Dreyfus to include background and experience that cannot be readily measured. Others criticize outcomes measurements for shifting the emphasis from learning to some kind of outcome at all, thereby ignoring the open-ended nature of education and encouraging rigidification of curriculum and perspective.  E.g., The Unhappiness Principle (November 29, 2012) . In a related way, still others have criticized the move toward measuring “learning outcomes” because a teacher’s success is dependent on whether students learn regardless of what students bring to the classroom referencing post cognitive learning theory and because targeting “outcomes” encourages teaching to the middle or bottom of a class to meet stated outcomes while ignoring other students.

Worthen notes as criticism the “obsession with testing that dominates primary education [has] invaded universities” pointing out a focus and resource shift to assessment rather than education.  Consistent with this concern about emphasizing testing and assessment over exploration, other authors have pointed to apprehension about how pressure on students to succeed according to measured external standards without cultivating personal morality has led to law student depression and cynicism. See Lawrence Krieger, Institutional Denial About the Dark Side of Law School, and Fresh Empirical Guidance for Constructively Breaking the Silence, 52 J. Legal. Educ. 112 (2002); see also Larry Natt Grantt and Benjamin Madison, Self-Directedness and Professional Formation: Connecting Two Critical Concepts in Legal Education (draft).

At least one critique focused not on educational soundness but more on the regulatory purpose of measuring assessments.  In a piece written for a conservative nonprofit institute, the author wrote “learning outcomes and assessment are not about education at all; they are about control.” David Clemens “Student Learning Outcomes and the Decline of American Education,” August 31, 2016  The author went on to note how political sides view outcomes differently though both sides see the outcomes as a tool for control.  “The right sees [learning outcomes] as a way to enforce professor accountability, increase ‘productivity,’ and get rid of bad teachers and junk courses.  The left sees [learning outcomes] as a golden opportunity to promote progressivism through ideological outcomes that students must internalize in order to pass.” The same author went on to note by way of example how selecting outcomes such as “build awareness of the history and context of diversity and social justice in [the State]” are “not the sort of thing” that “my conservative friends had in mind” for student learning outcomes.

Assuming in law schools we somehow adapt the economics terminology to legal education, measuring learning outcomes seems to be a policy that will continue for a time.  In legal education, of great significance, of course, is the responsibility of faculty for setting the program of education. Even revisioning learning goals as learning outcomes, the questions remain: in law school, what is meant by education, what can and what should be measured, and what is there about law and law practice that should form the basis of legal education whether measured.    In a short piece “What is Education? Insights from the World’s Greatest Minds,”, Marilyn Price-Mitchell, Ph.D. offers, among other things, the following about education:

            “The principle goal of education in the schools should be creating men and women who are capable of doing new things, not simply repeating what other generations have done.” Jean Piaget

            “Education is what remains after one has forgotten what one has learned in school.” Albert Einstein

            “Education is the most powerful weapon which you can use to change the world.” Nelson Mandela

            The law, unlike computers and math, is like the people who make it – reasoned but imperfect, biased, aspirational, short-sighted, flawed, reflective of society, and constantly changing; legal education goals should reflect those dynamics.  Although some schools have sought regional accreditation from the same entities that accredit undergraduate schools, legal education itself remains distinguishable from education at other institutions if only because it is governed by a professional set of standards unique to legal education.  The ABA Standard 302 obligates accredited law schools to establish minimum learning outcomes in the following areas: (a) Knowledge and understanding of substantive and procedural law; (b) Legal  analysis  and  reasoning,  legal  research,  problem-solving,  and  written  and  oral communication in the legal context; (c) Exercise of proper professional and ethical responsibilities to clients and the legal system; and (d) Other professional skills needed for competent and ethical participation as a member of the legal profession.  “Micro-outcomes” are not required, and “understanding” is one of the outcomes under the ABA.  It seems counter-intuitive and inconsistent with the goals of legal education that “micro-outcomes” set for other university degrees should somehow supersede ABA standards. For example, though “understanding” seems not to be an appropriate outcome under Middle States standards, it is clearly an outcome under ABA standards. Certainly for purposes of remaining ABA compliant, it seems appropriate to read general education standards in light of the specialized standards of a legal education if only to ensure bar passage and ethical participation as a member of our profession.

Not surprisingly then, beyond knowledge, analytical, and communication objectives, the ABA requires professional skills and ethical responsibilities goals aimed at both clients and the legal system we serve. Consistent with the idea that the law is about people, the people who make and the people who serve and are served by it, ABA Interpretation 302-1 further states that other law schools may determine other professional skills including, among others, “cultural competency.”  Thus, it would seem that while “build[ing] awareness of the history and context of diversity and social justice” may not have been what was initially in the minds of those who pushed for measuring learning outcomes as a way of “increasing productivity,” awareness of cultural differences and building cross-cultural competencies is nevertheless a specifically enumerated learning outcome that law schools may set. Any outcomes created under ABA standards are significant to the how students may come to learn not only the rules, but to view the law in context and in practice.

Despite the economics origin and terminology of “measuring” “outcomes,” and the difficulty in measuring some learning goals in law school – at least those goals in addition to passing the bar exam – perhaps as a colleague suggested, we might recognize the opening to address the context in which law is made and practiced more.  Consistent with our oath’s commitment to uphold the Constitution and ethical participation in our profession, decide what can be achieved in addition to knowledge and understanding of procedural law and practice skills.  Start with the aspirational; establish goals that begin with awareness that can be measured, and then hope for the enlightenment that comes from the experience after awareness all the while recognizing we may not need to measure everything.


Deep Dive into Experiential Education in U.S. Law Schools

In May 2017, Eduardo Capulong (Montana) moderated a lunch presentation at the AALS Clinic Directors Conference on the new experiential education requirement found in ABA Standard 303, which requires one or more experiential courses totaling at least six credit hours.  Standard 303 also explains that an experiential course must be a simulation course, a law clinic, or a field placement.

For my presentation, I prepared an abbreviated history of how the ABA Standards addressed experiential education over the past several decades.  What I found most interesting as I explored the history was the hostility to experiential education by some legal educators from the inception of university-affiliated law schools in the 1870s.   After the presentation, some clinic directors and associate deans for experiential education came up and said that they found the history interesting and that I should write something up.

Well, I responded to their suggestion and started digging more into the history of experiential education in U.S. law schools.  The more I dug, the more surprises I uncovered. While I was struck with the hostility toward experiential education in law schools, some of more surprising findings are:

–concurrent with the hostility toward experiential education was a preference for a law professoriate with little or no practice experience starting in the 1890s;

— the AALS membership requirements essentially served an accreditation function in the first half of the 20th Century and the AALS even applied to be accrediting agency for law schools in 1969;

— the ABA Standards did not require law schools’ educational programs to be designed to qualify its graduates for admission to the bar and to prepare them to participate effectively in the legal profession until 1993; and

— something that may not be new but is still surprising, when the ABA first required “substantial instruction” in professional skills in 2005, “substantial instruction” was interpreted to mean only one credit, and some law schools even resisted that minimal amount of experiential education!

I came to realize that the history of experiential education in law schools is primarily a history of some members of the bar, such as Robert MacCrate, and some legal educators pushing the ABA to adopt Standards to nudge law schools to require some experiential education of all law graduates.  Eventually, that digging into the history of experiential education has resulted in a recent draft article The Uneasy History of Experiential Education in U.S. Law Schools.  This article analyzes the history of the ABA involvement in legal education leading up to the first mention of experiential education in ABA accreditation standards. Next, the article traces the development of the experiential education requirement in the ABA accreditation standards, paying particular attention to how the experiential education requirement has evolved from something law schools should offer to something law schools must require. Finally, the article concludes with some suggestions for the future of experiential education in law schools.  If any of this sounds interesting, you may want to check out the article here.  Thank you!

“Will Law Schools See a ‘Trump Bump?'” A Law Student’s Perspective.

Professor Ray Brescia of Albany Law School wrote an enticing article on the possibility that law schools are experiencing an increase in applicants from prospective students in response to the 2016 election, and Trump’s victory. The link is below, I strongly encourage you all to give it a read.

While I personally envisioned attending law school and becoming an attorney from a young age, before the notion of a Trump America was conceived, nevertheless, my motivation to succeed spiked following the election.

I was lucky to begin law school in 2016, as it is a fascinating time to learn the law, procedure, and how the President and his administration’s actions may be in accordance or violation thereof.

I know I am not the only law student that feels this way. Regardless of which side of the political spectrum prospective and current law students reside, we are certainly seeing a time at which the motivation to make a difference is at a high. Students are taking action, whether its writing and calling legislators and senators, or forming political rallies to spread their word, or simply just beginning to engage in these conversations. Students are motived to be involved in the law and government on both the local and national level

I don’t know if the increase in law school applicants is a “Trump Bump,” but I do know that law students are responding to Trump, and want to be involved.


Self-Directed Growth

Cultivating self-directed growth in law students is among the most important roles that law schools have. Sure, knowledge of legal topics and analytical skills remains a priority. However, legal academia has increasingly learned that their students need more than that. A 2017 Survey of Law Schools stated learning outcomes demonstrated that a substantial number of schools included self-directed learning and professional development in the law schools learning outcomes. See, Learning Outcomes Database, on St. Thomas Law School Holloran Center website, at

Self-directedness is perhaps the foundational component on which students begin their growth as professionals. It begins with a commitment to evaluate oneself and accept evaluation of others on skills and competencies crucial to professional excellence. Dean Natt Gantt and I surveyed over 600 first-year law students at six different law schools of various sizes around the country. The results showed that a surprisingly large number, over forty percent, classified themselves in categories that conceded they were not self-directed. See . Because such surveys usually reflect elevated evaluations (due to social desirability bias), the true category of students who lack self-directed skills is likely far greater.

We were sufficiently convinced by our work (and that of Professors Neil Hamilton and Jerry Organ) of the need to make development of self-directedness a priority that we, with our faculty’s approval, began a mandatory first-year course that begins the process of self-directed growth. We partnered with our School of Psychology and Counseling to have that school administer both personality and vocational interest testing. Students then receive evaluations from that department and later are paired with a faculty coach who stays with the student through graduation. The student and professor discuss the student’s self-evaluation of her strengths and weaknesses, as well as a 360 degree evaluation including others who know the student. By the middle of a 1L’s spring semester, our students have developed a written plan for professional development. That plan includes steps the each will take to develop the competencies important to the legal profession. It also includes concrete plans for venturing into the legal world.

The interesting question now is how to measure students’ progress in self-directedness. The preparation of written plans will help in this process. By their second and third years, we hope that students will have learned to take ownership of the need to develop the competencies of an effective lawyer and to pursue opportunities proactively. The challenge for us and for others who have embraced learning outcomes that include self-directedness (and other professional identity competencies) is to generate reliable assessments to determine whether our programs have led to progress in the competencies. Fortunately, a group of law professors from many law schools are working together, with the support of St. Thomas’ Holloran Center, to create rubrics that should allow schools to assess the degree to which students have advanced in self-directedness and the other skills associated with an effective lawyer. Rubrics and assessment tools on self-directedness, as well as other skills associated with professional identity formation, should be available within a year. That may sound like a long time. However, we just celebrated last year the ten-year anniversary of the Carnegie Report. The movement

to make Carnegie’s “third apprenticeship” a reality in law schools is well underway and, one only hopes, will continue to gain momentum.

Help Students Help Themselves: Make Them Put Their Phones Away

I am leading a writing lockdown right now.   Am I locked in a room writing? Not quite.  But I am in a quiet room, with about 25 other people, mostly students, who are similarly focused on a piece of writing.  We are all hoping that at the end of these two hours, we will exit the room with an accomplished piece of work.  But with careful planning and thought, we can make it more likely that will happen, and not just a happy coincidence. One thing we can do to improve focus and concentration: put our phones away!

I’ve written here before about the challenges we and our students face as we navigate the distraction filled world.  I titled an earlier blog post and law review article which dealt with the negative affect of distraction on learning Teaching the “Smartphone Generation”How Cognitive Science Can Improve Learning in Law School      (  At the time that I wrote that law review article, I wanted to call it “Do Smartphones Make Us Dumber?”, but I was advised against that, hence the somewhat more academic and scholarly title. Since the writing of that article in 2013, more research has emerged confirming that our constant attention to phones negatively impacts our ability to pay attention. In fact, this summer, I was tickled to see this headline “Are Smartphones Making Us Stupid” (I guess I could have gone with my title).  My article argued that the constant shifting of attention between work and phone, to check email, text, check social media, etc. was not multitasking, as many believed, but rather, task switching, which negatively impacts mental efficiency.  The new article, which summarized a study recently published by the University of Texas at Austin, came to an even more distressing conclusion: “the mere presence of one’s own smartphone reduces available cognitive capacity”. The study’s authors found that cognitive capacity, that is, the brain’s ability to hold and process data, was significantly improved if his or her smartphone was in another room while taking a test to gauge attentional control and cognitive processes.  Even if the phone was turned off or put face down, the mere sight of one’s own phone seemed to induce “brain drain” by depleting finite cognitive resources.

So, back to the writing lockdown. Here’s how it works.  We invite students to sign up for a two-hour session and to bring a writing project.  We begin the session by encouraging students to put themselves in the best position to accomplish their writing goals.  First, we ask them to identify their goal, that is, to set their intention for what they will accomplish during the session. Second, we have students clear their physical space of any unwanted, unneeded, and potentially distracting material, including encouraging them to put their phones away. Not just turned down, but in a bag or someplace they cannot see it. I briefly explain why, referencing the recent study.  Third, we guide them in two minutes of deep breathing, helping them to prepare mentally for the work at hand. We suggest that distracting thoughts be jotted down so they will not be forgotten, but also need not be nagging at them while they are working.  Finally, we tell them to dive into the work.  This process, I have found, helps set the tone for a productive writing “lockdown”.

By the way, I did not bring my phone to the lockdown, and I accomplished my goal, too:  I wrote this post!

Stone Soup:  Do the Best Continuing Education Programs and Conference Sessions You Can

Don’t you hate it when presenters just talk at you for a whole program?

Adult learners generally do.

That’s why everyone suggests using interactive formats in which the audience regularly participates and doesn’t have just five minutes at the end to ask questions.

I’m sure that most readers of this blog who give presentations are keenly aware of this phenomenon and try to be as interactive as possible, sometimes asking the audience questions during the presentations.

This post describes the Stone Soup Project idea of using continuing education programs to produce and share knowledge about actual practice.

Enrich Your Programs

When your audiences consist of experienced practitioners, you can make your presentations shared learning experiences.  The presenters and members of the audience all can contribute valuable knowledge about actual practice.

As a presenter, you decide what ideas you want to convey.  You also can be strategic in planning questions to elicit things that you want to learn and that would be of interest to the audience.

One of the challenges in using educational programs to collect data is a tension between the goals of having speakers provide material to participants and gathering information from them.  Participants generally want to get information and ideas from the speakers and would be disappointed if the speakers skimp in their presentations.  On the other hand, experienced practitioners often want to share their experiences and learn from their colleagues’ experiences.  So the trick is to find a good balance of presenting and eliciting information.

If you want to use your program to elicit information from the audience, I suggest that you plan to make a record of the discussion and to distribute it after the program.  This preserves the ideas, which otherwise might fade in people’s memories.  It’s easy, doesn’t take a lot of time, and can create real value.

Record the Discussions

In this post, I described how I arranged to record and disseminate CLE presentations about lawyering with planned early negotiation.  I recruited a program organizer to take notes on a laptop, and I gave him this short document describing what to do.  Then I used his notes to write the blog post.

As an alternative, one could make an audio recording, though this approach has some potential problems.  The recording may not yield clear, audible language if the audience is widely dispersed in a large room.  Use of an audio recording also may trigger the need for an review by a faculty’s institutional review board (IRB) as there may be more confidentiality concerns about an audio recording.  By contrast, my instructions to the notetaker are to omit any identifying information and I told the audience that they could ask that their comments not be included in the notes.

Although a senior staffer at my school’s IRB told me that I didn’t need IRB review or approval, I followed the general principles of ethical research.  I produced this document to be given to participants when they checked into the program.  This includes the essence of informed consent documents without some of the Miranda-warning-type language.  I also described the process at the beginning of my presentation, as illustrated in my powerpoint slides.

Faculty using educational programs to collect and disseminate information might consult with their IRBs to determine what, if anything, they need to do to comply with any IRB requirements.

Distribute Insights from the Programs

After a presentation, you would prepare materials to distribute to the participants (and perhaps others).  I like to weave the notes into a short document similar to a magazine article or blog post in which I may add my comments and additional resources.  A simpler alternative is just to distribute unedited notes, though that may not be as useful for readers.

If you are presenting at a continuing education program, your host may arrange to email your summary to the participants and/or post it on its website.

If you present at a conference, you can circulate a sheet for people to provide their legible email addresses, which you can use to distribute the summary.

%d bloggers like this: