In an upcoming Wisconsin Law Review article, Robert Kuehn, Associate Dean for Clinical Education and Professor of Law at the Washington University Law School, presents a cogent, well-supported and thoughtful article describing the limitations of and lessons we can learn from the existing empirical analysis correlating student enrollment in clinical education and employment outcomes. Kuehn’s article, entitled Measuring Legal Education’s Employment Outcomes is particularly powerful because it provides a thorough empirical rejection of the claim that clinical coursework might actually harm employment outcomes, as asserted by Professor Jason Yackee and which attracted some sound-bite attention earlier this year. In what is, perhaps, an unexpected twist, Kuehn demonstrates that using Yackee’s statistical assumptions and methodology also would produce negative correlations for those students who participate on law journals or in moot court competitions. Kuehn argues that one can’t draw any reliable conclusion from Yackee’s 2013 model, and perhaps not from any nationwide statistical model – as opposed to a particularized analysis of one school – on the likely effect of clinical courses (or other activities like law journal or moot court) on employment, and surely not the negative effect Yackee posits. Kuehn points out that as to clinical coursework, the available evidence (through surveys) indicates that such experiences do aid some students in securing employment.
If you, like me, still become a bit nervous about how much you actually remember from undergraduate statistics courses, do not be alarmed by this post! You will find Kuehn’s article accessible and a quick good read, even when he is using words like “regression analysis,” “granular data” and “variable choices.” Here are the points made in Measuring Legal Education’s Employment Outcomes which I found most helpful:
- Kuehn’s reminder that when one confuses correlationwith causation one is bound to come up with a “misdiagnosis.” One problem with Yackee’s analysis is the lack of granular data to calculate the true employment rate for those who took a clinic (or who did not). In fact, the data is so poor that “the results never account for more than half of the variability in employment across schools.”
- Kuehn’s explanation of the “confounding effect of prestige” and bar passage on employment outcomes.
- The problems of validity and reliability raised by analyses which employ information from ABA questionnaires, particularly those self-reports submitted prior to 2014.
- The fact that “13% of law schools” provide 80% of the school-funded jobs to law graduates. Not surprisingly, Kuehn found this factor biases many results if you examine nationwide statistics. And when Kuehn removes those jobs from the statistical analysis, Yackee’s correlation with clinical education falls apart even using his own assumptions and methodology.
- Yackee’s model yields completely different results if one uses the US News Lawyers/judges data versus academic peer data to control for the possible influence of perceived prestige.
- Application of Yackee’s model to “Law Journals” and “Skills Competition” and S. Newssub-groups also show no relationship to employment outcomes!
- In Yackee’s model, a better ranking is “strongly associated with improved employment outcomes.” However, Kuehn points out that a “closer examination of the relationship between rank and employment indicates that this positive association, although statistically significant when applied across the entire range of top 100 schools, does not hold true for schools ranked 51 through 100 (emphasis added).”
- Kuehn’s documentation of employers who require, “strongly prefer” or identify law clinic experience as a positive factor in hiring such as The U.S. Department of Homeland, legal services and legal aid offices, district attorney, public defender, fellowships and private law firms.
- Kuehn’s description of National Association of Law Placement (NALP) existing information: such as the 2011 survey of lawyers with non-profit and government offices; the NALP survey of lawyers in firms of predominantly more than 100 attorneys; the NALP survey of public interest legal employers; and the NALP 2013 presentation on the employment market reporting that ” law firms say they want new graduates to have ‘more experiential learning, client-based and simulation.”
- Kuehn provision of good information on other employer information such as the Lexis-Nexis WHITE PAPER: HIRING PARTNERS REVEAL NEW ATTORNEY READINESS FOR REAL WORLD PRACTICE, Professor Neil Hamilton’s employer survey to determine the relative importance of twenty-one different competencies in employer hiring decisions, and Professor Susan Wawrose’s legal employer focus groups which found employers prefer new hires with ” well developed professional or ‘soft skills” along with “strong fundamental practice skills.”
Professor Kuehn concludes by recommending that studies could best be done on a school-by-school basis by “surveying likely employers to find out what educational experiences of students are most valued.” Professor Kuehn also recommends that schools could also “retrospectively look at various employment outcomes for graduates and any relationship” to students’ experiences while in school.
I agree with Professor Kuehn and am happy to report that Albany Law School, through its faculty Assessment committee and Admissions office, is currently engaged in conducting employer focus groups and analyzing what best helps our students obtain employment in their desired career paths. Until good data and information suggests otherwise, Professor Neil Hamilton’s advice to law students,which Professor Kuehn quotes in his “must read” article, bears repeating:
In this challenging market for employment, a law student can differentiate herself from other graduates by demonstrating to legal employers that the student both understands the core competencies that legal employers and clients want and is implementing a plan to develop these competencies, including an ability to demonstrate that the student has experience with these competencies.
Filed under: Best Practices, Best Practices & Curriculum, Best Practices & Externships, Best Practices & Setting Goals, Best Practices and Clinics, Best Practices for Institutional Effectiveness, Catalysts For Change, International Initiatives and Models | Tagged: #reformlegaled, Assessment, Best Practices and Curriculum; Catalysts for Change, best practices for legal education, legal employment, outcomes | 1 Comment »