“Mixing It Up: Interweaving Lecture/Lesson and Retrieval Practice for Better Test Results.”

A colleague recently shared a blog entry given to secondary school students on better studying.  The blog encouraged students to engage with the material and practice retrieval as part of learning[i] – not quite labeling this active learning.   That entry aimed at high-schoolers is grounded in the idea, supported by studies, that most students in high school and college spend most of their “study” time re-reading and re-highlighting material.[ii] The re-reading and highlighting give students the “illusion” of having mastered the material”[iii]  instead of better enabling students to actually learn material.  This entry is a brief pitch for still more varied and active studying through retrieval practice and some optimism that some secondary schools are encouraging more practice learning of their students thereby better equipping them to be our students.

Retrieval practice or “testing effect” is a positive learning effect that occurs when students are tested or required to retrieve information while they are studying.[iv]  Accepting for this entry’s purpose study conclusions that retrieval practice aides learning, when and maybe how to engage in that retrieval practice seem apt questions.

A recent study by Weinstein, Nunes, and Karpicke, “On the Placement of Practice Questions During Study,” (2016) [v] assessed whether practicing retrieval interspersed within a lesson or unit or waiting until the end of a lesson or unit to practice retrieval led to the greatest learning.  The study also compared those students who had practiced some retrieval questions with students who did not have any retrieval practice questions but merely re-studied material in both online and live-class settings.  The study used three experiments to assess student memory during the study phase in quizzes and then after a delay in a test[vi].

The results were interesting and varied depending whether and when students had practice questions and then were tested. The results seemed also to vary depending on whether the material was presented online so students could interact with the material or was presented in a classroom, although the authors did not discuss this distinction[vii].  Students given practice questions interspersed in a lesson scored higher on an immediate post-lesson quiz than students given practice questions at the end of the lesson but right before the quiz. The authors attribute this difference to the closeness of the questions to the material presented, among other things.[viii]  For all three experiments, student scores on a test given after a delay of at least a week were roughly the same, though, regardless of whether students had practice questions during or at the end of a lecture.  The time delay before testing seemed to level out any interspersed-question-advantage over the short term so that groups who had received practice questions during a lecture and those who had practiced at the end of a lesson had similar delayed test scores.  Good news for mixing-up when to give practice questions.

But, in all three experiments, students who received some practice questions – whether during a lecture or at the end of a lesson – performed significantly better on a delayed test than did students who had no practice questions, but simply “re-studied” material.  Students who had no practice questions where they were called upon to practice retrieval and apply information, but merely re-studied their material, scored lowest on delayed tests.[ix]

This consistency – students performing better short-term with questions interspersed during lectures and long-term similarly with practice questions either interspersed or at the end –  led the authors to conclude that practice questions aided in student learning and left for future study whether practice questions interspersed during a lecture coupled with practice questions also given at the end might have the greatest benefit on delayed tests.[x]

The Weinstein study results and the suggestions for improving retention and possibly learning are interesting not only because they posit study tools that are demonstrably more effective than re-reading or highlighting alone, but because those results demonstrate the fluidity in the cognitive domain taxonomy upon which many base teaching or learning exercises. (Bloom’s or Bloom’s, Anderson’s, & Krathwohl’s cognitive taxonomy). Briefly, the taxonomy is a hierarchical[xi] classification or ordering of cognitive skills. The cognitive domain taxonomy is most often, though not always, visualized through pyramids with knowledge (Bloom’s) or remembering (Anderson-Krathwohl) at the base.  That pyramid, though, is not exactly representative of learning – at least on the college level, as these referenced studies seem to reflect.  What can also be inferred from the studies on retrieval practice, is that learning occurs not based on having the knowledge or remembering alone, but happens more effectively when the other, maybe even all other, classifications (create, evaluate, analyze, apply, understand) are engaged and interwoven in students’ active practice.  Put a different way:  students who rely on flash-cards or outlines alone will likely be less successful on tests than those who are more engaged in their learning.

As an example of interwoven practice, the handout, “A Brief Guide” mentioned above, set out five ways of studying the authors, based on studies, suggest would be more effective ways of learning than repeating or re-reading.  Those study tools include retrieval practice, questions & answers, concrete examples, spaced practice, and interleaving. Although the guide is aimed at secondary school-level students, the tools will sound familiar to those studying teaching and learning theory at undergraduate and graduate levels. Not surprisingly, the guide suggests that students use all the learning tools and not just re-study.

The first tool, retrieval practice, (remembering) is defined more narrowly in the guide than in the studies.  That first tool encourages students to try to recall what they have learned from a chapter by, among other things, creating flashcards.  The key to this retrieval practice, of course, is that the flash card creator be the student and not someone else, because the preparation of the cards is the stronger retrieval practice, not the repetition or recitation from the card.

The second and third tools, questions & answers and creating concrete examples for abstract concepts, encourage students to ask questions about material – not mere recall questions here, but questions about how and why and then relate those to life. (evaluate, analyze, apply, understand) The guide encourages students to self-study which would require students to practice all this questioning on their own and may or may not allow for verification.  The Weinstein study, though, demonstrates the efficacy of including those same kinds of questions and hypotheticals both during a class and after class, and in any event, demonstrates the importance of some practice questions whether during or after class.[xii]

The last two study tools involve spacing practice and mixing it up.  Spacing practice means to do some studying every day rather than cram before a test – something most law students seem to learn eventually even if through trial and error.  Mixing-it-up or “interleaving” means students are to challenge their thinking by making retrieval more difficult through practicing different topics during a given study period.

So, what can we gain from these studies and the guide? It seems the old cliché, “practice makes perfect” or at least proficient, is still valid.

I have to add I found it heartening that a handout such as this is being offered at the secondary school level, perhaps better preparing students for graduate school study.  And, while many schools have finished or nearly finished final exams, graduating 3Ls have the bar exam to look forward to, so, there is still time to improve study patterns.

[i] “How Should Students Revise:  A Brief Guide” https://chronotopeblog.com/2018/05/05/how-should-students-revise-a-brief-guide/  .

[ii] Karpicke, Butler, and Roediger, “Metacognitive strategies in student learning: Do students practice retrieval when the study on their own?” (2009). http://learninglab.psych.purdue.edu/downloads/2009_Karpicke_Butler_Roediger.pdf (citations omitted) (surveying student study strategies including rereading notes – the largest percentage of students; doing practice problems and using flashcards – second largest group; rewriting notes & studying with a group – the third most students; with “memorise”, next, followed by making outlines, practicing self-recall, and thinking of real life examples as a very distant last place for student self-reported learning strategies).

[iii] Koriat & Bjork, “Illusions of Competence in Monitoring One’s Own Knowledge” https://bjorklab.psych.ucla.edu/wp-content/uploads/sites/13/2016/07/Koriat_RBjork_2005.pdf .

[iv] Id. These studies, which focus on how to study, inevitably rely on the premise that of material stored in short term memory roughly half is lost following twenty-four hours, and that more successful learning requires information to be stored in longer term memory and made easily accessible – sometimes known as “working memory.” Baddley A.D., & Hitch, G. “Working Memory.” In G.H. Bower (ed.) The Psychology of Learning and Motivation:  Advances in Research and theory. (Vol. 8, pp.47-89) (1974).

[v] Weinstein, Nunes, & Karpicke, “On the Placement of Practice Questions During Study,” 22, J. Experimental Psych: Applied No. 1 72-84 (2016) http://learninglab.psych.purdue.edu/downloads/2016_Weinstein_Nunes_Karpicke_JEPA.pdf  To be sure, these experiments did not involve the kinds of material law students must master, and the authors did posit a distinction between content with differing levels of complexity.  The authors used written questions requiring short answers as their measuring tool – they did not include in the study any kind of oral Q & A or hypothetical examination as may be more common in law school.

[vi] Id. There are factors the study remained focused by ruling out such things as diverse educational background of students; prior knowledge of subject matter; variation in question difficulty, and complexity of learned material.  The students were undergraduate students; the subject matter was always the same; the questions asked required short answer responses, and the questions were identical across the experiment.  The authors leave open the impact of information complexity and different question requirements as unstudied.  The authors also note the possibility of a psychologically positive motivation for students who have had practice questions and received feedback when those students took the delayed test.  The authors posited that students who have already practiced successfully may be more motivated to be successful on the delayed test than those who have not.  The authors leave psychological benefit to future study.

[vii] The material taught and tested included In-Text Citation requirements APA style.

[viii] Perhaps this proximity reflects the general sense that short term memory is strongest during the first twenty-four hours after learning.

[ix] Id.

[x] See Weinstein, supra n. v.

[xi]  Bloom’s taxonomies were originally stacked – meaning that to achieve the next level of cognition, the first level must be achieved. Stacking is emphasized less today.

[xii] One of the suggestions the authors included was that students were more successful when they had immediate feedback – after practice questions compared with the delayed test results where feedback came after grading.

%d bloggers like this: