BLOOMINGTON – In a first-of-its-kind study, Indiana University researchers have created a new model for studying how particular teaching practices work in the classroom. The study not only answered questions about optimal timing for feedback on student assignments but expands the boundaries and methods of current research on educational practice.
In 2019-20, Ben Motz and Emily Fyfe carried out the first ManyClasses experiment to explore the question of optimal timing for feedback on student assignments. ManyClasses is a massive-scale approach that can ultimately inform more precise recommendations for what works — not just in an abstract “classroom,” but in a wide range of actual classrooms across a broad and varied educational landscape.
“ManyClasses is an educational experiment embedded in dozens of classrooms, spanning student populations, institutions, disciplines and course formats,” said Motz, a research scientist in the IU Bloomington College of Arts and Sciences’ Department of Psychological and Brain Sciences. “By emphasizing diversity across independent samples, the results of a ManyClasses study will help researchers and teachers infer whether a research finding can reliably generalize.”
The paper will be published online the week of July 12 in Advances in Methods and Practices in Psychological Science.
Drawing on tools of digital education that have become ubiquitous since the onset of the COVID-19 pandemic, the study enlisted over 2,081 student participants from 38 classes, approximately 17 disciplines and 15 campuses at five universities: the University of Minnesota, the University of Michigan, the University of Nebraska Lincoln, Penn State University and Indiana University. All the universities are members of Unizin, a consortium focused on advancing student success and providing platforms for digital learning and contributed learner data aggregated through the Unizin Data Platform.
For the ManyClasses study, participating teachers provided each student with immediate feedback on half of their class assignments and delayed feedback on the other half. The researchers collected the data on Canvas, an online learning management system used by the participating institutions.
Fyfe said that immediate feedback has widely been considered the more effective strategy. However, data from the ManyClasses experiment showed no difference between immediate or delayed feedback.
“The main conclusion of the study — to the great surprise of many teachers — is that there is no overall effect of feedback timing that spans all learning environments,” said Fyfe, an assistant professor in the Department of Psychological and Brain Sciences. “The findings should provide some comfort to teachers. If they take a few days to return feedback, there is no evidence that the delay will hamper their students’ progress, and in some cases, the delay might even be helpful.”
Beyond its specific findings, the ManyClasses study expands the boundaries and methods of current research on educational practices. One issue the approach helps to solve is the limitation of lab studies that don’t match what happens in actual classrooms or single-classroom studies that are not broadly applicable.
“Only a third of replication studies in education find evidence consistent with the original results, which casts doubt on how results can reliably generalize to other contexts,” Motz said. “ManyClasses offers new ways to gauge the efficacy of a single teaching practice across many classroom situations.”
Participating instructors agreed that the study was significant.
“The sheer size and scope of the study were really remarkable,” Penn State assistant teaching professor Melina Czymoniewicz-Klippel said. “The researchers did a wonderful job of recruiting instructors from a range of universities and disciplines, modes of delivery of classes so that they can gather that large sample size and tease apart some of the factors that may be influencing feedback in student learning outcomes.”
The research team is already preparing for the next ManyClasses study. To expand its reach and streamline the process, Motz is also developing an experimental research tool that integrates with the Canvas Learning Management System, called Terracotta. (He recently won a prize for developing an app called Boost that helps students manage their assignments.)
A prepublication release of the accepted manuscript is now posted to the open access preprint service, PsyArXiv. Additional study authors include Janelle Sherman and Robert L. Goldstone in the IU Department of Psychological and Brain Sciences; Joshua R. de Leeuw, Vassar College; and Paulo F. Carvalho, Carnegie Mellon University. Motz is also affiliated with the IU Pervasive Technology Institute.
This study was supported with supplemental funding from the Department of Psychological and Brain Sciences and the University Information Technology Services’ division of Learning Technologies at IU.
Information News at IU Bloomington