Addressing a problem of practice through research partnerships: How should you prepare students for online learning?
Jacqueline Zweig, PhD, senior research scientist at Education Development Center, conducts quantitative research in partnership with education agencies to provide new insights into online learning.
Erin Stafford, project director, facilitates partnerships between researchers and practitioners to address questions of policy and practice.
Prior to the COVID-19 global pandemic, online course taking had become widespread in K–12 and postsecondary education across the U.S. Then last spring, most schools and districts across the nation suddenly transitioned to various forms of remote learning to continue serving students during COVID-19. Now as uncertainty about in-person schooling continues this fall, it is anticipated that the use of online courses will expand. Preparing for this possibility, educators are actively seeking evidence-based guidance on online learning, only to discover that there is limited rigorous research in this area.
To provide educators with the guidance they need, it is key to establish more research partnerships among online programs, state and local education agencies, EdTech firms, and researchers. True collaborations—ones in which all parties benefit from partnering— can identify what works for which students and why through rigorous research that maximizes the abundance of data gathered through learning management systems.
As an example, our research team at Education Development Center (EDC) recently partnered with the Michigan Department of Education and Michigan Virtual to conduct a randomized controlled trial (RCT) to understand the impact of an orientation on subsequent online course completion rates for high school students taking a supplemental online course for the first time. Orientations are a frequently cited best practice and are even required for accreditation, but there is a lack of information about what constitutes an orientation or evidence of their effectiveness, particularly with high school students.
To advance understanding of what works to prepare students for online courses, we studied Michigan Virtual’s orientation, Strategies for Online Success (SOS). SOS is an asynchronous orientation to learning online and includes five components: a pre-assessment, three interactive modules, and a post-assessment. The goal is to prepare students for the transition from taking courses in-person to taking courses online.
Using data from the Michigan Department of Education and Michigan Virtual on 1,781 first-time online students, we found no significant differences in online course outcomes between students assigned SOS and those who had access to the typical supports, which included access to a unit on the basics of using the learning management system. For both groups, nearly 75% of students completed their online courses (that is, earned at least 60% of the course points), 10% dropped their course during the grace period, and 15% did not complete their course or dropped it after the grace period.
While there was no main effect of being assigned SOS on course outcomes, further analyses suggested that the timing of enrollment may be a factor in the impact of SOS on the likelihood that students drop during the grace period. One hypothesis is that the orientation screened out students who were both enrolled before the course start and who may not have felt prepared to take an online course. Guided by our study’s findings, Michigan Virtual saw there was no longer a need for students, particularly those who are enrolled after the start of the course, to wait until they are enrolled to understand what to expect when learning in an online environment. As a result, Michigan Virtual has made SOS available to all students on their website at any time rather than tying it to enrollment. This approach allows students to access the orientation, possibly in consultation with a mentor at their school and/or with a caregiver, prior to enrolling in their online course.
This study is one of only a handful of studies related to K-12 online learning to use an experimental design, and we would not have been able to conduct it without our partnership with the online program and state department of education. Our partners identified the research question and problem of practice for us to study. They needed the answer urgently to inform Michigan Virtual’s approach to onboarding students and understand how to support other online programs in the state. As a result of this collaboration, the research findings are more relevant and useful to the field and are informing practice immediately in Michigan.
In addition, Michigan Virtual and the Michigan Department of Education have been instrumental in sharing the findings with other online programs who may be considering how to orient students. At conferences like the Digital Learning Annual Conference, our presentations have sparked conversations among researcher and practitioner colleagues and highlighted policy and practice questions related to online learning that still need to be studied. A few of these questions follow:
Should an orientation be optional or mandatory?
What content is important for an orientation? Is an orientation focused on how to use the specific learning management system more effective than an orientation focused on skills for learning online?
What is the purpose of an orientation? Should it screen out students who are not prepared to learn online or should it prepare all students to be successful?
When are students enrolling in online courses? Do you need to differentiate students’ onboarding experiences based on when they enroll?
What supports are being provided throughout the course to help students be successful in online learning? Are they effective?
To answer these questions and other practical relevant questions, we believe that there need to be more partnerships between researchers, state and local education agencies, EdTech firms, and online learning programs. Together we can better understand students’ needs, study interventions, and make sense of data in the rapidly changing world of online learning.
This research was supported by the Institute of Education Sciences, U.S. Department of Education, through Grant R305L170008 awarded to Education Development Center (PI: Jacqueline Zweig). The opinions expressed are those of the authors and do not represent views of the Institute or the U.S. Department of Education. This research result used data collected and maintained by the Michigan Department of Education (MDE) and/or Michigan’s Center for Educational Performance and Information (CEPI). Results, information and opinions solely represent the analysis, information and opinions of the author(s) and are not endorsed by, or reflect the views or positions of, grantors, MDE and CEPI or any employee thereof.