This is what journalistic malpractice looks like
By Raymond Rose and John Watson
Raymond Rose has been involved with online learning for over two decades. He was part of the team that created the first virtual high school in the US. He is currently focused on digital accessibility issues.
COVID-19 cases are spiking, and schools are again shifting to remote learning—if they had gone back to onsite attendance. For many district leaders, it probably feels like Groundhog Day as they are revisiting so many of the issues that first arose last spring, and again in the early fall months.
Here’s what’s not helping: shoddy analysis that passes for a serious examination of remote learning issues. The fact that this example comes from a senior associate partner at Bellwether Education Partners is bad enough. The likelihood that the poor analysis makes district leaders’ challenges all the more difficult is inexcusable.
Writing at The 74, Chad Alderman examined “How Much Learning Time Are Students Getting?” and found the answer to be: “In 7 Large School Districts, Less Than Normal — and in 3, They’re Getting More.”
Some of the text is far more damning than the headline. For example, “Through its remote learning schedule, I calculated that Fairfax [VA] was offering less than half of a typical school year to my first-grade son.”
Sounds bad, right? It would, if 1) you don’t read closely, or 2) you have no idea how online learning actually works.
Why? Because his analysis is based on the wrong data, as he explains:
“I looked for sample schedules in 10 of the country’s largest school districts for students in fifth, eighth and 11th grades, to get a range of ages. These were not all easy to find. Some districts provided clear communications about what they might be able to expect for their children, while others buried the information in teacher labor agreements. In a few cases, mostly in the South, the districts did not provide a revised schedule because there was no change from prior years. From those schedules I could find, I compared the number of hours of live instruction the district planned for each group of students with the state’s requirement for the amount of school time children should receive in a normal year…seven of the 10 districts — Los Angeles; Clark County, Nevada; Wake County, North Carolina; New York City; Montgomery Country, Maryland; Fairfax; and Chicago — are all planning to deliver far less instructional time to students than normal.” (emphasis added)
Many of our readers will immediately spot the problem with this analysis. “Live instruction” is only one form of online instruction, and basing an analysis of total instructional time on live instruction only makes no sense.
For experienced online schools and course providers, far more instruction is provided via asynchronous methods including videos, text, discussion boards, and other materials, than via live video. The ratio of synchronous to asynchronous differs based on many variables, including grade level, academic topic, and school/teacher preferences. But it is clear that mainstream districts have used a far higher amount of live-streamed video than experienced online educators use, and there’s good reason to think that an over-reliance on live video is sub-optimal.
Many district leaders understand this, and have been transitioning to more use of learning management systems, online content, and other diverse online instructional strategies, and less use of live video. We have heard from several leaders who told us that they are challenged, in part, by parents, school boards, and state policymakers who equate remote learning with live streaming video. District leaders not only have to figure out the best mix of instruction; they also have to explain their choices—often to a skeptical audience. It is also incumbent on the online learning field to help make clear what high quality online learning looks like.
These jobs are hard enough already, without shoddy analysis from a generally respected source making things more difficult.