How large are public education enrollment declines?
There’s been quite a bit of attention to the declines in public school enrollments. Some of this is specific to certain districts, but more interesting are the enrollment declines at a national level—because these might have significant effects on policy and practice.
There’s little question that enrollments are declining. But the difference between some of what’s been published, compared to other data sources, is large. At the upper end estimates the decline is going to be hugely impactful, forcing major changes on districts including reduced teachers and staff through layoffs and/or attrition, possible school closures, and likely a shift to districts more actively seeking to increase enrollments via new schools and programs. At the lower end estimates of declines, the pressure would likely be small enough, and over a long enough time period, such that the decline per se would not be a major driver of change.
So which is it, large or small?
A major data source in the “large” camp is a series of reports from Tyton Partners, with funding from the Walton Family Foundation. These are both prominent organizations, and I’ve seen and heard these data points brought up several times.
From “Part 1” of the School Disrupted series:
“Tyton’s analysis reveals that traditional district public schools have seen a significant decline in enrollment since 2019, a trend which is accelerating. We estimate a nine percent (9%) decrease in district public school enrollment from Spring 2021 to Spring 2022 – a decline of more than four million students.” (emphasis added)
But the paragraph above actually understates the larger story in the report, because it is followed by this graphic which shows a 14.6% decrease in current enrollment compared to pre-pandemic. (To their credit, Tyton makes the data sources quite clear: “Pre-COVID-19 figures are 2020 NCES estimates; Fall 2020, Spring 2021, and Spring 2022 enrollment are estimated
from proprietary survey data.”)
The report goes on to explain why this is a big deal:
“Such a decline in district public school enrollment has important consequences for all K-12 education. Lower student enrollment decreases state and federal funding for public schools while most school costs remain constant. Reduced revenue from serving fewer students may result in fewer or lower quality learning spaces and environments, which snowballs into more dissatisfaction with public schools and continued decline in enrollment. And, declining public school enrollment increases pressure on charter and private schools to support an influx of new students. These schools may not yet be prepared to serve more students, particularly those with special needs.”
(Note that the idea that lower enrollment would lead to lower funding of traditional public schools is a reasonable assumption, but not certain. Some states have bolstered funding for districts with declining enrollment, and might do so across the board to maintain overall public education funding at levels similar to current.)
The study goes on to estimate enrollment increases in charter, private, and homeschool sectors:
As the real bloggers say, this is big, if true. In our digital learning space, charter schools enroll a disproportionate number of students compared to traditional public schools. If more students leave traditional public districts for charter schools, more students are likely to find their way into online schools. In addition, homeschool is often conflated with students learning from home while enrolled in an online charter school, possibly pushing online student numbers even higher. Those pressures in turn might lead traditional districts to start, or better support their existing, online/hybrid schools.
But is it accurate? Other sources suggest that the decline is far smaller. The American Enterprise Institute recently released a study looking at effects of remote learning on the decline in student enrollment, which included this graphic:
The key here, for our purposes, isn’t that enrollment declines were larger in the “most remote” category (although that in and of itself is useful data). For our purposes, the main point is that none of these categories come anywhere close to a 9% (or 14.6% for that matter) decrease. The largest appears to be about 5%, the middle category a little over 2%, and the smallest category just higher than 0%. It doesn’t seem that we can simply average those numbers because the category sizes are not equal, but still these numbers appear to suggest a decline of no more than 3.5%— and probably less.
Another source, the Burbio enrollment tracker, also doesn’t show similar huge declines. As with the AIE data, it’s not a direct comparison, but if a 9-15% drop was occurring it would be showing up in some form or fashion there.
NCES itself also doesn’t seem to back up these numbers either, in its fast facts or overall enrollment data sets. These are not as up to date as the Tyton numbers, and the data get confusing because the NCES numbers include charter schools, so to compare accurately you must back the charter school enrollments out.
It’s hard to say what’s going on. A couple of possibilities are:
Tyton researchers have found an important, very large, declining enrollment trend in their surveys, which will show up elsewhere when those most-slowly-researched data sources have more up to date numbers. In this scenario we are undergoing a seismic shift that will manifest in many changes to all of public education and to digital learning.
Tyton’s numbers are inaccurate, and while an enrollment drop exists, it is much smaller than they are reporting.
If any readers see a flaw in my analysis, or have additional information, please let me know! (info@evergreenedgroup.com)
In the meantime, I would suggest that anyone referencing the Tyton numbers should acknowledge that those data appear to be in conflict with other reputable sources.