Opinion: No straight line links remote learning to lower achievement

The average U.S. student missed half a year of learning in math due to the pandemic. "We’re not going to make up for half a year of learning with a few extra days of instruction or by providing tutors to 5% to 10% of kids," said Harvard's Thomas Kane. (Jim Wilson/The New York Times)

The average U.S. student missed half a year of learning in math due to the pandemic. "We’re not going to make up for half a year of learning with a few extra days of instruction or by providing tutors to 5% to 10% of kids," said Harvard's Thomas Kane. (Jim Wilson/The New York Times)

The recent release of national scores showing drops in math and reading sparked criticisms of how long school districts remained virtual during the pandemic.

Are those criticisms fair?

Overall, districts that delayed the return to in-person instruction on average saw greater declines on the National Assessment of Educational Progress, a benchmarking test given every two years. However, enough variations exist to cause education researchers to advise against sweeping indictments of remote learning.

“There is nothing in this data where we can draw a straight line between time spent in remote learning, in and of itself, and student achievement,” said Peggy Carr, commissioner of the National Center for Education Statistics, which, as the statistical wing of the U.S. Department of Education, administers NAEP.

“Even places that were in-person all year still, on average, had very large learning losses on the order of a third to half of a grade level,” said Stanford education professor Sean Reardon, a senior fellow at the Stanford Institute for Economic Policy Research.

Reardon and other Stanford researchers teamed up with the Center for Education Policy Research at Harvard University to create an Education Recovery Scorecard that overlays local and national test data and allows a district-level analysis of pandemic learning loss. They relied on state tests including the Georgia Milestones to measure district-level changes in academic skills, and the NAEP test served “as a kind of Rosetta stone that lets us put these changes on the same scale,” said Reardon.

The Education Recovery Scorecard shows Georgia students lost over four months of learning in math between 2019 and 2022 and nearly two months in reading. Some districts lost more in math. Newton County lost the equivalent of nearly eight months, while Clayton and Savannah-Chatham lost about seven months.

The academic decline patterns in Georgia and other states reveal an unequal impact of COVID-19 on low-income communities, which endured greater economic hardship. “Children in affluent districts didn’t emerge unscathed but were much less harmed by the pandemic than children in more disadvantaged communities,” said Reardon.

“A lot of things were happening that made it hard for kids to learn. One of them seems to be the extent to which schools were open or closed, but that’s only one among many factors,” said Reardon. For example, Florida and Texas largely remained open and experienced smaller declines than the national average in math, but so did some states that relied on virtual learning.

“There was definitely not a one-to-one correspondence between the share of the year schools were remote in a state and the magnitude of their losses,” said Thomas Kane, faculty director of Harvard’s Center for Education Policy Research. “One clear outlier was California, that had smaller losses than most other states, and yet they were remote the longest on average.”

What has yet to be evaluated are differences in the quality of remote instruction and the toll to on-site instruction from rolling school closings, quarantines and teacher and student absences.

Academic declines may be better seen as a symptom of a much larger constellation of ways COVID-19 set kids back. “If we had systematic measures of kids’ mental health and well-being and sense of safety in the world, we probably would see similarly sized declines across the board,” said Reardon.

Kane urged school leaders to consider these losses and the evidence on the efficacy of their intervention strategies. What many districts will find, he said, is that what they’re planning is insufficient to the challenge. “The average kid missed half a year of learning in math. We’re not going to make up for half a year of learning with a few extra days of instruction or by providing tutors to 5% to 10% of kids,” said Kane.

And districts should ratchet up their efforts now. “Don’t wait for spring 2023 state test results to come back to say kids are still way behind. Use these data to reconsider plans for spring and next summer,” said Kane.

Critics of standardized testing argue that NAEP declines are not a calamity and shouldn’t be treated as such. But a new study that Kane co-authored suggests a relationship between progress on NAEP and life outcomes. The study found that growth in eighth grade math on NAEP positively correlated with high school graduation, college enrollment, and life earnings from age 28.

“Before shrugging off the test score declines, parents and policymakers should look at the specific skills on which students lost ground,” said Kane. “If the headlines had been ‘8th graders less able to subtract integers’ or ‘Students less able to recognize when to multiply or divide,’ the stakes for the economy and for students’ lives would be more apparent.”