Explore

What if No Child Left Behind Worked and Nobody Realized it? Blame the Media.

The Fact-Check is The Seventy Four’s ongoing series that examines the ways in which journalists, politicians and leaders misuse or misinterpret education data and research. See our complete Fact-Check archive.
If the public knows one thing about No Child Left Behind (NCLB), it’s that it probably didn’t work. It’s a widely held belief; the problem is it’s not true.
“The public perception,” says Stanford professor Tom Dee who has researched the law, “seems to be that No Child Left Behind has failed, but the available research evidence suggests it led to meaningful — but not transformational — changes in school performance.”
Unfortunately, the media has largely not addressed the crucial question of NCLB’s impact on students, favoring instead traditional he-said, she-said style reporting  or offering crude statistical analysis of its legacy.
Take a recent Washington Post news story on NCLB. Reporter Lyndsey Layton explains the law’s impact: “[NCLB’s] goals were later seen as unrealistic, and the law had unintended consequences: Many schools squeezed out art, science and other subjects to focus on math and reading; cheating scandals erupted; and some states lowered standards so their students would appear more proficient.”
There were in fact unintended consequences of the law, but the Post story doesn’t get around to how NCLB did when it came to intended consequence, namely, improving student learning. No research is cited; no researchers are quoted.
When I raised this on Twitter, Layton pointed to the National Assessment of Educational Progress (NAEP), a widely regarded, low-stakes national assessment, saying student progress has been “jagged” and “incremental.” From this, she implied that NCLB hadn’t improved academic achievement.
Education researchers Morgan Polikoff, a USC professor, and Steve Glazerman, a senior fellow at the research group Mathematica, quickly jumped in to explain that it’s inappropriate to use data that way.
In an interview, Polikoff said that simply looking at trends in test score data does not allow for a counterfactual — meaning there’s no way of knowing what the scores would have looked like in the absence of NCLB. That’s why sophisticated analyses are necessary to isolate NCLB’s impact.
Dee also said that NCLB can’t be judged by looking at overall NAEP scores , in part because some states already had NCLB-style accountability before the federal law required it, while others didn’t.
Similarly Matt Di Carlo, a senior research fellow for the Albert Shanker Institute, has written extensively about why using simplistic test score data is misguided for judging individual policies.
Instead, researchers must carefully construct control and treatment groups to determine a policy’s impact. In the case of NCLB that means, for example, comparing Catholic schools (not subject to the law) to public schools; states with accountability policies pre-NCLB to those that didn’t; schools just below a cutoff for NCLB sanctions to those just above the cutoff.
Polikoff and Dee agree that high-quality research on NCLB tends to find small but real gains in student achievement, particularly in math. Dee says the law produced “meaningful, important, and cost-effective improvements.”
On Twitter, Layton expressed surprise that anyone would think that NCLB had led to improved student achievement.
She and the Washington Post are not alone in ignoring the evidence. When the Senate passed a revision to NCLB in July among The New York Times, The Wall Street Journal, The Los Angeles Times, The Associated Press, USA Today, and The Atlantic, not a single outlet cited research about its impacts or spoke to an academic expert.
Instead they all covered the conflict. Teachers unions said this; civil rights groups said that. Republicans want less federal involvement; Democrats want more.
There were a couple exceptions: The Washington Post’s Wonkblog cited and quoted Stanford professor Sean Reardon. Vox had a solid write-up. NPR put together a decent summary of the research literature in 2014.
This is not a new tendency for our press corps, where false balance and conflict often  trump substance. Matt Yglesias recently wrote in Vox about how the media is paying more attention to Hillary Clinton’s email debacle than Jeb Bush’s tax plan.
In education the same problem exists: the boxing match is covered, the research is not. It’s charters vs. traditional public schools, reformers vs. unions, opt-out-ers vs. accountability hawks. Empirical evidence rarely sees the light of day.
Polikoff recently sent around a new, particularly rigorous research paper, which found positive outcomes from NCLB, to a group of education journalists. Not one of them covered it, he says.
“Part of the problem is it’s not a sexy thing,” Polikoff explains.  
He’s right: Getting into the weeds of different ways to evaluate NCLB — ya’ know, interrupted time series, regression discontinuities — makes readers’ eyes glaze over. But that is part of a reporter’s job: to render complex subject matter understandable so the public can make informed decisions
After all, taking research, and all its complexity, seriously is necessary if we have any hope of expanding effective education policies.

Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

Republish This Article

We want our stories to be shared as widely as possible — for free.

Please view The 74's republishing terms.





On The 74 Today