Nobel Winner’s Research Shows Home Nurse Visits for New Moms Boost Children’s Cognitive Skills

‘This Is About Power and Control’: Advocates Push Back on Weingarten for Linking School Choice to Segregation

Editorial Board Lauds Louisiana Superintendent John White — as State Forces Him to Work Month to Month

Ohio’s ESSA Reboot: State School Leaders Make Big Pivot After Outcry Over Testing Requirements

‘It Gave Us a Choice When We Didn’t Have One’: Private School Choice Participants Flood Capitol to Tell Their Stories

This Week in ESSA: Ohio, Montana, Oklahoma, New York Advance Plans as Group Promotes Evidence-Based Rules

Weingarten Speech Tying School Choice to Racism Sets Off Firestorm

House Committee Rejects Democrats’ Bid to Restore Education Funding, Protect Teacher Training

How’s This for a Yarn? School Bus Driver Crochets a Personalized Toy for Every Student on Her Route

A Phoenix Breakthrough: How 3 Massachusetts High Schools Are Helping Dropouts Become College-Bound Grads

Teacher Groups Frustrated With California ESSA Plan’s ‘Loose’ Definition of Ineffective Teachers

The 74 Interview: You Don’t Think Your Child Is Average & Harvard’s Todd Rose Doesn’t Either

Interactive: How Far Every State Has Gone to Update Education Policies Under the Every Student Succeeds Act

House Republicans Warn Education Dept. on ESSA Overreach as Democrats Lament Lack of Accountability Rules

More Attention to ELLs, Student Suspension, Fewer Test Days: NY Tweaks Its ESSA Plan

WATCH: These 100 HS Grads Made a Splash With Pomp, Circumstance — and a Jump in the Lake

12 Rhode Island Schools Vie for Chance to Become Their State’s 3 Personalized Learning Labs

Teacher Raises, Bathrooms, Vouchers: Texas Lawmakers Take Up Big School Fights in Special Legislative Session

Investigation: Forced Into Unneeded Remedial Classes, Some Community College Students Fail to Finish Degrees

A Summer Education Meltdown: Why Everyone in DC Is Mad About ESSA, Congress, Charters, Choice — or All of the Above

What if No Child Left Behind Worked and Nobody Realized it? Blame the Media.

September 30, 2015

Talking Points

No Child Left Behind increased student achievement, but nobody knows it because journalists won’t cover it.

The conventional wisdom on No Child Left Behind is wrong because the press ignored research

Sign Up for Our Newsletter

The Fact-Check is The Seventy Four’s ongoing series that examines the ways in which journalists, politicians and leaders misuse or misinterpret education data and research. See our complete Fact-Check archive.
If the public knows one thing about No Child Left Behind (NCLB), it’s that it probably didn’t work. It’s a widely held belief; the problem is it’s not true.
“The public perception,” says Stanford professor Tom Dee who has researched the law, “seems to be that No Child Left Behind has failed, but the available research evidence suggests it led to meaningful — but not transformational — changes in school performance.”
Unfortunately, the media has largely not addressed the crucial question of NCLB’s impact on students, favoring instead traditional he-said, she-said style reporting  or offering crude statistical analysis of its legacy.
Take a recent Washington Post news story on NCLB. Reporter Lyndsey Layton explains the law’s impact: “[NCLB’s] goals were later seen as unrealistic, and the law had unintended consequences: Many schools squeezed out art, science and other subjects to focus on math and reading; cheating scandals erupted; and some states lowered standards so their students would appear more proficient.”
There were in fact unintended consequences of the law, but the Post story doesn’t get around to how NCLB did when it came to intended consequence, namely, improving student learning. No research is cited; no researchers are quoted.
When I raised this on Twitter, Layton pointed to the National Assessment of Educational Progress (NAEP), a widely regarded, low-stakes national assessment, saying student progress has been “jagged” and “incremental.” From this, she implied that NCLB hadn’t improved academic achievement.
Education researchers Morgan Polikoff, a USC professor, and Steve Glazerman, a senior fellow at the research group Mathematica, quickly jumped in to explain that it’s inappropriate to use data that way.
In an interview, Polikoff said that simply looking at trends in test score data does not allow for a counterfactual — meaning there’s no way of knowing what the scores would have looked like in the absence of NCLB. That’s why sophisticated analyses are necessary to isolate NCLB’s impact.
Dee also said that NCLB can’t be judged by looking at overall NAEP scores , in part because some states already had NCLB-style accountability before the federal law required it, while others didn’t.
Similarly Matt Di Carlo, a senior research fellow for the Albert Shanker Institute, has written extensively about why using simplistic test score data is misguided for judging individual policies.
Instead, researchers must carefully construct control and treatment groups to determine a policy’s impact. In the case of NCLB that means, for example, comparing Catholic schools (not subject to the law) to public schools; states with accountability policies pre-NCLB to those that didn’t; schools just below a cutoff for NCLB sanctions to those just above the cutoff.
Polikoff and Dee agree that high-quality research on NCLB tends to find small but real gains in student achievement, particularly in math. Dee says the law produced “meaningful, important, and cost-effective improvements.”
On Twitter, Layton expressed surprise that anyone would think that NCLB had led to improved student achievement.
She and the Washington Post are not alone in ignoring the evidence. When the Senate passed a revision to NCLB in July among The New York Times, The Wall Street Journal, The Los Angeles Times, The Associated Press, USA Today, and The Atlantic, not a single outlet cited research about its impacts or spoke to an academic expert.
Instead they all covered the conflict. Teachers unions said this; civil rights groups said that. Republicans want less federal involvement; Democrats want more.
There were a couple exceptions: The Washington Post’s Wonkblog cited and quoted Stanford professor Sean Reardon. Vox had a solid write-up. NPR put together a decent summary of the research literature in 2014.
This is not a new tendency for our press corps, where false balance and conflict often  trump substance. Matt Yglesias recently wrote in Vox about how the media is paying more attention to Hillary Clinton’s email debacle than Jeb Bush’s tax plan.
In education the same problem exists: the boxing match is covered, the research is not. It’s charters vs. traditional public schools, reformers vs. unions, opt-out-ers vs. accountability hawks. Empirical evidence rarely sees the light of day.
Polikoff recently sent around a new, particularly rigorous research paper, which found positive outcomes from NCLB, to a group of education journalists. Not one of them covered it, he says.
“Part of the problem is it’s not a sexy thing,” Polikoff explains.  
He’s right: Getting into the weeds of different ways to evaluate NCLB — ya’ know, interrupted time series, regression discontinuities — makes readers’ eyes glaze over. But that is part of a reporter’s job: to render complex subject matter understandable so the public can make informed decisions
After all, taking research, and all its complexity, seriously is necessary if we have any hope of expanding effective education policies.