The two scariest words in all of education journalism are “research shows.”
writers ignore research altogether, many education journalists quite rightly want to bring empirical evidence to an education debate that is frequently based on intuition and anecdote. But often “research shows” means no such thing, and is actually more like “one high-profile study that I can easily recall shows.”
Here’s a case study: We are now solemnly told that “research shows” incentive pay for teachers “doesn’t work.”
A beginning-of-the-school-year article
from Chalkbeat – Colorado claims, “Research has shown that paying teachers more money to stay at schools with difficult working conditions largely hasn’t worked.” No citation is offered.
Writing on the same topic in the Los Angeles Times, journalist Kristin Rizga claims
, “Politicians have tried to solve the country's teacher attrition problem by giving bonuses to teachers working in high-poverty schools or tying pay to standardized test scores. But recent research
shows that such approaches have been mostly unsuccessful.”1
The tone in both cases is matter-of-fact. It’s what research shows! What — do you not believe in science?
But there’s a big problem with these specific statements: research on the topic is so much more complicated. There are many rigorous studies finding positive benefits of performance pay for teachers or bonuses for those working in hard-to-staff areas:
in Tennessee found that teacher retention improved among those who received a performance bonus; so did a study
in Washington, D.C. showed that teachers raised their performance in response to performance incentives.
of Austin’s performance pay system found gains to student achievement; so did a study
of Minnesota's performance-based pay and professional development system.
Two studies in Israel found short-
gains to students of teachers who were received performance bonuses.
of several unnamed school districts found gains to student achievement due to a program that paid high-performing teachers bonuses for transferring to high-poverty schools.
of a North Carolina program that gave small bonuses to teachers in math, science, and special education found that it increased teacher retention.
A randomized study
of schools utilizing performance pay across the country found small, statistically significant gains in reading achievement.
of districts across the country found that those using a performance pay system attracted more academically able teachers — as measured by SAT scores — than districts not using performance pay.
Now, we can and should have an extended debate about what these different studies mean. Were their research designs appropriate? Did they miss any unintended consequences of incentive pay? Would money spent on such programs be better used elsewhere? What about some of the conflicting evidence on performance
These are discussions well worth having, but clearly the simple-minded view that incentive pay doesn’t work is misguided and one-sentence dismissals of its effectiveness lacking citation are misleading.
Even when journalists delve more deeply, they often fall back too easily on the safe and ambiguous standby “the research is mixed.” For example, a story
in EdSource concluded that the impact of high school exit exams on students was “uncertain.” But in fact, research is fairly overwhelming
that such exams have no positive effect
on student achievement, and lead to a host of negative consequences for vulnerable students, including higher
rates of incarceration. But one wouldn’t know this from EdSource’s article.
Journalists understandably can’t go into a nuanced review of research on every topic in the middle of a news story. But there are some best practices that all of us writers ought to follow:
Whenever saying research shows, always cite, ya know, research (or at least a summary of specific research). Sometimes we are told what research shows with no citation, or with a link to another article that doesn’t clearly state what research is being referred to.
Be wary of basing a summation of research on just one or two studies. Instead, consider multiple studies from different sources or look for meta-analyses, which analyze a topic based on many pieces of research. At the very least, when focusing on one study, emphasize its limitations.
Try to find time to write a standalone article on the research regarding important topics. It will always be difficult to summarize complex research on complex topics in a couple paragraphs or a sentence. Instead, it’s crucial that empirical evidence on important topics — think charter schools, teacher evaluation, differential compensation, class size — is thoroughly discussed. (At The Seventy Four we’ve tried to do just that with our growing set of flashcards
.) A bonus is that all future stories on the topic can link back to a careful research review.
It’s terrific that more and more journalists want to use evidence to inform their writing. But a little bit of research poorly used can actually be worse for the reader than no research at all.
So let’s use research, but use it well.
1. The “recent research” Rizga cites is actually an article in The Atlantic claiming, “Little research supports combat pay as an effective tool,” citing a 2011 research brief from the Center For American Progress. This brief points to yet another research review finding that some incentive pay programs have been cancelled or had attracted little interest, but does acknowledge that “studies show that teachers respond to wages in their decisions to enter and remain teaching.” Not one item in this research bridge to nowhere is a rigorous study of the effects of incentive pay on teacher retention or student achievement. (Return to story)