Analysis

Crean Davis: In Building an Evidence Base for Research, Start Small — & Follow the Trail of Breadcrumbs Toward Larger Answers

By Allison Crean Davis | January 17, 2019

We’re in a window of time when educators, nonprofits, and researchers are talking shop about methodology, statistical analyses, and effect sizes. Federal grant competitions, several of which are open right now, often start this flurry of activity by seeking applications for funding that support various kinds of educational programming that are accompanied by rigorous evaluations. But as important as randomized control trials and quasi-experimental designs are to forming a body of evidence, they are just the tip of the evidence pyramid. The foundation is where an evidence base begins, and its construction is less about pillars of rigor than it is about following the breadcrumbs.

That’s right: breadcrumbs. Hansel and Gretel dropped them in the forest to provide a path back home. Developers use them to help users navigate applications and websites. And people, in general, use them to guide day-to-day decisions. Breadcrumbs, whether literal or metaphorical, are a series of connected pieces of information that show us where we are in relation to a desired destination and suggest a path forward. As someone who consumes and conducts research, I consider data points my breadcrumbs.

Data points form patterns, patterns become trails, trails lead to observations, observations guide our thinking, and, ideally, our thinking leads to a series of decisions that are better because they are informed by evidence. In the education sector, this translates to better policy, better practices, smarter use of limited resources, and more refined questions for further research. This is our destination. But the destination is elusive, because it takes time — and some meandering along the trail of breadcrumbs — for a coherent body of research to develop.

That’s why individual studies can be disorienting: Sometimes they are inconclusive, inconsistent with other findings, or not quite addressing educators’ questions. Take the question of teacher professional development: Districts invest significant time and dollars in training and support for teachers to hone their practice. But are these investments worthwhile?

Different research approaches point to different findings. On the one hand, when looking broadly at the relationship between a range of professional development approaches and teacher performance, the evidence suggests they are not worthwhile. But some experts question whether that research is confounded by poor measures or a lack of attention to the format, focus, and quality of professional development. On the other hand, when narrowing in on particular professional development tactics, like content-specific coaching, other research says they are worthwhile. Still, this research acknowledges the need to better understand the features of effective coaching and whether these can be applied at scale.

This lack of consistency in the research signals an evidence base that is still building: Researchers are dropping breadcrumbs, but these do not yet point to a conclusive destination — or answer. It means we need to seek out more evidence, replicate studies to see if results are reliable across settings, ask more refined questions, and look for new ways of studying the issue. It may also mean we need a tolerance for research that is exploratory in nature, not necessarily giving us definitive answers but pointing us in new directions. This kind of research is rarely what competitive grant funding rewards, but there are other reasons to conduct it.

Joseph Wholey, an influential scholar and thought leader in the evaluation field, describes how exploratory research can help guide short-term program decisions and inform the broader research agenda. Exploratory studies, often limited in scale and number of data sources (and hence relatively inexpensive), are better at suggesting patterns and potential directions of inquiry than they are at determining causation and arriving at firm conclusions. Nonetheless, they can point toward topics worthy of further inquiry and suggest how resources for research can best be leveraged. This is consistent with an embrace of evidence under the Every Student Succeeds Act that “demonstrates a rationale” for a program, approach, or intervention, even if it has yet to demonstrate causality. It is an entry point for an evidence base that has yet to be built.

That’s why when BELL (Building Educated Leaders for Life), a national nonprofit, had a hunch that its unique approach to professional development was having a positive impact on teacher practices, it asked us at Bellwether Education Partners to investigate it through a rapid-turnaround exploratory evaluation. Of note, while most professional development for educators happens during the traditional academic year, BELL works with teachers during the summer. BELL’s version of professional development presents an alternative to what has been studied in the past: rigorous training, coaching, and instructional supports for teachers that are offered between the end of one academic year and the beginning of the next.

Within months of starting our study, BELL had some encouraging data: Not only did teachers describe a transformative effect on their practices during the summer program, but 100 percent reported transferring new instructional practices back to their schools and classrooms during the full academic year. Naturally, these findings must be caveated: The sample size was small, the data were limited to self-reports, and the trend has not been replicated. But the exploratory study did its job: It produced some enticing breadcrumbs, suggesting a trail that can be explored more fully in the future.

BELL is planning a more extensive research agenda on its professional development approach, aiming for deeper understanding of its influence on teacher practices during and beyond the summer. This will bolster the evidence and guide its organizational learning, and it may contribute new observations to strengthen the larger evidence base on the impact of teacher professional development.

The moral of this story is to let the story build. No single breadcrumb can lead to an answer. But it may lead to the next breadcrumb, and the next one after that, and as a pattern emerges, the destination becomes clearer. Early in a research agenda, it is wise to consider hunches and then find a way to get evidence on them. Early, data-driven exploration may pay off big-time in building the trail to big and audacious answers.

Allison Crean Davis is a partner with Bellwether Education Partners, focusing on issues related to evaluation and planning, predictive analytics, extended learning opportunities, and Native American education. Bellwether was co-founded by Andrew Rotherham, who sits on The 74’s board of directors.

Related

Sign up for The 74’s newsletter

Submit a Letter to the Editor