Exclusive: Teachers Union Document Reveals Master Plan for Unionizing Charter School Networks

Whitmire: In the State That Created High-Performing Charter Networks, College Success Is Lagging Behind Others

Rafal-Baer: In Education, Preparing Next Generation of Leaders Shouldn’t Be a Revolutionary Idea

Smith: 10 Lessons From Rocketship Education’s First Decade as a Pioneer of K-5 Personalized Learning

Oreopoulos: No Diploma Without a Plan for the Future? Why Chicago’s New Graduation Requirement Might Work

Williams: How a Tougher Test and Chaos in D.C. Just Made Things a Whole Lot Harder for Kids Learning English

Analysis: Teachers Union Adds 40,000 Offshore Members While Labor Rolls Stagnate at Home

Quality Early Learning Programs Are a Key to Future Success. Why Don’t States Put Them in Their ESSA Plans?

Bradford: A Free Education System Bought and Sold on the Housing Market, as It Was Intended to Be

Litt: Why Kids in Low-Performing Schools Are Set to Lose Big Under California’s Current ESSA Plan

Reality Check: Before Smartphones Ruined Teenagers, It Was Video Games! And TV! And Elvis!

Lake & Tuchman: Disability Rights Advocates Are Fighting the Wrong Fight on School Choice

Anello: Why the NAACP Should Look Beyond Misleading Narratives & Work With Charters to Lift Up Black Students

Analysis: How OER Is Boosting School Performance and Equity From the Suburbs to the Arctic

Analysis: Which Bothers Randi Weingarten More — Segregation or School Choice?

Howard Fuller: Advancement — the Second ‘A’ in NAACP — Should Apply to Our Children Too

Korman & Rotherham: You Can Help Schools and Social Service Agencies Collaborate Better for Students

Sigmund: In New York’s Schools, a Serious Transparency Problem When It Comes to Student Data

Bradford: For Black Families Focused on Education, the NAACP Just Committed ‘the Worst Kind of Betrayal’

Mesecar: 4 Ways Tennessee Is Prioritizing Personalized Learning in Its New ESSA Plan

DeArmond & Gross: It’s Time to Help Teachers Generate and Use Their Own Evidence on Digital Tools

May 2, 2017

Michael DeArmond
Michael DeArmond

Michael DeArmond is a senior research analyst at the Center on Reinventing Public Education, specializing in educational governance, bureaucratic reform, and policy implementation.

Michael DeArmond is a senior research analyst at the Center on Reinventing Public Education, specializing in educational governance, bureaucratic reform, and policy implementation.
Betheny Gross
Betheny Gross

Betheny Gross is a senior research analyst and research director at the Center on Reinventing Public Education (CRPE) and affiliate faculty at the School of Interdisciplinary Arts and Sciences at the University of Washington Bothell. She coordinates CRPE’s quantitative research initiatives, including analysis of portfolio districts, charter schools, and emerging teacher evaluation policies.

Betheny Gross is a senior research analyst and research director at the Center on Reinventing Public Education (CRPE) and affiliate faculty at the School of Interdisciplinary Arts and Sciences at the University of Washington Bothell. She coordinates CRPE’s quantitative research initiatives, including analysis of portfolio districts, charter schools, and emerging teacher evaluation policies.
Talking Points

New @DeArmondMM & @bethenygross analysis: Empowering teachers to generate their own evidence on digital tools

Sign Up for Our Newsletter

This essay, part seven in an ongoing series, previously appeared at The Lens, the Center on Reinventing Public Education’s blog at the University of Washington Bothell. Here are essays that have been previously published at The 74:
Lake: Are We Personalizing Learning for the Students Who Need It Most?

“There are so many digital resources out there, I am lost as to which ones are good.
I usually try things that some of the more technology-knowledgeable people I teach with [use].”
From “Teachers Know Best,” Bill & Melinda Gates Foundation, 2015, page 21

Teachers in the personalized learning (PL) schools we visit are using a wide range of digital tools — sometimes picking up and dropping them at a rapid clip — but their decisions about which tools to use generally aren’t guided by systematic evidence. Instead, teachers tend to rely on their colleagues for advice. That’s understandable, but it means that teachers have little assurance of a product’s effectiveness, and students tell us they feel like guinea pigs as teachers cycle through different tools.

Knowing that teachers will turn to their professional networks for advice, an urgent question for the field is: Are there ways to enrich those networks with more systematic evidence on the quality and impact of digital tools?

Since teachers are far less persuaded by research studies on interventions than by their colleagues’ own experiences with interventions, the answer probably doesn’t lie in creating a new, massive clearinghouse of products or research studies.

First, some of these clearinghouses already exist. Consumer Reports–style websites like EdSurge and Common Sense Education, for example, cover thousands of technology products for a wide range of subjects and grades. Though organizations can, like EdSurge, provide a “concierge” service to help schools and districts find digital tools, many educators won’t have access to such supports, nor will they have the time to pore over multiple websites to find research-based tools.

Second, the K-12 educational technology marketplace is massive and growing at rapid-fire pace. Stacey Childress, CEO of NewSchools Venture Fund, recently wrote that investments in K-12 technology companies ballooned from roughly $91 million in 2009 to $643 million in 2014. The research community just can’t keep up with this dramatic expansion of companies and digital offerings. Products without any research (much less rigorous research) will continue to be available to teachers and find their way into classrooms.

Third, even if rigorous research happens, if it regards digital tools as a treatment (akin to a pill), it may overlook a critical factor: how any technology’s effects depend on the interaction between technology, teachers, pedagogy, and the context in which it is all happening. Even if a tidy randomized trial shows positive impacts for a digital tool, it is still important to consider how any given tool fits into the entire instructional program of classrooms and schools.

Given all this, helping teachers, schools, and districts learn how to generate and use evidence themselves may be a promising path toward injecting more evidence into decisions about digital tools for the classroom. Several initiatives, methods, and tools already available seem a logical place to start. Examples include:

The Proving Ground initiative at Harvard University’s Center for Education Policy Research helps districts and charter school networks design and use a deliberate, analytical approach to gathering and using evidence to test digital tools they might adopt systemwide.

The Ed Tech Rapid Cycle Evaluation (RCE) Coach, created by Mathematica Policy Research, gives schools and districts a process to follow and a tool to evaluate educational technology. The tool walks practitioners through a five-step process covering everything from planning an evaluation to summarizing the results. (Program materials say the typical RCE lasts three months from start to finish.)

The Carnegie Foundation for the Advancement of Teaching has created a range of resources. It also hosts an annual summit to help educators and others use a problem- and user-centered approach to learning and improvement, which leverages rapid testing and networked learning communities, to better their classrooms and schools.

More homegrown examples are also popping up. In Colorado, districts grappling with PL have formed a network to problem-solve issues jointly. The regional support agency that coordinates the network walks teachers through a Plan-Do-Study-Act inquiry cycle focused on a particular PL problem, with the goal of building this analytic process into their daily work.

At this point, we can’t say for sure that these initiatives, methods, and tools really work. But teachers’ hunger for guidance and information on digital tools in a fast-changing tech landscape certainly suggests that these and other approaches toward practice-based evidence generation would be useful initiatives for districts and partners to explore in earnest.