Heffernan: How Can We Know If Ed Tech Works? By Encouraging Companies & Researchers to Share Data. Government Funding Can Help
- Heffernan: How can we know if ed tech really works? By encouraging companies and researchers to share data. Government funding can help
- Heffernan: Not enough educational technology providers open up their platforms so researchers can study what works — and what doesn’t. The government should provide more funding for technology platforms to make their data available
For all the promise of education technology, one critical step for fueling its continued improvement is missing: Not enough educational technology providers are opening up their platforms so researchers and entrepreneurs can study and test what works — and what doesn’t.
The federal government should lead the way and provide more funding for technology platforms to make their data available, so researchers can examine the efficacy of specific educational approaches.
Walk into any classroom, and you will find technology in use: children working with apps on iPads, teachers presenting material with the help of PowerPoint or YouTube, or students taking a quiz on laptops.
These technologies have obvious benefits. Students can access a broad array of materials easily and quickly, and learning can be tailored to a student’s particular needs, so fewer are left behind or bored by material they’ve already mastered.
But critics worry about the cost and effectiveness of certain software; despite students’ increased access to these devices and online tools, it’s hard for teachers, parents and administrators to tell what works.
A big part of the problem is that technology platforms don’t share enough of their data. This means that when scientists want to study human learning, they often have to start from scratch and build a platform that will allow them to experiment before they can start designing their study. This inefficient process results in wasted effort, time and money.
It doesn’t need to be this way. Educational technology companies and researchers can open up the back end of their platforms so researchers can access data and run experiments.
I’ve done something like this myself. A few years after establishing the online teacher tool math program ASSISTments with my wife, Cristina, who is now chief impact officer of the ASSISTments Foundation, I opened up the platform to academics, making the system a shared research infrastructure in which researchers can run experiments. We call the back end of the platform E-TRIALS, and it allows researchers to propose experiments that meet Institutional Review Board standards for educational research.
When ed technology companies open up their platforms in this way, research is much more effective. With more access to data, experts can do higher-quality studies and more easily share their methods and results. Researchers can also test their ideas on much larger samples and find more generalizable results — and spend their time and energy developing and executing research instead of creating a platform to run their research or recruiting a broad sample of students.
Other scientific disciplines have faced and solved such problems. In astronomy, for example, the nationally funded Green Bank Radio Telescope in West Virginia has been accessed by many researchers and was used to discover the most massive neutron star ever recorded. The U.S. has also given more than $500 million to the Large Hadron Collider project at CERN, the European research organization and particle physics laboratory.
These shared scientific instruments are valuable not just because of the powerful technology they bring to bear on scientific questions but also because of the way they focus research, foster scientific community and provide a shared framework for measurement and analysis.
A number of ed tech programs have taken promising steps forward in part because companies want to know what works in education. EduStar is one example. The program runs randomized controlled tests in the background of digital learning activities (apps, games and videos) that are available on PowerMyLearning Connect, a content platform free to all schools.
In one pilot trial, EduStar compared programs used to teach division. Within a class period, researchers found that the more straightforward program worked better than a gamified one. The program — which has conducted 77 trials and engaged 10,000 students in more than 40 schools — plans to scale up, enriching its data history and diversity.
While the operators of many ed tech programs have shown interest in such education research practices, they will need more incentives to scale their approaches. Federal agencies such as the U.S. Department of Education and the National Science Foundation can lead the way and support these efforts.
The head of the U.S. Department of Education’s Institute of Education Sciences recently signaled support for such research practices. In a blog post, its director, Mark Schneider, wrote that the institute is “particularly interested in technologies that can test many more students more quickly and more cheaply. New platforms are emerging that can do this, perhaps leading to changes in our ‘standard’ model” of randomized controlled trials.
The government is no stranger to supporting innovations in research in the hard sciences, but education research is just as critical. As of 2015, the U.S. ranked 38th in math out of 71 countries, according to the Programme for International Student Assessment. Federal education funding fell 4 percent from 2010 to 2014, and under the current administration, it could fall even further; the 2020 budget proposes a 10 percent cut for the Education Department.
We can’t go backward. We need to renew our commitment to better learning, and part of that must be a commitment to better learning research practices.
Neil T. Heffernan is a professor of computer science and director of the Learning Sciences and Technologies Program at Worcester Polytechnic Institute. He is the developer of ASSISTments, a free web-hosted digital platform that provides teachers with specific insight into their students’ progress on math homework.Submit a Letter to the Editor