Who Gets Access to School Data? A Case Study in How Privacy, Politics & Budget Pressures Can Affect Education Research
Months before President Donald Trump’s campaign promise to dramatically expand the use of public funds to pay private school tuition would accelerate the debate, the Louisiana study was the first of its kind to find adverse results for students using vouchers.
Louisiana’s voucher program is one of the most tightly monitored in the nation: Participating schools can’t set admission criteria, can’t charge families tuition beyond the voucher amount, and — most important to the study — require students to take the same tests as their public school counterparts. That the research showed those first-year test comparisons were negative was quickly taken up both by voucher critics as evidence that private school choice doesn’t work and by hardline school choice supporters as proof that Louisiana’s program is overregulated.
Meanwhile, John White, the state’s high-profile schools superintendent whose pro-school-choice policies had long been scrutinized, publicly accused the Duke and MIT researchers of improperly rushing to publish their results, and The Wall Street Journal condemned them for similar reasons in an editorial.
The criticism turned on whether the researchers should have waited for additional data before publishing their findings. But even that assertion, it turns out, was complicated. Not long after the headline-grabbing study was released, Louisiana ended its data-sharing relationship with the MIT and Duke researchers, according to emails obtained by The 74 through a public records request.
Although the fallout from the voucher study was very public, less seen were the tensions, negotiations, and conflicting priorities leading up to its publication that are revealed in the dozens of emails between the researchers and Louisiana state education officials.
It’s a behind-the-scenes glimpse into the world of education research and the pursuit of data that is more complicated and subjective than the academic nature of the work might suggest. It’s also a process that is critical to producing results that shape public policy — particularly in highly politicized arenas like school choice — and deepen understanding of what is helpful and harmful to students.
Data wars are not limited to Louisiana — a showdown erupted in Arizona in April after reporters there said they could not get data out of the state about its voucher program and the state schools chief took them to task with the legislature. Researchers, though, rely on individual student information that goes beyond what reporters and others can access through public records laws.
“For a program that’s ongoing, there are real issues of who gets to evaluate the program. Is it open to many teams, which I think is a good model? Or is it restricted to partners?” said MIT professor Parag Pathak, part of the team that studied Louisiana’s voucher program. “There are real broad issues in social science — it’s something that we’re all wrestling with.”
Dan Goldhaber, a professor at the University of Washington Bothell and director of the Center for Analysis of Longitudinal Data in Education Research, said getting data can sometimes come down to persuasion and pre-existing relationships.
State education departments, meanwhile, are constrained by time, resources, increasingly stringent student privacy laws, and — as one chief strategy officer acknowledged — deciding which research projects would shed light on their own policy priorities.
Assistant Superintendent Jessica Baghian said the Louisiana Department of Education is “committed to transparency and supporting research” and “shares data with reputable peer-reviewed institutions.”
Baghian said in an interview that significant data on students who participated in the state voucher program were given to the MIT and Duke research team — including for its second year — but that the department ultimately chose not to spend more time and resources answering their request for additional information after their study was published.
“I have conferred with the Superintendent [John White],” Brian Darrow of the Louisiana Department of Education wrote to the researchers in a Jan. 7, 2016, email. “We will not be sharing any additional data at this time.”
Pathak said that major sections of the second-year data received from the state were missing, which is why his team decided not to wait any longer and went ahead and published their findings based solely on the first year. After the study came out, they went back to try to get more complete second-year voucher results and were turned down.
His team, Pathak said, had “been given approval initially to look at more years of data, and then we were denied that approval.”
Multiple years of data were provided to a different team of researchers who published results not long after the MIT and Duke study came out. Their initial study also showed that students who used vouchers to attend private schools lost ground on state tests in the first year, although their later results — specifically for the third year of the scholarship program — would tell a different story.
The back-and-forth emails between the Louisiana DOE and the MIT and Duke researchers show some of the tricky conversations, strained negotiations, and gray areas that can surround data-sharing and the pending publication of controversial research.
In October 2015, Pathak emailed two DOE staff members the results from his soon-to-be-published study with a pair of colleagues showing that students who received a state-funded voucher to attend a private school saw large drops in test scores after one year compared with students who had applied for a voucher but didn’t get one. Hours later, Brian Darrow shared the results with John White and asked if he wanted to join a debriefing call, according to department emails.
An email three days later to Pathak from DOE staffer Gabriela Fighetti shows the conference call happened and included White.
“I know it wasn’t the easiest of calls,” Fighetti wrote, not indicating to what difficulty she was referring.
Fighetti also suggested that if the researchers would hold the paper, the department might be able to provide them additional data based on more years. Pathak welcomed the opportunity.
“We would be happy to include additional years of data,” he wrote in response. “We all think the study would benefit from this and would be willing to wait to release this study provided we could obtain the outcomes in a relatively short time frame.”
Fighetti soon emailed Darrow saying they should discuss with White whether to let the researchers publish with a single year of data or to provide them more years. They decided on the latter.
After internal discussion about how to provide newer data, it was determined by education department staff that — because of student privacy laws in the state — someone from the research team would have to come to Louisiana to receive it.
A Dec. 11 email from Pathak to Darrow acknowledged that “some of our phone calls have been tense,” but reaffirmed his willingness to hold the paper in order to add new data.
“We’re eager to be responsive to John White’s concern that our study is limited to the first year of statewide expansion,” Pathak wrote.
A few days later, Atila Abdulkadiroğlu, a researcher from Duke, flew to New Orleans to get the data. But soon there was a problem: in emails, Abdulkadiroğlu informed education department staff that the new data were not sufficient because there were no test scores in year two for too many students — 40 percent of them were missing scores, compared with just 8 percent in the first year.
A DOE staff member explained that the data only showed test scores for the year the students applied for a voucher but did not get one; in later years, if that same student didn’t reapply, their scores weren’t included.
“In contrast to our data request, LDOE provided a second-year data set that did not include lottery losers,” from the program’s first year Pathak told The 74. “This made the data useless for research purposes.”
Baghian, the assistant superintendent, said that the data were provided to the researchers as requested.
The data-sharing agreement between the state education department and the MIT and Duke team covered “2014 and 2015 test results and enrollment of earlier cohorts” of voucher applicants, meaning the students who applied for that first 2012–13 school year.
On Dec. 21, DOE staffer Laura Boudreaux responded to Abdulkadiroğlu that getting all of the data he wanted was “not something we have readily available,” and that, with the holidays approaching, they wouldn’t be able to provide the information until after the new year.
Abdulkadiroğlu replied that he understood and that “at the moment, we will have to stick to one-year follow[-up] only,” adding that “we would be very happy to come back for a follow up study with more test years.”
Boudreaux wrote back, “We would be happy to work with you on a follow up study with more test years.”
The study, based on one year of data, first appeared as a working paper through the National Bureau of Economic Research in late December 2015, and was later published online in the peer-reviewed American Economic Journal.
On Jan. 7, 2016, Abdulkadiroğlu sent a follow-up email to education department staff. “I’d like to resume our conversation on data,” he wrote, explaining what information was needed, and saying he would be willing to go to New Orleans again. “We would be happy to offer our assistance [in] any capacity in creating a new data set.”
Minutes later, Fighetti forwarded the email to Darrow, writing, “I imagine we might not be sharing more data, but curious if this came up when you spoke to John,” presumably referring to White, the state superintendent.
Meanwhile, a different department staff member responded to Abdulkadiroğlu indicating that her team would work to provide data, suggesting that they set up a conference call.
But soon after, Darrow replied to the researchers with the update that he had conferred with White and no additional data would be forthcoming. The researchers said they soon received a letter, signed by White, formally cancelling their data-sharing agreement, which had been extended in December 2015 to go through 2018.
“I don’t think [White] wanted to continue having us study the program would be my guess,” Pathak said.
Baghian, of the Louisiana DOE, told The 74 that the denial of data after the study was published and the researchers asked again for second-year numbers was simply based on allocating staff time.
“My team, in partnership with our portfolio team, literally worked around the clock up until the holidays to try to provide [data] files for them — only to then have them not use the files,” she said.
“The reality is we have hundreds of data requests, lots of researchers with whom we work, and we moved on in the queue to the next set of researchers who needed work,” she said. “We have had productive conversations with them since then, and have conveyed we are more than willing to work with them and share any additional data that they would like.”
Pathak said he talked with White in person in Cambridge late last year, but said, “I do not recall White indicating that the LDOE had changed their mind about data-sharing when we met briefly in November.” He noted that aside from that conversation he has not communicated with the Louisiana DOE since January 2016.
Throughout the spring of 2016, the initial study drew significant press attention. The March 2016 editorial in The Wall Street Journal said that the research “looked at data from one year — 2012–13, the first year the program debuted statewide — and the authors refused to check out more recent results, which is research malpractice.” The claim was not sourced, and appears to be contradicted by the emails obtained by The 74 showing the back-and-forth between the researchers and state education department staffers over the second-year data.
“We vehemently object to your accusation that we refused to check out more recent data and your allegation of ‘research malpractice,’ ” Pathak responded in a letter to the editor in the Journal. “Our research was conducted with the full cooperation of the Louisiana Department of Education. We have no agenda for or against choice and our work employs a gold-standard lottery-based research design that cannot be manipulated.”
A few months later, in a blog post for the Brookings Institution, White argued that it was unfair to judge the program based on a single year of data.
“In order to meet their own publishing deadlines, the researchers opted not to include available results from later years in their study,” he wrote.
When asked if they decided not to wait to publish the study because of deadline pressure, either external or self-imposed to possibly beat out competing researchers, Pathak said, “Our first best [preference] would have been to get more data. We wanted to wrap this project up — if we weren’t going to get more data, let’s write up what we had. We had a scholarly obligation to report what we had and move on to other topics.”
Pathak said he would still like to study additional years of the program, but maintained that knowing the first-year results was important in itself. He said it was common to report first-year impacts of school voucher programs, citing first-year studies done in Washington, D.C.
The Louisiana voucher research has gotten even more attention since the appointment of school choice advocate Betsy DeVos as secretary of education and Trump’s proposal to substantially expand private school choice programs, specifically vouchers.
The Louisiana Scholarship Program began in 2008 and was limited to New Orleans before expanding statewide in 2012. The initiative provides private school tuition subsidies to families in the state with incomes at or below 2.5 times the federal poverty line, or about $60,000 a year for a family of four. Students can enter the program starting in kindergarten or after attending a public school that received a C, D, or F letter grade. In 2015–16, 7,110 students received a voucher worth on average $5,856.
At the time of the study during the 2012–13 school year, more than 80 percent of students in the program were African American and participants’ average family income was less than $18,000 a year.
Louisiana’s program is distinct from many other private school choice initiatives in its test-based accountability rules, a point of controversy for some free-market voucher supporters. Private schools that enroll more than 40 students funded through the scholarship are rated on test scores just like public schools, and those with low ratings are no longer eligible to admit students using vouchers.
White argues that this accountability mechanism needed time to work, and so the program shouldn’t be judged on its earliest results. That view seemed to be borne out when the data released in June showed that 514 Louisiana voucher students who stayed in private schools had made up those earlier year deficits, performing statistically on par with their public school counterparts in math and English by year three.
White tried to put the various voucher studies into perspective — along with the fierce politics surrounding school choice — at a forum at the Urban Institute in Washington, D.C., the same day the latest Louisiana results were published.
“I think that within the Beltway and maybe in education policy circles, people come to these studies and ask, ‘Well, do voucher programs work?’ I just view that as the wrong question to ask,” White said. “The question is, ‘Are there good schools, effective schools of all governance types available to low-income and otherwise disadvantaged children?’ That is the essential question from a governance perspective, and unfortunately, it’s one that politics and the politicians and ideology largely misses because this issue has become a vessel for ideology on both sides of the aisle.”
The research that looked at years one, two, and three of the Louisiana scholarship program was done through the Education Research Alliance at Tulane University and the School Choice Demonstration Project at the University of Arkansas. The study of years one and two appeared in a peer-reviewed journal, while the third-year results were released in a paper dated June 26.
The year-two data they relied on were apparently different than what was shared with the MIT and Duke team.
“We did not see missing data for 40 percent of students,” Jon Mills, one the researchers on the multi-year study, told The 74. “It's surprising to hear the attrition rate was so high.”
A data-sharing agreement with the School Choice Demonstration Project covers several years of student information, including school enrollment data and test scores “for all students in the state.” The agreement states the data “will be used to generate evaluation reports from the fall of 2013 through the fall of 2018.”
Doug Harris, the director of the Research Alliance and a Tulane University professor, wrote to White in a Jan. 10, 2016, email, “We should talk at some point about the voucher studies and what [the MIT and Duke researchers] did. The way the authors handled this is unfortunate. You can trust that we would never do anything like that.”
Asked to elaborate, Harris told The 74: “I thought it was unfortunate that the earlier study was released with only one year of data, when there was good reason to think the results would be different in the second year. My understanding of the situation from LDOE was that the researchers were granted access to [a] second year of data, but that they wanted to release the results quickly.”
“In the email, I was communicating to Mr. White that this is not how we operate at the research center I direct, that we wouldn’t release results quickly if there was good reason to think, with additional feasible analysis, that the conclusion might change.”
Harris says that he feels no obligation to frame studies in a certain way to get continued access to data.
“We feel no pressure at all,” he said. “I think that’s fairly obvious since we’ve released reports that directly conflict with LDOE’s and [the state Board of Education’s] most closely held positions, especially on vouchers.”
In 2015, Harris published what may be the best-known analysis of the expansion of charter schools in New Orleans. Based on extensive data provided by the state, he concluded that the package of policies implemented in New Orleans after Hurricane Katrina had large positive effects on student achievement.
The Alliance has also produced a number of reports that paint the state’s recent policy changes in a less-flattering light, including the first- and second-year voucher results study, and a report showing that a number of New Orleans charter school principals admitted to trying to counsel out or exclude certain students.
Frank Adamson, a policy and research analyst at the Stanford Center for Opportunity Policy in Education, said he wasn’t able to get sufficient data from the state to examine the effects of its charter policy. Adamson, who co-authored a largely critical report on New Orleans’s charter schools in 2015, said the data provided by the education department were largely lacking key student demographic information that would allow comparisons based on students’ race and socioeconomic status.
He said he asked for the exact same data that has been provided to CREDO, another Stanford-based research group. Past CREDO studies of Louisiana have included a breakdown of results by student demographics, which Adamson said he could not do with the data he received.
Baghian flatly stated that the same data shared with CREDO had been provided to Adamson. “He was absolutely given the CREDO data,” she said.
Access to data has been a point of controversy in Louisiana. In 2014, a nonprofit group, Research on Reforms, won a court judgment requiring the education department to turn over the data that was provided to CREDO.
Harris notes that privacy provisions related to student data enacted in Louisiana in 2014 have made it more difficult to provide data.
“I think the state laws are … more challenging” than in other states, Harris said. “The state Department of Education is in a tough spot.”
Louisiana is hardly the only state where researchers say they don’t have access to enough data.
Goldhaber, the University of Washington Bothell professor, said that there is significant variation between places.
For instance, North Carolina provides data through a third party, Duke University, which offers access to university and nonprofit researchers. As a result, a large number of education research studies focus on the state.
“There are some states that are very forthcoming with data-sharing,” said Pathak. “North Carolina is near the forefront of those states.”
California, on the other hand, has gutted many of its data systems, making it extremely difficult for scholars to study education in the largest state in the country.
Goldhaber said that most states operate on something of an ad hoc basis: researchers may be able to get data, but only if they can convince some combination of education officials and politicians that their study is worthwhile. Sometimes that turns on an individual researcher’s relationship with key policymakers, he said.
There’s a downside to this approach, Goldhaber warns: “I would say it’s a discretionary process, which … opens the door for politics.”
“I think there is pressure, and it’s important as a researcher to make sure that you are protecting your reputation as an independent researcher and you try not to mold your findings because you know that somebody is going to appreciate them or frown upon them,” he said.
Louisiana’s concerns about staff time in compiling the data seems to be reflected in other places.
Carrie Conaway, chief strategy and research officer at the Massachusetts Department of Education, said that she is cognizant of the manpower required when determining whether to accept a request for data.
She also said the state grants or denies requests in part based on whether the proposed research is of interest to her office.
“I take a look and think, ‘Do I think this is going to help us do a better job improving the education of a million kids in the commonwealth of Massachusetts?’ ” she said.
Conaway said that she could not recall an instance when two research groups had requested data at the same time to research the same topic, but she did note that researchers have asked to examine a topic that had already been studied, and in those cases, she has usually not granted such requests.
“It is generally not to our benefit to end up with two competing answers to the same question,” she said.
Conaway said that her department’s approach is strictly apolitical, and that the goal is to determine whether a program is effective — not simply to validate the state’s education policies.
Researchers seem to agree, in the famous words of Supreme Court Justice Louis Brandeis, that sunlight is the best disinfectant.
“Peer review and the ability to conduct research to validate reporting and claims from other institutions is the foundation of the scientific community,” said Adamson of Stanford. “Without the capacity to do that, the scientific claims from anything coming out of New Orleans … are not validated.”
Harris’s research on the effects of New Orleans’s charter school reforms was published through the Education Research Alliance, but not, to date, in a peer-reviewed journal. However, separate peer-reviewed studies — including one co-authored by Abdulkadiroğlu and Pathak — have also shown the city’s charters having positive effects on student achievement.
Harris agrees that data should be open and transparent: “I think government agencies should make data as accessible to researchers as possible within the bounds of the law and in keeping with the need to protect the privacy of individuals. This makes for transparent government and good scientific practice, allowing replication of analyses.”
Pathak notes that despite his difficulties getting further data, Lousiana is actually progressive compared to some states in sharing information with researchers.
“I’m really grateful that they allowed us to study the program” in its first year, he said. “In many settings, people don’t even allow that to take place.”