Trump recently told a group of students attending voucher schools in D.C. that they were very lucky because the graduation rate of the voucher program was 98%. That was far more than the evaluation of the program, which claimed a rate of 82%. But when I re-read the final evaluation report on the program, I couldn’t understand how the evaluators arrived at 82%. Newspaper accounts regularly say that the D.C. voucher program had no effect on test scores but a higher graduation rate. But was it true? What was the attrition rate? How did the evaluators arrive at 82%?

So I asked William Mathis of the National Education Policy Center to explain what was behind the numbers. He very kindly untangled the data for me and wrote the following:

Donald Trump’s Phantom 98% Voucher Graduation Rate

William J. Mathis

​Education secretary Betsy DeVos joined Donald Trump at the White House to pitch school vouchers touting the “98% graduation rate” from the District of Columbia program. Now, a 98% graduation rate would be a superlative figure for any school but coming out of urban Washington, this would be nothing short of phenomenal. Some might claim divine intervention would be required.​​

Here’s why: For the baseline year of 2010, the federal government’s official, national, on-time graduation rate reached an “all-time high” of 80%. When the District of Columbia’s 2010 graduation rate was compared to the 50 states, it came in dead-last with 59%. It maintains the dubious last-place ranking. Thus, to reach 98%, the DC voucher program would have to leap over all 50 states including top-scoring Iowa (88%). Such a miraculous ascent rightly raises a skeptical eye.

To sort this out, inquiring minds would first go to the source of the numbers. The president’s remarks were based on a 2010 University of Arkansas study of Washington DC which estimated the actual graduation rate of 70% for traditional public schools and 82% for voucher schools. This would be pretty good given DC’s official rate of 59% for that year. But this is a long way from Trump’s imaginative 98%.

​So what’s the difference between the researchers’ rate and the real rate? The University of Arkansas’ numbers were based on a telephone survey of parents which had a response rate of only 63% despite some aggressive follow-up. For students who had not yet graduated, they asked the parents to forecast whether their student would, in fact, graduate. Since the control group had a response rate similar to the voucher students, the researchers concluded they could compare the groups. But this quickly runs into problems. The first of which is the low response rate to the telephone survey. It is reasonable to infer that respondents would differ from non-respondents. The second problem is relying on the parents’ forecast that their child would graduate rather than using the actual school district count of drop-outs and non-graduates. These errors would result in inflated numbers.​

​The third problem is selection effects. That is, the parents who elected to participate in the voucher program are parents who are more likely to be involved and motivated to advance their children’s education. As is clearly known, parental involvement is a key to educational success. Parents must register for the program and the on-line application program requires the parent to establish an account with email address and password. Then, social security numbers, date of birth, proof of income, proof of DC residence and tax ID numbers are required. This suggests a multitude of selection problems including non-computer literate parents, computer availability, privacy protection and any number of other reasons that people may not want to be in a government data base.

​Mystifying to the reader, only 351 out of 1293 students used their voucher for all years (27%). The remaining 73% dropped out of the program but whether they graduated is unclear. We just don’t know what happened to these students.
Trump and DeVos failed to mention that this same study showed test scores for the voucher students remained flat. They also overlooked a newer DC study with even less positive findings. In this federally sponsored 2017 study, test scores dropped for both experimental and control groups. But voucher program students dropped more than the traditional students in both reading and mathematics. Further, 82% of the voucher group changed schools after the first year. All in all, there are no transcendent intercessions here. It’s just a weak design garnished with exaggeration.

While Trump argues for billions in new tax breaks for voucher schemes, there is no evidence that they are an effective reform strategy. To the contrary, the segregative effects could be quite harmful. Large-scale voucher studies in Louisiana, Indiana and Ohio also show negative numbers. So in light of these facts, what did the federal government do? They prohibited further studies of the program and called for greater federal support of voucher programs.