A reader wonders about whether the data can be trusted when there are so many reasons to inflate the numbers.

She writes:

“Merrow finally points out the most explosive point in this memo, one that’s being passed over in commentaries. Fay noted that specific erasure patterns occurred across many classrooms. That would indicate that systematic cheating occurred at the building level, and point toward the principals.

“Massachusetts has never requested erasure analyses, so far as I know. I wish they would. My own district is in an odd bind, because it was an education reform superstar, ranked at level I (the top 20%), in 2009. We’ve been data-driven for a decade. This year, we are flagged as level 3, the bottom 20%. I’m concerned our new principal will be blamed for testing a higher percentage of our actual students, but maybe there’s something more.

“I’m walking on eggs here. The state assistance team that came to our building quoted an improved graduation rate, and I want so much to believe it, but it didn’t make statistical sense. They told me they couldn’t divulge the denominator they used, because of confidentiality involved in tracking students after they left the building. The gap between the number of kids in the testing rooms and the number of scores reported makes actual data analysis impossible. I have no idea what any “data” anywhere means, even if the tests did measure anything worthwhile.”