Dear Marc,

I read your latest post about NAEP scores in which you say you are taking a long view. You dismiss the disappointing results of the 2015 NAEP, which showed almost no gains and some declines. You choose instead to look at the Long-Term Trend NAEP, which has been asking identical questions since 1973. You point out that 17-year-old scores are flat since the early 1970s, which persuades you that we are in big trouble.

For readers, let me explain that there are two different versions of NAEP. The one that was recently reported is called “Main NAEP.” Its curriculum framework is updated every seven to 10 years, and the content changes. Main NAEP is offered in every state every other year, in reading and mathematics. It also periodically tests other subjects, such as history, civics, and science. It gives data for individual states for students in fourth and eighth grades, enabling anyone to compare performance from state to state. It also reports on achievement gaps among students who are white and black, white and Hispanic.

Then there is the “Long-Term Trend” NAEP. It is offered every four years. The reading LTT started in 1971, the math in 1973. Unlike Main NAEP, the content almost never changes, although items that are obsolete are deleted (the one deleted item I recall from my time on the governing board of NAEP was about S&H Green Stamps). It breaks out scores by race and gender.

Marc, you note the impressive progress made by students at ages 9 and 13, especially black and Hispanic students. But you then go on to say that at the current rate of improvement, 80% of students would not reach “proficient” for many decades, perhaps more than a century. I have to disagree with you here, because setting NAEP proficient as a goal is as unrealistic as the NCLB mandate that 100% of American students would be proficient by 2014. NAEP proficient is a very high standard; it represents a very high level of achievement. NAEP started measuring state performance in 1992, and 23 years later, Massachusetts is the only state in the nation where as many as 50% of students have reached NAEP proficient. Why set an impossible goal?

It is true that the scores for 17-year-old students have barely moved, but not for the reasons you cite. It is not that students get dumb as they reach senior year, but that they don’t give a hoot about a test that means nothing to them. When I was on the NAEP governing board, we devoted an entire meeting to discussing the problem of motivation for students at age 17 or senior year. Seniors doodled or made patterns on the answer sheets. They didn’t care what their score was because they knew the test didn’t matter. It didn’t affect their grades; it didn’t affect their college prospects. They would never find out how they did. For them, it was a meaningless exercise. The board considered ways to motivate them. Suppose we offered a pizza party to encourage students to care? Suppose we offered cash prizes? We could not agree on a solution to the problem of motivating high school seniors to take seriously a test that didn’t count.

And that is why I am not surprised or alarmed by the test scores of 17-year-old students on a test that they know doesn’t matter to them.