Race to the Top placed a $4.45 Billion bet that the way to improve schools was to tie teachers’ evaluations to their students’ test scores.
As it happens, the state of Tennessee has been using value-added assessment for 20 years, though the stakes have not been as high as they are now.
What can we learn from the Tennessee experience. According to Andy Spears of the Tennessee Education Report, well, gosh, sorry: nothing.
Spears has a list of lessons learned. Here are the key takeaways:
“4. Tennessee has actually lost ground in terms of student achievement relative to other states since the implementation of TVAAS.
Tennessee received a D on K-12 achievement when compared to other states based on NAEP achievement levels and gains, poverty gaps, graduation rates, and Advanced Placement test scores (Quality Counts 2011, p. 46). Educational progress made in other states on NAEP [from 1992 to 2011] lowered Tennessee’s rankings:
• from 36th/42 to 46th/52 in the nation in fourth-grade math
• from 29th/42 to 42nd/52 in fourth-grade reading
• from 35th/42 to 46th/52 in eighth-grade math
• from 25th/38 (1998) to 42nd/52 in eighth-grade reading.
5. TVAAS tells us almost nothing about teacher effectiveness.
While other states are making gains, Tennessee has remained stagnant or lost ground since 1992 — despite an increasingly heavy use of TVAAS data.
So, if TVAAS isn’t helping kids, it must be because Tennessee hasn’t been using it right, right? Wrong. While education policy makers in Tennessee continue to push the use of TVAAS for items such as teacher evaluation, teacher pay, and teacher license renewal, there is little evidence that value-added data effectively differentiates between the most and least effective teachers.
In fact, this analysis demonstrates that the difference between a value-added identified “great” teacher and a value-added identified “average” teacher is about $300 in earnings per year per student. So, not that much at all. Statistically speaking, we’d call that insignificant. That’s not to say that teachers don’t impact students. It IS to say that TVAAS data tells us very little about HOW teachers impact students.”
Read the whole article.
It is one of the best, most sensible things you will read on value-added assessment. It is a shame that Tennessee has wasted more than $300 million in search of the magic metric that identifies the “best” teachers. It is ridiculous that Congress and the U.S. Department of Education wasted nearly $5 billion to do the same thing, absent any evidence at all. Just think how many libraries they might have kept open, how many health clinics they could have started, how many early childhood programs initiated, how many class sizes reduced for needy kids.
But let’s not confuse the DOE with actual evidence when they have hunches to go on.