Race to the Top placed a $4.45 Billion bet that the way to improve schools was to tie teachers’ evaluations to their students’ test scores.
As it happens, the state of Tennessee has been using value-added assessment for 20 years, though the stakes have not been as high as they are now.
What can we learn from the Tennessee experience. According to Andy Spears of the Tennessee Education Report, well, gosh, sorry: nothing.
Spears has a list of lessons learned. Here are the key takeaways:
“4. Tennessee has actually lost ground in terms of student achievement relative to other states since the implementation of TVAAS.
Tennessee received a D on K-12 achievement when compared to other states based on NAEP achievement levels and gains, poverty gaps, graduation rates, and Advanced Placement test scores (Quality Counts 2011, p. 46). Educational progress made in other states on NAEP [from 1992 to 2011] lowered Tennessee’s rankings:
• from 36th/42 to 46th/52 in the nation in fourth-grade math[2]
• from 29th/42 to 42nd/52 in fourth-grade reading[3]
• from 35th/42 to 46th/52 in eighth-grade math
• from 25th/38 (1998) to 42nd/52 in eighth-grade reading.
5. TVAAS tells us almost nothing about teacher effectiveness.
While other states are making gains, Tennessee has remained stagnant or lost ground since 1992 — despite an increasingly heavy use of TVAAS data.
So, if TVAAS isn’t helping kids, it must be because Tennessee hasn’t been using it right, right? Wrong. While education policy makers in Tennessee continue to push the use of TVAAS for items such as teacher evaluation, teacher pay, and teacher license renewal, there is little evidence that value-added data effectively differentiates between the most and least effective teachers.
In fact, this analysis demonstrates that the difference between a value-added identified “great” teacher and a value-added identified “average” teacher is about $300 in earnings per year per student. So, not that much at all. Statistically speaking, we’d call that insignificant. That’s not to say that teachers don’t impact students. It IS to say that TVAAS data tells us very little about HOW teachers impact students.”
Read the whole article.
It is one of the best, most sensible things you will read on value-added assessment. It is a shame that Tennessee has wasted more than $300 million in search of the magic metric that identifies the “best” teachers. It is ridiculous that Congress and the U.S. Department of Education wasted nearly $5 billion to do the same thing, absent any evidence at all. Just think how many libraries they might have kept open, how many health clinics they could have started, how many early childhood programs initiated, how many class sizes reduced for needy kids.
But let’s not confuse the DOE with actual evidence when they have hunches to go on.
They could start by telling the truth:
“In August of 2011, the New York State Education Department hosted its first Network Team Institute. This institute was a multi-day event that trained educators from across the state in the Reform Agenda. In turn, they were to serve as “turnkey trainers” and push out the reforms.
At that first institute, participants received instruction from a variety of presenters including Paul Bambrick-Santoyo, the managing director of Uncommon Schools Newark. The State Education Department also purchased 600 copies of his book, Driven by Data, which it gave to participants. (You can see that purchase, expenses and presenter fees from the first institute, which I obtained through a FOIL request by clicking here. Various BOCES around the state then purchased books and distributed them as part of local turnkey training. I received my copy from Nassau BOCES, which purchased 475 copies to distribute. You can see the purchase, obtained from a FOIL here.
Data Driven Instruction is clearly a manual on how to teach to state tests. One of the schools described in the book as a “data-driven success story” is South Bronx Classical, a K-5 charter school with good test scores. Below is an excerpt that describes the practices it uses.
South Bronx Classical created an aggressive follow-up system in which students took daily math assessments and daily English assessments to track their performance in real time. This…was coupled with formal tests every two weeks that served a miniature interim assessments within the larger structure of quarterly tests.
The author then describes how students were regularly pulled out of “specials” such as gym for tutoring. (p31).
The students of the school are as young as five years of age.”
http://www.washingtonpost.com/blogs/answer-sheet/wp/2013/10/20/common-core-the-growing-slip-between-rhetoric-and-implementation/
Obviously, the parents at the meeting (I watched) weren’t concerned about “high standards”
They were concerned about yet another “reform” turning into an elaborate and expensive testing scheme. They should be concerned about that because that’s been the pattern for the last decade.
They can distract all they want with screams of “special interests!” but that’s fact, and it’s also fact that they sold “data driven instruction” hard, right along with the CC in NY.
We don’t want our public schools turned into no excuses charter chains. If they don’t want parents to believe they’re turning public schools into no excuses charter chains, they should stop buying the books of no excuses charter chain leaders and hiring them as consultants to drill teachers in test-taking techniques.
I wonder how wealthy Sanders became from all this..
All the money that could have support real student success and improved the schools that need the help… Very sad for me after teaching 25 years and supporting the Public Schools. Wake Up people, our schools need help now.. Help them and support all our children.. Public schools should be high class and in all states..Sign off….Frustrated teacher
Reblogged this on peakmemory and commented:
As I have pointed out, value added measures of teachers are statistically invalid. This piece from Diane Ravich’s blog makes it clear that you should not base policy on an invalid statistical analysis.
http://peakmemory.me/2013/10/02/value-added-measures-of-teachers-are-invalid/
Diane,
$300 per kid? That would be quite a lo. $300 X 30 or even $300 X 150. I’ misundertanding–so please correct me!
How many teachers, I wonder, get it one year and not the next, etc.
Deb
For more information see website: http://www.deborahmeier.com
Reblogged this on Transparent Christina.
Deb: The data indicate that the difference between a VAM identified “great” teacher and an “average” one equals $300 per kid per year that student works over a 30-year work career — which, let’s face it, is relatively short. So, it means $9000 over a LIFETIME … which is a tiny amount. Certainly not enough to differentiate among the strongest and weakest teachers or to tell very much of anything about which teachers are doing the job “better.”
Also, to Deb’s point, here’s the article where I break down the value-added “impact.”
http://kyedreport.com/?p=33
The VAM provides a good lesson in itself of how an analyst can come up with a good basic idea and after years of consideration on how to apply the methodology by many people and contracts let for millions of dollars to apply the methodology, no one takes the time to study the process carefully enough to suggest they use different independent variables . The dependent variable should still be teacher’s scores (not student scores) while the independent variables (the ones presumed to be acting on the dependent variable) should have been the children’s assessment of all the things they enjoyed about the class, specific things they learned, how well they thought the teacher knew the subject and was able to answer questions, the number of times that the teacher seemed to lose her patience, etc. with this format, you could even lightly sprinkle in some key substance questions like did your teacher teach you how to solve a quadratic equation (followed by a request that they list the steps) that, because it is a questionnaire about the child’s opinion of the teacher, would not be threatening. Another good question for 3rd through 5th graders would be: ‘did you learn about using abstract thinking to solve problems’ or, maybe more appropriately: ‘did she teach you much about how math might be used to solve problems’. This would also only require minor editorial changes, like from ‘VAM’ to ‘CAVAM’. or ‘CATVAM’.
You can get more from where Andy Spears got his summary at The Mismeasure of Education, particularly Part 3.
Absolutely, Jim. Good stuff!
I posted a link to this article on the TN DOE Facebook page. I was surprised that someone took the time to reply to my post. This was their reply.
For more information on TVAAS check out some tips from Tennessee teachers on how they are using it to make a difference in their classrooms.
Click to access Seating_Chart_Tetris.pdf
Click to access How_I_use_TVAAS.pdf
20 years of TVAAS and they show me a seating chart. Unbelievable!!