Peter Greene reports that the National Association of Secondary School Principals is reviewing and likely to endorse a statement rejecting VAM. The NASSP recognizes a growing body of research that shows the inaccuracy of VAM.
They cite the research, then offer recommendations:
“NASSP recommends that teacher eval include multiple measure, and that Peer Assistance and Review programs are the way to go. Teacher-constructed portfolios of student learning are also cool.
“VAMs should be used to fine tune programs and instructional methods as well as professional development on a building level, but they should not be “used to make key personnel decisions about individual teachers.” Principals should be trained in how to properly interpret and use VAMmy data.”
This is an important step forward, toward professional responsibilty and common sense.
Can someone please comment on the use of VAM measures in the new CAEP accreditation procedures?
I’m not sure what you mean. The whole process they are suggesting is blatantly flawed and seems designed only to create massive amounts of “paperwork.” As far as I can tell, the best thing for a school to do is to have lots of friends in high places.
What clarification are you looking for? Perhaps more important is what you and colleagues can do about this now that the procedures are hard-wired into federal policy.
CAEP standards are filled with the same accountability demands and jargon inflicted on public education. CAEP standards apply to any entity responsible for the preparation of educators including a nonprofit or for-profit institution of higher education, a school district, an organization, a corporation, or a governmental agency.” In effect, a traditional college degree can be bypassed.
Here are few of the standards.
1.2 Providers ensure that completers use research and evidence to develop an understanding of the teaching profession and use both to measure their P-12 students’ progress and their own professional practice.
1.3 Providers ensure that completers apply content and pedagogical knowledge as reflected in outcome assessments in response to standards of Specialized Professional Associations (SPA), the National Board for Professional Teaching Standards (NBPTS), states, or other accrediting bodies (e.g., National Association of Schools of Music – NASM).
1.4 Providers ensure that completers demonstrate skills and commitment that afford all P-12 students access to rigorous college- and career-ready standards (e.g., Next Generation Science Standards, National Career Readiness Certificate, Common Core State Standards).
1.5 Providers ensure that completers model and apply technology standards as they design, implement and assess learning experiences to engage students and improve learning; and enrich professional practice.
Here is a sample of the detail.
Example: CAEP Standard 4.1, p. 13) “The provider documents, using multiple measures, that program completers contribute to an expected level of student-learning growth. Multiple measures shall include all available growth measures (including value-added measures, student-growth percentiles, and student learning and development objectives) required by the state for its teachers and available to educator preparation providers, other state-supported P-12 impact measures, and any other measures employed by the provider.”
Add this to the mix, from p. 27: “Measures of completer impact, including available outcome data on P-12 student growth, are summarized, externally benchmarked, analyzed, shared widely, and acted upon in decision-making related to programs, resource allocation, and future direction. There is much more, but VAM value-added metrics and proxies such as SLOs and SGOs were accepted by CAPE, and perhaps because USDE said they had to be there.
Click to access final_board_approved1.pdf
“Teacher-constructed portfolios of student learning are also cool.” Is this a teacher portfolio or student? In any case, do we really need to mandate more “cool” things for teachers to do? Back when making teacher portfolios was the “cool” thing to do, I made one for a job search. I took it to one interview; they told me it was very nice that I had made one but they didn’t need it. Exit portfolio.
Now peer assistance/evaluation,…my experience with such a system was totally positive.
That line caught my eye also. “Cool”. WTF????
“Teacher-constructed portfolios of student learning are also cool.”
See above comment!
“VAMs should be used to fine tune programs and instructional methods as well as. . . ”
VAMs SHOULDN’T BE USED FOR JACKSHHEEEIIIIIIITTTT!!!!
VAMs are completely invalid mental masturbation of some economist’s metrics melted mental material.
How much measly mouthed machinations can these administrators come up with to not say what needs to be said (see the capitalized statement above) while appearing to “make a statement”????
Spineless Chickensheeiiittttsss those administrators.
Peer validators in Newark are shipped in from Connecticut and are being utilized to the detriment of teachers. I am with Duane on this one.
“VAM: The Scarlet Letter”
“For VAM Nobel Tolls”
VAMs are mathturbation
A bunch of voodoo spells
By folks seeking adulation
From those who give Nobels
VAM never should have left the think tank.
I am with Duane regarding “Peer Assistance and Review programs are the way to go. Teacher-constructed portfolios of student learning.”
It is completely disastrous and meaningless in praising and stabbing behind the back from people without dignity or as we call “Spineless Chickensheeiiittttsss” administrators because of their own profit. Back2basic
Reblogged this on peakmemory and commented:
“the National Association of Secondary School Principals is reviewing and likely to endorse a statement rejecting VAM”
A good step forward, however, I do have to say that I am not a big fan of “portfolios of student learning” as an evaluation technique.
Portfolios in Tennessee are being used as a proxy for SLOs and for VAM. here is how the system works.
Tennessee has modified the SLO process in order to rate teachers of art, dance, music, and theater on the student “growth” they have produced.
The Tennessee initiative received state approval in 2012. This version of the SLO process includes a masked peer review of student work called “evidence collections.” Art teachers assemble these collections in a digital portfolio that also includes other documents for the evaluation. A dedicated online site facilitates the process.
Many of the criteria for submitting a portfolio are linked to the concept of “purposeful sampling.” For example, the teacher must select samples of student work from two points in time (comparable to a pretest and posttest) in order to represent student “growth.”
A teacher must submit five evidence collections. Each collection must be coded to identify the specific state standards for arts learning addressed in the lessons or units. The evidence collections must include “targets for learning” in three of the four major domains in state arts standards: Perform, Create, Respond, and Connect.
The online template offers guidance for submitting portfolios and understanding how the scoring system works. In this system, the art teacher rates the evidence collections—a form of self-evaluation. This self-evaluation becomes part of the portfolio. Then two ex-emplary art teachers with job-alike experience independently rate the portfolios—a form of masked peer review. These raters have been trained to use rubrics for the evaluation.
The final rating places the teacher into one of six levels of performance from “signifi-cantly below expectations” to “significantly above expectations.” A third rater may be enlisted to ensure the final rating has a consensus.
In Tennessee, this “student growth” measure counts for 35 percent of a teacher’s overall evaluation. By 2013, 1,500 art teachers had been evaluated by this method. The plan is still a work-in-progress. It began with an extraordinary collaboration among teachers in the arts, community leaders in local and state agencies, scholars, and other experts in the arts and evaluation.
The process seems to have USDE approval–for the moment. As far as I know, this process ends up being impersonal in that face-to-face evaluations between teachers and their principals are not needed, at least for this component of their evaluation.