Jennifer Borgioli, whom I met via Twitter and know as DataDiva, has sent me a post about performance assessments in New York. She is responding to an earlier post about the New York Performance Standards Consortium, which has thus far not gotten permission form the state to add 19 schools to its group. The Consortium many years ago won an exemption from all state standardized testing (except for the Regents ELA exam) and relies instead of performance assessments judged by teachers and others. The article cited in the earlier post indicated that the state was reluctant to allow other schools to escape the state testing regime, which Jennifer does not contest. She believes that the state is open to performance assessment within its testing regime.
In your blog, your last paragraph seemed to suggest that NYS wants to discourage the use of performance-based tasks when in fact, I don’t think that’s the case. I cannot speak to the reasons why there is hesitation to approve more schools for performance-based alternatives to Regents but I do know there is room for performance-based tasks for all schools in New York. In effect, there are two types of performance tasks. The large-scale, long-term tasks as you described (portfolio assessments, etc.) that are used for exit criteria and reflect a deep understanding of the content and skills in a given domain or topic and then there are small-scale, shorter on-demand performance tasks that require students to follow a series of steps or tasks in order to generate a product or engage in a performance.
The NYS APPR guidance documents reference performance-tasks at least twice:
F5. We want to use locally‐developed performance tasks for a variety of grades and subjects that would be assessed using a rubric. Is that allowable?
Subject to local negotiation, locally‐developed performance tasks scored by a rubric could be used as a district, regional, or BOCES developed assessment wherever locally developed assessments are allowed as either a comparable growth measure or a locally selected measure provided that such assessments are rigorous and comparable as described above.
G4. Does vested interest rule apply to pre‐tests given to establish a baseline for a SLO?
To the extent practicable, districts or BOCES should ensure that any assessments or measures, including those used for performance‐based or performance task assessments that are used to establish a baseline for student growth are not disseminated to students before administration and that teachers and principals do not have a vested interest in the outcome of the assessments they score
In a number of regions across the state, teachers are working together to design performance-tasks that get to the most critical learning of their content area or course as determined by the New York State Learning Standards in a way that is authentic as possible. These are not large scale tasks full of student choice and authentic assessment but neither are they traditional, multiple choice tests. Their use reflects a commitment by the participating schools to minimize the impact of APPR regulations on students. These assessments are designed by classroom teachers and will be scored by them using rubrics they create. Though we do not have evidence of reliability yet, that will come after the administration of the pre-assessments and inter-rater reliability analysis this fall, there is every reason to believe that these assessments will generate results as consistent as those generated by a machine-scored, publisher-created multiple choice test.
It’s a small move but it’s a start.