Here is the reason for the collapse of test scores in New York City and New York State.
State officials decided that New York test scores should be aligned with the achievement levels of the federally-administered National Assessment of Educational Progress (NAEP).
This is an excerpt from a press release prepared by Mayor Bloomberg’s office:
“The new State test results are in line with previous results for student’s readiness for college and careers and show New York City students have maintained gains made over the past decade. The percentage of New York City students meeting the new, higher bar for proficiency in math (29.6 percent) is similar to the percent of students measured proficient on the 2011 National Assessment for Educational Progress (NAEP) tests (28.0 percent) – up from 20.5 percent on the NAEP test in 2003. The percentage of New York City students meeting the new, higher bar for proficiency in English (26.4 percent) is similar to the results on the most recent NAEP English test (26.5 percent), up from 22.0 percent on the NAEP test in 2003.”
Now, leave aside for the moment the odd fact that the mayor is boasting about the appallingly low percentage of students in New York City who met the new proficiency standards after a decade of his control of the schools. The key point here is that the mayor, his Chancellor Dennis Walcott, Regents’ Chancellor Merryl Tisch, and State Commissioner John King all agreed that the state and city scores should be comparable to the NAEP “proficiency” level.
That is a huge mistake. It explains why the scores are invalid.
The state didn’t just “raise the bar.” It aligned its passing mark to a completely inappropriate model.
The state scores have four levels: level 4 is the highest, level 1 is the lowest. In the present scoring scheme, students who do not reach level 3 and 4 have “failed.”
NAEP has three levels: “Advanced” is the highest (only about 3-8% of students reach this level). “Proficient” is defined by the National Assessment Governing Board as “solid academic performance for each grade assessed. This is a very high level of academic achievement.”). “Basic” is “partial mastery” of the skills and knowledge needed at each grade tested.
“Proficient” on NAEP is what most people would consider to be the equivalent of an A. When I was a member of the NAEP governing board, we certainly considered proficient to be very high level achievement.
New York’s city and state officials have decided that NAEP’s “proficiency” level should be the passing mark.
They don’t understand that a student who is proficient on NAEP has attained “a very high level of academic achievement.”
Any state that expects all or most students to achieve an A on the state tests is setting most students up for failure.
If students need to reach “proficiency” just to pass, there will obviously be a very large number of students who “fail.”
B students and C students will fail.
The NAEP achievement levels have always been controversial. Many researchers and scholarly bodies have said they were unreasonably high and thus “fundamentally flawed.” That term “fundamentally flawed” occurs again and again in the literature of NAEP critics. This article by James Harvey is a good summary of these arguments.
Some on this blog have asked whether NAEP is a criterion-referenced test, and the answer is no. A criterion-referenced test is one that almost everyone can pass if they master the requisite skills. A test to get a drivers’ license is a criterion-referenced test. Anyone who studies the laws can pass the written test and qualify for a drivers’ license.
NAEP is not a criterion-referenced test. Massachusetts is the only state where as much as 50% of the students (and only in fourth grade) are rated proficient in reading. The NAEP tests are not designed to be criterion-referenced tests; they are a mix of questions that are easy, moderate, and difficult.
The achievement levels were created when Checker Finn was chair of NAGB. I think they are defensible if people understand that the achievement levels do not represent grade levels. If the public wants a measure of “grade level,” then “basic” probably comes closest to grade level. “Proficient” is not grade level; as NAGB documents state, it represents “a very high level of academic achievement.”
More important, the NAEP achievement levels were never intended to be measures of grade level, and New York officials are wrong to interpret them as such, especially when they mistakenly use “proficient” as the passing mark.
Any state that uses NAEP “proficient” as its definition of “grade level” is making a huge mistake; it will set the bar unreasonably high and will mislabel many students and misjudge the quality of many schools.
And that is exactly what happened in the New York testing fiasco.
If the state sticks to its present course of using NAEP “proficient” as its passing mark, it will encourage criticism of the Common Core standards as unrealistic and stoke parental outrage about Common Core testing.
People know their children, and they know their own school. The politicians may convince them that American education is floundering (even if it is not), but they can’t convince them that their own child and their own school are “failing” when parents know from their own experience that it is not true.
The corporate reformers now using the Shock Doctrine to bash the schools and disparage students may find that their tactic has backfired. They succeed only in adding fuel to the growing movement to stop the misuse of standardized testing.
What is happening in New York is likely to undermine public confidence in the state’s highest education officials and create new converts to the Opt-Out of Testing movement.
The Shock Doctrine may be a boomerang that helps to bring down the madness of No Child Left Behind, Race to the Top, Common Core, the Pearson empire, and every other part of the reformy enterprise.
New York may have inadvertently created by the most powerful recruiting tool for the Opt Out movement.