Archives for category: NAEP

Professor Jack Hassard of Georgia State University concludes, after reviewing Tom Loveless’s report for Brookings, that the Common Core Standards have had little or no effect on NAEP math scores, as Loveles predicted a few years ago.

 

The states most aligned with CCSS had the smallest gains.

 

Overall, eighth grade math scores show very little improvement since the Common Core was rolled out in 2010.

 

He writes:

 

Between 1990 – 2013 there was a 22 point increase in 8th grade math. Over the 23 years this amounts to about a 1 point increase per year. However, the average score increase from 2009 – 2013, the years the Common Core has been used, has only increased 0.30 points per year, much less than before the roll out of the Common Core.

 

Well, four years is too soon to see the radical improvements that Bill Gates and others have promised. Maybe we will have to wait a full decade to know whether the billions spent on CCSS were well spent.

 

 

Did you know that the National Assessment Of Educational Progress used to test much more than reading and math, much more than academic subjects? Did you know that it was designed originally to assess student cooperation and behavior as well as skills? Did you know that the narrowing of NAEP testing is fairly recent?

Richard Rothstein knows what NAEP was supposed to be and he explained its history to the governing board of NAEP on the occasion of its 25th anniversary.

He wrote:

“Education policy in both the Bush and Obama administrations has suffered from failure to acknowledge a critical principle of performance evaluation in all fields, public and private—if an institution has multiple goals but is held accountable only for some, its agents, acting rationally, will increase attention paid to goals for which they are evaluated, and diminish attention to those, perhaps equally important, for which they are not evaluated.

“When law and policy hold schools accountable primarily for their students’ math and reading test scores, educators inevitably, and rationally, devote less instructional resources to history, the sciences, the arts and music, citizenship, physical and emotional health, social skills, a work ethic and other curricular areas.

“Over the last decade, racial minority and socio-economically disadvantaged students have suffered the most from this curricular narrowing. As those with the lowest math and reading scores, theirs are the teachers and schools who are under the most pressure to devote greater time to test prep, and less to the other subjects of a balanced instructional program.

“One way the federal government promotes this distortion is through its National Assessment of Educational Progress (NAEP), an assessment administered biennially in every state, but only in math and reading. Government officials spend considerable effort publicizing the results. They call NAEP “The Nation’s Report Card,” but no parent would be satisfied with so partial and limited a report card for his or her child.

“Twenty-five years ago, Congress created the National Assessment Governing Board (NAGB) to create NAEP policy. At NAGB’s conference today celebrating its silver anniversary, Rebecca Jacobsen and I describe (in a presentation drawn from our book with Tamara Wilder, Grading Education. Getting Accountability Right) how NAGB’s disproportionate attention to math and reading was not intended when NAEP was first administered in the early 1970s.

“In those early years, NAEP attempted to assess any goal area for which schools devote, in the words of NAEP’s designers, “15-20% of their time…, [the] less tangible areas, as well as the customary areas, in a fashion the public can grasp and understand.”

“For example, to see whether students were learning to cooperate, NAEP sent trained observers to present a game to 9-year-olds in sampled schools. In teams of four, the 9-year-olds were offered a prize to guess what was hidden in a box. Teams competed to see which, by asking questions, could identify the toy first. Team members had to agree on which questions to ask, and the role of posing questions was rotated. Trained NAEP observers rated the 9-year-olds on their skills in cooperative problem-solving and NAEP then reported on the percentage who were capable of it.

“NAEP assessors also evaluated cooperative skills of 13- and 17-year-olds. Assessors presented groups of eight students with a list of issues about which teenagers typically had strong opinions. Students were asked to reach consensus on the five most important and then write recommendations on how to resolve two of them. The list included, for 13-year-olds, such issues as whether they should have a curfew for going to bed, and for 17-year-olds, eligibility minimums for voting, drinking, and smoking. NAEP observers rated skills such as whether students gave reasons for their points of view and defended a group member’s right to hold a contrary viewpoint.”

For the full presentation, written with Rebecca Jacobsen, read this.

Late last night, I posted a commentary that connected two seemingly unrelated communications. One was an article in Slate by psychologist Laurence Steinberg, saying that our high schools are not rigorous enough, our seniors are not learning enough, and bemoaning both kids and schools. It happened to arrive about the same time as a letter in my inbox from a teacher in upstate Néw York.

I pointed out that when I was a member of the NAEP governing board (NAGB), we devoted an entire meeting to discussing the well-known problem of seniors not caring about NAEP scores. They know that NAEP counts for nothing, and many turned in blanks or doodled or made silly patterned guesses to show their disdain for being asked to take yet another test of no significance.

When Steinberg saw my post,he tweeted (I paraphrase): when given a choice between anecdote and data, I choose data.

I responded that Mark Twain said there were lies, damn lies, and statistics.

The serious answer is that before one uses data to condemn or judge people, one should evaluate the quality of the data. In this case, I can testify as a fact that the governing board studied the 12th graders’ lack of motivation to comply. We thought about offering cash or pizza parties for agreeing to take the test seriously. No one had a good answer. The data are not reliable.

Maybe the Puritans started the tradition of saying that the younger generation is going to hell. Why can’t we ever stop wailing about the kids? Whatever they are, they reflect the society they were born into. I expect great things from them, despite the obstacles we older folks place before them, despite our dysfunctional politics, despite adults’ misguided priorities, despite all the bad educational policies our kids must overcome.

At the time I wrote last night, little did I know that the teacher and the professor graduated from the same college and had long ago had a similar exchange.

The teacher wrote this morning:

“Thanks to Diane for posting my original letter and for working so hard on behalf of teachers and our students.

“I turned on the computer this morning to look for my letter and nearly fell out of my chair when I saw my name mentioned alongside a reference to Dr. Laurence Steinberg. You see, Dr. Steinberg and I exchanged a number of letters many years ago about an op-ed piece he had written in the Times. I’d used his piece in my classes back in the early 1990s and my students took great issue with it. They were, to put it nicely, mad as hell at Larry.

“[Dr. Steinberg..... can I call you Larry? The fact that I've bumped into you again this way in the middle of the internet after all these years is just a little weird, isn't it? We ought to get together someday. I'll buy you a coffee. Hell, I'll even pay for your lunch. We both graduated from Vassar College so we can talk about the beautiful campus there when we need a break from arguing about education.]

“Suffice to say, that if I had to re-do my correspondence with Larry again there’s definitely things I would do differently. And, I’d like to think that Larry might feel the same way.

“But, Larry, Diane couldn’t be more right. Many of these tests are bogus. And, the kids know it. They’re certainly smart enough not to waste their time, especially considering the fact that we all seem to have so much less of that time nowadays.

“I’m so tired of hearing the same old cliche, “Kids today….blah, blah, blah….” After teaching in a high school for 26 years, I can say that these “kids today” are much more serious and hardworking than my own classmates back in the 1970s. They have to be. We’ve given them no choice.

-John Ogozalek”

Something magical is happening in San Diego. It is a good school district. Teachers and administrators and the school board are working towards common goals.

San Diego, in my view, is the best urban district in the nation.

I say this not based on test scores but on the climate for teaching and learning that I have observed in San Diego.

It’s not the weather, which of course is usually magnificent. Los Angeles too has great weather but it is constantly embroiled in turmoil, with teachers against administrators, the school board divided, and political tensions underlying every decision and policy.

San Diego went through its time of troubles in the late 1990s and early 2000s (I wrote about it in my next to last book, The Death and Life of the Great American School System, in which I devoted a chapter to the upheaval in San Diego, where corporate-style, top-down reform was birthed).

But in recent years, San Diego has elected a school board that works harmoniously with the teachers and their union. Until recently, it had a superintendent, Bill Kowba (a retired Navy admiral) who understood the value of teamwork. And with the leadership of an activist board, a new spirit of community-based reform began to take hold.

Scores went up on almost everything that was tested, but that was not what mattered most to the new (and true) reformers in San Diego. The rising test scores were the result of the new spirit of community-building that included parents, students, teachers, administrators, and the local community.

San Diego, of course, rejected Race to the Top funding. It didn’t want to make test scores more consequential than they already were.

When Superintendent Kowba retired, the San Diego school board met and immediately announced their choice of a new superintendent, without conducting a national search. The board asked Cindy Marten, one of the district’s best elementary school principals, to assume the superintendency. She was stunned, and she chastised them for not casting a wider net. But she took the job.

Cindy is a leader. She knows how to inspire and lead. She respects the work of principals and teachers, and they respect her. She also knows the importance of parent and community engagement.

Her motto, which is a playful twist on the KIPP motto is: “Work Hard. Be Kind. Dream Big! No Excuses.”

No matter how sunny the skies for the schools, no matter how harmonious the educators, parents, and children, the business community is grumpy. It can’t get over the fact that San Diego doesn’t have a brash, disruptive superintendent who wants to test the kids until they cry “uncle,” demean the teachers, and hold everyone’s feet to the fire. It can’t accept that there is any other way to lead the schools. And it can’t give up on its favorite meme that the schools are “failing” even though they are not.

These views were expressed full force recently when the San Diego Union Tribune, a deeply conservative newspaper, penned an editorial longing for the good old days when Terry Grier was superintendent. The UT can’t believe that San Diego let him go, let him move to Houston, where he is following the corporate reform script, handing out bonuses, firing teachers, using test scores as a club to beat up teachers. Talk about being a skunk at the garden party! The UT published an editorial lamenting “what might have been” if only Grier had stayed around in San Diego to do what he is doing now in Houston.

There was pushback. One board member wrote a letter to the editor pointing out that the dropout rate in Houston was nearly double the dropout rate in San Diego and commending Cindy Marten for avoiding the polarizing tactics associated with certain other unnamed superintendents.

But whoa! There are also some basic facts that the Union Tribune should have noticed. On the 2013 NAEP, San Diego’s public schools outperform those of Houston in math and reading, in grades 4 and 8. San Diego is in the top tier of urban districts; Houston is not. San Diego’s scores on the NAEP have steadily improved over the past decade. The proportion of students who score “below basic” has dropped significantly, and the proportion who score at or above proficient has increased significantly over the past decade. Why does the UT envy a lower-performing district and dismiss the solid, steady, persistent gains of its own district?

Michael Casserly, the fair-minded and careful leader of the Council of Great City Schools wrote an article for the newspaper applauding the success of San Diego and the leadership of Cindy Marten, but the Union Tribute failed to publish it.

Doug Porter of the San Diego Free Press wrote up the imbroglio and called out the UT for its humbug and hypocrisy. He aptly called his article “Facts Don’t Matter in Newspaper’s Quest to Demonize Public Education in San Diego.”

He wrote:

Talk about your cheap shots. It was bad enough when the UT-San Diego editorial board whipped up an attack on our city’s schools laden with misstatements, factual errors and a personal attack on Superintendent Cindy Marten. But when a nationally recognized education leader stepped forward to correct the record on her behalf, his response was deemed unworthy for publication.

It’ all very Orwellian; reality isn’t simply what Papa Doug Manchester tries to tell us it is. When his minions refuse to acknowledge something, the idea is for you to believe that it never happened.

One of the longest running narratives with our Daily Newspaper has been their dislike for the Board of Trustees at San Diego Unified. The paper’s ‘reform’ agenda for public education mirrors the libertarian/conservative wet dream of privatized charter schools, a change that means monetizing learning for corporate interests and creating a two-tiered system favoring the wealthier (and white) classes.

The reality that voters have elected and re-elected progressives to a school board that refuses to demonize teachers and puts the classroom first just is too much for them to handle. So this hatchet job is consistent with their refusal to acknowledge that SD Unified is making steady, determined progress (and is, in fact, a national leader among urban school districts).

Porter includes the full text of Mike Casserley’s supportive article about the steady progress of the San Diego public schools. This is my favorite line from his letter chastising the San Diego UT:

“So, pining for a previous superintendent is not only an affront to Ms. Marten but is akin to daydreaming about a former lover on your honeymoon.”

Porter makes only one mistake. He suggests that the school district engaged in “puffery” when it talked about its steady improvement on NAEP. I disagree. San Diego has made steady progress. On most NAEP measures, it outperforms other large city districts. This is a record to be proud of, not puffery.

San Diego now has the political climate that every district should have: a wise and experienced educator as leader; a collaborative relationship among administrators, teachers, the union, and the school board; a sense of vision about improving the education of every child and a determination to provide a good public school in every neighborhood. This is a vision far, far from the reformy effort to close down public schools and replace them with a free market. Unlike Chicago, Philadelphia, Houston, and most other urban districts, San Diego has the right vision, the right climate, and the right leadership. There is a unity of purpose focused on children that is impressive.

And that is why San Diego at this moment in time is the best urban district in the nation.

Just as the holidays began, Education Week published a very important article explaining why Common Core testing causes a collapse of test scores.

Since most people were preoccupied with preparations for the holidays, it probably didn’t get much attention. But it should have because it unlocks the mystery if why state after state is experiencing a 30 point drop in passing rates on Common Core tests.

As Catherine Gewertz wrote:

“It’s one thing for all but a few states to agree on one shared set of academic standards. It’s quite another for them to agree on when students are “college ready” and to set that test score at a dauntingly high place. Yet that’s what two state assessment groups are doing.

“The two common-assessment consortia are taking early steps to align the “college readiness” achievement levels on their tests with the rigorous proficiency standard of the National Assessment of Educational Progress, a move that is expected to set many states up for a steep drop in scores.

“After all, fewer than four in 10 children reached the “proficient” level on the 2013 NAEP in reading and math.”

I served on the NAEP governing board for seven years. NAEP “proficient” was never considered a passing mark; it signifies excellent academic performance. Only one state in the nation, Massachusetts, has 50% of its students at NAEP proficient.

It is absurd to set such a high bar for “passing.” It is a guarantee that most students will fail.

Why do we want an education system that stigmatizes 60-70% of all students as “failures?”

Is the purpose of education to develop citizens and healthy human beings or is it to sort and rank the population for selective colleges and the workplace?

Bruce Baker has written an important post about the inability of pundits (and journalists) to read NAEP data.

Part of the misinterpretation is the fault of the National Assessment Governing Board, which supervises NAEP. It has a tight embargo on the scores, which are widely released to reporters. It holds a press conference, where board members and one or two carefully chosen outsiders characterize the scores.

He writes:

“Nothin’ brings out good ol’ American statistical ineptitude like the release of NAEP or PISA data. Even more disturbing is the fact that the short time window between the release of state level NAEP results and city level results for large urban districts permits the same mathematically and statistically inept pundits to reveal their complete lack of short term memory – memory regarding the relevant caveats and critiques of the meaning of NAEP data and NAEP gains in particular, that were addressed extensively only a few weeks back – a few weeks back when pundit after pundit offered wacky interpretations of how recently implemented policy changes affected previously occurring achievement gains on NAEP, and interpretations of how these policies implemented in DC and Tennessee were particularly effective (as evidenced by 2 year gains on NAEP) ignoring that states implementing similar policies did not experience such gains and that states not implementing similar policies in some cases experienced even greater gains after adjusting for starting point.

“Now that we have our NAEP TUDA results, and now that pundits can opine about how DC made greater gains than NYC because it allowed charter schools to grow faster, or teachers to be fired more readily by test scores… let’s take a look at where our big cities fit into the pictures I presented previously regarding NAEP gains and NAEP starting points.
The first huge caveat here is that any/all of these “gains” aren’t gains at all. They are cohort average score differences which reflect differences in the composition of the cohort as much as anything else. Two year gains are suspect for other reasons, perhaps relating to quirks in sampling, etc. Certainly anyone making a big deal about which districts did or did not show statistically significant differences in mean scale scores from 2011 to 2013, without considering longer term shifts is exhibiting the extremes of Mis-NAEP-ery!”

But if NAGB wanted intelligent reporting of the results, it would release them not just to reporters but to qualified experts in psychometric s and statistics. Because it refuses to do this, NAEP results are reported like a horse race. Scores are up, scores are down. But most journalists never get past the trend lines and cannot find experts who have had time to review the scores and put them into context.

I have a personal beef here because I was given access to the embargoed data when I blogged at Education Week and had 4,000 readers weekly. Now, as an independent blogger with 120,000-150,000 readers weekly, I am not qualified to gain access to the data until after they are released (because i do not work for a journal like Edweek.) I don’t claim to be a statistical expert like Bruce Baker, but surely the governing board of NAEP could release the data in advance to a diverse group of a dozen qualified experts to help journalists do a better job when the scores come out.

For more than two decades, we have heard that charter schools will “save” poor kids from “failing public schools.”

Most comparisons show that charter schools and public schools get about the same test scores if they serve the same demographics. When charter schools exclude English learners and students with severe disabilities and push out students with low test scores, or exclude students with behavioral issues, it is likely to boost their test scores artificially.

Nicole Blalock, who holds a Ph.D. and is a postdoctoral scholar at Arizona State University, compared the performance of charter schools and public schools on NAEP 2013.

She acknowledged the problems inherent in comparing the two sectors. Both are diverse, and demographic controls are not available.

Nonetheless, she identified some states where charter performance is better, and some where public school performance is better.

The result, as you might expect: Mixed.

Bottom line: charters are no panacea.

You can view here the results for the NAEP for urban districts, known as TUDA, or Trial Urban District Assessments.

Five districts volunteered to take the NAEP in 2002.

Since then, the number has grown to 21 districts.

Test scores have generally risen, though not in all districts and not at the same rate.

Demographics affects the scores, not surprisingly.

Watch for changes over time in the proportion of high-poverty students.

As a New Yorker, I was very interested in the progress of what was once known as the “New York City miracle.” It disappeared.

On NAEP TUDA 2013, there was no “New York City miracle.” For almost every group and grade, scores have been stagnant since 2007. This year, the only group that saw a gain was white students in eighth grade. Black students and Hispanic students in fourth and eighth grades saw no gains at all. Black and Hispanic scores have been flat since 2005.

Knowing of Mayor Bloomberg’s large public relations staff and his pride in having “transformed” New York City’s public schools, I was curious to see how they would spin these flat results.

Here it is, in the Wall Street Journal:

“NYC Student Test Scores Rise Slower Than Other Cities”

“City Says Its Already High Scores Are Tougher to Improve”

But New York City is not number 1; it is not even number 2.

It is in sixth, or seventh, or eighth place in reading and mathematics, as compared to cities like Charlotte, Austin, Hillsborough County, Boston, and San Diego, yet its officials feel compelled to claim that they are just too darn accomplished to make improvements.

 

 

When I spoke in Rhode Island in October, I said that test scores were at their highest point in the past 40 years. I also said that the rate of increase had slowed after the passage of NCLB and Race to the Top. The largest recent gains occurred from 2000-2003, before the implementation of NCLB. Whoever writes the PolitiFact column for the Providence Journal claimed that my statements were “mostly false,” for reasons I did not understand, since I had the graphs from the US Department of Education to back me up.

I wrote a response, which the paper did not print.

Historian-teacher John Thompson corrected PolitiFact as well, included in the previous link.

The newspaper just issued a correction, admitting its error.

It is good to set the record straight.

The Thomas B. Fordham Institute is a conservative think tank based jointly in DC and Dayton, Ohio. I was a founding board member and served on its board for many years until 2009, when I decided I could no longer support its central focus on school choice and testing. I had tried to resign earlier, but was persuaded by personal friendships to remain as an internal dissident. One of the qualities I admired about TBF was its candor in recognizing the shortcomings of its ideas and projects. In fact, when people ask me why I abandoned the rightwing crusade for choice, I often refer back to the blunt self-criticisms of TBF’s charter schools. I opposed the idea that TBF should become a charter authorizer but was outvoted. Then, over the next few years, my own illusions about charters were dashed as many of the charters we sponsored became failures.

The latest report from TBF, written by Aaron Churchill, continues the tradition of candor.

Churchill reviews the NAEP results for Ohio and acknowledges that traditional public schools significantly outperformed charter schools.

Churchill compares the performance of students eligible for free and reduced price lunch in both sectors and concludes:

“The results from this snapshot in time are not favorable to charter schools. In all four grade-subject combinations, charter school NAEP scores fall short of the non-charter school scores. And in all cases, I would consider the margin fairly wide—more so in 4th than 8th grade. In 4th grade reading, for example, non-charter students’ average score was 211, while charter students’ average score was 191, a 20 point difference.

“The difference, however, narrows in 8th grade. Charter school scores are only 5 points lower in reading and 6 in math. The standard error bars nearly overlap in 8th grade, but not quite—if the standard error bars had overlapped, the difference in scores would not have been meaningful.

It was findings like these that convinced me that the proliferation of charter schools was no panacea; that most charter schools were no better and possibly weaker than traditional public schools; and that an increase in charters–especially in a state like Ohio, where the charter sector is politically powerful and seldom (if ever) held accountable–would harm children and weaken American education.

And one aside about this post: I object to the idea, recently popular, that NAEP “proficient” should be treated as a reasonable goal for most children, and that anything less is disappointing. New York, which is probably not alone, has aligned its Common Core testing to produce results aligned with NAEP “proficient,” so that anything less is considered failing. This is absurd. NAEP “proficient” represents superior achievement, not pass-fail. The only state in the nation that has reached the 50% mark is Massachusetts. Why set impossible and unrealistic goals? Did we learn nothing from the disaster of NCLB?

Follow

Get every new post delivered to your Inbox.

Join 94,891 other followers