Archives for the year of: 2014

Chalkbeat in Colorado reports that school authorities are worried about a mass opt-out by high school students in Boulder and in Douglas County and possibly other districts. The students say they have been tested nonstop during their entire school careers, and “enough is enough.” They are right.

 

This letter just in from a student leader in Colorado, who attends Fairview High School in Boulder, the epicenter of the student revolt. When the students organize and push back, they will change the national climate. Students are the true victims of our nation’s obsession with high-stakes testing and standardized testing. It is they who are losing a real education while their schools are compelled to administer test after test, taking away a month or more of instruction, dropping the arts and other subjects that encourage creativity. When teachers and administrators protest, they can be fired. The students cannot be fired. They are powerful because they are free to voice their opinions without fear of retribution.  If this time of national test mania should ever subside, it will be because students like these in Colorado stood together and demanded real education, real instruction, instruction meant to recognize their talents and to inspire them to ask questions, not to check the right boxes. As the scholar Yong Zhao writes in his last book about Chinese education, standardized tests are inherently authoritarian; they require students to give the answer that the authorities demand. These students reject authoritarianism; they want an education that challenges them, inspires them, brings out the best in them. And they are right. They are the Tom Paines of our time. May their numbers multiply. They act in the authentic American tradition of revolt against distant and oppressive authorities.

 

For their intelligence, their courage, and their resistance to mindless demands that destroy their education, I name these students to the honor roll of the blog. The adults are “just following orders.” The students are taking an active role in their own education.

 

 

 

 

 

Hello Ms. Ravitch,
My name is Jennifer Jun and I am a senior at Fairview High School in Boulder, Colorado. I’m writing to tell you that the senior class of our school, along with several other schools, is planning a protest of the Colorado Measures of Academic Success (CMAS) test that is expected to take place this Thursday 11/13 and Friday 11/14.

 

I have been following your blog and updates to educational issues for some time now, and I simply wanted to reach out and let you know. It would be an honor to have our event recognized by a key individual in the national education reform dialogue like you.
After extensive and research and discussion our senior class has decided that the implementation of this test did not take into account student opinions, and also does not accurately reflect the Colorado social studies and science curriculum. Therefore, we students have decided to opt out of the test and gather by the school during the testing hours to protest the lack of student voice that goes into such educational reform.

 

The students have been actively initiating dialogue with school administration, the district, and intend to find other channels to talk to policy makers and individuals that are involved in implementations of such tests.

 

Students have made a 3-minute informational video about the protest, which outlines additional details about the event: https://www.youtube.com/watch?v=38zAfVOu1tw&feature=youtu.be . We have also written an open letter discussing our opinions of the test: https://docs.google.com/document/d/1tbDg-SEqpYrBUwixGh4wuMu6B0YYnfftt6u-cI5dWmQ/edit?usp=sharing

 

The protest was just released to the public today, and here is one of the several articles outlining the event: http://www.dailycamera.com/boulder-county-schools/ci_26910001/boulder-valley-seniors-plan-protest-state-tests-this
Thank you for your time and for being such an active voice for the students and the betterment of education.

 

Sincerely,

 

 

Jennifer Jun
Fairview High School

jenniferjunfhs@gmail.com

The state of Pennsylvania is being sued by the following petitioners:

WILLIAM PENN SCHOOL DISTRICT; PANTHER VALLEY SCHOOL DISTRICT; THE SCHOOL DISTRICT OF LANCASTER; GREATER JOHNSTOWN SCHOOL DISTRICT; WILKES-BARRE AREA SCHOOL DISTRICT; SHENANDOAH VALLEY SCHOOL DISTRICT; JAMELLA AND BRYANT MILLER, parents of K.M., minor; SHEILA ARMSTRONG, parent of S.A., minor; TYESHA STRICKLAND, parent of E.T., minor; ANGEL MARTINEZ, parent of A.M., minor; BARBARA NEMETH, parent of C.M., minor; TRACEY HUGHES, parent of P.M.H., minor; PENNSYLVANIA ASSOCIATION OF RURAL AND SMALL SCHOOLS; and THE NATIONAL ASSOCIATION FOR THE ADVANCEMENT OF COLORED PEOPLE—PENNSYLVANIA STATE CONFERENCE.

The petitioners are represented by the Public Interest Law Center of Philadelphia and the Education Law Center of Pennsylvania.

Here is the summary of their challenge to the state:

PETITION FOR REVIEW IN THE NATURE OF AN ACTION FOR DECLARATORY AND INJUNCTIVE RELIEF

Petitioners, by and through their counsel, for their Petition for Review in the Nature of an Action for Declaratory and Injunctive Relief against Respondents, state and allege as follows:

INTRODUCTORY STATEMENT

The good Education of Youth has been esteemed by Wise men in all Ages, as the surest foundation of the happiness both of private Families and of Common-wealths. Almost all Governments have therefore made it a principal Object of their Attention, to establish and endow with proper Revenues, such Seminaries of Learning, as might supply the succeeding Age with Men qualified to serve the publick with Honour to themselves, and to their Country.1

1. From the earliest days of the Commonwealth, Pennsylvania has recognized a societal interest in public education—a charge that Respondents here are sworn to carry out. Under the Pennsylvania Constitution, Respondents have an obligation to support a thorough and efficient public school system that provides all children an equal opportunity to receive an adequate education. Through legislation and regulation, Respondents have established state academic standards that define precisely what an adequate education entails. But rather than equip children to meet those standards and participate meaningfully in the economic, civic, and social life of their communities, Respondents have adopted an irrational and inequitable school financing arrangement that drastically underfunds school districts across the Commonwealth and discriminates against children on the basis of the taxable property and household incomes in their districts.
In adopting this arrangement, Respondents have violated Article III, Section 14, of the Pennsylvania Constitution (the “Education Clause”), which requires the General Assembly to “provide for the maintenance and support of a thorough and efficient system of public education to serve the needs of the Commonwealth.” They have also violated Article III, Section 32 (the “Equal Protection Clause”), which requires Respondents to finance the Commonwealth’s public education system in a manner that does not irrationally discriminate against a class of children.

2. The General Assembly’s delegation of much of these duties to local school districts cannot elide its ultimate responsibility under the Education Clause and the Equal Protection Clause. Through this lawsuit, Petitioners seek to hold the General Assembly responsible for and accountable to its constitutional mandate.

3. Respondents are well aware that the current school financing arrangement does not satisfy that mandate. In 2006, recognizing its constitutional duty to ensure adequate school funding, the General Assembly passed Act 114, which directed the State Board of Education to conduct a comprehensive statewide “costing-out” study to determine the “basic cost per pupil to provide an education that will permit a student to meet the State’s academic standards and assessments.” Upon the study’s completion in 2007, Respondents learned that 95% of the Commonwealth’s school districts required additional funding, a shortfall that totaled $4.4 billion.
In response, the General Assembly approved a bill in 2008 that established funding targets for each school district and a formula for distributing education funds in a manner that would help ensure that all students could meet state academic standards. Even with a financial crisis sweeping the nation, Respondents were able to rely on that funding formula to begin to more equitably distribute state funds and federal stimulus money, which collectively increased funding for school districts by more than $800 million over three years. Beginning in 2011, however, Respondents abandoned the funding formula, slashed funding to districts by more than $860 million, and passed legislation to severely restrict local communities from increasing local funding. Meanwhile, the cost of meeting state academic standards continued to rise, opening a perilous and widening gap between the actual resources provided to school districts and the resources necessary to provide children in Pennsylvania an adequate education.

4. These funding cuts have had a devastating effect on students, school districts (especially less affluent school districts), teachers, and the future of the Commonwealth. The latest figures from the 2012–13 school year indicate that more than 300,000 of the approximately 875,000 students tested, including the children of the individual Petitioners in this action, are receiving an inadequate education—by Respondents’ own definition—and are unable to meet state academic standards. Specifically, these students are unable to achieve proficiency on the Pennsylvania System of Standardized Assessment (“PSSA”) exams, which the General Assembly modified in 1999 to track state academic standards and measure student performance in reading, writing, math, and science.

5. Because of insufficient funding, Petitioner school districts are unable to provide students with the basic elements of an adequate education, such as appropriate class sizes, sufficient experienced and effective teachers, up-to-date books and technology, adequate course offerings, sufficient administrative staff, academic remediation, counseling and behavioral health services, and suitable facilities necessary to prepare students to meet state proficiency standards. In fact, the superintendent of the state’s largest school district has stated publicly that school staffing levels in 2013-14 were insufficient to provide students an adequate education.

6. Nor do Petitioner school districts have adequate resources to prepare students to pass the Keystone Exams, which measure student performance in math, science, and English. Achieving proficiency or higher on the Keystone Exams (or an equivalent project-based assessment) is a graduation requirement for all Pennsylvania students in the class of 2017 and beyond. Yet over 50% of students in the Commonwealth are currently unable to pass the Keystone Exams. Many of those students will leave high school without a diploma, hindering their ability to enter the workforce or “serve the needs of the Commonwealth.” The existing system of public education is therefore neither thorough nor efficient, as measured by the Commonwealth’s own academic standards and costing-out study.

7. What is worse, the very low levels of state funding and unusually high dependence on local taxes under the current financing arrangement have created gross funding disparities among school districts—an asymmetry that disproportionately harms children residing in districts with low property values and incomes. In fiscal year 2011, local sources provided 60% of the money that funded public education, while state appropriations accounted for only 34%. That year, only three states contributed a smaller percentage of the cost of public education than Pennsylvania.

8. As a consequence, total education expenditures per student now range from as little as $9,800 per student in school districts with low property values and incomes to more than $28,400 per student in districts with high property values and incomes, according to the Pennsylvania Department of Education’s 2012–13 data.2 This unconscionable and irrational funding disparity violates the Equal Protection Clause because it turns the caliber of public education into an accident of geography: Children in property- and income-poor districts are denied the opportunity to receive even an adequate education, while their peers in property- and income-rich districts enjoy a high-quality education.

This funding disparity is not justified by any difference in student needs. To the contrary, those students with the highest needs (e.g., English-language learners, students living in poverty) receive the fewest resources to prepare them to succeed. Nor is it justified by a desire to maintain local control over education. Any such “control” is illusory under the current financing arrangement because districts with low property values do not actually control the amount of resources at their disposal or the standards to which their students are held. In fact, many low-wealth districts have higher tax rates than property-rich school districts. In other words, it is not tax effort that explains the difference in funding. Rather, these underfunded districts are in areas so poor that, despite their high tax rates, they simply cannot raise enough money to improve education without more assistance from the state.

10. Petitioner Panther Valley School District (“Panther Valley”), a property-poor district, is a prime example of the funding disparity. In 2012– 13, Panther Valley’s equalized millage rate of 27.8—the 27th highest of the Commonwealth’s 501 school districts—raised revenue of approximately $5,646 locally per student. Property-rich Lower Merion School District (“Lower Merion”), on the other hand, raised revenue of approximately $23,709 locally per student—four times more than Panther Valley—with an equalized millage rate of just 14.7, almost half of Panther Valley’s.

11. Although the state has made some effort to close that gap, contributing twice as much per student to Panther Valley as it did to Lower Merion, that still left Panther Valley with less than half the combined state and local funding of Lower Merion: $12,022 per student versus $26,700. Respondents cannot reasonably claim that $12,022 is adequate to educate a Panther Valley student—not when the State Board of Education’s own costing-out study showed that Panther valley needed $13,427 per student based on 2005–06 costs. Over the past nine years, of course, those costs have only grown.

12. Given Respondents’ failure to address the funding crisis—and the ongoing harm their failure has inflicted on children throughout the Commonwealth—Petitioners ask this Court to declare the existing school financing arrangement unconstitutional and find that it violates both the Education Clause and the Equal Protection Clause. An objective framework for such an inquiry already exists. The state academic standards and student performance measures developed by Respondents beginning in 1999, as well as the costing-out study they commissioned, provide judicially manageable standards by which the Court can assess whether the General Assembly has maintained and supported “a thorough and efficient system of public education to serve the needs of the Commonwealth,” as required by the Pennsylvania Constitution.

13. Petitioners also seek an injunction compelling Respondents, after being given sufficient time to design, enact, and implement a school financing arrangement consistent with the Constitution, to halt any funding arrangement that (i) does not provide necessary, sufficient, and appropriate funding that ensures all students have an opportunity to obtain an adequate education and meet state academic standards, and (ii) irrationally discriminates against children who live in school districts with low property values and incomes.

1 Benjamin Franklin, Proposal Relating to the Education of Youth in Pennsylvania (1749), available at http://www.archives.upenn.edu/primdocs/1749proposals.html.

2 Unless otherwise noted, throughout this Complaint, “per student” is based upon Average Daily Membership (“ADM”) as reported by the Pennsylvania Department of Education. ADM refers to “all resident pupils of the school district for whom the school district is financially responsible.” It includes students in charter schools. See http://www.portal.state.pa.us/portal/server.pt/community/financial_data_elements/7672.

Professor Francesca Lopez of the University of Arizona responded to Betts and Tang’s critique of her post on the website of the National Education Policy Center.

 

 

 

She writes:

 

 

 

In September, the National Education Policy Center (NEPC) published a think-tank review I wrote on a report entitled, “A Meta-Analysis of the Literature on the Effect of Charter Schools on Student Achievement,” authored by Betts and Tang and published by the Center on Reinventing Public Education. My review examined the report, and I took the approach that a person reading the report and the review together would be in a better position to understand strengths and weaknesses than if that person read the report in isolation. While my review includes praise of some elements of the report, there is no question that the review also points out flawed assumptions and other areas of weakness in the analyses and the presentation of those analyses. The authors of the report subsequently wrote a prolonged rebuttal claiming I misrepresent their analysis and essentially reject my criticisms.

 

The rebuttal takes up 13 pages, which is considerably longer than my review. Yet these pages are largely repetitive and can be addressed relatively briefly. In the absence of sound evidence to counter the issues raised in my review, the rebuttal resorts to lengthy explanations that obscure, misrepresent, or altogether evade my critiques. What seems to most strike readers I’ve spoken with is the rebuttal’s insulting and condescending tone and wording. The next most striking element is the immoderately recurrent use of the term “misleading,” which is somehow repeated no fewer than 50 times in the rebuttal.

 

Below, I respond to each so-labeled “misleading statement” the report’s authors claim I made in my review—all 26 of them. Overall, my responses make two primary points:

 

 The report’s authors repeatedly obscure the fact that they exaggerate their findings. In their original report, they present objective evidence of mixed findings but then extrapolate their inferences to support charter schools. Just because the authors are accurate in some of their descriptions/statements does not negate the fact that they are misleading in their conclusions.

 

 The authors seem to contend that they should be above criticism if they can label their approaches as grounded in “gold standards,” “standard practice,” or “fairly standard practice.” When practices are problematic, they should not be upheld simply because someone else is doing it. My task as a reviewer was to help readers understand the strengths and weaknesses of the CRPE report. Part of that task was to attend to salient threats to validity and to caution readers when the authors include statements that outrun their evidence.

 

One other preliminary point, before turning to specific responses to the rebuttal’s long list. I am alleged by the authors to have insinuated that, because of methodological issues inherent in social science, social scientists should stop research altogether. This is absurd on its face, but I am happy to provide clarification here: social scientists who ignore details that introduce egregious validity threats (e.g., that generalizing from charter schools that are oversubscribed will introduce bias that favors charter schools) and who make inferences on their analyses that have societal implications, despite their claims of being neutral, should act more responsibly. If unwilling or unable to do so, then it would indeed be beneficial if they stopped producing research.

 

What follows is a point-by-point response to the authors’ rebuttal. For each point, I briefly summarize those contentions, but readers are encouraged to read the full 13 pages. The three documents – the original review, the rebuttal, and this response – are available at http://nepc.colorado.edu/thinktank/review-meta-analysis-effect-charter. The underlying report is available at http://www.crpe.org/publications/meta-analysis- literature-effect-charter-schools-student-achievement.

 

#1. The authors claim that my statement, “This report attempts to examine whether charter schools have a positive effect on student achievement,” is misleading because: “In statistics we test whether we can maintain the hypothesis of no effect of charter schools. We are equally interested in finding positive or negative results.” It is true that it is the null hypothesis that is tested. It is also true that the report attempts to examine whether charter schools have a positive effect on student achievement.

 

Moreover, it is telling that when the null hypothesis is not rejected and no assertion regarding directionality can be made, the authors still make statements alluding to directionality (see the next “misleading statement”).

 

#2. The authors object to my pointing out when they claim positive effects when their own results show those “effects” to not be statistically significant. There is no question that the report includes statements that are written in clear and non-misleading ways. Other statements are more problematic. Just because the authors are accurate in some of their descriptions does not negate my assertion that they make “[c]laims of positive effects when they are not statistically significant.” They tested whether a time trend was significant; it was not. They then go on to say it is a positive trend in the original report, and they do it again in their rebuttal: “We estimate a positive trend but it is not statistically significant.” This sentence is misleading. As the authors themselves claim in the first rebuttal above, “In statistics we test whether we can maintain the hypothesis of no effect.” This is called null hypothesis statistical testing (NHST). In NHST, if we reject the null hypothesis, we can say it was positive/negative, higher/lower, etc. If we fail to reject the null hypothesis (what they misleadingly call “maintain”), we cannot describe it in the direction that was tested because the test told us there isn’t sufficient support to do that. The authors were unable to reject the null hypothesis, but they call it positive anyway. Including the caveat that it is not significant does not somehow lift them above criticism. Or, to put this in the tone and wording of the authors’ reply, they seem “incapable” of understanding this fundamental flaw in their original report and in their rebuttal. There is extensive literature on NHST. I am astonished they are “seemingly unaware” of it.

 

#3. My review pointed out that the report shows a “reliance on simple vote-counts from a selected sample of studies,” and the authors rebut this by claiming my statement “insinuates incorrectly that we did not include certain studies arbitrarily.” In fact, my review listed the different methods used in the report, and it does use vote counting in a section, with selected studies. My review doesn’t state or imply that they were arbitrary, but they were indeed selected.

 

#4. The authors also object to my assertion that the report includes an “unwarranted extrapolation of the available evidence to assert the effectiveness of charter schools.” While my review was clear in stating that the authors were cautious in stating limitations, I also pointed to specific places and evidence showing unwarranted extrapolation. The reply does not rebut the evidence I provided for my assertion of extrapolation.

 

#5. My report points out that the report “… finds charters are serving students well, particularly in math. This conclusion is overstated; the actual results are not positive in reading and are not significant in high school math; for elementary and middle school math, effect sizes are very small…” The authors contend that their overall presentation of results is not misleading and that I was wrong (in fact, that I “cherry picked” results and “crossed the line between a dispassionate scientific analysis and an impassioned opinion piece”) by pointing out where the authors’ presentation suggested pro-charter results where unwarranted. Once again, just because the authors are accurate in some of their descriptions does not negate my assertion that the authors’ conclusions are overstated. I provided examples to support my statement that appear to get lost in the authors’ conclusions. They do not rebut my examples, but instead call it “cherry picking.” I find it telling that the authors can repeatedly characterize their uneven results as showing that charters “are serving students well” but if I point to problems with that characterization it is somehow I, not them, who have “crossed the line between a dispassionate scientific analysis and an impassioned opinion piece.”

 

#6. I state in my review that the report includes “lottery-based studies, considering them akin to random assignment, but lotteries only exist in charter schools that are much more popular than the comparison public schools from which students are drawn. This limits the study’s usefulness in broad comparisons of all charters versus public schools.” The rebuttal states, “lottery-based studies are not ‘akin’ to random assignment. They are random assignment studies.” The authors are factually wrong. Lottery-based charter assignments are not random assignment in the sense of, e.g., random assignment pharmaceutical studies. I detail why this is so in my review, and I would urge the authors to become familiar with the key reason lottery-based charters are not random assignment: weights are allowed. The authors provided no evidence that the schools in the study did not use weights, thus the distinct possibility exists that various students do not have the same chance of being admitted, and are therefore, not randomly assigned. The authors claim charter schools with lotteries are not more popular than their public school counterparts. Public schools do not turn away students because seats are filled; their assertion that charters do not need to be more popular than their public school counterparts is unsubstantiated. Parents choose a given charter school for a reason – oftentimes because the neighborhood school and other charter school options are less attractive. But beyond that, external validity (generalizing these findings to the broader population of charter schools) requires that over-enrolled charters be representative of charters that aren’t over-enrolled. That the authors test for differences does not negate the issues with their erroneous assumptions and flatly incorrect statements about lottery-based studies.

 

#7. The authors took issue with my critique that their statement, “One conclusion that has come into sharper focus since our prior literature review three years ago is that charter schools in most grade spans are outperforming traditional public schools in boosting math achievement” is an overstatement of their findings. In their rebuttal, they list an increase in the number of significant findings (which is not surprising given the larger sample size), and claim effect sizes were larger without considering confidence intervals around the reported effects. In addition to that, the authors take issue with my critique of their use of the word “positive” in terms of their non-significant trend results, which I have already addressed in #2.

 

#8. The authors take issue with my finding that their statement, “…we demonstrated that on average charter schools are serving students well, particularly in math” (p. 36) is an overstatement. I explained why this is an overstatement in detail in my review.

 

#9. The authors argue, “Lopez cites a partial sentence from our conclusion in support of her contention that we overstate the case, and yet it is she who overstates.” The full sentence that I quoted reads, “But there is stronger evidence of outperformance than underperformance, especially in math.” I quoted that full sentence, sans the “[b]ut.” They refer to this as “chopping this sentence in half,” and they attempt to defend this argument by presenting this sentence plus the one preceding it. In either case, they fail to support their contention that they did not overstate their findings. Had the authors just written the preceding sentence (“The overall tenor of our results is that charter schools are in some cases outperforming traditional public schools in terms of students’ reading and math achievement, and in other cases performing similarly or worse”), I would not have raised an issue. To continue with “But there is stronger evidence of outperformance than underperformance, especially in math” is an ideologically grounded overstatement.

 

#10. The authors claim, “Lopez seriously distorts our work by comparing results from one set of analyses with our conclusions from another section, creating an apples and oranges problem.” The section the authors are alluding to reported results of the meta- analysis. I pointed out examples of their consistent exaggeration. The authors address neither the issue I raise nor the support I offer for my assertion that they overstate findings. Instead, they conclusively claim I am “creating an apples and oranges problem.”

 

#11. The authors state, “Lopez claims that most of the results are not significant for subgroups.” They claim I neglected to report that a smaller sample contributed to the non-significance, but they missed the point. The fact that there are “far fewer studies by individual race/ethnicity (for the race/ethnicity models virtually none for studies focused on elementary schools alone, middle schools alone, or high schools) or other subgroups” is a serious limitation. The authors claim that “This in no way contradicts the findings from the much broader literature that pools all students.” However, the reason ethnicity/race is an important omission is because of the evidence of the segregative effects of charter schools. I was clear in my review in identifying my concern: the authors’ repeated contentions about the supposed effectiveness of charter schools, regardless of the caution they maintained in other sections of their report.

 

#12. The authors argue, “The claim by Lopez that most of the effects are insignificant in the subgroup analyses is incomplete in a way that misleads. She fails to mention that we conduct several separate analyses in this section, one for race/ethnicity, one for urban school settings, one for special education and one for English Learners.” Once again, the authors miss the point, as I explain in #11. The authors call my numerous examples that discredit their claims “cherry picking.” The points I raise, however, are made precisely to temper the claims made by the authors. If cherry-picking results in a full basket, perhaps there are too many cherries to be picked.

 

#13. The authors take issue that I temper their bold claims by stating that the effects they found are “modest.” To support their rebuttal, they explain what an effect of .167 translates to in percentiles, which I argued against in my review in detail. (The authors chose to use the middle school number of .167 over the other effect sizes, ranging from .023 to .10; it was the full range of results that I called “modest.”) Given their reuse of percentiles to make a point, it appears the authors may not have a clear understanding of percentiles: they are not interval-level units. An effect of .167 is not large given that it may be negligible when confidence intervals are included. That it translates into a 7 percentile “gain” when percentiles are not interval level units (and confidence bands are not reported) is a continued attempt to mislead by the authors. I detail the issues with the ways the authors present percentiles in my review. (This issue is revisited in #25, below.)

 

#14. The authors next take issue with the fact I cite different components of their report that were “9 pages apart.” I synthesized the lengthy review (the authors call it “conflating”), and once again, the authors attempt to claim that my point-by-point account of limitations with their report is misleading. Indeed, according to the authors, I am “incapable of understanding” a “distinction” they make. In their original 68-page report, they make many “distinctions.” They appear “incapable of understanding” that the issues I raise concerning “distinctions” is that they were reoccurring themes in their report.

 

#15. The authors next find issue with the following statement: “The authors conclude that ‘charter schools appear to be serving students well, and better in math than in reading’ (p. 47) even though the report finds ‘…that a substantial portion of studies that combine elementary and middle school students do find significantly negative results in both reading and math – 35 percent of reading estimates are significantly negative, and 40 percent of math estimates are significantly negative (p. 47)’.” This is one of the places where I point out that the report overstates conclusions notwithstanding their own clear findings that should give them caution. In their rebuttal, the authors argue that I (in a “badly written paragraph”) “[insinuate] that [they] exaggerate the positive overall math effect while downplaying the percentage of studies that show negative results.” If I understand their argument correctly, they are upset that I connected the two passages with “even though the report finds” instead of their wording: “The caveat here is”. But my point is exactly that the caveat should have reigned in the broader conclusion. They attempt to rebut my claim by elaborating on the sentence, yet they fail to address my critique. The authors’ rebuttal includes, “Wouldn’t one think that if our goal had been to overstate the positive effects of charter schools we would never have chosen to list the result that is the least favorable to charter schools in the text above?” I maintain the critique from my review: despite the evidence that is least favorable to charter schools, the authors claim overall positive effects for charter schools—obscuring the various results they reported. Again, just because they are clear sometimes does not mean they do not continuously obscure the very facts they reported.

 

#16. The authors take issue with the fact that my review included two sentences of commentary on a companion CRPE document that was presented by CRPE as a summary of the Betts & Tang report. As is standard with all NEPC publications, I included an endnote that included the full citation of the summary document, clearly showing an author (“Denice, P.”) other than Betts & Tang. Whether Betts & Tang contributed to, approved, or had nothing to do with the summary document, I did not and do not know.

 

#17. The next issue the authors have is that I critiqued their presentation and conclusions based on the small body of literature they included in their section entitled, “Outcomes apart from achievement.” The issue I raise with the extrapolation of findings can be found in detail in the review. The sentence from the CRPE report that seems to be the focus here reads as follows, “This literature is obviously very small, but both papers find evidence that charter school attendance is associated with better noncognitive outcomes.” To make such generalizations based on two papers (neither of which was apparently peer reviewed) is hardly an examination of the evidence that should be disseminated in a report entitled, “A Meta-Analysis of the Literature on the Effect of Charter Schools on Student Achievement.” The point of the meta-analysis document is to bring together and analyze the research base concerning charter schools. The authors claim that because they are explicit in stating that the body of literature is small, that their claim is not an overstatement. As I have mentioned before, just because the authors are clear in their caveats, making assertions about the effects of charter schools with such limited evidence is indeed an overstatement. We are now seeing more and more politicians who offer statements like, “I’m not a scientist and haven’t read the research, but climate change is clearly a hoax.” The caveats do little to transform the ultimate assertion into a responsible statement.

 

Go to the link to read the rest of Professor Lopez’s response to Betts and Tang, covering the 26 points they raised.

On September 30, Francesca Lopez of the University of Arizona reviewed a study of charter schools by Julian R. Betts and Y. Emily Tang of the University of California at San San Diego

Betts and Tang here respond to Lopez’ critique of their study of charter school achievement..

The critical study by Lopez was published by the National Education Policy Center and posted on this blog.

Betts and Tang say that Lopez misrepresented their study. They write:

“First, what did we find in our meta-analysis of the charter school effectiveness literature? On average, charter school studies are revealing a positive and statistically significant difference between math achievement at charter schools and traditional public schools. We also find a positive difference for reading achievement, but this difference is not statistically significant. Second, we devote much of our paper to studying not the mean effect, but the variation across studies in the effect of attending a charter school. We find that charter schools’ effectiveness compared to nearby traditional public schools varies substantially across locations.

“What is the central claim of Lopez? She writes: “The report does a solid job describing the methodological limitations of the studies reviewed, then seemingly forgets those limits in the analysis” (p. 1). She uses words like “exaggeration” and “overstated” (p. 8) to characterize our analysis of the literature, and implies that our conclusions are not “reserved,” “responsible,” (p. 7) or “honest” (p. 7 and p. 8).

“Throughout her essay, Lopez falsely projects intentions in our words that simply are not there. We encourage interested readers to review the words that we actually wrote, in their full context, in our abstract, main paper, and our conclusion. We are confident that readers will confirm for themselves that any “overstated” conclusions of which Lopez accuses us are imagined.”

“There are serious problems with Lopez’s arguments. First, she habitually quotes our work in a selective and misleading way. Such rhetorical slights, in which she quotes one of our sentences while ignoring the highly relevant adjacent sentences, or even cutting important words out of our sentences, overlook important parts of our analysis and result in a highly inaccurate presentation of our work. Second, her analysis contains six technical errors. These technical mistakes, some quite serious, invalidate many of Professor Lopez’s claims. An appendix to this essay exposes more than two dozen misleading or outright incorrect statements that Lopez makes in a mere 9-page essay. To give readers a sense of the scope and severity of these problems, consider the following examples:

“Example 1: A Partial and Misleading Quotation

“Lopez insinuates that we exaggerate the positive overall math effect while downplaying the percentage of studies that show negative results. She writes:
“The authors conclude that ‘charter schools appear to be serving students well, and better in math than in reading’ (p. 47) even though the report finds ‘…that a substantial portion of studies that combine elementary and middle school students do find significantly negative results in both reading and math—35 percent of reading estimates are significantly negative, and 40 percent of math estimates are significantly negative (p. 47)’”

“Here is what we actually wrote on page 47: “Examining all of these results as separate parts of a whole, we conclude that, overall, charter schools appear to be serving students well, and better in math than in reading. The caveat here is that a substantial portion of studies that combine elementary and middle school students do find significantly negative results in both reading and math—35 percent of reading estimates are significantly negative, and 40 percent of math estimates are significantly negative.”

“Lopez uses two rhetorical devices to lead readers to the perception that we overstated findings. First, she separates the two quotations, implying that we are somehow hiding the second result, when in fact we intentionally mention the positive overall mean math effect and the variation in the results across studies side by side. Second, she further misleads the reader by again cutting out part of our sentence. Instead of stating that we have a “caveat” to the positive mean math effect she removes that entire clause.

“What makes the approach of Lopez even more misleading is that in the paragraph above, we were bending over backwards to be fair. We cite only one type of study in that quotation: those that combine elementary and middle schools. (These account for about 1/7th of all the studies.) Why did we focus only on those studies in the above quotation? Because these studies were the exception to our conclusion—the ones that produced the highest percentage of studies with negative and significant estimates. Wouldn’t one think that if our goal had been to overstate the positive effects of charter schools we would never have chosen to list the result that is the least favorable to charter schools in the text above? For example, we could have stated that for elementary school studies, only 12% showed negative and significant reading results, compared to 71% showing positive and significant results. Or we could have stated that only 11% of elementary school studies showed negative and significant math results, while 61% showed positive and significant results in math.

“Lopez fails to list any of the more positive results from the other grade span categories studied that led us to our overall conclusion. We noted the exception above precisely because it was an exception. While it is worth noting, it does not refute the other evidence. By citing an exception as a reason to dismiss all of the other results, Lopez misses the main point of a statistical meta-analysis. This is a consistent pattern throughout her essay.”

Betts and Tang make 26 points about the flaws of Lopez’s analysis.

The New York Times has an excellent article by Lizette Alvarez about the growing outrage among parents against the standardized testing of their children. The article focuses on parents in Florida–whose children are being intellectually suffocated by the Jeb Bush model of punitive testing and accountability–but in fact the same complaints are increasingly heard in every state. The idea that children learn more if they are tested more has been the dogma of the ruling politicians of both parties since at least 2001, when huge majorities in Congress passed President George W. Bush’s No Child Left Behind law. Now, along comes President Obama and Secretary Arne Duncan with their Race to the Top program, and the stakes attached to testing go higher still. Now, it is not only students who are subjected to tests that label and rank them, but the jobs of principals and teachers are on the line if test scores do not go up.

 

This is the best article I have read about the current testing mania in the New York Times. It is heartening that the revolt against the testing madness has attracted national attention in the nation’s most important newspaper. Many broadcast media use the Times as their guide to the important issues of the day.

 

Alvarez begins:

 

 

 

ROYAL PALM BEACH, Fla. — Florida embraced the school accountability movement early and enthusiastically, but that was hard to remember at a parent meeting in a high school auditorium here not long ago.

 

Parents railed at a system that they said was overrun by new tests coming from all levels — district, state and federal. Some wept as they described teenagers who take Xanax to cope with test stress, children who refuse to go to school and teachers who retire rather than promote a culture that seems to value testing over learning.

 

“My third grader loves school, but I can’t get her out of the car this year,” Dawn LaBorde, who has three children in Palm Beach County schools, told the gathering, through tears. Her son, a junior, is so shaken, she said, “I have had to take him to his doctor.” She added: “He can’t sleep, but he’s tired. He can’t eat, but he’s hungry.”

 

One father broke down as he said he planned to pull his second grader from school. “Teaching to a test is destroying our society,” he said.

 

Later in the story, she adds:

 

In Florida, which tests students more frequently than most other states, many schools this year will dedicate on average 60 to 80 days out of the 180-day school year to standardized testing. In a few districts, tests were scheduled to be given every day to at least some students.

 

The furor in Florida, which cuts across ideological, party and racial lines, is particularly striking for a state that helped pioneer accountability through former Gov. Jeb Bush. Mr. Bush, a possible presidential contender, was one of the first governors to introduce high-stakes testing and an A-to-F grading system for schools. He continues to advocate test-based accountability through his education foundation. Former President George W. Bush, his brother, introduced similar measures as governor of Texas and, as president, embraced No Child Left Behind, the law that required states to develop tests to measure progress.

 

The concerns reach well beyond first-year jitters over Florida’s version of Common Core, which is making standards tougher and tests harder. Frustrations also center on the increase this year in the number of tests ordered by the state to fulfill federal grant obligations on teacher evaluations and by districts to keep pace with the new standards. The state mandate that students use computers for standardized tests has made the situation worse because computers are scarce and easily crash.
“This is a spinning-plates act like the old ‘Ed Sullivan Show,’ ” said David Samore, the longtime principal at Okeeheelee Community Middle School in Palm Beach County. “What you are seeing now are the plates are starting to fall. Principals, superintendents, kids and teachers can only do so much. They never get to put any plates down.”

 

Imagine that: Many schools will dedicate 60-80 days this year to standardized testing! This is a bonanza for the testing industry, and a bonanza for the tech industry, which gets to sell so many millions of computers and tablets for test-taking, but it is a disaster for students. Think of it: students are losing 33-40% of the school year to testing. This is time that should be spent on instruction, on reading, on creating projects, on debating ideas, on physical exercise, on singing, dancing, painting, and drawing.

 

The testing madness is out of control. Parents know it. Teachers know it. Principals know it. Superintendents know it. The only ones who don’t know it are sitting in the Governor’s mansion and in the State Legislature, in the U.S. Department of Education, the White House and Congress. If they had to spend 33-40% of their time taking standardized tests to measure their effectiveness, they would join with the angry parents of Florida and say “enough is enough.”

 

 

 

 

 

 

 

 

 

 

Yesterday, in response to a reader in Ohio, I posted an “Ohio Alert,” warning that the State Board of Education in Ohio would soon consider eliminating teachers of  art, music, and physical education, librarians, social workers, and nurses in elementary schools.

 

Several commenters on the blog have disputed the claim and said it was not true..

 

This article seems to offer a definitive explanation.

 

It is NOT TRUE that the vote will be taken this week. The state board will vote on this question in December. Forgive my error!

 

What will the vote be about?

 

Patrick O’Donnell of the Cleveland Plain Dealer writes:

 

The state board will vote in December, not this week as some have claimed, on whether to eliminate requirements that local districts have a certain number of elementary art, music or physical education teachers, school counselors, library media specialists, school nurses, social workers and “visiting teachers.”

 

Administrative code requires districts to have at least five of these eight positions per 1,000 students in what some call the “5 of 8” rule. The state board is considering wiping out that rule and allowing districts to make staffing decisions on their own.

 

Tom Gunlock, the board’s vice chairman, said this morning that the proposed change isn’t to eliminate those positions, as some are charging, but to let districts make their own choices.

 

Should the state require districts to have these classes and services? Or is that a local decision? Tell us below.
“I’m sure they’ll do what’s right for their kids,” Gunlock said.

 

He added: “For years, people have been telling me about all these unfunded mandates and that we’re telling them what to do. They keep telling me they know more about what their kids need that we do, and I agree with them.”

 

Susan Yutzey, president of the Ohio Educational Library Media Association, is urging her members to oppose the change. In a presentation on the change posted online, Yutzey said she and other organizations are “concerned that local boards and administrators will see this as an opportunity to eliminate art, music, physical education, school counselors, library media specialists, school nurses and social workers.”

 

So, if we read Mr. Gunlock’s view correctly, the state will consider changing its requirement that elementary schools must fill five of these eight positions. Requiring that all elementary schools have teachers of the arts, nurses, librarians, physical education teachers, and social workers is “an unfunded mandate.” Schools facing budget cuts could get rid of the school nurse or the teachers of the arts or teachers of physical education or social workers or librarians. The choice would be theirs as the state code would no longer require that every school must fill at least five of these eight positions per 1,000 students.

 

If I were a parent of an elementary school age child in Ohio, I would be very alarmed that the state board is making these positions optional. For warning that the state board is even considering such a nonsensical “mandate relief,” I offer no apology.

Peter Greene knows that breaking up is hard to do. But it is happening. The people who love charters also were promoting Common Core. They had a common goal: make public schools look bad, then watch the stampede to privately-managed charters.

 

What is it about Common Core that has made it toxic? The more teachers use it, the more the polls show they don’t like it. Rhetoric to the contrary, CCSS does tell teachers how to teach, based on the likes and dislikes of the authors, few of whom ever were classroom teachers. Rhetoric to the contrary, the early grades set absurd expectations that some children will meet easily, and others won’t reach for a year or two. No one on the writing team had ever taught little kids or had no idea that they develop at different rates. No one had any experience teaching students with disabilities, most of whom will look bad on Common Core tests. Greene points to the number of governors, like Malloy and Cuomo, who disowned the Common Core, but I think it is better to wait and see what happens now that the election is over.

 

Greene writes:

 

The Ed Reform movement has always been a marriage of different groups whose interests and goals sometimes aligned, and sometimes did not. The Systems Guys, the Data Overlords, the Common Core Corporate Hustlers, the Charter Privateers, the Social Engineers– they agree on some things (we need to replace variable costly teachers with low-cost uniform widgets), but there are cracks in the alliance, one seems to be turning into a fissure.

 

The Common Core Hustlers are being dumped by the Charter Privateers. It’s not an obvious break-up– the privateers haven’t texted the Core backers to say, “Hey, we need to talk.” It’s the slow, soft drop. The unreturned phone calls. The unwillingness to even say the name. Not even making eye contact when they show up at the same party. It’s awkward. It’s painful.

 

It wasn’t always like this. Charters and the Core were a match made in heaven. To spur financing and enrollment, the Charter forces needed a way to “prove” that public schools suck, and that meant finding a yardstick with which public schools could be measured and found failing. That meant some sort of standardized test, and that meant something to test them on. So, Common Core. The Core and the Tests (from which it could not, must not, be separated) would be the smoking gun, the proof that public schools were failing and that only privatizing schools would save Our Nation’s Youth.

 

The corporate folks liked it because it was another opportunity for market growth. The fake liberals liked it because it could be packaged as a way to bring equity to the poor. The fake conservatives liked it because it could be packaged as a way to use market forces to get those slacker poor folks into line.The Core and Charter really got each other. They wanted all the same things.

 

But soon, the love affair between charters and the Core started to show strain. The Core would show up late at night, smelling like Big Government. And while everybody’s friends liked the Core when it first started coming around, but as they got to know it, they started whispering behind its back that it was kind of an asshole. Pretty soon, old friends like Bobby Jindal were calling the Core out in public. And when election season came, they weren’t invited to the same parties together any more. Jeb Bush had been the Core’s oldest and best friend, and even he had a huge party where Charters were held up for praise and applause and the Core wasn’t even mentioned.

 

There was no longer any denying it. When Charter walked into the cafeteria, instead of sitting down with the Core and telling friends, “You should come sit with the Core. It’s cool” instead Charter would sit on the other side of the room and say, “You don’t want to sit at that table with that thing.”

 

Once the Core had been a marketing point. Public schools were bad news because they couldn’t do Common Core well enough. Now public schools are bad news because they are trying to do Common Core well enough. We used to market charters as a way to run toward the Core; now we market them as a way to run away from it.

 
None of the reformsters who now disown Common Core are dropping any other part of the reformster agenda, especially not privatization.

 

And you can bet they are not dropping high-stakes testing either, unless the public revolt gets loud enough for legislators to hear it.

 

 

Last year, the school board of Lansing, Michigan, voted to eliminate music, art, and physical education from its elementary schools. It was a budget-cutting measure. Where were the “reformers”? Silence. Do you remember the strong statement from Secretary Duncan? Neither do I.

 

After the teachers of the arts and physical education were laid off, the job of teaching those subjects was assigned to the regular classroom teachers. No specialists, no art teachers, no music teachers, no gym teachers.

 

Remember that old idea about equality of educational opportunity? This isn’t it.

 

 

 

 

This is one of the best quotes I have seen in a very long time:

 

“One of the saddest lessons of history is this: If we’ve been bamboozled long enough, we tend to reject any evidence of the bamboozle. We’re no longer interested in finding out the truth. The bamboozle has captured us. It’s simply too painful to acknowledge, even to ourselves, that we’ve been taken. Once you give a charlatan power over you, you almost never get it back.”

 

 
― Carl Sagan, The Demon-Haunted World: Science as a Candle in the Dark

Andrea Gabor, professor of journalism at Baruch College in New York City, recently interviewed Stuart Maguder, an architect in Los Angeles who serves on the Bond Oversight Committee of the school district. He was unusually outspoken in his criticism of John Deasy’s deal to spend construction bond money on IPads for all. For his criticism, he was briefly ousted from his unpaid position, then restored after a public outcry. He is critical of both Deasy and the teachers’ union, finding them both intransigent.

Gabor, an expert on the work of W. Edwards Deming, observed:

“As Magruder spoke of Deasy defeat and the union’s intransigence, I was struck by an irony: My principle purpose in traveling to Los Angeles was to attend the annual conference of the Deming Institute, which was founded in order to continue to work of W. Edwards Deming, the management guru whose ideas about systems thinking and collaborative improvement–informed by statistical theory–helped turn around struggling American industries in the 1980s.

“The unraveling in Los Angeles is just the latest example of education reformers who have yet to absorb the most valuable management lessons of the last half century–achieving lasting institutional change and improvement involves teamwork, collaboration among all the constituencies in an organization, and systems thinking. None of which have been on display in Los Angeles.”