Archives for category: Alaska

The Walton Family Foundation, which is the second largest funder of privately-run charter schools (first is the U.S. Department of Education, which dispenses $400 million a year to charters), wanted to create positive press about charter schools in Alaska. So they commissioned a study by two charter advocates, who produced the positive results Walton wanted.

Beth Zirbes teaches math and statistics in a high school in Fairbanks, Alaska. With her friend Mike Bronson, she reviewed the data in the state records and reached a different conclusion: charter schools are no better than neighborhood public schools, even though the charter students are more advantaged. Their article, with a link to their study, was published by the Anchorage Daily News.

What’s impressive about this study is that a high school math teacher bested a Harvard professor of political science. It just goes to show: Don’t be overly impressed by the author’s academic credentials. And, never believe any charter or voucher research funded by foundations that fund charters and vouchers.

Would you believe a study claiming that cigarettes do not cause cancer if the study was funded by Philip Morris or some other tobacco vendor?

Zirbes and Bronson wrote:

The governor has claimed in several newspaper pieces that Alaska charter schools are more effective than neighborhood schools, and that the charters should be modeled more widely. He’d seen reports by Paul E. Peterson and M. Danish Shakeel, sponsored by the Walton Family Foundation, showing that Alaska charter schools held top rank academically among other states on a federal test.

We value the good performance of many charter school students, but we were skeptical that charter schools were necessarily more effective at lifting students up. So we looked at state data to find out how much of the charter schools’ better scores might be attributed to the schools themselves versus what the students bring to the schools. Read our full report here.

The state’s data showed the governor’s takeaway was incorrect. He was wrong that Peterson’s study showed the superior effectiveness of Alaskan charter schools over neighborhood schools. First, Peterson’s study did not even look at neighborhood schools. Second, after we accounted for numbers of students poor enough to be eligible for reduced-price or free lunches, we found that charter schools and neighborhood schools did not statistically differ in their English language proficiency scores. Instead, the percentage of proficient students in both charter and neighborhood schools was closely related to family income.

Alaska charter schools, on average, are distinguished by high proportions of white students, higher family income and fewer English language learners. Alaska charter school student bodies, in general, don’t even resemble Lower 48 charter schools, let alone Alaskan neighborhood schools. Unfortunately, Alaskan charter students do resemble other Alaskan public schools in that a majority of them score below the state standards in reading and math.

The graph shows a decline in percentages of third to ninth-grade participating students who scored proficient or better on the state’s 2019 PEAKS assessment of English language arts with increasing school percentages of students poor enough to be eligible for free or reduced-priced lunch, in other words economically disadvantaged. Each point shows a public school in Alaska school districts having charter schools. Neighborhood schools are considered non-charter, brick-and-mortar schools including alternative and lottery schools managed by a school district. Data are from the Department of Education and Early Development.

Researcher Beth Zirbes, a teacher of advanced mathematics, used her skills to dissect a charter school study produced at Harvard. The study was reported by Paul Peterson in The Journal of School Choice; Peterson, like the Journal, is an outspoken advocate for charters and vouchers. The study claimed that charter schools outperform public schools, and that the charter schools in Alaska were best among all states.

The governor of Alaska cited the study as a reason to increase the number of charter schools.

Zirbes doubted that this was true and decided to do her own analysis. What she found, amazingly, was that the Harvard study ignored vital demographic factors.

She wrote:

When I first saw the results of the Harvard study concerning charter schools I was simultaneously unsurprised and skeptical. I was unsurprised as I have seen many very bright young students in my AP classes come from charter schools. I was skeptical as I suspected much of this performance could be attributed to the type of student who attends Alaska’s charter schools. As a comparative analysis of Alaska’s charter schools and neighborhood schools had not been done, I set out to do one myself.

To determine whether charter schools outperform neighborhood schools I looked at the performance of all schools on the 2018-2019 PEAKS ELA (English Language Arts) assessment from the Alaska Department of Early Childhood Education and Development’s (DEED) report card to the public, as this year was within the same time frame as the data from the Harvard study. The performance of each school is given under the “2018-2019 Performance Evaluation for Alaska’s Schools (PEAKS)” tab. I used the data on this page for every school in the state which allowed me to analyze test scores and demographic characteristics such as the proportion of the school who are economically disadvantaged, English language learners (ELL), and special education (SPED). Demographic characteristics are only given for the set of test-takers and thus all summaries and analyses are for students in grades 3-9 during the 2018-2019 school year. I also removed all correspondence schools from my dataset as these students were not included in the Harvard study, do not take NAEPs tests, and have very low participation rates on state tests. For my analysis on performance, I also restricted my dataset to include schools only in districts where charters are an option to ensure that the student populations were as similar as possible.

At first glance, it appears that charter schools are more successful than neighborhood schools. At charter schools 52.5% (1,866 out of 3,554) of students were proficient on the ELA assessment versus 40.1% (18,655 out of 46,574) of students at neighborhood schools. However, these differences could be explained by the differences in demographics of the student bodies at these schools. To rule this out as a potential issue, statisticians control for these variables in their mathematical models. We can then ask, do the charter schools outperform neighborhood schools that have similar characteristics? Or do charter schools do any better than we would expect, given their student populations? In short, the answer is no, they do not. When I fit a model which controlled for socioeconomic status alone, the type of school (charter versus neighborhood) was not significant. (For anyone who knows statistics, the p-value associated with type of school was 0.57. It wasn’t even close.) In summary, there is no evidence that charter schools outperform neighborhood schools in terms of ELA proficiency once we consider their socioeconomic make-up.

During my data exploration, I discovered that charter schools, on average, have very different student bodies than neighborhood schools. Charter schools have far fewer economically disadvantaged students, far fewer ELL students, and are comparable to neighborhood schools in terms of SPED populations. Here is a summary of how these populations differ for all Alaskan students in the relevant grades in all of Alaska’s brick and mortar schools for the 2018-2019 school year:

  •  Neighborhood schools were 52.2% economically disadvantaged (30,780 out of 58,929 students) compared to 31.3% in charter schools (1,219 out of 3895 students).
  • Neighborhood schools were 15.5% ELL (9,150 out of 58,929 students) compared to 9.3% in charter schools (363 out of 3895 students).
  • Neighborhood schools were 16.3% SPED (9,162 out of 58,929 students) compared to 13.7% in charter schools (532 out of 3895 students).

However, these summaries are highly influenced by a few outliers and obscure some large discrepancies, especially in terms of the economically disadvantaged and ELL students.

  • Of the charter schools, 46.4% (13 out of 28) have economically disadvantaged rates below 20%, compared to just 3.5% of neighborhood schools (15 out of 426).
  • Only 10.7% of charter schools (3 out of 28) have ELL percentages above 10%, compared to 36.9% of neighborhood schools (157 out of 426).

Even if we did a comparison of charter schools and neighborhood schools and found that charter schools did better, we still cannot conclude charter schools are causing the performance difference we observe. A comparative study like this is an example of an observational study and because it is impossible to control for all confounding factors, such as parental involvement, we can’t conclude success is caused by the school type. To definitively conclude that charter schools were causing the observed difference in success compared to neighborhood schools we would have to randomly assign some students to go to a charter school and some students to go to neighborhood schools. After some time, we would then compare the results. Obviously, this is impractical as many charter schools do not have busing, require volunteer hours, can remove students for poor attendance, and some do not even have lunch services.

Before the state uses the results of the Harvard study to change the approval process for charter schools we need to understand if charters are better and, if so, why. So far, I have not seen convincing evidence that charter schools outperform neighborhood schools when we control for various student characteristics. I have an idea for further study which I believe should be completed before any changes to policy are made. We can examine the performance of students who got admitted to charter schools via the lottery to those who applied but did not get in and attended their neighborhood schools instead. The group who was admitted is likely similar to those who applied but were not. This would be as close to a randomized experiment as one could hope to have. From this one could determine whether various factors were causing differences in performance, such as class size and teaching methodology. Additionally, we could use results of such a study to determine which factors correlate with success and apply these strategies in all our schools. As more than 90% of the students at our in-person schools are in neighborhood schools, such reforms will be more wide-reaching than simply adding a few more charter schools.

 2018-2019 PEAKS ELABeth Zirbeseconomically disadvantagedELLGuest ColumnHarvard Charter School StudyNAEPSPED

This is a heartening story in The Nation about the effective activism of Alaskans, who persuaded Senator Lisa Murkowski to oppose DeVos.

They bombarded her with calls, emails, etc.

The question for Senator Murkowski and Senator Collins–who say they will vote against DeVos on the Senate floor–is why they didn’t vote against her in committee. If her nomination had been voted down in committee, it would never have reached the Senate as a whole. She was endorsed by the HELP committee by vote of 12-11. If only one of them had voted no, DeVos would now be history.

But they cannily approved her in committee, then announced they would vote no when their vote no longer was pivotal.

If every Republican votes for DeVos except for these two, the Senate will have a tie, 50-50. Mike Pence will then cast the tiebreaker and DeVos will be confirmed.

DeVos will become the first candidate for a Cabinet position in history to be endorsed by a tie-breaking vote by the Vice President.

I am not ready to offer any awards to Murkowski or Collins. Either one of them could have put an end to her candidacy in committee, and they didn’t. These are not profiles in courage.


Mercedes Schneider tells the wonderful, wacky, mad story of the backhoe that cut through a fiber optic cable and canceled testing in the state of Alaska.

Remember those reliable tests that teachers gave? Pencils always came through. 

Politico reported this morning:

ALASKA HAD DOUBTS BEFORE TESTING SEASON BEGAN: The Alaska Department of Education was concerned about this year’s computer-based statewide student assessments even before last week’s Internet connectivity problems, Interim Commissioner Susan McCauley told Morning Education. “We had very shaky confidence going into this assessment, but from an administrative standpoint, assumed it would be fine,” McCauley said. Alaska had already decided in February that it would begin the search for a new testing vendor for next year and beyond, and this experience made clear their need for an institution that can provide “high-quality, useful data for Alaska parents and educators.” The state canceled testing entirely last week: http://politico.pro/201LR38.

– “While there’s no way we could have predicted this outcome,” McCauley added, “there should have been a plan in place” in the event of a situation like this. “We learned a lot to better prepare us for future assessments.” The University of Kansas’ Center for Education Testing and Evaluation noted that other students using its tests Monday were having no trouble: http://bit.ly/1S5n2Np.

– In addition to Internet woes, the testing system also failed on another front: Students who were disrupted were often taken back to the start of the assessment rather than where they left off. Because of this, the data surrounding test completion rates pose a problem as well. The state reports that just 8.2 percent of students completed the entire ELA portion of the Alaska Measures of Progress assessment, and just 5 percent completed the whole math portion. Only 4 percent of students were able to complete all stages of the Alaska Science Assessment exam. And those could be too high. Brian Laurent, the Department’s data management supervisor, notes that these figures must be taken with “extreme caution,” since there is no way to assess whether the percentages reflect actual completion of the exam or “‘completion’ as interpreted by the testing platform.” Whatever the real figures, they will surely fall below the required 95 percent of students who must be tested according to federal law. [http://politico.pro/1QFIhua]