High school rankings by popular media usually take into account how many students take AP exams. Some high schools push students to take AP courses whether or not they are prepared, just to satisfy the rankings. But are the AP courses an appropriate measure of high quality?
A reader responded to an earlier post about the Tucson BASIS charter schools by questioning the value of AP courses and tests:
“Here is the essence of what Tim Steller wrote about BASIS-Tuscon: “the Basis schools require students to take eight AP courses before graduation, take six AP tests and pass at least one…That naturally helps Basis place high in the U.S. News rankings” And, it is ALL about the rankings. And the College Board’s Advanced Placement program (which Diane neglected to mention).
Steller adds this important point in his article about BASIS, made by an education consultant: “AP has pulled the wool over people’s eyes across the nation…”
Actually, it’s the College Board that has “pulled the wool over people’s eyes.” About AP, to be sure. But also about the SAT and PSAT, and Accuplacer, the placement test used by more than 60 percent of community colleges. They’re all mostly worthless, more hype than reality.
Consider the Advanced Placement program, pushed shamelessly buy the College Board, and by Jay Mathews at The Washington Post (Mathews started the Challenge Index, a ranking of high schools based on the number of AP tests they give).
A 2002 National Research Council study of AP courses and tests found them to be a “mile wide and an inch deep” and inconsistent with research-based principles of learning.
A 2004 study by Geiser and Santelices found that “the best predictor of both first- and second-year college grades” is unweighted high school grade point average, and a high school grade point average “weighted with a full bonus point for AP…is invariably the worst predictor of college performance.”
A 2005 study (Klopfenstein and Thomas) found AP students “…generally no more likely than non-AP students to return to school for a second year or to have higher first semester grades.” Moreover, the authors wrote that “close inspection of the [College Board] studies cited reveals that the existing evidence regarding the benefits of AP experience is questionable,” and “AP courses are not a necessary component of a rigorous curriculum.”
A 2006 MIT faculty report noted ““there is ‘a growing body of research’ that students who earn top AP scores and place out of institute introductory courses end up having ‘difficulty’ when taking the next course.”
Two years prior, Harvard “conducted a study that found students who are allowed to skip introductory courses because they have passed a supposedly equivalent AP course do worse in subsequent courses than students who took the introductory courses at Harvard” (Seebach, 2004).
Dartmouth found that high scores on AP psychology tests do NOT translate into college readiness for the next-level course. Indeed, students admit that ““You’re not trying to get educated; you’re trying to look good;” and, “”The focus is on the test and not necessarily on the fundamental knowledge of the material.”
Students know that AP is far more about gaming the college acceptance process than it is learning.
In The ToolBox Revisited (2006), Adelman wrote about those who had misstated his original ToolBox (1999) work: “With the exception of Klopfenstein and Thomas (2005), a spate of recent reports and commentaries on the Advanced Placement program claim that the original ToolBox demonstrated the unique power of AP course work in explaining bachelor’s degree completion. To put it gently, this is a misreading.”
Ademan goes on to say that “Advanced Placement has almost no bearing on entering postsecondary education,” and when examining and statistically quantifying the factors that relate to bachelor’s degree completion, Advanced Placement does NOT “reach the threshold level of significance.”
The 2010 book “AP: A Critical Examination” noted that “Students see AP courses on their transcripts as the ticket ensuring entry into the college of their choice,” yet, “there is a shortage of evidence about the efficacy, cost, and value of these programs.” And this: AP has become “the juggernaut of American high school education,” but “ the research evidence on its value is minimal.”
As Geiser (2007) notes, “systematic differences in student motivation, academic preparation, family background and high-school quality account for much of the observed difference in college outcomes between AP and non-AP students.” College Board-funded studies do not control well for these student characteristics (even the College Board concedes that “interest and motivation” are keys to “success in any course”).
Klopfenstein and Thomas (2010) find that when these demographic characteristics are controlled for, the claims made for AP disappear.
Yet, the myths –– especially about AP, the SAT and PSAT –– endure.
Meanwhile, the College Board is promoting the Common Core and says it has “aligned” (cough, wink) its products with it. And people believe it. Stopping corporate-style “reform and the Common Core is easier said than done. Parents, students and educators are going to have to remove the wool from over their eyes. And that means abandoning blind belief in the College Board and the products it peddles.”