We have had an interesting conversation on the blog about the value of AP courses. It was tied to Jay Mathews’ use of AP courses to rank the quality of high schools: the more AP courses, the better the high school.

I have made clear two points: One, when I was in high school in the 1950s, there were no AP courses, so I have had no experience with them; and my children graduated high school without ever taking an AP course. This, I have no personal experience with AP. Two, I strongly object to the College Board marketing AP courses on the spurious grounds that they promote equity and civil rights. The College Board is making millions by doing so. It should be as honest as those selling cars, beer, and cigarettes.

Our friend and reader “Democracy” posted this comment:

“It sure is interesting that the pro-AP commenters on this thread do not – and cannot – cite any solid evidence that Advanced Placement is any more than hype. To be sure – there are good AP teachers. Also to be sure – as a program – AP just is NOT what the proponents claim. Far from it.

Much of the AP hype that exist in the U.S. can be traced to Jay Mathews, who slobbers unabashedly over the “merits” of AP. Mathews not only misrepresents the research on AP but also publishes the yearly Challenge Index, which purportedly lists the “best” high schools in America based solely on how many AP tests they give.

Jay Mathews writes that one of the reasons his high school ““rankings have drawn such attention is that “only half of students who go to college get to take college-level courses in high school.” What he does NOT say is that another main reason his rankings draw scrutiny is that they are phony; they are without merit. Sadly, far too many parents, educators and School Board members have bought into the “challenge” index that Mathews sells.

The Challenge Index is – and always has been – a phony list that doesn’t do much except to laud AP courses and tests. The Index is based on Jay Mathews’ equally dubious assumption that AP is inherently “better” than other high school classes in which students are encouraged and taught to think critically.

As more students take AP –– many more are doing so…they’ve been told that it is “rigor” and it’s college-level –– more are failing the tests. In 2010, for example, 43 percent of AP test scores were a 1 or 2. The Kool-Aid drinkers argue that “even students who score poorly in A.P. were better off.” Mathews says this too. But it’s flat-out wrong.

The basis for their claim is a College Board-funded study in Texas. But a more robust study (Dougherty & Mellor, 2010) of AP course and test-takers found that “students – particularly low-income students and students of color – who failed an AP exam were no more likely to graduate from college than were students who did not take an AP exam.” Other studies that have tried to tease out the effects of AP while controlling for demographic variables find that “the impact of the AP program on various measures of college success was found to be negligible.”

More colleges and universities are either refusing to accept AP test scores for credit, or they are limiting credit awarded only for a score of 5 on an AP test. The reason is that they find most students awarded credit for AP courses are just generally not well-prepared.

Former Stanford School of Education Dean Deborah Stipek wrote in 2002 that AP courses were nothing more than “test preparation courses,” and they too often “contradict everything we know about engaging instruction.” The National Research Council, in a study of math and science AP courses and tests agreed, writing that “existing programs for advanced study [AP] are frequently inconsistent with the results of the research on cognition and learning.” And a four-year study at the University of California found that while AP is increasingly an “admissions criterion,” there is no evidence that the number of AP courses taken in high school has any relationship to performance in college.

In The ToolBox Revisited (2006) Clifford Adelman scolded those who had misrepresented his original ToolBox research by citing the importance of AP “in explaining bachelor’s degree completion. Adelman said, “To put it gently, this is a misreading.” Moreover, in statistically analyzing the factors contributing to the earning of a bachelor’s degree, Adelman found that Advanced Placement did not reach the “threshold level of significance.”

College Board executives often say that if high schools implement AP courses and encourage more students to take them, then (1) more students will be motivated to go to college and (2) high school graduation rates will increase. Researchers Kristin Klopfenstein and Kathleen Thomas “conclude that there is no evidence to back up these claims.”

In fact, the unintended consequences of pushing more AP may lead to just the reverse. As 2010 book on AP points out “research…suggests that many of the efforts to push the program into more schools — a push that has been financed with many millions in state and federal funds — may be paying for poorly-prepared students to fail courses they shouldn’t be taking in the first place…not only is money being misspent, but the push may be skewing the decisions of low-income high schools that make adjustments to bring the program in — while being unable to afford improvements in other programs.”

Do some students “benefit” from taking AP courses and tests? Sure. But, students who benefit the most are “students who are well-prepared to do college work and come from the socioeconomic groups that do the best in college are going to do well in college.”

So, why do students take AP? Because they’ve been told to. Because they’re “trying to look good” to colleges in the “increasingly high-stakes college admission process,” and because, increasingly, “high schools give extra weight to AP courses when calculating grade-point averages, so it can boost a student’s class rank.” It’s become a rather depraved stupid circle.

One student who got caught up in the AP hype cycle –– taking 3 AP courses as a junior and 5 as a senior –– and only got credit for one AP course in college, reflected on his AP experience. He said nothing about “rigor” or “trying to be educated” or the quality of instruction, but remarked “if i didn’t take AP classes, it’s likely I wouldn’t have gotten accepted into the college I’m attending next year…If your high school offers them, you pretty much need to take them if you want to get into a competitive school. Or else, the admissions board will be concerned that you didn’t take on a “rigorous course load.” AP is a scam to get money, but there’s no way around it. In my opinion, high schools should get rid of them…”

Jay Mathews calls AP tests “incorruptible.” But what do students actually learn from taking these “rigorous” AP tests?

For many, not much. One student remarked, after taking the World History AP test, “dear jesus… I had hoped to never see “DBQ” ever again, after AP world history… so much hate… so much hate.” And another added, “I was pretty fond of the DBQ’s, actually, because you didn’t really have to know anything about the subject, you could just make it all up after reading the documents.” Another AP student related how the “high achievers” in his school approached AP tests:

“The majority of high-achieving kids in my buddies’ and my AP classes couldn’t have given less of a crap. They showed up for most of the classes, sure, and they did their best to keep up with the grades because they didn’t want their GPAs to drop, but when it came time to take the tests, they drew pictures on the AP Calc, answered just ‘C’ on the AP World History, and would finish sections of the AP Chem in, like, 5 minutes. I had one buddy who took an hour-and-a-half bathroom break during World History. The cops were almost called. They thought he was missing.”

An AP reader (grader), one of those “experts” cited by Mathews notes this: “I read AP exams in the past. Most memorable was an exam book with $5 taped to the page inside and the essay just said ‘please, have mercy.’ But I also got an angry breakup letter, a drawing of some astronauts, all kinds of random stuff. I can’t really remember it all… I read so many essays in such compressed time periods that it all blurs together when I try to remember.”

Dartmouth no longer gives credit for AP test scores. It found that 90 percent of those who scored a 5 on the AP psychology test failed a Dartmouth Intro to Psych exam. A 2006 MIT faculty report noted “there is ‘a growing body of research’ that students who earn top AP scores and place out of institute introductory courses end up having ‘difficulty’ when taking the next course.” Mathews called this an isolated study. But two years prior, Harvard “conducted a study that found students who are allowed to skip introductory courses because they have passed a supposedly equivalent AP course do worse in subsequent courses than students who took the introductory courses at Harvard” (Seebach, 2004).

When Dartmouth announced its new AP policy, Mathews ranted and whined that “The Dartmouth College faculty, without considering any research, has voted to deny college credit for AP.” Yet it is Jay who continually ignores and diminishes research that shows that Advanced Placement is not what it is hyped up to be.

In his rant, Mathews again linked to a 2009 column of his extolling the virtues of the book “Do What Works” by Tom Luce and Lee Thompson. In “Do What Works,” Luce and Thompson accepted at face value the inaccuracies spewed in “A Nation At Risk” (the Sandia Report undermined virtually everything in it). They wrote that “accountability” systems should be based on rewards and punishments, and that such systems provide a “promising framework, and federal legislation [NCLB] promotes this approach.” Luce and Thompson called NCLB’s 100 percent proficiency requirement “bold and valuable” and “laudable” and “significant” and “clearly in sight.” Most knowledgeable people called it stupid and impossible.

Luce and Thompson wrote that “data clearly points to an effective means” to increase AP participation: “provide monetary rewards for students, teachers, and principals.”
This flies in the face of almost all contemporary research on motivation and learning.

As I’ve noted before, College Board funded research is more than simply suspect . The College Board continues to perpetrate the fraud that the SAT actually measures something important other than family income. It doesn’t. Shoe size would work just as well.

[For an enlightening read on the SAT, see: http://www.theatlantic.com/magazine/archive/2005/11/the-best-class-money-can-buy/4307/%5D

The College Board produced a “study” purporting to show that PSAT score predicted AP test scores. A seemingly innocuous statement, however, undermined its validity. The authors noted that “the students included in this study are of somewhat higher ability than…test-takers” in the population to which they generalized. That “somewhat higher ability” actually meant students in the sample were a full standard deviation above those 9th and 10th graders who took the PSAT. Even then, the basic conclusion of the “study” was that students who scored well on the PSAT had about a 50-50 chance of getting a “3”, the equivalent of a C- , on an AP test.

A new (2013) study from Stanford notes that “increasingly, universities seem
to be moving away from awarding credit for AP courses.” The study pointed out that “the impact of the AP program on various measures of college success was found to be negligible.” And it adds this: “definitive claims about the AP program and its impact on students and schools are difficult to substantiate.” But you wouldn’t know that by reading Jay Mathews or listening to the College Board, which derives more than half of its income from AP.

What the College Board doesn’t like to admit is that it sells “hundreds of thousands of student profiles to schools; they also offer software and consulting services that can be used to set crude wealth and test-score cutoffs, to target or eliminate students before they apply…That students are rejected on the basis of income is one of the most closely held secrets in admissions.” Clearly, College Board-produced AP courses and tests are not an “incorruptible standard.” Far from it.

The College Board routinely coughs up “research studies” to show that their test products are valid and reliable. The problem is that independent, peer-reviewed research doesn’t back them up. The SAT and PSAT are shams. Colleges often use PSAT scores as a basis for sending solicitation letters to prospective students. However, as a former admissions officer noted, “The overwhelming majority of students receiving these mailings will not be admitted in the end.” But the College Board rakes in cash from the tests, and colleges keep all that application money.

Some say – and sure does look that way – that the College Board, in essence, has turned the admissions process “into a profit-making opportunity.”

Mathews complains about colleges who no longer award AP credit. He says (wink) “Why drop credit for all AP subjects without any research?” Yet again and again he discounts all the research.

Let’s do a quick research review.

A 2002 National Research Council study of AP courses and tests was an intense two-year, 563-page detailed content analysis. The main study committee was comprised of 20 members who are not only experts in their fields but also top-notch researchers. Most also write on effective teaching and learning. Even more experts were involved on content panels for each discipline (biology, chemistry, physics, math), plus NRC staff. Mathews didn’t like the fact that the researchers concluded that AP courses and tests were a “mile wide and an inch deep” and they did not comport with well-established, research-based principles of learning. He dismissed that study as the cranky “opinion of a few college professors.”

The main finding of a 2004 Geiser and Santelices study was that “the best predictor of both first- and second-year college grades” is unweighted high school grade point average, and a high school grade point average “weighted with a full bonus point for AP…is invariably the worst predictor of college performance.” And yet – as commenters noted here – high schools add on the bonus. The state of Virginia requires it.

Klopfenstein and Thomas (2005) found that AP students “…generally no more likely than non-AP students to return to school for a second year or to have higher first semester grades.” Moreover, they write that “close inspection of the [College Board] studies cited reveals that the existing evidence regarding the benefits of AP experience is questionable,” and “AP courses are not a necessary component of a rigorous curriculum.”
In other words, there’s no need for the AP imprimatur to have thoughtful, inquiry-oriented learning.

Phillip Sadler said in 2009 that his research found “students who took and passed an A.P. science exam did about one-third of a letter grade better than their classmates with similar backgrounds who did not take an A.P. course.” Sadler also wrote in the 2010 book “AP: A Critical Examination” that “Students see AP courses on their transcripts as the ticket ensuring entry into the college of their choice,” yet, “there is a shortage of evidence about the efficacy, cost, and value of these programs.” Sadly, AP was written into No Child Left Behind and Race to the Top and it is very much a mainstay of corporate-style education “reform,” touted by the likes of ExxonMobil and the US Chamber of Commerce.

For years, Mathews misrepresented Clifford Adelman’s 1999 ToolBox. As Klopfenstein and Thomas wrote in 2005, “it is inappropriate to extrapolate about he effectiveness of the AP Program based on Adelman’s work alone.” In the 2006 ToolBox Revisited Adelman issued his own rebuke:

“With the exception of Klopfenstein and Thomas (2005), a spate of recent reports and commentaries on the Advanced Placement program claim that the original ToolBox demonstrated the unique power of AP course work in explaining bachelor’s degree completion. To put it gently, this is a misreading.”

The book, ‘AP A Critical Examination’ (2010) lays out the research that makes clear AP has become “the juggernaut of American high school education,” but “ the research evidence on its value is minimal.” It is the academic equivalent of DARE (Drug Abuse Resistance Education). DARE cranks out “research” that shows its “effectiveness,” yet those studies fail to withstand independent scrutiny. DARE operates in more than 80 percent of U.S. school districts, and it has received hundreds of millions of dollars in subsidies. However, the General Accounting Office found in 2003 that “the six long-term evaluations of the DARE elementary school curriculum that we reviewed found no significant differences in illicit drug use between students who received DARE in the fifth or sixth grade (the intervention group) and students who did not (the control group).”

AP may work well for some students, especially those who are already “college-bound to begin with” (Klopfenstein and Thomas, 2010). As Geiser (2007) notes, “systematic differences in student motivation, academic preparation, family background and high-school quality account for much of the observed difference in college outcomes between AP and non-AP students.” College Board-funded studies do not control well for these student characteristics (even the College Board concedes that “interest and motivation” are keys to “success in any course”). Klopfenstein and Thomas (2010) find that when these demographic characteristics are controlled for, the claims made for AP disappear.

I’m left wondering about this wonder school where “several hundred freshmen” take “both AP World and AP Psychology” and where by graduation, students routinely “knock out the first 30 ore more college credits.”

Where is this school, and what is its name? Jay Mathews will surely be interested.”