David Berliner, Regents Professor Emeritus at Arizona State University, and one of the natuon’s Most distinguished researchers of education, asked me to pass along his advice to the citizens of Philadelphia.
Dear Diane,
A few weeks ago, I was heartened by your column about the return of the public schools to the citizens of Philadelphia. Since then, I’ve been mulling over four things that I wish I could communicate to them. Perhaps you can do so if you think it appropriate. I don’t know the folks there.
First, it will be difficult for teachers to show that they can turn Philadelphia’s schools into higher-achieving institutions. Teachers may help their students become stronger and more engaged learners, but they probably won’t be able to demonstrate student learning in the way that most people understand it, namely, through higher standardized achievement test scores.
The education research community clearly knows what politicians and the media don’t fully grasp: teachers simply don’t account for much of the variance in standardized tests scores. A reasonable estimate is that teachers account for about 10 percent of the variance in standardized achievement test scores. Research also suggests that outside-of-school factors account for 6 times more of that variance! We even have a Philadelphia based study corroborating these estimates.
In a 2014 Educational Researcher article by Fattuzo, LeBoeuf, & Rouse, 10,000 achievement test scores from Philadelphia were examined. The researchers used two sets of variables as predictors of students’ standardized test scores. The first set were school-level demographic variables such as race, gender, and degree of economic disadvantage. These are the kind of variables that the Coleman Report first revealed as strong influences on standardized achievement test scores.
My own research, and that of many others, has repeatedly confirmed this truth. In Fattuzo et al. these kinds of variables predicted 63 percent of the between school variance, quite close to the usual estimate of 60 percent of variance accounted for by demographic variables in students’ achievement test scores. Their data, then, are similar to what was found across nations in PISA and PIRLs, and similar to what other researchers find when school-level demographic variables are put into regression equations to predict variance accounted for.
But what Fattuzo et al. also did was add student-level variables to their equations. For each student, they knew whether the child was pre-term or low birth weight, had inadequate prenatal care, had a mother who was a teen, had high lead exposure, had a report record of being maltreated, had ever been homeless, and had a mother with less than a high school degree. It wasn’t surprising that each of these variables was a negative predictor of achievement test scores, and similarly, it wasn’t surprising that all but one variable was a statistically significant predictor of the standardized achievement test scores.
When the conventionally used school-level demographic variables were combined with student-level variables into the same equation, something quite different was revealed. The between school variance in reading test scores increased from 63 percent to 77 percent of the variance accounted for in the students’ standardized test scores. This leaves us with the task of trying to estimate what accounts for the remaining 23 percent of the variance in these standardized achievement test scores. We can separate that remaining variance into variance accounted for by error in the measurement system (a reasonable estimate might be about 10 percent) and school effects that are independent of teacher effects (which may also reasonably be estimated to be about 10 percent). Now we can account for about 97% of the variance. So, what percent of the variance in student test scores remains for teachers to affect? The answer is clearly almost nothing!
This all suggests that the good citizens of Philadelphia will probably not find whatever important things teachers might be accomplishing in their classrooms reflected in the standardized test scores routinely used in Pennsylvania. Instead, they should now think about other credible ways to judge teacher effectiveness.
Second, it won’t be Philadelphia’s newly taken-over schools that demonstrate how to get greater achievement from the students they serve. Schools, like teachers, may also be doing great things. But as noted, they usually affect about 10 percent of the variance on standardized achievement tests. Given Philadelphia’s high poverty rates, the variables associated with poverty may well account for most of the variance in citywide test scores, leaving little variance for the schools to effect, similar to the teacher effects just discussed.
The third point addresses the obvious question, what then should we do to better understand how teachers and schools effect local student outcomes? Usually we measure this via a standardized achievement test score, but as noted, that is quite likely not to be adequate for those purposes.
Instead, let’s try something different, and use the funds ordinarily paid to a test publisher to train selected parents who, alongside school principals or teacher leaders, could routinely observe classrooms and assist in monitoring the quality of instruction. Parents, principals, and teacher leaders can better learn to evaluate the artifacts of teaching, among them teachers’ tests and students’ answers. Those classroom tests will show both teachers’ understanding of and instructional alignment with the desired curriculum (as evidenced in the test’s items). And those tests will also show the quality of teaching that curriculum (as evidenced in the students answers to the items).
These assessments can be conducted as informal observations, or can use some of the of the more systematic and frequently used observation scales, such as those found in Danielson’s and Pianta’s systems. Scriven’s duties-based evaluation approach is certainly worth trying. And Meier’s and Knoester’s recent book offers a half a dozen other ways to assess students and classroom practices that don’t rely on standardized achievement tests. The point is this: assessing the quality of teachers and schools can be done without using standardized achievement tests that are known to be highly insensitive to what teachers and schools accomplish.
My fourth bit of advice is directed at those who serve on the newly constituted school board. It’s to remember that education outcomes are the result of much more than education polices. The school board will not achieve their improvement targets until other city and state policies better address the needs of Philadelphia’s schoolchildren. Housing policies must be strengthened to eliminate segregation by race and income, and affordable housing is needed to minimize residential mobility, which impedes school achievement. Physical and behavioral health policies must be strengthened, and nurses, counselors, and social workers must be sufficiently resourced to ensure that students attend school daily and have the supports needed to fully engage in learning. Policies to assure food security are needed so students aren’t preoccupied with hunger, and fair wage policies can provide income security for working families overwhelmed by living in poverty. And so on.
Getting their schools back is good for democracy in the city that played center stage in the founding of our nation’s democracy. Getting those schools to function well enough so its students can take on the role of stewards of our democracy is a whole other matter. I hope this advice helps them to do just that.
David Berliner
On yesterday’s Vice News there was a story about music in the Philadelphia Public Schools. As a product of this school system, I was both angry and sad at this news. The story was about a music teacher that could no longer have an orchestra because the amount of damage to the instruments. In order to raise money to repair the instruments, the students were giving “concerts” with their broken instruments. What was upsetting was that the reckless, irresponsible leadership of the city and state has left the district a shell of its former self. They allowed the barbarian horde of “edupreneurs” loot the public schools with all their lies, waste, fraud and kickbacks for the so-called representatives.
Developers have been allowed to make a fortune by rebuilding and resegregating neighborhoods with selective charters for middle class white families and cheap charters or crumbling public schools for the poor minorities. The misguided, blind acceptance of “market based solutions” makes money for a few at the top at the expense of many at the bottom. It is anti-democratic, rascist and sickening.
Thanks again, retired teacher, for enlightening us.
Makes me ill.
Public Schools are being starved and like the students we teach, the deformers don’t care and don’t want to GET IT…POVERTY in homes and USELESS, STUPID spending on Common Gore stuff and testing. Egads…and then the workbooks put online..plus money towards unnecessary infrastructure. RIDICULOUS. We ALL are being HAD.
Thanks for posting Berliner’s letter to you, Diane. Love it.
Thanks to David Berliner, a national treasure
Berliner is national treasure … YES, indeed, Diane.
And thanks to you, too. You are a national treasure, too.
Ahhhh
Indeed!
David Berliner carries the Progressive torch brilliantly and reaches those who have forgotten the principles of this signature movement in American education.
late and long again.
I respect much of David Berliner’s thinking, especially about better uses of the vast sums of money spent of standardized tests and test prep. Unfortunately, I see no way to cut those requirements under ESSA and under the equally pernicious focus on tests in state legislation and policy.
The Pennsylvania Association of School Administrators worries about pending state legislation. Apparently schools placed in the bottom 15% of a rating scheme will be required to offer vouchers to students, and each voucher will produce a corresponding drop in the district budget. http://www.pasa-net.org/currentupdate
The plot thickens, because the school rating system in Pennsylvania is in transition, and it also has a title which should raise some red flags. Schools will be rated using a “Future Ready PA Index,” complete with a new data dashboard, beginning in 2018.
I am not certain what the Future Ready PA Index will be. I gather that the SAS Institute may be involved in helping to set up the new system, which might preserve some variant test-dependent VAM scores.
But there is another possibility. The title, “Future Ready PA Index,” rang a bell. There is a Future Ready Schools Organization, devoted to all things digital in learning. Moreover, the website for Future Ready Schools says that the: “Pennsylvania Department of Education is a new Future Ready partner committed to supporting students and teachers through systemic digital learning planning.”
On the chance that a state-wide digital initiative is in the works for Pennsylvania (as it is in the District of Columbia and 29 states), I looked into Future Ready Schools and fell into that rabbit hole.
Future Ready Schools is a branded operation of a national digital learning initiative organized by the Alliance for Excellent Schools and led by Bob Wise, former USDE official leading the charge for digital learning.
Future Ready Schools is funded by the Carnegie Corporation of New York, Bill and Melinda Gates Foundation, Apple, AT&T, McGraw-Hill Education, Google for Education, Follett, Summit Learning, and Pearson.
In addition to these mega-promoters of tech in education Future Ready Schools has three “digital content” partners: The William &Iva Friday Institute for Educational Innovation, COSN (Consortium for School Networking, technology workers) and Common Sense Education. These are the main source of “content” for the training being offered to leaders and participants in Future Ready Schools.
In addition, to these edu-businesses and foundations, Future Ready Schools has forty-three “national partners,“ a mix of established professional groups, networks, and business who are clearly enchanted with disruptive innovation of the digital kind.
Future Ready Schools asks local participants to sign one of two “pledges.” This amounts to a pledge of allegiance and it is a hallmark of projects supported by the Gates Foundation. One pledge is for a participating superintendent and one is for principals who participate.
Both pledges commit the signers to full-throttle attention to digital learning and allegiance to a specific program of retraining district/school employees into the Future Ready Schools ideology of digital learning. The pledge amounts to an agreement to outsource the entire architecture of the school system to an unelected organization, an organization supported by large corporations seeking profits from online, digital education
The ideology of the Future Ready Schools program is evident in the training materials, including “online self-assessments” and micro-credentials (badges) issued for completing specific assignments. The training materials for district and school leaders are organized under eight topics: 1. Curriculum, Assessment, and Instruction, 2. Use of Time and Space, 3. Robust Infrastructure, 4. Data and Privacy, 5. Community Partnerships, 6. Personalized Professional Learning, 7. Budget and Resources, and 8. Collaborative Leadership.
Here is an example of a district self-assessment for Curriculum, Assessment, and Instruction. As I read these rubric-like statements seeking some level of “agreement” or disagreement” it is obvious that there is no interest in curriculum content except as it is aligned with grade–by-grade standards for “21st Century Skills. https://dashboard.futurereadyschools.org/uploads/media/default/0001/01/2e3f80dd676db966d4652e1a99e88704bd98c47e.pdf
Recall that the meme “21st Century Skills” came to us from the tech lobbyist and master marketer Ken Kay, whose list of skills, except for those linked to information technology, were not distinctive to the 21st century. He had no concept of curriculum other than listing conventional names for subjects and a bunch of really complex ideas that he called “skills.’
Back to the rabbit hole. After the self-assessment exercise, your training for each topic is structured much like a standard online course. Moreover, you may be able to receive graduate credit for your topic-centered ‘micro-credential” at one of five universities.
Here is an example of the course objective and requirements. The overarching task for the topic “Curriculum, Assessment, and Instruction” is : “District team will build data-driven plans to lead change in their districts using the Future Ready Framework.
Please notice the thick marketing jargon in this assignment and the placement of this assertion as if inevitable and desirable: ”Assessments are shifting to be online, embedded, and performance-based.” Scan the readings that comprise the course, and notice the platform for the program, from BloomBoard.
In addition to the Future Ready Schools package of micro-credentials, BloomBoard offers online micro-credentials for specific teaching techniques (e.g., from Doug Lemov’s Teach Like a Champion), subjects (e.g., Financial Education), and methods of planning (e.g., Backward Design).
https://bloomboard.com/microcredential/view/79642aa5-c114-4439-9f7c-ea80bec1168b
I hope that the “Future Ready PA Index” is not a stalking horse for the whole bill of goods from “Future Ready Schools.”
Ich bin ein Berliner
I wonder what Prof. Berliner thinks of E.D. Hirsch’s observation that French schools, when they had a coherent, content-rich curriculum, reduced the achievement gap, but now that they have a skills-focused American-style curriculum, the achievement gap is widening. If Hirsch is right, then Philly can make gains by replacing Common Core and other skills-centric curricula with a knowledge-centric curriculum.
Of course, all of this makes sense and is something most people know already. What about what economists have been talking about, the factor of when in the year a child was born. The earlier, the more successful they are. How does that figure into these numbers?
of course the bottom line is that poverty and race are clearly the indicators that matter for almost all children.