Steven J. Koutsavlis, a research associate at the National Center on Privatization in Education reviews Audrey Watters’ new book on the history of education technology in schools. Its title is Teaching Machines: The History of Personalized Learning.
Koutsavlis writes:
On account of the pandemic, there has been a seismic shift to remote or hybrid instruction. However, long before COVID-19, forces to harness instruction to technology were at play within the American school system. In Teaching Machines: The History of Personalized Learning (MIT Press, 2021), Audrey Watters masterfully explores this story and explains the consequences technology has had on the nature and architecture of American schooling.
As many policy analysts know, Watters has been writing incisively about this important topic since 2010 on her blog, Hack Education: The History of the Future of Education Technology. With Teaching Machines, she cements a decade of lucid, riveting commentary.
In this NCSPE excerpt, in particular, Watters establishes the foundation for her analysis with a depiction of the first efforts of the Harvard psychology professor B.F. Skinner to mechanize learning. Skinner would go on to develop several teaching machines during the Sputnik era and beyond. Watters explains how he incorporated his work as a behaviorist into the design of these learning devices.
Skinner, along with other progenitors of teaching machines such as education psychologist Sidney Pressey, aimed to pioneer the automation of pedagogy, “freeing the teacher from the drudgeries of her work so that she may do more real teaching, giving to the pupil more adequate guidance in his learning,” Watters writes. In doing so, they prioritized the interests of private entities looking to engineer systems of learning at the expense of teachers and school leaders who aimed to engender more democratic modes of education.
The posture of automated learning presumes that the work of evaluating student responses and guiding them to new levels of understanding can simply be outsourced to a programmed device and does not require the nuanced touch of a seasoned practitioner with deep content knowledge. Yet the word “assessment” itself derives from the Latin assidēre, meaning “to sit down to.” The role of the teacher to sit beside children and listen deeply and intently, not only to learn how students are approaching a particular task, question, or problem, but also to hear from them about what piques their curiosity about the work at hand and motivates them to persevere. Watters deftly details how even the most well-designed or well-intentioned teaching machines fail to achieve this. She moreover describes how critics of Skinner such as Paul Goodman raised these concerns as they saw these devices dehumanizing the educational process.
“Who, then, will watch the puzzlement on a child’s face and suddenly guess what it is that he really doesn’t understand, that has apparently nothing to do with the present problem, nor even the present subject matter?” Watters quotes approvingly from Goodman’s 1960 book, Growing Up Absurd. “And who will notice the light in his eyes and seize the opportunity to spread the glorious clarity over the whole range of knowledge; for instance, the nature of succession and series, or what grammar really is: the insightful moments that are worth years of ordinary teaching.”
Even with advanced programming and interactive computer displays, personalized teaching machines or programs may not be able to elucidate nuanced understandings of difficult concepts with struggling learners. Independent work with these programs is often unsupervised, and students may receive unauthorized assistance to particular questions instead of actually supplying their own authentic response. A recurring issue with struggling learners is also the motivation to complete the tasks themselves. Extended independent assignments often, in fact, result in fatigue and non-completion for students who are still building task stamina.
Watters also writes about the very challenges of implementing such programs, where private demands for technocratic control over the levers of schooling have clashed with the needs of actual practitioners and students. As we see in contemporary education settings, Watters documents that programs were often rolled out in a hasty and haphazard fashion, unsupported by research evidence demonstrating their effectiveness or appropriateness for students and without adequate levels of teaching training or adoption.
Programmed instruction in the form of teaching machines as well as the modern incarnation of computerized learning engines, Watters likewise makes clear, represent a highly systematized and standardized form of education that collides with more progressive, constructivist, and student-led pedagogical methods. They also reify practices and norms within school systems that promote a highly functionalist model of education, where students are fed bits of information as they are trained to complete discrete tasks serving little more than the informational needs of private companies.
While programmed learning systems and algorithms aim to provide individualization and personalized learning, Watters demonstrates how they can conversely serve to stifle creativity and individual expression, on the student, teacher, school, and system level. “These technologies foreclose rather than foster possibilities,” Watters writes.
For longtime followers of Watters’s blog, which is now on hiatus, Learning Machines will fulfill all expectations. For those who haven’t read Watters’s blog, this excerpt should pave the way to reading the book. Agree or not with Watters, readers will be glued and challenged.
“Who, then, will watch the puzzlement on a child’s face and suddenly guess what it is that he really doesn’t understand, that has apparently nothing to do with the present problem, nor even the present subject matter?””
Who, indeed?
time has given us so many child-development studies with books written about how kids need human interaction to make academic connections — and here we are in our ‘modern’ age cutting more and more of those human interactions down to the bone
Oh yes yes yes.
I think the first teaching machine was the written word, the first personalized teaching machine was the library. Much of the time spent in learning how to properly access that technology and even after many centuries of use there is still disagreement on how best to teach that skill. It will take some time before we figure out how best to use digital technology and teach students how to access it.
The first teaching machine was the human machine and the interaction of that machine with the environment and with other human machines. We are far from figuring out how digital technology can come close to duplicating this interaction. More importantly, what role do we really want digital technology to play?
Speduktr,
I was thinking that spoken language was not really a technology, but written language (and importantly the printing press) was the first technology that allowed for mass learning from an artificial source.
I suspect that technology will eventually duplicate that interaction, but technology now can enhance that interaction. By using Gradescope, for example, I can give better feedback on assignments, spend no class time returning assignments, and get assignments graded in a quarter of the time that grading with fewer errors than is possible without that technology and its handwriting reading artificial intelligence.
te- the following is apropos nothing but, provokes some shaking of the head. The latest Jan. 6 person appearing in court was a senior in “mathematical economics” at the University of Kentucky. She said she was looking for a place (on the Senate floor which she didn’t recognize) to plug in her phone.
She liked the idea of infamy. She thought her future grandkids would want to know she had been in the Capitol on that day.
The Trumpers have reversed what happened on Jan 6.
The insurrectionists are heroes to them.
Those defending democracy are villains to them.
Linda,
We agree. Your post is without reason or purpose.
Like the printed word, other techno-education approaches lead not to decrease human involvement. Rather they tend to increase the ability of the individual to access learning.
The interesting thing about learning and teaching is that labor saving devices never save labor. As soon as space is made in time by such things as books, some people begin to demand more.
I note that this is somewhat different from manufacturing. A quicker way to produce textiles created a specialization in industry, freeing (or perhaps enslaving?) workers to perform tasks other than the traditional spinning and weaving of the traditional economy. While this was going on in the industrial revolution, the complexity of work made more education necessary, just the opposite effect.
It seems to me that the paradox of education is that it does not fit the business model of economics, but you may use teaching to discuss the business model of economics.
Definition of machine-
a piece of equipment with moving parts that does work when it is given power from electricity, gasoline, etc.
Individuals are not an “it”. And, in the American democracy, people should expect more of themselves than to be described as its, earthen vessels and flocks of sheep.
TE– I can see your parallel with the written word/ printing press: digital tech allows us to spread information farther and faster. But the function is the same. Internet conversation is similar: it allows us to converse in real time with more and further-distant people.
But to me– getting back to Skinner/ learning sw– there’s a fundamental error there, in attempting to mimic/ replace the function of teaching/ learning. It’s of some help, but as an assist; a tool, not a replacement. It’s the same error as Skinner’s confusion between learning and behavior. It’s the same as thinking AI/ robots can replace the human brain.
Tech will always follow well behind what we learn about how the brain functions. The absolute killer marvel of digital tech is its ability to perform repetitive calculations far, far beyond the speed and capacity of the brain. Thus it is a wonder-tool for speeding up our learning about how the brain works (and all the body, and the interplay among brain and other bodily systems). It has been an incalculable assist to medicine already, and I count on it to further our baby-level understanding of how genes/ environment work in brain devpt, so we may advance in treating mental illness.
bethree5,
I certainly agree that technology is an assistant, but it is an assistant in the way that a tractor or a harvester is an assistant: it allows fewer people to do the same or better job than before.
Gradescope, which I wrote about above, allows me to grade and record scores in a quarter of the time that grading by hand takes. There are no data entry errors, students get more feedback, and I can feasibly do more frequent low stakes activities in the large classes I teach. The AI in Gradescope reads the student’s handwriting and groups answers together if they all have the same short answer or numerical result. Gradescope then allows my to combine the groups if they all say similar things (say 3.5 or 7/2) and allows me to look at all the ungrouped answers and put them into groups. I can grade every answer in a group with a single keystroke. Once everything is graded, a single keystroke writes all the scores to the grade book.
As the AI gets better I expect that it would learn that 3.5 and 7/2 and 14/4 are all the same answer and group them together. I would also expect that it will learn that “a is less efficient than b” and “b is more efficient than a” are actually the same answer and group those together. It will likely go much further than that and save me even more time, but we will have to give it more data.
Yes, TE that sounds like a wonderful tool, and thanks for explaining how its features don’t just save you correcting time, it allows you to refine & expand types of assignments. I tend to think only of CBE-type programs, which it seems to me could be used as an adjunct to in-person teaching, say for practicing what you learned in class, but apparently gets [mis]used in some districts as a substitute for the IRL part. What you are describing sounds like having a teaching assistant in your pocket! I also tend to think through my lens as a former humanities/ for-lang teacher, where only some types of homework lend themselves to a computer-grading assist.
We’ll call you when you’re six years old
And drag you to the factory
To train your brain for eighteen years
With promise of security
But then you’re free
And forty years you waste to chase the dollar sign
So you may die in Florida
At the pleasant age of sixty nine
The water’s getting hard to drink
We’ve mangled up the country side
The air will choke you when you breathe
We’re all committing suicide
But it’s alright
It’s progress folks keep pushin’ till your body rots
Will strip the earth of all its green
And then divide her into parking lots
But there’s nothing you and I can do
You and I are only two
What’s right and wrong is hard to say
Forget about it for today
We’ll stick our heads into the sand
Just pretend that all is grand
Then hope that everything turns out ok
You’re free to speak your mind my friend
As long as you agree with me
Don’t criticize the fatherland
Or those who shape your destiny
‘Cause if you do
You’ll lose your job your mind and all the friends you knew
We’ll send out all our boys in blue
They’ll find a way to silence you
But there’s nothing you and I can do
You and I are only two
What’s right and wrong is hard to say
Forget about it for today
We’ll stick our heads into the sand
Just pretend that all is grand
Then hope that everything turns out ok
Ostrich 1967 John Kay
absolutely crucial: “The role of the teacher to sit beside children and listen deeply and intently, not only to learn how students are approaching a particular task, question, or problem, but also to hear from them about what piques their curiosity about the work at hand and motivates them to persevere. Watters deftly details how even the most well-designed or well-intentioned teaching machines fail to achieve this.” Where is student voice in programmed learning? Does anyone pushing this crap stop to think that while children are herded through digital programs, there is no opportunity for them to actually ask questions, whether for clarification or curiosity? I have a vague memory of being introduced to programmed learning during high school in the ’60’s. After moving through the program for a short time, I realized that I could answer the questions with only a part of my mind, and not understand any of the content at all. I am convinced that this is what happens for most students on digital platforms. This is the antithesis of learning. I’m looking forward to reading Audrey Watters’ book. I have been making my way through Shoshana Zuboff’s equally crucial expose of the aims of surveillance capitalism (also directly linked to Skinner): The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. To my mind, this dovetails completely with what the edtechdeformers have been trying and succeeding to accomplish with digital data mining/monitoring/profiling/profiting. This is a sinister agenda that must be thwarted.
I too recall those programmed learning books. I hated the experience. I must admit that others I have talked to were not quite so put off as I was. A best friend recalls learning stuff from her SRA reading. Not I. The hated program was put behind me as soon as I could.
I think the problem is that the purveyors of this type of learning are so taken with the platform that they lose sight of what they are teaching. The same can be said of a particular testing format. Once we decide to test a certain way, we start to gear our instruction toward that end and off we go, steered by a hand unconscious.
I remember the SRA color coded reading cards that were once used by reading teachers in the 1970s. As I recall, they were dull as dish water, and students hated doing them. Most computerized instruction is simply boring worksheets in an electronic delivery system.
Sheila, this is interesting & on-point: “I realized that I could answer the questions with only a part of my mind, and not understand any of the content at all.”
Using the perhaps-faux but intuitively-helpful concept of left-brain/ right-brain learning: any CBE-type program is going to be based on left-brain thinking, i.e, “thinking in words/ linear thinking/ sequencing/ mathematics/ facts/ logic.” Because that’s how computers and their software “think.” It’s binary, and linear. As you suggest, one can easily miss the overall content conveyed in an article/ book, as it’s essentially being “tested” by the sw for awareness of discrete facts or concepts presented. Such sw these days is [I’m speculating] likely based on CCSS ELA, which likewise prioritizes surface features over comprehension of content and its implications.
Understanding content involves letting what’s going on in that specific left-brain practice connect to other binary processes and to other reading and to real-life experience—which leads to understanding this specific content, and then beyond to other content gleaned from other readings, via individual thoughts/ previous conclusions reached, now expanded.
Thank you for this thoughtful post. The “teaching machine” model of education does not belong in a democracy where we expect people to become engaged citizens and informed voters. The pandemic has shown that a rather large group of voters is easily misled and swayed by misinformation. Critical thinkers are essential in a democracy.
Corporations do not care about democracy. They only care about profit. They want to produce obedient drones, not creative thinkers. If the only education available to the children of the proletariat is sitting in front of screens all day, we will have lost a democratic education that provides access and opportunity for all. Expansion of cyber instruction will lead to a tiered system of opportunity. The children of the wealthy will continue to get a world class education while the children of the working class will be conditioned by teaching machines to stay in their place and not question anything. This is social engineering, not education.
“Corporations do not care about democracy. ”
There is an irony here. Nineteenth Century Classical Liberals made the argument that corporations pushed the idea of democracy. Men like David Livingstone, the Scot who went to Africa as a missionary and thereby opened the continent for European exploitation, would have argued that commerce was a catalyst for freedom from conflict and human rapacity. What followed was the nasty affair of European domination of Africa over the next several decades.
Even in the face of this experience, some will make the same argument today.
retired teacher– “Corporations do not care about democracy. They only care about profit. They want to produce obedient drones, not creative thinkers.”
I have some trouble with this line of thinking, although I hear it often here from many commenters. Full disclosure: my corporate experience, though extensive, is only with engineering/ construction firms. “Obedient drones, not creative thinkers” are not welcome employees at such places, cannot even do the work. The description sounds to me more like Amazon box-stuffers/ carriers/ loaders, who are mere antecedents to the robots that will inevitably replace them.
We can make other parallels: might work for Walmart cashiers, but not for Starbucks servers/ cashiers, where personality/ service are part of the job, and creativity may even be rewarded. When it comes to retail, perhaps what I’m observing is just our usual herding into services for the rich vs the poor: those with smallest wallets will have to put up with obedient drones without a bone of creativity… Meanwhile the corporations themselves cannot function without highly-educated creative and innovative people at supervisory/ mgt/ above levels.
So… you may be right that corporations can find the people they need without democracy… or not. IMHO, it’s the same paradigm we observe with charter schools: for-profits feeding off the benefits provided by public services while simultaneously bleeding them dry. It’s not sustainable.
“Who, then, will watch the puzzlement on a child’s face and suddenly guess what it is that he really doesn’t understand”
Is the child wearing a mask?
I blame Bill Gates for being one of the biggest advocates and pushers of tech in schools and according to one expert, Gates’ worldview is deeply flawed.
“Anthropology Expert Says Bill Gates’ Positive World View Is Deeply Flawed”
https://observer.com/2019/01/jason-hickel-bill-gates-world-view-flawed/
Pinker, Gates and Epstein…of course.
It’s America’s sorrow that Melinda believes the press she pays for- the propaganda that she is better than her ex-husband.
Hickel’s argument: “Prior to colonization, most of the world’s population lived in “subsistence economies” where they relied not on money, but actual resource such as land, livestock and a system of sharing and reciprocity. They had little if any money, but then they didn’t need it in order to live well—so it makes little sense to claim that they were poor,” Hickel wrote.”
That seems like a real stretch to me. We are meant to believe that prior to 1500, most people “lived well.” Life expectancy was 30-40 years. 1/2 of children died by age 15 [half of those in infancy]. The average fertility rate was 5 or more children, yet population did not increase. The part of the world we’re familiar with—Western Europe—was a feudal society, where a few in fact “lived well,” and the vast majority—did not. Hickel seems to view pre-colonial Europe as some knd of hippy-dippy-happy commune.
Also interesting by Audrey Watters on her blog: “Luddite Sensibilities’ and the Future of Education”, 29 July 2020
http://hackeducation.com/2020/07/29/luddite-sensibilities
Thank you, John. That is wonderful!
Apologies for redundancies to those who’ve known this of me from past posts:
I was the tech liaison and coach for the six sites associated with our special education D75 school in NYC for more than a decade. I applied for and won very large grants which ushered in what was then “state of the art” technology for all of our classes. I then searched for, reviewed, and bought programs. Gave PDs/personal training sessions to get the teachers up to snuff on the new mediums, as well.
Of equal importance were my responsibilities, earlier on, related to finding and purchasing curriculum programs. I’d attend the book/curriculum fairs and bring back brochures and suggestions to the admins.
I say “of equal importance” because, when it comes to taking autonomy away from a teacher (which tech does, indeed, do)…there’s more than one way to skin a cat.
The curriculum publisher’s presentations became more script centered, as time went on. “And the best thing is: you, the teacher, don’t have to do ANYTHING!” was one of the common catch phrases. When we’d ask, “What if we WANT to do something?”, we’d be referred to the “Differentiation” blocks (often with a purple background for some reason) in the script. Or we’d get no answer at all.
Tech took took that ball, pumped it up with steroids, and ran with it. I could make this much longer, but, “Please don’t get him started” def applies here. Suffice it to say that my colleagues and I were witness to and participants of the gradual shift from “tech serving the teachers’ needs” to “teacher serving tech as a monitor to the process”.
Skinner may actually have believed in his theories; but once you add in the profit motive, inherent in the private sector and coat the whole mess with politicians allied with big money/anti-union interests; the motives become much less altruistic.