Blue Cereal Education is the name of an educator-blogger in the Tulsa area. He or she has helpfully reproduced a graphic from the website of the Oklahoma State Department of Education that will show you, in a flash, how teaching and learning are being systematically destroyed in this country by robots who pretend to be humans.
It is called “Ms. Bullen’s Data-Rich Year,” but it might as well be called “The End of Teaching as We Know It As we Collect Data and Pretend It Matters.”
Here are a few of the 15 steps to a data-rich classroom:
“(7) You are expected to create an IEP for each and every one of your students before school even begins! (Step Two) Setting aside the fact that this is insane, it’s still nine full steps before Step Eleven, where an ‘early warning system’ (which appears to be an iPad app) will send an alert to a strange man in the room that Joey is off-track, or failing. Presumably the strange man will tell Ms. Bullen, who can call Joey’s very involved parents in to look at the full-sized mural she’s devoted to the Chutes & Ladders version of Joey’s educational journey. Thank god there’s finally a way to know when students are failing – other than the fact that they’re, for example, failing.
“(8) You are expected to immediately discard the approximately 170 IEP’s you’ve spent weeks creating so you can “adjust instruction on the fly” (Step Three) based solely and exclusively on the perceived reactions of Joey. We can only hope the 34 other students in the room are not offended at the impact this must have on their individualized learning experience. At the same time, this is a great moment – it’s the only point in All 18 Steps that assumes for even an instant that you (represented here by Ms. Bullen) have any idea what you’re doing without consulting a few dozen spreadsheets of data. But don’t worry – you won’t be stuck teaching ‘on the fly’ for long!
“(9) You will have plenty of time to meet one on one with each of your students (Step Six) to discuss their behavior, attendance data (which is different from attendance… how?), and performance, as well as what Joey’s parents want for him – during the one moment in which is overly involved parents are conspicuously absent. You’ll set some individualized goals for the year to replace that IEP you developed before you met him, then threw out in Step Three.
“Assuming you have approximately 168 students, and that each of these meetings take about 10 minutes, that’s only about… 28 hours each week. Or is it each month? I’m not sure how often this one is supposed to happen. Let’s assume it’s just once – it’s not like Joey’s performance, behavior, goals, or attendance are likely to change throughout the year. So we’ll just use that extra 28 hours floating around during, say… October. Nothing that important happens in October anyway.
“(10) I’m not sure what “Data Coaches” are (Step Seven), although each school apparently has several (they must share office space with all the Tutors and Trainers – no wonder Oklahoma schools are so darned inefficient with how they spend district money!) Apparently while teachers celebrate their one collective decent idea, the Data Coaches do some sort of ceremonial handshake – or perhaps it’s a dance. I’m not familiar with that culture, but I’d really like to see that. There simply aren’t enough dances based on hard educational data.”
Now that is only four of the 15 steps that the State Education Department includes in its graphic.
Taken together, the graphic demonstrates a system that cares nothing about education, nothing about children, and nothing about teachers.
Perhaps it was put together by a computer or by someone who wants to promote home schooling.

“as well as what Joey’s parents want for him – during the one moment in which is overly involved parents are conspicuously absent”
This is more than a little insulting, and would seem to reflect the true feelings of this “teacher”. I surprised you reproduced it verbatim.
LikeLike
One of the problems with data driven systems like this, as well as individualized learning for every child, is the expectation to document so frequently what happens with this child (in a manner that can be broken down by a computer such as a likert scale).
It’s much harder and messier to promote group work and collaboration when it’s much harder to distill each individual student’s contribution unless you very specifically define the roles so they too are “data driven” which takes away from children’s creativity to solve problems.
One of the major gripes I have with this whole system, is that in order to be data driven, you require standardization for both tracking and comparison purposes (either to compare to other students, or to compare against an artificially generated data standard benchmark). It is VERY hard to generate real and meaningful assignments that target individual student’s needs while also constrained by data demands.
It’s like teachers weren’t overworked before when we were just generating lessons, in charge of our classes, and our tests. Now we need to generate the paperwork to prove that every one of those other things we did and are still doing, are backed up by what seems like a statistical thesis for each child. None of this data collection is passive, and it’s very invasive on classroom design.
LikeLike
Well said. The cure is itself a disease that has metastasized through U.S. K-12 education. The deformers like to talk about their “creative disruption.” Well, that they have accomplished. They have managed to disrupt the normal functioning of our schools to an astonishing degree. Years ago, I was running a large editorial department. I decided to start using a project that generated GANTT and PERT charts to track what we were doing. I soon found that I was spending half my time updating the stupid chart. I threw the software out and went back to management by walking around. Breathtakingly more effective.
LikeLike
I decided to start using project software that generated GANTT and PERT charts to track what we were doing.
LikeLike
But oh, my GANTT AND PERT CHARTS were lovely. They were the envy of the other managers. They were a big hit at meetings with superiors. There was just one little problem: they interfered enormously with getting my job done–with the humane and scholarly interactions that are the soul of an editorial project.
LikeLike
Diane posted a piece by a Tulsa teacher who nailed the IEP for all mentality.
LikeLike
That poster states everything that the state feels about teachers: they must be single, immature and unprofessional enough to be interested in cartoons, willing to be spoken to in such a condescending manner…I can’t believe that the state printed that and distributed it…plus most OK teachers come to Texas because they automatically get a raise…and we are one of the lower paying states, so imagine what OK pays.
LikeLike
I think many people, especially parents and teachers, are missing a critical point. The problem isn’t about making data collection work with education. The problem is that education is being defined (or re-defined) in terms of those activities that are compatible with data collection and computer use. As has happened in many areas of human activity historically, when machines and techniques are introduced the activities are then defined in terms of the machines and techniques, not the other way around as often claimed.
The real issue we have to face is the definition of education. People have obtained fantastic educations for millennia without “data”, computers, or technology in general. Education used to focus on the development of the intellect, which really doesn’t require much in the way of devices—clay tablets did quite well for a long time; often students didn’t have books or scrolls but listened attentively as the teacher read to them.
But modernity, especially in America, has become enthralled with the idea of machine- and technique-driven “efficiency” that essentially promises something for nothing. That rarely works, unless we subtly start defining the tasks in terms of the machines and techniques, because ideas about efficiency are often in appropriate or can be defined only vaguely for most tasks involving human intellectual activities. In short, we keep cramming square pegs into round holes.
LikeLike
Bingo.
LikeLike
Also, this shifts the IEP creation to the teacher, whereas it has always been the area of SPED. I would not be surprised, if soon, non SPED trained teachers are expected to create these plans for SPED students. That eliminates paying more for SPED.
I am also concerned about what the liability is for the teacher; so if I do not keep up with this untenable amount of documentation, I am not getting my contract renewed? When do I find time to do the essay grading and writing conferences that I am required to do? The TELPAS that I am required to do…the teaching to the test that I am required to do, the ….it just never ends.
Who is advocating for the teachers to have a realistic workload per week. Oh wait, they get summers off..but wait, that is when the IEP and data analysis is created…
LikeLike
In my state all G&T students have IEPs. Would you expect that a SPED trained teacher would typically be trained in creating IEPs for all the students that require it?
LikeLike
Yes, that is exactly the problem. Too much emphasis on what can be easily measured and reported up the hierarchy and little or no emphasis on what is most important to student learning and daily instructional decisions.
LikeLike
Dr. Ravitch,
The Data Quality Campaign produced this graphic (not the Oklahoma Dept of Ed but I’m thrilled they’re using it!) to show what it looks like when a teacher uses data. Too often, we hear that people think that “using data” is an event that happens once a year. When, in fact, it happens every day and involves a LOT more than just slicing and dicing a state test score. A lot of data use is simply about communicating with parents and colleagues to better understand whether students are on track to meeting their goals. By no means, does this illustration attempt to define teaching. It simply provides a more accurate picture of data use.
Here is a link to the paper that the DQC published this past February that discusses the need to better support our teachers with the skills necessary to effectively use data http://dataqualitycampaign.org/files/DQC-Data%20Literacy%20Brief.pdf. The graphic was published as part of that paper on page 5.
We convened a large group of partners for a full year to develop the content, the graphic, the recommendations, the definition of data literacy, and several other materials. They are listed on page 9: AACTE, The Education Trust, NEA, NASDTEC, WestEd, CCSSO, NCTQ, AASA, RI Department of Education, Oregon Department of Education, AIR, Bill & Melinda Gates Foundation, Michael and Susan Dell Foundation, NBPTS, NISL, 50CAN, George W Bush Institute, The Achievement Network, US Department of Education.
I’d be happy to discuss the graphic or the recommendations in the paper. Given today’s report from NWEA we know that teachers are asking for better supports and we hope the recommendations in our paper are a good beginning.
Paige Kowalski
Director, State Policy and Advocacy
Data Quality Campaign
LikeLike
Paige,
Thanks for the correction. This obsession with data is a nightmare. It seems to destroy what good teaching is about.
LikeLike
Teachers have been using data from the beginning of teaching. I’m not sure what you think is so new about this, except for the fact that it’s exceptionally confusing and time/energy intensive for teachers to use your particular model. Human beings don’t need fancy data processing to give each other feedback.
LikeLike
Diane said it. A nightmare. All this belongs to a really backward, dangerous vision of education as acquisition of a bullet list of skills that can be simply and easily and independently quantified. It’s not data-based decision making. It’s numerology. It is to teaching and learning what astrology is to astronomy, phrenology to the cognitive sciences of personality. All one has to do is to compare any authentic, real-world reading and writing to the InstaWriting and InstaReading done on the tests that generate this “data” to understand what I mean there. Students are not widgets. Learning to read and write and think is not a singular, invariant process with singular, invariant outcomes. That’s utter nonsense that only a technocratic philistine who has not actually noticed the variety of thinking, reading, and writing in the world would take at all seriously. Schools are not factories, and kids are not machine parts to be identically milled to specification. This sort of process would work for producing screws and bolts. It doesn’t work for creating independent, intrinsically motivated, unique readers, writers, thinkers, and learners.
LikeLike
The most well-vetted standardized test in history was probably the SAT. It was supposed to be a test of aptitude for college. So it was called the Scholastic Aptitude Test. There was a slight problem, however, It did not predict achievement in college at all well. It was invalid for that purpose. So, they renamed it the Scholastic Achievement Test. But it was invalid for that purpose as well. It turned out to be correlated with g–the “general intelligence factor” measured by IQ tests. So they renamed it the Scholastic Reasoning Test, or simply the SAT.
And yet people put enormous faith in the “data” generated by standardized tests far, far less well vetted than the SAT was. What a joke.
As I say, the Education Deformers’ “data-based decision making” is NUMEROLOGY. Purest GIGO nonsense. But a lot of noneducators take this stuff seriously, and in the process, they have driven out much of what was great in our classrooms. The writing process/writing workshop model and extended research and reporting, for example, has been almost entirely supplanted by teaching formulatic InstaWriting of responses to test prompts.
Great job, deformers.
LikeLike
cx: The writing process/writing workshop model and extended research and reporting, for example, have been almost entirely supplanted by teaching formulaic InstaWriting of responses to test prompts.
LikeLike
A simple glance at the ELA “standards” measured by these tests should make it quite clear that this “data-based decision making” is a “garbage in” process. Attainment in the English language arts includes attainment of world knowledge (knowledge of what) and procedural knowledge (knowledge of how) and a great deal of implicit acquisition of abilities not explicitly taught or learned (of vocabulary, for example, and of grammatical competence). The CC$$ list contains ALMOST NO world knowledge. It’s a list of skills, for the most part. And those skills are so vaguely formulated that they cannot be rationally operationalized sufficiently to allow for anything like valid assessment. And, of course, they are almost all couched in terms of explicit learning. (Those who think that vocabulary and grammar are learned via explicit instruction, for the most part, really need to take an introductory linguistics course or read some introductory texts like Radford’s Minimalist Syntax or Carnie’s Syntax: A Generative Introduction or Roeper’s The Prism of Grammar or George Miller’s The Science of Words.) Furthermore, the essential world knowledge component of attainment in ELA will vary enormously from person to person, and rightly so.
So, since the “standards” are supposed to be a list of measurable outcomes, and since they leave out much of what’s important in ELA (all that world knowledge and all that implicit acquisition), they cannot validly measure attainment in ELA and so the “data” derived from the tests of those standards will be invalid as measurement of said attainment. QED.
One can also arrive at the same conclusion by noting that the ELA components of these standardized tests are supposed to be measures of reading and writing ability but that for these tests people don’t do anything remotely resembling reading and writing as it is actually done in the real world. And again, since what is supposed to be measured by these tests is not being done on them (and cannot be done on them, given their structure), they cannot, again, be valid measurements of what they purport to measure. Again, QED.
Numerology-based decision making. Taking over a school near you.
LikeLike
Paige,
“. . . to better support our teachers with the skills necessary to effectively use data. . . ”
Such a love of “data” belies an understanding of the teaching and learning process which is an aesthetic, non quantifiable human interaction activity. To “numerize” that process, which is what your data chart attempts to do to help “support our teachers. . . ” shows a fundamental lack of knowledge, an ignorance of the “data” gleaned from your, as B. Sheperd puts it, “Numerology-based decision making.”
To understand the errors of your methodology, I advise you to read and understand Noel Wilson’s acclaimed, never refuted nor rebutted historic study “Educational Standards and the Problem of Error” found at: http://epaa.asu.edu/ojs/article/view/577/700
Brief outline of Wilson’s “Educational Standards and the Problem of Error” and some comments of mine. (updated 6/24/13 per Wilson email)
1. A quality cannot be quantified. Quantity is a sub-category of quality. It is illogical to judge/assess a whole category by only a part (sub-category) of the whole. The assessment is, by definition, lacking in the sense that “assessments are always of multidimensional qualities. To quantify them as one dimensional quantities (numbers or grades) is to perpetuate a fundamental logical error” (per Wilson). The teaching and learning process falls in the logical realm of aesthetics/qualities of human interactions. In attempting to quantify educational standards and standardized testing we are lacking much information about said interactions.
2. A major epistemological mistake is that we attach, with great importance, the “score” of the student, not only onto the student but also, by extension, the teacher, school and district. Any description of a testing event is only a description of an interaction, that of the student and the testing device at a given time and place. The only correct logical thing that we can attempt to do is to describe that interaction (how accurately or not is a whole other story). That description cannot, by logical thought, be “assigned/attached” to the student as it cannot be a description of the student but the interaction. And this error is probably one of the most egregious “errors” that occur with standardized testing (and even the “grading” of students by a teacher).
3. Wilson identifies four “frames of reference” each with distinct assumptions (epistemological basis) about the assessment process from which the “assessor” views the interactions of the teaching and learning process: the Judge (think college professor who “knows” the students capabilities and grades them accordingly), the General Frame-think standardized testing that claims to have a “scientific” basis, the Specific Frame-think of learning by objective like computer based learning, getting a correct answer before moving on to the next screen, and the Responsive Frame-think of an apprenticeship in a trade or a medical residency program where the learner interacts with the “teacher” with constant feedback. Each category has its own sources of error and more error in the process is caused when the assessor confuses and conflates the categories.
4. Wilson elucidates the notion of “error”: “Error is predicated on a notion of perfection; to allocate error is to imply what is without error; to know error it is necessary to determine what is true. And what is true is determined by what we define as true, theoretically by the assumptions of our epistemology, practically by the events and non-events, the discourses and silences, the world of surfaces and their interactions and interpretations; in short, the practices that permeate the field. . . Error is the uncertainty dimension of the statement; error is the band within which chaos reigns, in which anything can happen. Error comprises all of those eventful circumstances which make the assessment statement less than perfectly precise, the measure less than perfectly accurate, the rank order less than perfectly stable, the standard and its measurement less than absolute, and the communication of its truth less than impeccable.”
In other word all the logical errors involved in the process render any conclusions invalid.
5. The test makers/psychometricians, through all sorts of mathematical machinations attempt to “prove” that these tests (based on standards) are valid-errorless or supposedly at least with minimal error [they aren’t]. Wilson turns the concept of validity on its head and focuses on just how invalid the machinations and the test and results are. He is an advocate for the test taker not the test maker. In doing so he identifies thirteen sources of “error”, any one of which renders the test making/giving/disseminating of results invalid. As a basic logical premise is that once something is shown to be invalid it is just that, invalid, and no amount of “fudging” by the psychometricians/test makers can alleviate that invalidity.
6. Having shown the invalidity, and therefore the unreliability, of the whole process Wilson concludes, rightly so, that any result/information gleaned from the process is “vain and illusory”. In other words start with an invalidity, end with an invalidity (except by sheer chance every once in a while, like a blind and anosmic squirrel who finds the occasional acorn, a result may be “true”) or to put in more mundane terms crap in-crap out.
7. And so what does this all mean? I’ll let Wilson have the second to last word: “So what does a test measure in our world? It measures what the person with the power to pay for the test says it measures. And the person who sets the test will name the test what the person who pays for the test wants the test to be named.”
In other words it measures “’something’ and we can specify some of the ‘errors’ in that ‘something’ but still don’t know [precisely] what the ‘something’ is.” The whole process harms many students as the social rewards for some are not available to others who “don’t make the grade (sic)” Should American public education have the function of sorting and separating students so that some may receive greater benefits than others, especially considering that the sorting and separating devices, educational standards and standardized testing, are so flawed not only in concept but in execution?
My answer is NO!!!!!
One final note with Wilson channeling Foucault and his concept of subjectivization:
“So the mark [grade/test score] becomes part of the story about yourself and with sufficient repetitions becomes true: true because those who know, those in authority, say it is true; true because the society in which you live legitimates this authority; true because your cultural habitus makes it difficult for you to perceive, conceive and integrate those aspects of your experience that contradict the story; true because in acting out your story, which now includes the mark and its meaning, the social truth that created it is confirmed; true because if your mark is high you are consistently rewarded, so that your voice becomes a voice of authority in the power-knowledge discourses that reproduce the structure that helped to produce you; true because if your mark is low your voice becomes muted and confirms your lower position in the social hierarchy; true finally because that success or failure confirms that mark that implicitly predicted the now self evident consequences. And so the circle is complete.”
In other words students “internalize” what those “marks” (grades/test scores) mean, and since the vast majority of the students have not developed the mental skills to counteract what the “authorities” say, they accept as “natural and normal” that “story/description” of them. Although paradoxical in a sense, the “I’m an “A” student” is almost as harmful as “I’m an ‘F’ student” in hindering students becoming independent, critical and free thinkers. And having independent, critical and free thinkers is a threat to the current socio-economic structure of society.
LikeLike
Thank you for the link to Wilson and the summary of the paper!!! Bringing Foucault to the frackus is brilliant!!
On another note, are you familiar with Tinbergen’s four questions and ethology?
By understanding institutionalized education memes and their histories using ethology and Tinbergen’s four questions, Wilson work goes to another level academically and scientifically.
LikeLike
And one more thing…the “mark” also leads to “Othering” or demonizing behavior to those not scoring well. And as a former E/bd teacher, I had to deal with the “demonized” learner after the fact. The tools I was to use are restraining, timeout, and locked quiet rooms or LCRs…windowless, solitary confinement! There were other accountability devices but torture and shaming or othering was central. Once I understood how inappropriate the permissible memes were, I had to quit teaching. Its like, the humiliation will continue until your attitude and educational acumen improves! Crazy!
LikeLike
I know an art teacher whose job is to teach 400 grade four students every two weeks. 5he observation that education is totally distorted by the data mangers is absolutely true. This is a great commentary…perfect script for a stand-up comedy gig, but unfortunately this nonsense is proliferating.
LikeLike
There was a time when anyone in education would have guffawed the peddlers of such nonsense as this infographic off the national stage. It’s just appalling that such technocratic philistines now have command and control of our schools. Just appalling. This stuff is self-parodying.
LikeLike
Enjoy!
LikeLike
And I’m sure none of those students have mastered the art of keeping multiple windows open so you can flip back and forth between what you’re supposed to doing (when the teacher is looking) and your game/chat room/porn site/whatever.
LikeLike
That line is probably in the Amplify pitch.
LikeLike
I love the elderly lady saying, “You can’t stay in teaching and go with the old ways.” That’s wonderful.
These sharepoint servers for the classroom are a GREAT idea. I wonder, though, about their security. Important to look into that. I would be pretty upset if someone used one of these to distribute inappropriate materials to my students, colleagues, administrators, parents.
LikeLike
Google’s pretty good about access-point security, assuming the user uses the two-step protocol. In fact, I think Google Docs is probably more secure than many or even most corporate workplaces.
Of course, what happens on Google’s servers stays on Google’s servers. In time, I think this stuff will make inBloom’s longitudinal database aspirations look like child’s play.
LikeLike
FLERP!
“Of course, what happens on Google’s servers stays on Google’s servers.”
Of course my sarcasmometer may be a tad warped at the moment and need to be recalibrated but if anyone believes that statement I’ve got some great ocean front lots over at Lake of the Ozarks in Central Missouri. They’re selling like margaritas on Cinco de Mayo. Contact our sales rep so that you to may be suckered, oops I mean may buy into a buy of a lifetime.
LikeLike
Gee. Zuz.
Sent from my iPhone
LikeLike
Insane.
LikeLike
Congrats, deformers, you have made the word “data,” a perfectly respectable word, into one of the most loathed four-letter words in the English language.
from the Rheeformish Lexicon:
data chat. Local-level meeting to enforce the will of the Common Core Curriculum Commissariat and Ministry of Truth (C^4MiniTru). See waterboarding.
data-driven decision making. Rheformish numerology.
data wall. Public shaming device and demotivational tool; the equivalent, in schools, of targets and production figures for pig iron, etc., continually broadcast by every Fascist regime.
technocratic Philistinism. Replacement for quaint values of humane scholarship and research, teaching and learning; another name for the Rheformish faith.
VAM. Value-Added Measurement, or Vacuity-of-curriculum-and-pedagogy Acceleration Mechanism; means for enforcing the reduction of the complex, unquantifiable, humane enterprise of teaching and learning to a number intended to measure the extent to which a teacher has
a) effectively narrowed his or her curricula to the bullet list of “standards”;
b) based his or her pedagogy on extrinsic punishment and reward;
c) robotically parroted his or her canned scripts;
d) modeled for his or her students proper obsequiousness to superiors; and
e) identically milled his or her differing students to specification, via test preparation, thereby inuring them to the performance of meaningless, alienating tasks and so preparing them for the low-wage service jobs of the future.
See data-driven decision making, Powerpointing of U.S. K-12 education, and technocratic Philistinism.
LikeLike
I love how the deformers blithely dismiss the subtle observations that human beings make, all the time, to “anecdotes.” Oh, the hubris!!! Think of how simply and easily you can discern another’s emotional state. Now, think of the difficulty that AI researchers have encountered in developing devices to do that–to do a billion other things that people do simply and easily. I think of a friend of mine who recently had to take a week of her class time to do reading records assessment of her students. At the end of the week, she had learned PRECISELY NOTHING that she did not already know about her students’ reading ability. However, she had met a requirement placed upon her by administrators who had to have “data.” Of course, her students lost a week of instructional time and spent that week doing something in itself really demotivating. And the typical teacher will now spend a quarter of the school year doing precisely such crap. Pretests and practice tests and benchmark tests and post tests and state assessments and test prep and more test prep and more test prep and activities in basals that are, surprise, modeled on the questions in the crappy standardized assessments. And it’s just as bad for the administrators. They are spending all their time, these days, in data chats and evaluations and reporting. All this has gotten entirely out of hand.
And this is what happens in education. A few years back, the business world was all abuzz about key performance indicators. Some were using stack ranking approaches based on them. These bad ideas have a way of filtering down like radioactive fallout onto our schools in wave after wave of idiotic “reforms.” Remember behavioral objectives? The idiotic reform of yesteryear. Same thing here.
LikeLike
just give a guy a KPI . . . life in the alternate Rheeality of Education Deform is that simple
LikeLike
“When the right thing can only be measured poorly, it tends to cause the wrong thing to be measured, only because it can be measured well. And it is often much worse to have a good measurement of the wrong thing—especially when, as is so often the case, the wrong thing will in fact be used as an indicator of the right thing—than to have poor measurements of the right things. —pioneering statistician John Tukey
LikeLike