I recently posted about a new partnership between the National PTA and the Data Quality Campaign. In response, our wonderful reader-researcher Laura Chapman dug deep into the money flow and produced this commentary:
Ah data. You can be sure the PTA is uninformed about the data being collected with their tax dollars. Here are some not widely publicized facts.
Between 2005 and early 2011, the Gates’ Foundation invested $75 million in a major advocacy campaign for data gathering, aided by the National Governor’s Association, the Council of Chief State School Officers, Achieve, and The Education Trust—most of these groups recipients of Gates money. During the same period, the Gates Foundation also awarded grants totaling $390,493,545 for projects to gather data and build systems for reporting on teacher effectiveness. This multi-faceted campaign, called the Teacher Student Data Link (TSDL) envisioned the linked data serving eight purposes:
1. Determine which teachers help students become college-ready and successful,
2. Determine characteristics of effective educators,
3. Identify programs that prepare highly qualified and effective teachers,
4. Assess the value of non-traditional teacher preparation programs,
5. Evaluate professional development programs,
6. Determine variables that help or hinder student learning,
7. Plan effective assistance for teachers early in their career, and
8. Inform policy makers of best value practices, including compensation.
Gates and his friends intended to document and rate the work of teachers and a bit more: They wanted data that required major restructuring of the work of teachers so everything about the new system of education would be based on data-gathering and surveillance.
The TSDL system ( in use in many states) required that all courses be identified by standards for achievement and alphanumeric codes for data-entry. All responsibilities for learning had to be assigned to one or more “teachers of record” in charge of a student or class. A teacher of record was assigned a unique identifier (think barcode) for an entire career in teaching. A record would be generated whenever a teacher of record has some specified proportion of responsibility for a student’s learning activities.
Learning activities had to be defined by the performance measures (e.g., cut scores for proficiency) for each particular standard for every subject and grade level. The TSDL system was designed to enable period-by-period tracking of teachers and students every day; including “tests, quizzes, projects, homework, classroom participation, or other forms of day-to-day assessments and progress measures”—a level of surveillance that proponents claimed was comparable to business practices (TSDL, 2011, “Key Components”).
The system was and is intended to keep current and longitudinal data on the performance of teachers and individual students, as well schools, districts, states, and educators ranging from principals to higher education faculty. Why? All of this data could be used to determine the “best value” investments to make in education, with monitoring and changes in policies to ensure improvements in outcomes. Data analyses would include as many demographic factors as possible, including health records for preschoolers.
The Gates-funded TSDL campaign added resources to a parallel federal initiative. Between 2006 and 2015, the US Department of Education (USDE) has invested nearly $900 million in the Statewide Longitudinal Data Systems (SLDS) Grant Program. Almost every state has received multi-year grants to standardize data on education. Operated by the Institute of Education Sciences, the SLDS program is: “designed to aid state education agencies in developing and implementing longitudinal data systems.
What is the point of the SLDS program? “These systems are intended to enhance the ability of States to efficiently and accurately manage, analyze, and use education data, including individual student records…to help States, districts, schools, and teachers make data-driven decisions to improve student learning, as well as facilitate research to increase student achievement and close achievement gaps” (USDE, 2011, Overview).
The most recent data-mongering activity from USDE, rationalized as “helping keep students safe and improving their learning environments” is a suite of on-line School Climate Surveys (EDSCLS). The surveys will allow states, districts, and schools to “collect and act on reliable, nationally-validated school climate data in real-time,” (as soon as it is entered).
The School Climate Surveys are for students in grades 5-12, instructional staff, non-instructional staff in the schools they attend and parents/guardians. Data is stored on local data systems, not by USDE. Even so, but the aim is to have national “benchmarks” online by 2017 for local and state comparisons with national scores.
Student surveys (73 questions) offer scores for the entire school disaggregated by gender, grade level, ethnicity (Hispanic/Latino or not), and race (five mentioned, combinations allowed).
The Instructional Staff Survey has 82 Questions. Responses can be disaggregated by gender, grade level assignment, ethnicity, race, teaching assignment (special education or not), years working at this school (1-3, 4-9, 10-19, 20 or more).
The Non-instructional Staff Survey has 103 questions, but 21 are only for the principal. Demographic information for disaggregated scores is the same as for s instructional staff)
The Parent Survey has 43 questions, for item-by-item analysis, without any sub-scores or and summary scores. Demographic information is requested for gender, ethnicity, and race.
These four surveys address three domains of school climate: Engagement, Safety, and Environment, and thirteen topics (constructs).
Engagement topics are: 1. Cultural and linguistic competence, 2. Relationships, and 3. School participation.
Safety topics are: 4. Emotional safety, 5. Physical safety, 6. Bullying/cyberbullying, 7. Substance abuse, and 8. Emergency readiness/management (item-by-item analysis, no summary score)
Environment topics are: 9. Physical environment, 10. Instructional environment, 11. Physical health (information for staff, but no scores for students) 12. Mental health, and 13. Discipline.
Almost all questions call for marking answers “yes” or “no,” or with the scale “strongly agree,” agree,” ”disagree” “strongly disagree.” Some questions about drug, alcohol and tobacco abuse ask for one of these responses: “Not a problem,” Small problem,” “ Somewhat a problem,” “Large problem.” None of the questions can be answered “Do not know.”
I have looked at the survey questions, developed by the American Institutes for Research (AIR), and concluded they are not ready for prime time. Here are a few of the problems.
This whole project looks like a rush job. The time for public comment about this project was extremely short. USDE did not change flaws in the piloted surveys, claiming that there was no budget for revisions.
The flaws are numerous. Many of the survey questions assume that respondents have an all encompassing and informed basis for offering judgments about school practices and policies. Some questions are so poorly crafted they incorporate several well-known problems in survey design–including more than one important idea, referring to abstract concepts, and assuming responders have sufficient knowledge. Here is an example from the student survey with all three problems. “Question 8. This school provides instructional materials (e.g., textbooks, handouts) that reflect my cultural background, ethnicity, and identity.”
Many questions have no frame of reference for a personal judgment:
From the student survey:
“17. Students respect one another.”
“18. Students like one another.“
Other questions call for inferences about the thinking of others.
“50. Students at this school think it is okay to get drunk.”
Some questions assert values, then ask for agreement or disagreement.
In the parents survey,
“7. This school communicates how important it is to respect students of all sexual orientations.”
Others assume omniscience:
“41. School rules are applied equally to all students.” Some questions seem to hold staff responsible for circumstances beyond their immediate control. 74. [Principal Only] The following are a problem in the neighborhood where this school is located: garbage, litter, or broken glass in the street or road, on the sidewalks, or in yards. ( Strongly agree, Agree, Disagree, Strongly disagree).
Overall, the surveys and the examples of data analysis they provides are unlikely to produce “actionable interventions” as intended. The questions are so poorly crafted that they are likely to generate misleading data with many schools cast in a very bad light. See, for example, page 26 data from this source. https://safesupportivelearning.ed.gov/sites/default/files/EDSCLS%20Pilot%20Debrief_FINAL.pdf
The responsibility for privacy rests with the schools, districts and states, but everything in on line. A brief inspection of the background questions should raise major questions about privacy, especially for students who identify themselves with enough detail–gender, ethnicity, race, grade level (20 data points minimum)–to produce survey answers that match only one person or a very few individuals.
My advice, not just to the PTA: Stay away from these data monsters. They drown everyone in data points. Results from the School Climate Surveys are processed to make colorful charts and graphs, but they are based of fuzzy and flawed “perceptions” and unwarranted assumptions. The surveys offer 63 data points for profiling the participants, but only four possible responses to each of 283 questions of dubious technical merit.
Perhaps most important for parents: Some questions seem to breech the Family Educational Rights and Privacy Act (FERPA) for topics in student surveys, especially questions pertaining to “illegal, anti-social, self-incriminating, or demeaning behavior.” More on FERPA at http://www.ed.gov/offices/OM/fpco/ppra/index.html
It would be nice to think that FERPA really protects student privacy. But former Secretary Duncan loosened the FERPA protections in 2011, to make it easier for outsiders to obtain student data. That was the premise behind the Gates’ Foundation’s inBloom project, which was set to collect personally identifiable data from several states and districts and store it in a cloud managed by Amazon. That project was brought down by parental objections, which caused the states and districts to back out.

This is what happens when our elected leaders stop funding peer reviewed, scientific research, and instead allow private organizations’ product R&D to run unchecked. We wind up with pseudoscience like “psychometric” “data analysis”, no more accurate or even scientific than phrenology.
I have given and taken these surveys, and I can tell you from painful experience that they are just like asking why hares eat talking pineapples that wear sleeves. One pitiful, unfortunate joke after another.
Out here in California, I think State Superintendent Torlackson understands that. So I hope and pray that diligent researchers like Laura Chapman and Diane Ravitch are able to force John King and the Gates Propaganda Army to lay off Sacramento, and let Tom Torlackson keep as much pseudoscience as possible out of my state’s blueprints for the future.
LikeLiked by 1 person
The question of whether this survey breaches privacy relates to the Protection of Protection of Pupil Rights Amendment (PPRA) — not FERPA, which is an entirely different law. PPRA requires parental notification and opt out for any survey that asks students questions on potentially sensitive issues, and parental consent for federally funded surveys that ask these sorts of questions.
For more information on the three different federal laws that relate most directly to student privacy, FERPA, PPRA and COPPA, check out the Parent Coalition for Student Privacy fact sheets here: http://www.studentprivacymatters.org/ferpa_ppra_coppa/ Given that these new surveys were funded and are being distributed by the US ED , I would surmise that they should require parental consent before a district or school asks students to respond to it.
Our coalition also very concerned about the State Student Longitudinal Data Systems or SLDS, and we’ve written about them here: https://www.washingtonpost.com/news/answer-sheet/wp/2015/11/12/the-astonishing-amount-of-data-being-collected-about-your-children/
Our Coalition came together after defeating inBloom, which was funded by the Gates Foundation at over $100 million — the cost of which does not appear to be included in the summary of spending above. The Gates Foundation continues to fund the Data Privacy Campaign as well as help support the efforts of the Future of Privacy Forum, which despite its name represents the interests of the ed tech and software industry, Commonsense Media, CoSN, and many other organizations working on issues related to student privacy.
Others who would like to protect student privacy against the vast array of wealthy interests working to weaken it should check out the Parent Coalition for Student Privacy website and join our mailing list at http://www.parentcoalitionforstudentprivacy.org
LikeLike
Thanks for that info, Leonie!
LikeLike
Thanks for the correction. I did link to PPRA while speaking about FERPA. Senior glitch.
Correct on my not including Gates invetments in InBloom…(and more is going on there with social media tracking).
Correct on the coordinated effort to subvert regulations/laws bearing on student privacy while pretending to “protecting” data.
The new USDE school climate survey platform and questions are being shoved out the door–shoddy products but “free” and permitted under ESSA for accountability.
In any case the work you are doing is amazing and gets more complicated with the borderless expansion of merchandizing parading as either fun or essential for students and always, of course, evidence-based.
LikeLike
Some districts mandated a certain number of clicks per semester or per year on the SLDS- didn’t matter that what was loaded into SLDS was only barely informative. The teachers still had to click and click and click away until they reached the magic number of several hundred. This absurd make work is certainly going to be held up as how useful this SLDS is- see, every teacher used it atleast 400 times a year. Teacher are required to enter data on students every week no matter what- all because parents demand it. NO, the data machine demands it.
LikeLike
I wondered about this question asked of my first graders. I learn things in this class. My first thought was I hope they answer strongly agree, but then I thought, the craft of teaching requires that I work on the instructional level of the child. If done well, the child learns without frustration or anxiety. The learning becomes engaging and fun. The child may not be aware of the learning going on. So then I wondered if somewhat disagree might be the desired answer. It really bothers me that people who do not know my children or me get to judge my effectiveness on this mumbo-jumbo. Not only are the children losing their right to privacy, but so am I. And I did not give my consent either.
LikeLike
“I wondered about this question asked of my first graders.”
Which question?
I would find the whole process overly intrusive even if it did provide reliable information. When they start to draw their colorful charts, the poorly constructed questions are going to be further bastardized into one or two word summaries. So people will look at a chart rating “school climate” covering several questions of dubious merit. Conclusions will be drawn (and carefully orchestrated) that benefit the bottom line of corporate interests first. Actual educational benefit would be an ancillary goal to be determined only after years of following the numbers. Gates should not be allowed near a tech product with his tendency to release beta versions for big bucks rather than a finished product. What is the hurry to produce another badly produced effort to “reform” schools? “We can’t wait! Our children deserve action now! ” Balderdash! Our children deserve carefully thought out, professionally developed, fully vetted and field tested efforts that address needs identified by stakeholders who are invested in educated our children and not in selling products.
LikeLike
It’s not just the data. It’s about using data to predict. What about the data collected to predict student behavior? We are headed over a cliff. This article is related and should be a call to action whether data is collected to predict student behavior or adult behavior. https://www.propublica.org/
LikeLike
Reblogged this on David R. Taylor-Thoughts on Education.
LikeLike
The national PTA site has a section where the PTA President posts articles. It is titled Laura’s Corner. Comments can be added to her articles.
LikeLike
My comment, criticizing the PTA’s involvement in DQC, was removed after the “awaiting moderation” function.
LikeLike