This would be funny if it were not also unethical and outrageous.
Pearson embedded messages in certain tests to test “social-psychological”premises. Would encouraging messages raise test scores? Would “growth mindset” messages improve scores?
The answer: no.
“Education and publishing giant Pearson is drawing criticism after using its software to experiment on over 9,000 math and computer science students across the country. In a paper presented Wednesday at the American Association of Educational Research, Pearson researchers revealed that they tested the effects of encouraging messages on students that used the MyLab Programming educational software during 2017’s spring semester.
“Titled “Embedding Research-Inspired Innovations in EdTech: An RCT of Social-Psychological Interventions, at Scale,” the study placed 9,000 students using MyLab Programming into three groups, each receiving different messages from the software as they attempted to solve questions. Some students received “growth-mindset messages,” while others received “anchoring of effect” messages. (A third control group received no messaging at all.) The intent was to see if such messages encouraged students to solve more problems. Neither the students nor the professors were ever informed of the experiment, raising concerns of consent.
“The “growth mindset messages” emphasized that learning a skill is a lengthy process, cautioning students offering wrong answers not to expect immediate success. One example: “No one is born a great programmer. Success takes hours and hours of practice.” “Anchoring of effect” messages told students how much effort is required to solve problems, such as: “Some students tried this question 26 times! Don’t worry if it takes you a few tries to get it right.”
“As Education Week reports, the interventions offered seemingly no benefit to the students. Students who received no special messages attempted to solve more problems (212) than students in either the growth-mindset (174) or anchoring groups (156). The researchers emphasized this could have been due any of a variety of factors, as the software is used differently in different schools. However, educators who spoke to Education Week were understandably more alarmed by Pearson placing thousands of unwitting minors in A/B testing for its products.
“It’s concerning that forms of low-level psychological experimentation to trigger certain behaviors appears to be happening in the ed-tech sector, and students might not know those experiments are taking place,” Ben Williamson, a professor at the University of Stirling, told the publication.”
The students who received no “encouraging messages”performed much better than those who did. Maybe the messages triggered anxiety. Maybe they were distracting. Maybe this was a dumb and unethical experiment on human subjects.

“Neither the students nor the professors were ever informed of the experiment, raising concerns of consent.”
Ya think?
Let me fix that sentence” raising the the certainty of experimentation on children with no consent, a completely unethical and possibly illegal breach of scientific protocol by Nazi “researchers” at Pearson”
By the way, these sorts of Nazi experiments will almost certainly be a part of “Personalized learning”.
Anyone who believes otherwise has not been paying attention.
LikeLike
Pearson is claiming that these were not children since they were college students…..therefore, consenting adults. They couldn’t NOT consent since the program was mandated by the university for those particular classes. My gosh, this whole mess just gets more awful by the minute. When will some rich family have the guts to go up against the “ed-tech Giants”?
LikeLike
The same consent requirements apply to adults.
Fakebook did a similar “experiment” on unsuspecting Fakebook users by feeding them certain types of “news” items, effectively playing with their emotions.
A Cornell University (fake) researcher was involved in the “analysis” of the data, though Cornell claimed they had no responsibikity because their (fake) researcher did not actually perform the experiment, just analyzed the data.
It’s bad enough when corporations do this stuff but when our Universities get involved it is grotesque.
Universities like Cornell who do not abide by (or even stretttttttch) the rules about experimentation on human subjects should have ALL of their Federal and state research dollars yanked.
LikeLike
And by the way, the Cornell researcher “collaborated in the design of the study”, so Cornell’s claim about not actually performing the experiment is simply disingenuous.
https://www.theguardian.com/science/head-quarters/2014/jul/01/facebook-cornell-study-emotional-contagion-ethics-breach
LikeLike
“Cornell ethics board did not pre-approve Facebook mood manipulation study
By Gail Sullivan (Washington Post)
https://www.washingtonpost.com/news/morning-mix/wp/2014/07/01/facebooks-emotional-manipulation-study-was-even-worse-than-you-thought/?utm_term=.1dc792a10859
LikeLike
Consent is needed to perform research experiments on any human subject, regardless of age.
LikeLike
Correction: Pearsonalized Learning
LikeLike
Sadly in general experiments with human subjects that take place in educational settings are exempt from requiring consent. But this is still unethical if not illegal. usually however you need the consent of the institution in which the experiment is taking place, which did not seem to exist in this instance. However, in nearly every contract w/ an ed tech vendor there is a clause that says that personal student data can be used to improve your product or create new products or services. This is likely the clause that Pearson is relying upon in order to argue that this was legal.
LikeLike
Yup though Justin Reich in EdWeek claims this is no different from teachers who experiment on their students all the time. Please read and leave a comment. http://blogs.edweek.org/edweek/edtechresearcher/2018/04/ben_harold_an_ed_week.html
LikeLike
The big difference is teachers use in class tests to inform instruction and report to parents. Teachers generally respect the privacy of students and never sell information about them.
LikeLike
Please copy. Don Stewart
LikeLike
If I didn’t already have enough reasons not to use software produced by publishing companies in the college mathematics classes that I teach….
LikeLiked by 1 person
“Maybe this was a dumb and unethical experiment on human subjects.” This is another sick attempt to take over the minds of our young. What unethical experiment will be tried next?
Administrators should be outraged at this. I certainly don’t want ‘encouraging’ messages to come to me from someone attempting to influence how I think.
LikeLiked by 1 person
Where anti-tech “outrage” should lie, these days, we often instead find complacence: who, after all, is helping to massively fund those who should be outraged…
LikeLike
I’ll go with the unethical experiment description.
LikeLike
Where are the protections for “HUMAN SUBJECT” research? All research done by “REAL” researchers must have this in place.
Why doesn’t PEARSON have to follow the same rules? This is SERIOUS STUFF.
LikeLike
Pearson obviously believes they can get away with it because they are a British company and therefore beyond the reach of the US government.
Reminder to the Brits at Pearson: we have whipped you before (more than once).
LikeLike
Lends new meaning to the term “Pearsonalized learning”
“Pearsonalized Learning”
Pearsonalized learning
Nazi testing
Intestinal churning
Demands divesting
LikeLike
This is one more step in the direction of the “surveillance state” where technology is used to punish and exercise social control. We do not need to follow China where they already have a wide surveillance operation that tracks people with warrants, electronically posts faces of jaywalkers for public shaming and even limits the amount of toilet paper (nine sheets) in public bathroom. Who can forget the ‘Minority Report?’ https://www.theatlantic.com/international/archive/2018/02/china-surveillance/552203/
LikeLike
Growth mindset is junk science.
As for testing on unsuspecting kids, come on, we are bombarded by stuff like this every minute on TV, in newspapers, on the internet. This is how advertising and propaganda works.
LikeLike
Does Facebook’s *uckerberk own shares in Pearson (Replace the asterisk with any letter you want but not the “Z”)?
LikeLiked by 1 person
Or, do Pearson execs own stock in technology….
LikeLike
Good question. Does Pearson own stock in Facebook? Do they send memos to *uckerberg with suggestions on how to boost profits?
LikeLiked by 1 person
Reminds me PCP testing on soldiers and other individuals way back when. The victims did not really know what they were getting into. Now Pearson is using Students as ginny pigs.
LikeLike
Long. In October 2017, nearly 60 researchers under the leadership of the Gates-funded Data Quality Campaign and Dr. Morgan Polikoff, Associate Professor of Education at Rossier School of Education, University of Southern California planned to lobby Congress for specific changes to Student Privacy Protection Act (H.R. 3157 – 114th Congress) a dormant version of FERPA, but awaiting reintroduction for action by the 115th Congress. Here are key points in the “let’s lobby together” letter.
“The Every Student Succeeds Act’s evidence tiers provide new opportunities for states and districts to use data to better understand their students’ needs and improve teaching and learning. FERPA must continue to permit the research and research-practice partnerships that states and districts rely on to generate and act on this evidence. Section 5(c)(6)(C), should be amended to read “the purpose of the study is limited to improving student outcomes.” Without this change, states and districts would be severely limited in the research they can conduct.”
I think that “improving student outcomes” allows a lot of room for mischief. Indeed, later in the lobby letter “student outcomes” extends to workforce programs and outcomes. Also the bill now in limbo prohibits data-gathering on social-emotional states. Some of these researchers promote the use of such tests.
“States and districts need help to build their educators’ capacities to protect student privacy, including partnering effectively with researchers and other allies with legitimate educational reasons for handling student data. In many instances, new laws and regulations are not required to enhance privacy. Instead, education entities need help with complying with existing privacy laws, which are often complex. FERPA should provide privacy protection focused technical assistance, including through the invaluable Privacy and Technical Assistance Center, to improve stakeholders’ understanding of the law’s requirements and related privacy best practices.
I think this section of the letter is designed to keep any revision of FERPA as slim as possible, forestall other privacy laws, and to offload questions about and interpretations of FERPA to a “federal technical center” so that states and districts have “flexibility” in defining “legitimate educational reasons for handling data.”
“Support community data and research efforts. …Schools and community-based organizations like tutoring and afterschool programs need to securely share information about the students they serve. Harnessing education data’s power to improve student outcomes, as envisioned by the Every Student Succeeds Act, will require improvements to FERPA that permit schools and their community partners to better collaborate, including sharing data for legitimate educational purposes including conducting joint research.”
I think this is a really big deal, especially because the door is opened wide for any number of “community partners” including social service providers, mentors, medical staff, tutors, ad hoc workers/volunteers to be in the loop of tracking individual student “performance” with mobile devices and data dashboards. The broad and fuzzy language especially about “legitimate use” could include anything being marketed as “personalized learning,” which typically requires (and gathers) a PII–personally identifiable information from students, teachers, and others in the loop (e.g. SSNs, student numbers, biometric records, date of birth, place of birth, mother’s maiden name).
“Support evidence-use across the education and workforce pipeline. We recommend adding workforce programs to (FERPA authorized studies)…The country also needs to better understand the efficacy of workforce programs. FERPA should recognize the inherent connectivity between these areas to better meet student and worker needs.”
I think this is asking for a major distortion of the purpose of FERPA protections, which end when students reach the age of 18. Only by hoarding PII data acquired in schools and linking it to workforce data would this be feasible.
I looked at the bill, stalled in the last Congress. It is remarkably detailed. From the proposed Student Privacy Protection Act (light editing).
SEC. 6. PROHIBITION ON PSYCHOLOGICAL TESTING.
The law states: IN GENERAL….no funds provided to the Department or Federal funds provided under any applicable program shall be spent to support any survey or academic assessment allowing any of the following types of data collection via assessments or any other means, including digitally:
“(A) Any data collected via affective computing, including analysis of facial expressions, EEG brain wave patterns, skin conductance, galvanic skin response, heart-rate variability, pulse, blood volume, posture, and eye-tracking.
“(B) Any data (including any resulting from national or State assessments) that measure psychological resources, mindsets, learning strategies, effortful control, attributes, dispositions, social skills, attitudes, intrapersonal resources, or any other type of social, emotional, or psychological parameter.
“(C) Any data collected through predictive modeling to be used to detect behaviors, beliefs, or value systems, or for predicting or forecasting student outcomes.
“(D) Any type of psychological data, including assessment of non-cognitive skills or attributes, psychological resources, mindsets, learning strategies, effortful control, attitudes, dispositions, social skills, or other interpersonal or intrapersonal resources collected via any national or State student assessment.
“(3) SPECIAL RULE.—Paragraph (2) shall not apply to an applicable program carried out or funded under the Individuals with Disabilities Education Act if the data collection is required under such Act.
I think the law’s language is designed to say no federal funds for Tripod or Panorama Surveys, Dweck’s Brainology (mindsets), Duckworth’s Character Lab (now offering services to Relay Graduate School of Education where Doug Lemov has been king). These are out from federal funding along with Devos-supported Neurocore treatments.
“(4) NO NATIONAL ASSESSMENT USING PSYCHOLOGICAL DATA.—No funds provided to the Department or to an applicable program may be used to pilot test, field test, implement, administer, or distribute in any way any federally sponsored national assessment collecting any psychological data or any federally sponsored research on social-emotional data in education.
“(A) ELEMENTARY SCHOOLS AND SECONDARY SCHOOLS.—No funds provided to the Department (shall be used for) video monitoring of classrooms in the school, for any purpose, including for teacher evaluation, without the approval of the local educational agency after a public hearing and the written consent of the teacher and the parents of all students in the classroom.
“(B) OTHER AGENCIES AND INSTITUTIONS.—No funds provided to the Department (shall be used for) video monitoring of classrooms in a school or institution, for any purpose, including for teacher evaluation, without a public hearing and the written consent of the teacher, and of the parents of all students in the classroom.
“(A) ELEMENTARY SCHOOLS AND SECONDARY SCHOOLS.—No funds provided to the Department (shall be used for a school-provided), computing device on which remote camera surveillance software has been installed, without first obtaining the approval of the local educational agency after a public hearing. Any such elementary school or secondary school that provides computing devices to teachers or students shall adopt a policy prohibiting the use of remote camera surveillance software on a school supplied computing device without the written consent of the teacher and the parent of each affected student.
“(B) OTHER AGENCIES AND INSTITUTIONS.— (As above, prohibits the use of an institution-provided computing device equipped with remote camera surveillance software “without first providing a public hearing and adopting a policy prohibiting the use of remote camera surveillance software on an institution-supplied computing device without the written consent of the teacher and the parent of each affected student.”
Unless I am mistaken, the last two provisions make it really hard for mobile tracking of students–an essential for the proponents of “any time, anywhere” learning, which is also the stated mission of fans of personalized learning and the US Office of Educational Technology.
If passed, these rules will also kill off the “School Quality Improvement Index” installed in the CORE Districts of California where social-emotional & culture-climate indicators make up forty percent of a school’s rating. https://www.sciencedirect.com/science/article/pii/S0193397316301290
In fact, about half of the researchers who signed the lobby letter were from California Institutions of higher education or independent research organizations, including the CORE-PACE alliance. Dr. Macke Raymond at Stanford’s CREDO signed the letter.
The demand for data-driven “personalized” instruction is on a collision course with the whole concept of student data privacy. I think these researchers want to protect their interest in data mining at the expense of student privacy.
LikeLike
Participants must know that they are involved in an experiment and the students should have been allowed to volunteer and also opt-out at anytime. The students and staff should have notified that changes were made to the software so that they had the opportunity to opt-out of the software if they choose. Yes or No? I am just putting thoughts and questions out there…
Let’s say the students did volunteer after knowing they would be involved in an experiment that involved taking a math test. If the students were not in an danger and their personal grades and classes were not affected by the tests, then isn’t okay to not tell them that they would receive questions that may or may not pop up along the way?
This makes me think about Map testing. The tests change according to the student’s answers. The better a student does the more difficult or more complex the questions get. Now, as a teacher I was not told to explain this to my students. Yet, when I think about it, if I did tell them that, it might change the outcome of the test. Students who realize that their questions are getting more difficult, might assume they are doing well and either keep working hard or decide that they can relax a bit, because they have already demonstrated understanding of basic concepts.
If Pearson was trying to see the affects of positive or negative comments, then they wouldn’t want to tell the students, because it might change the behavior of the participant, right?
LikeLike
Common Core was undoubtedly among the largest experiments ever performed on American children without the consent of their parents.
Bill Gates admitted as much when he said “It would be great if our education stuff worked, but that we won’t know for probably a decade.”
When you try something to “see how it works out”, that is the very definition of an “experiment”. And Gates and others never got permission from parents to do what they did.
Bill and Melinda Gates obviously believe in experimenting on children — as long as they are other people’s children, of course. They would never subject their own children to such experimentation.
LikeLike
Intrusive randomized distracting experiments which affect student responses on tests have consequences for student scores and sometimes teacher performance evaluations. Doesn’t this invalidate scores since it’s not a level playing field?
LikeLike
Cue Duane Swacker, who would point out that you can’t invalidate something that was invalid to begin with.
But you are right. IF the scores were valid to begin with(and that’s a very big if), introducing a confounding variable would certainly invalidate the results.
It goes against the very core principle of science, an effort to control for everything except the factor of interest.
It’s like introducing an additional force or forces other than gravity into an experiment on falling masses in a vacuum in which some masses are subjectedto the additional forces) and some are not. The acceleration will no longer be that due to gravity for all the masses.
But the people making these tests don’t care about stuff like “validity. “They don’t even know what it means. And they certainly don’t know how to do science.
LikeLike
You did just fine explaining, SDP!
Perhaps the edudeformers believe that by invalidating an invalidity that would net a validity. Make sense?
LikeLike
I first thought, this was an onion “news”. I mean the Pearson “researchers” even presented this crap at a conference?
LikeLike