Archives for category: Data and Data Mining

New York State cut all ties with inBloom, the controversial data-mining project sponsored by the Gates Foundation and the Carnegie Corporation.

The legislature, which totally ignored parent demands for new faces on the New York Board of Regents, bowed to parent protests against the State Education Department’s determination to share confidential student data with inBloom.

In this post, Leonie Haimson describes how parents organized–not only in New York, but wherever inBloom planned to gather confidential student data–and fought back to protect their children’s privacy.

Give Haimson credit for being the spark plug that ignited parent resistance across the nation.

Normally, the federal law called FERPA would have prevented the release of the data that inBloom planned to collect, but in 2011, the U.S. Department of Education changed the regulations to permit inBloom and other data-mining to access student data without parental consent.

Gates and Carnegie contracted with Rupert Murdoch’s Wireless Generation, and the plan was to put millions of student records in an electronic data managed by amazon.com.

No one was able to assure that the data could never be hacked.

In every state and district where inBloom thought it was operative, parents brought pressure on public officials, and the contract was severed.

At present, inBloom has no known clients.

But as Haimson points out, this could change.

The thirst for data mining seems to be insatiable, and as I posted not long ago, the president of Knewton boasted that education is one sector that is ripe for data mining and that his company and Pearson would be using online tests to gather information about every student and storing it.

Protecting student privacy must remain high on every parent’s agenda.

A frequent commentator, Bob Shepherd, with many years in curriculum development, education publishing, and assessment, offers sage advice:

“The tests are infallible. They are objective measures. And we know that because they produce data. And not just any old data. Data with numbers and stuff. Very rigorously determined raw-to-scaled-score conversions and cut scores and proficiencies. Super-dooper, charterific, infallible data. Lots and lots of it. I mean lots. Tons. You wouldn’t believe the data!!! Data for days. Rivers of data. Big, big data.

“If the new tests show that 70 percent of students are failures, that’s because 70 percent of students are failures. And if the tests show that 70 percent of our students are failures, that’s because 70 percent of our teachers are failures too.

“You see? The data show that those shiftless, ungritful kids and teachers just can’t measure up to “higher standards” produced by folks with VAST experience as educators. Folks like David Coleman.

“And that’s why teachers need to be replaced with educational technology.

“And that’s why the public schools need to be closed down and replaced with private schools and charter schools.

“And that’s why the country needs to spend about 50 billion dollars making the transition to the Common Core and Big Data.

“Because the Common Core data show a 70 percent failure rate!!!

“Because numbers in a report, however they got there, are never wrong!

“Why are they never wrong? Because they are data!

“data data data data data

“You see?

“It couldn’t POSSIBLY BE that the tests are poorly conceived and written. It couldn’t possibly be that the standards are likewise poorly conceived and written. It couldn’t possibly be that what’s being called data-driven decision making is a variety of NUMEROLOGY.

“Because the masters who designed these tests and these standards are infallible. They are the best makers of tests and standards (well, if you use those terms very, very loosely) that a plutocrat’s money can buy, that is, if the plutocrat is in a hurry, and if he doesn’t really give the matter much thought. You know, if he does this in the way that ordinary, nonplutocratic folks might, say, order up a pizza.

“Glad I could straighten that out for you.

“Just remember: The DATA show that everybody failed and needs to be fired and that everything needs to be privatized.

“Oh, and lots and lots of new software and data systems need to be bought. I mean, billions of dollars worth. Billions and billions.

“You’re welcome.”

If the answer is yes, please come to one or both of the two
sessions where I am speaking on April 3. I will give the
John Dewey Society lecture at the
Convention Center, 100 Level, Room 114, from 4-7 pm. (Lots of time
for discussion). My topic: “Does Evidence
Matter?”
Fair warning: The room holds only 600
people. Before the Dewey lecture, I will join Philadelphia parent
activist Helen Gym and Carl Grant of the University of Wisconsin
(chair) in a special Presidential session from 2:15 to 3:45,
on the same level in Room 121B The
title of the session is:
Rising to the
Challenges of Quality and Equality:

The Promise of a Public
Pedagogy
If you join me at the early session,
you will have to race with me to the lecture, and the room may be
full.

Peter Greene, in a serious vein, explains that the Common Core standards are integrally connected to the collection of data.

They can’t be changed or revised–contrary to the nationally and internationally recognized protocol for setting standards–because their purpose is to tag every student and collect data on their performance.

They cannot be decoupled from testing because the testing is the means by which every student is tagged and his/her data are collected for Pearson and the big data storage warehouse monitored by amazon or the U.S. government.

He writes:

We know from our friends at Knewton what the Grand Design is– a system in which student progress is mapped down to the atomic level. Atomic level (a term that Knewton loves deeply) means test by test, assignment by assignment, sentence by sentence, item by item. We want to enter every single thing a student does into the Big Data Bank.

But that will only work if we’re all using the same set of tags.

We’ve been saying that CCSS are limited because the standards were written around what can be tested. That’s not exactly correct. The standards have been written around what can be tracked.

The standards aren’t just about defining what should be taught. They’re about cataloging what students have done.

Remember when Facebook introduced emoticons. This was not a public service. Facebook wanted to up its data gathering capabilities by tracking the emotional states of users. But if users just defined their own emotions, the data would be too noisy, too hard to crunch. But if the user had to pick from the facebook standard set of user emotions– then facebook would have manageable data.

Ditto for CCSS. If we all just taught to our own local standards, the data noise would be too great. The Data Overlords need us all to be standardized, to be using the same set of tags. That is also why no deviation can be allowed. Okay, we’ll let you have 15% over and above the standards. The system can probably tolerate that much noise. But under no circumstances can you change the standards– because that would be changing the national student data tagging system, and THAT we can’t tolerate.

This is why the “aligning” process inevitably involves all that marking of standards onto everything we do. It’s not instructional. It’s not even about accountability.

It’s about having us sit and tag every instructional thing we do so that student results can be entered and tracked in the Big Data Bank.

And that is why CCSS can never, ever be decoupled from anything. Why would facebook keep a face tagging system and then forbid users to upload photos?

The Test does not exist to prove that we’re following the standards. The standards exist to let us tag the results from the Test. And ultimately, not just the Test, but everything that’s done in a classroom. Standards-ready material is material that has already been bagged and tagged for Data Overlord use.

The end-game is data-tracking, not standards. And that helps to explain why CCSS was written without consultation with educators; without participation by early childhood educators or those knowledgeable about students with disabilities; why there is no appeals process, no means of revision, why they were written so hurriedly in 2009 and pushed into 45 states and D.C. by Race to the Top.

Big data will open the way to the future of education, says
the CEO of Knewton.

 

The company is piloting its products at Arizona
State University. Whatever we used to call education will cease to
exist. Big data will change everything.

 

“The so-called Big Data movement, which has been largely co-opted by the for-profit
education industry, will serve as “a portal to fundamental change
in how education research happens, how learning is measured, and
the way various credentials are measured and integrated into hiring
markets,” says Mitchell Stevens, an associate professor of
education at Stanford University. “Who is at the table making
decisions about these things,” he says, “is also up for grabs.”

 

Want to know the future? Watch Knewton: “Big Data stands to play an
increasingly prominent role in the way college will work in the
future. The Open Learning Initiative at Carnegie Mellon University
has been demonstrating the effectiveness of autonomous teaching
software for years. Major educational publishers such as Pearson,
McGraw-Hill, Wiley & Sons and Cengage Learning have long
been transposing their textbook content on to dynamic online
platforms that are equipped to collect data from students that are
interacting with it. Huge infrastructural software vendors such as
Blackboard and Ellucian have invested in analytics tools that aim
to predict student success based on data logged by their client
universities’ enterprise software systems. And the Bill &
Melinda Gates Foundation has marshaled its outsize influence in
higher education to promote the use of data to measure and improve
student learning outcomes, both online and in traditional
classrooms. “But of all the players looking to ride the data wave
into higher education, Knewton stands out.”

 

Read more:
http://www.insidehighered.com/news/2013/01/25/arizona-st-and-knewtons-grand-experiment-adaptive-learning#ixzz2wkgLQ1ZS
Inside Higher Ed

If you have been wondering why data mining matters so much,
you will want to see this video.

Please note that the U.S. Department of
Education’s logo is on this video.

In it, an entrepreneur named Jose Ferreira, CEO of Knewton, shares his vision for a future in
which education of every individual child is completely determined
by data. Education today happens to be the most “data-mineable
industry in the world,” he says.

His firm and Pearson can map out whatever your child knows and doesn’t know, design lessons, and do
whatever is necessary to “teach” the concepts needed. There is
nothing about your child that they don’t know, and they will know
more about him or her next year than they do this year. If this is
the future, then teachers will be mere technicians, if they are
needed at all. What do you think?

Peter Greene saw the video and
thought it was scary. He wrote: “Knewton will generate this giant
data picture. Ferreira says presents this the same way you’d say,
“Once we get milk and bread at the store,” when I suspect it’s
really more on the order of “Once we cure cancer by using our
anti-gravity skateboards,” but never mind. Once the data maps are
up and running, Knewton will start operating like a giant
educational match.com, connecting Pat with a perfect educational
match so that Pat’s teacher in Iowa can use the technique that some
other teacher used with some other kid in Minnesota. Because
students are just data-generating widgets. “Ferreira is also
impressed that the data was able to tell him that some students in
a class are slow and struggling, while another student could take
the final on Day 14 and get an A, and for the five billionth time I
want to ask this Purveyor of Educational Revolution, “Just how
stupid do you think teachers are?? Do you think we are actually
incapable of figuring those sorts of things out on our
own?””

If the answer is yes, please come to one or both of the two
sessions where I am speaking on April 3. I will give the
John Dewey Society lecture at the
Convention Center, 100 Level, Room 114, from 4-7 pm. (Lots of time
for discussion). My topic: “Does Evidence
Matter?”
Fair warning: The room holds only 600
people. Before the Dewey lecture, I will join Philadelphia parent
activist Helen Gym and Carl Grant of the University of Wisconsin
(chair) in a special Presidential session from 2:15 to 3:45,
on the same level in Room 121B The
title of the session is:
Rising to the
Challenges of Quality and Equality:

The Promise of a Public
Pedagogy
If you join me at the early session,
you will have to race with me to the lecture, and the room may be
full.

Mercedes Schneider came across a speech
that Bill Gates gave to state legislators in 2009
. It
lays out the blueprint for everything that has happened in
education since then. Forget what you learned in civics class.
Gates gave legislators their marching orders. Duncan already had
his marching orders. Gates laid out $2.3 billion to create and
promote the Common Core standards. His buddy Arne handed out $350
million to test Bill’s standards. All the other pieces are there:
Charter schools should replace failure factories. He is a true
believer in charter magic. (We now know that charters get the same
results when they have the same students.) Longitudinal data
systems should be created to track students. (A parent rebellion
seems to have put this on the back burner for now, although
everyone seems to be mining student data, from Pearson to the SAT
to the ACT.) The teacher is the key to achievement (although real
research says the family and family income dwarfs teacher effects).
Here is the man behind the curtain, the man who loves data and
measurement, not children. Lock the doors, townspeople. Bill Gates
wants to measure everything about your children! Ask yourself, if
this guy made $60,000 a year, would anyone listen to him?

UPDATE:
After this blog was posted, two privacy activists–Allison White
and Leonie Haimson advised me that the collection of confidential
data about children is going forward, thanks to Arne Duncan’s
loosening of privacy rights under FERPA, the legislation designed
to prevent data mining. They write: “Actually at least 44 states
including NY are going forward with their internal P20 Longitudinal
data systems – as required by federal law – which will track kids
from cradle to the grave and collect their personal data from a
variety of state agencies.” Leonie Haimson is leader of Class Size
Matters and Prvacy Matters Allison Breidbart White is Co-author,
Protect NY State School Children Petition Please sign and share the
petition http://bit.ly/18VBvX2

ALSO: I transposed the numbers describing what the Gates Foundation spent on Common Core: it was $2.3 billion, not $3.2 billion. A billion here, a billion there, soon you are talking real money (I think I am paraphrasing long-gone Senator Everett Dirksen of Illinois, but who knows?)

Reader Laura H. Chapman shares this exchange with a senior fellow at the Brookings Institution about the Common Core:
I had a brief email exchange with Darrell West of the
Brookings about the CCSS. He wants the CCSS to be standardized so
that test scores will provide “big data” for his real interest,
which is an automated system of tellin students what they need to
do in order to master CCSS content. He wants to ensure that that no
one is messing around with what he regards as a perfected agenda
for tests that will product lots of data.

He is absolutely clueless about who
developed the standards, who paid for them, or the role of the CCSS
in the enterprise of K-12 edcuation. He ASSUMES that these
standards can and should function in the same capacity as ISO
standards function for quality control in engineering–think
elaborate checklists for compliance–or as instruments for quality
control for entering professions such as law and medicine. He is a
complete slave to the spin thrown out by the promoters of the
CCSS.
He is another in a long line of
economists who are in love with the idea of getting their algoritms
to munch on the big data forthcoming from tests of the
CCSS.
Since he was hooked on the idea that the
CCSS standard-setting process settled everything that mattered (to
him), I did let him know that the CCSS did not meet the minimal
criteria for “setting standards” set forth by the The American
National Standards organization for designing and judging any
standard-setting process:
These
are:
1. Seeks consensus from and through a
group that is open to representatives from all interested
parties
2. Solicits broad-based public review
and comment on draft standards
3. Gives
careful consideration to comments and offers a public response to
these comments
4. Incorporates changes that
meet the same consensus requirements as the draft
standards
5. Makes available an appeal process
for any participant alleging that these principles were not
respected during the standards-development process.

The Brookings has really gone over the hill with a bunch
of reports on education that are free of any moral compass or
academic integrity.

North Carolina officials are trying to get a refund from Pearson because of flaws in the data system that Pearson is running for the state.

Pearson is charging the state $7.1 million for its information system but it doesn’t work.

Here are some of the problems with Pearson’s PowerSchool:

CMS POWERSCHOOL WOES

At the Observer’s request, CMS produced a summary of ongoing problems with PowerSchool.

• Transcripts: Cannot produce transcripts for mid-year graduates. System maintenance has wiped out some data for other students.

• Athletic eligibility: PowerSchool cannot generate eligibility reports. CMS created a local system.

• Driver’s license eligibility: Can’t create reports that verify students’ eligibility.

• Graduates and dropouts: Reporting systems on retention, promotion and graduation don’t work; there is no dropout reporting system.

• School activity reports: CMS has created work-around systems because of flaws in reports that track teacher qualifications and student-teacher ratios.

• Enrollment: Monthly reports that tally enrollment at each school have had glitches. The September report is used as the official snapshot of statewide enrollment. The state reported that this function was fixed in February.

Read more here: http://www.charlotteobserver.com/2014/02/28/4731119/nc-on-troubled-school-data-system.html#.UxfazMu9KSN#storylink=cpy