This mother has adapted a question that she says appears in her child’s homework assignment. Is she exaggerating? If you have a child in this grade, please chime in.
“Second grade question: If Jack and Jill go up the hill, and a bucket of olives is $5.00 then how much tapenade can John concoct with boughten oil. (Actually not far off from actual 2nd grade math homework – infer that Jack=John, understand the archaic past tense form of buy, be inspired to look up the word ‘tapenade’, find a tapenade recipe, estimate the amount of olives in a bucket, then solve – assuming that Jack/John had five dollars.) It integrates skills (sure it does) – as if you were a 24 year graduate student intern at the food network.
“Frustrating in the extreme doesn’t begin to describe this nonsense. Parents have been posting their kids homework all over Facebook. The questions/problems are absurd. The only aim is to make all the teachers take expensive training, sell expensive computer systems, and make everyone feel inadequate.
“This emperor has no clothes, none of this is proven to produce results. It is an expensive and damaging program.”
I think I’d want to see the actual question and know the context in which it was asked. But it would not be the first example of a poor homework or assessment item.
What a buzzkill you are, reading and comprehending Diane’s post and following it up with such a reasonable response. I’m leaving this comment to go party with H.A. Hurley!
Wait…Tapenade context & 2nd grade? Could play and learn with a 2nd grader 24/7 and NEVER stumble on TAPENADE! I lived a wonderful life until I stumbled onto Tapenade in my 30s. Have been making and enjoying it for 30+ years. But, 2nd grade? In all fairness, my granddaughter knew what Tapenade was in 2nd grade. But, embedding it in a math problem in 2nd grade? Do we truly have nothing in our US that we think we need to teach without looking through old dusty dictionaries and randomly pluck words and incorporate them into CCSS? Come On, People!!!
Are we preparing our little ones to hold conversations with the elderly in nursing homes, or prepare them for a life full of great things, ideas and concepts? Nothing against the elderly, oops – I am one, or nursing homes.
I must have seen that Peyton Manning/Deion Sanders DirectTV “tapenade” commercial 50+ times last season, and I don’t even watch that much football. Tapenade is in the mainstream!
We need a copy of this and proof that it is real
I can tell you that the questions asked of 2nd graders are above their comprehension level. These questions are more attuned with fourth grade.
As you have written, this is child abuse to knowingly ask questions that the child is most likely not able to answer.
Are you kidding me?
Parents should send such harmful and absurd nonsense to David Letterman & Jay Leno for highly monologue. What hinter-world contributed this information for academic rigor? Some isolated weirdo in a cabin in Upper-Mongolia? Do we not have any American information that is developmentally appropriate and important for our kids to learn? I bet, even Mario Batali’s boys would not know how to make tapenade. But, if they had been educated properly in 2nd grade and CCSS, they would know it by now.
Is this real? Pinch me, somebody!!?
I raised four kids. If that isn’t enough to destroy any dreams of culinary glory, I don’t know what is. Only one of the four would touch olives as a child. Tapenade did not cross my radar until they were all gone. If the child is good at sounding out words, I wonder if he asked his mother what “tape nade” is.
It is obviously Gatorade that comes out of the tap. Tapenade.
Good one, Joanna. 🙂
I would like to see the actual question, but on the surface it is absurd.
I have a 2nd grader and the math questions have been pretty straightforward nothing like the example given here. Homework is assigned M-Th. It consists of one math question, one literacy exercise (and some are fun – for instance one exercise is to illustrate vocabulary words – the definitions are provided, go along with the class reading and do not seem too advanced to me – the hardest one this week is precipitation), and a minimum of twenty minutes of reading is required. We choose the books for the reading portion.
In all the work my child had brought home so far this year, I found one question I did not like it was on an enrichment math sheet he did in class. He did not do this particular problem because they don’t use calculators in school. It involved using a calculator to add two numbers and pressing the “=” button 3x and recording the answer.I’m an old school HP gal and my RPN calculator does not have an “=” button! (not that I would have allowed my 2nd grader to use a calculator anyway….)
Obviously the question makers were pissed that they had to pull out Mug (TM) root beer, so they said, we’ll show those little second grade twerps… tapenade.
To assuage some here I apparently have to declare that I am not a K-12 educator, merely the father of 3 kids and husband of a former HS teacher.
If this is supposed to be funny, I don’t see the funny bit. Without the actual math question, this is really rather silly and ridiculous. As it stands it is essentially an example of Alinsky’s Rule #5. Such comments hardly add to the credibility of critics of CCSS.
Here’s one from an ELA packet that my daughter recently brought home. It’s about dividing syllables. Here are the printed instructions: “When a word has two consonants between two vowels, the syllables are divided between the two consonants. For example, number is divided like this: num/ber.” She was then given five words to divide: droplet, effect, currents, faster, and happens. Here is the link — it’s on p.13: http://books.google.com/books?id=wR7t_AWLtKMC&pg=PA152&lpg=PA152&dq=spectrum+reading+a+word+part+is+called+a+syllable&source=bl&ots=Dl0eAvS3QI&sig=EAUzu-vla0hKZRaTJj8sWt-aN1E&hl=en&sa=X&ei=xrFmUsqDI8jPkQeKvIC4BA&ved=0CCsQ6AEwAA#v=onepage&q=spectrum%20reading%20a%20word%20part%20is%20called%20a%20syllable&f=false
Following the rule, my daughter dutifully divided all five words between the consonants. The problem? In order to properly divide “faster” into syllables, you do NOT divide between the two consonants — “fas/ter.” Because the “er” is a suffix, the proper way to syllabicate faster is: fast/er. Look it up in the dictionary! Say it aloud to yourself. Basically, my daughter has just learned improper grammar from a common core aligned workbook in school. No big deal?
Let’s fast-forward to ELA testing time. What happens if my daughter gets something like this on the ELA exam? What should she do? Now the poor child has to sit and play mental harikari with herself: which answer does the KGB of childhood testing want? Is it the grammatically incorrect one from the workbook, which is aligned with the common core, which is aligned with the test? Or is it the grammatically correct one according to the dictionary? Only one answer is right according to the KBG. Only one answer is grammatically correct according to the dictionary. Which answer gets her (and her teacher) the points? Fas/ter? Or fast/er? We will never know……….. And therein lies the rub….
That is a great catch on your part regarding fast/er! Hopefully other children who are using this workbook have similarly observant teachers or parents in their lives.
But this strikes me as being an error on the part of the textbook/curriculum provider, not something inherently wrong with standards or curriculum reform.
I understand your point. Hope you understand the Hobson dilemma outlined in mine!
Deborah:
I really appreciate the actual concrete example. It really helps.
I would defer to linguists on this, but it seems to me that the issue is what typically happens in establishing pronunciation rules in English – there are always exceptions. For the word “master” which rhymes with “faster” the split is between the s and t – according to my American Heritage Dictionary.
More to the point, and as someone else noted, this looks like a mistake. Mistakes happen both in textbooks, workbooks and tests. They shouldn’t but they do. It argues for greater care in the preparation of such material. I find it difficult to go beyond that simple conclusion and decide to throw the proverbial baby out with the bath water.
Except that mistakes in high-stakes testing — documented in this link — have serious ramifications. And our children are being made to suffer those consequences. http://www.washingtonpost.com/blogs/answer-sheet/wp/2013/04/24/a-brief-history-of-pearsons-problems-with-testing/
Deborah:
There can be no excuses for the errors. It looks like Pearson and other test publishers need to have a far more rigorous quality control process in place. Contract provisions need to be added that penalize such errors and ensure quality control checks. In the majority of instances scores can readily be corrected for this type of error.
These types of errors will not end standardized testing.
Ah, Bernie. So what will (end standardized testing)?
Deborah:
I think I need to add a caveat at this point. To assuage some here I apparently have to declare that I am not a K-12 educator, merely the father of 3 kids and husband of a former HS teacher.
In my opinion, demonstrably effective and credible evaluations of teachers and administrators that incorporate some external validation of those evaluations would essentially eliminate the need for grade by grade standardized assessments. There may still be a legitimate need for some form of standardized evaluation at entry points into Middle School and High School – but these would primarily serve a student readiness diagnostic purpose. They may be used to provide some validation of a school’s evaluation process.
To eco Deborah’s point,
The errors in high stakes testing (well documented in print and in the experience of the teachers who have, for one reason or another, seen the highly secretive tests), have serious consequences for students and teachers.
Because of the “test security ” measures, there is little (no?) mechanism to insure that errors are caught, corrected etc, prior to students being denied diplomas, receiving lower grades, being held back, teachers being fired, etc.
Perhaps it is time to toss the monster baby out along with its toxic bath water.
Ang:
Pre- and post-test administration error checking mechanisms should be completely independent from test security issues.
Except for rhyming, Bernie the two words cannot be compared for syllabication. “Faster” is a comparative adjective where the -er is a suffix indicating “more fast.” In the case of “master” -er is not a suffix. We are not suggesting “more mast.” As someone else suggested catching these errors on the test is entirely up to the publishers since no one else has access to the test questions.
That fact alone puts the lie to the use of these tests for the improvement of teaching. How can you use the results of a test to improve pedagogy when you do not have access to the test or the results? Talk about a way to deform teaching and learning! “You are just not good enough at reading/math. You will just have to work harder. No, we can’t tell you what to work on; just do more harder.” Of course, some kids from groups that tend to have lower scores, with year after year of sub par test scores, just quit. They know they are dumb.
2old2tch:
I am not disputing the error and I do understand why the word should not have been put in the workbook.
However, it is a massive leap to argue that because such errors have and might continue to occur that that in itself warrants stopping the use of such test to evaluate teacher’s value added. There are way more potent and compelling arguments for using such tests as major components of any evaluation of teachers, IMHO.
I was playing with you with the word stuff. The dang things (words) are much harder to pin down than a person would think. I had to think about why your rhyming was not a good example.
As to the value added stuff, my post was not referring to that but to the bogus claim that we could use the tests to improve teaching and learning. It’s impossible to use a test you can’t see to inform instruction. As to VAM, I believe I have seen posted with citations on more than one occasion that the statisticians who came up with VAM cautioned that the concept was not ready for prime time and should not be used for high stakes decisions. How anyone can justify going ahead and using the concept is beyond me.
Bernie
Would love to hear your “more potent and compelling arguments for using such tests as major components of any evaluation of teachers”
In your humble opinion, of course.
NY Teacher:
I meant to write “There are way more potent and compelling arguments against using such tests as major components of any evaluation of teachers, IMHO.
My apologies for any confusion. “For” makes no sense in context.
Deborah,
Thanks for sharing. I found the question relating to the reading assignment in that same area interesting. The passage explained why lightening and thunder occur.
One question was; The purpose of the column was:
a) to inform
b) to entertain
c) to persuade
I know (at least I highly suspect) the answer they want is a), but I also feel b) is a valid answer if the student finds the subject matter fascinating.
Bernie @ 5:51,
Should be does not equal IS.
The fact that due to “test security” parents, students and teachers are not aware of issues until too late ( or ever).
And woe be unto the teacher who attempted to point out an error.
Trust me on this one.
I am not ok with students and teachers suffering consequences due to potential ( and in the experience of many, frequent) errors in super secret, secure, never examined by parents, teachers or the citizens who are paying for them, tests.
Thanks for the correction. I am a former, part time item writer for a private testing company; I wrote for many different state standards under NCLB. I must say that poorly constructed, confusing, or developmentally inappropriate items undermine the validity of standardized scores and subsequent use in teacher evaluation. When standardized tests are properly constructed, such items which might make it to a field test will almost certainly be vetted during what is typically a two year process. Many items on the Pearson math and ELA administered last April here in NY were written, in my opinion, in an intentionally confusing style using obtuse or arcane vocabulary. The ELA test in particular included confusing item stems and distractors that were not clearly wrong. There were far too many items that turned subjective opinions (most likely; best; author’s intent; etc.) into a “one right, three wrong” format. Many teachers were unsure of the correct answers on a number of vague and fuzzy items.
The math test included many items that were ridiculously convoluted. Although there may be other compelling arguments against VAM teacher evaluations, corrupt test writing, norm referencing (instead of criterion referenced scoring), and manipulating cut scores add up to a rather important set of reason to invalidate the entire process.
NY Teacher:
Good points. Recent Pearson tests need to be made public to facilitate the type of analysis you outlined. They also need to release the validation data for those same tests. That in my opinion is the angle to take. If the tests are flawed they will fall of their own weight.
Math is math. The only way for lazy, incompetent testing corporations to make it more “rigorous” is to make the actual math problem harder to extract from the text. Essentially, we are supposed to believe that it’s a good idea for our children to learn to be confusing when they communicate with others, that this “Tower of Babel” model of high expectations will lead us to ever greater global competitiveness. I call it a GIANT PANTSLOAD!
It seems to me that criticism of the author of any question should be based on a question that author actually wrote. Unfortunately that is not the case here.
TE:
I absolutely agree. The great value of Deborah’s example is that it can be meaningfully discussed.
I remember getting a brand new first edition math book when I was 15 ears oldy. The errata covered three pages. We learned to check the errata before doing the problems.
Thank you, all. My point — however inartfully I may have expressed it — was that discussion regarding errors our children make on teacher-driven tests is possible. It is not possible with the Pearson-driven-and-owned-trade-secret-protected (yes, they claim trade secret legal protection) standardized tests (unless, of course, there is come glaring “class action” error, i.e., with the NYC gifted and talented students.) Parents and teachers never get to see the children’s work product, and never get to see the mistakes the children made, and our children never get to learn from their mistakes. And, after all, isn’t that what true learning is all about… being able to learn and grow from one’s mistakes?
Wonder what Bernie would think of these two: http://blogs.edweek.org/edweek/on_performance/2011/10/performance_or_effectiveness_a_critical_distinction_for_teacher_evaluation.html followed by http://blogs.edweek.org/edweek/on_performance/2011/11/ramifications_of_the_performanceeffectiveness_distinction_for_teacher_evaluation.html
Bernie, sell all your shares in Pearson and the other testing “giants”.
Very interesting…
CitizensArrest:
These are two interesting articles. Thanks for the links. To be clear, I am neither for nor against Pearson or any other textbook or test publisher.
The two articles you linked to capture the essence of how most I/O Psychologists – my background – view performance assessment. They are both blessedly brief, well written and informative.
I particularly like the closing two paragraphs of the second article:
If you want to measure teacher performance, then measure it directly. Doing so will force you to delineate the behaviors of interest (i.e., what you define performance to be) and increase your chances of identifying promising interventions for improving performance (and, thereby, effectiveness). Second, VAM seems to limit the definitions of both teacher effectiveness (to students’ test scores) and teacher performance (to only those behaviors that increase student achievement on tests, and this assumes that we know which behaviors those are). Thus, both effectiveness and performance as defined by VAM are likely deficient concepts.
Please do not let our concerns regarding VAM lead you to believe we are anti-testing. On the contrary, we are staunch supporters of standardized testing. Nevertheless, current VAM seems to discount the inherent complexity of teacher performance and teacher effectiveness, artificially constraining their definitions and indicators. We enthusiastically endorse the use of empirical data, but convenience (students’ test scores are available and standardized, at least within states) must not trump relevance (students’ test scores tell us little about specific teacher behaviors) when choosing data to serve as the foundation for high-stakes personnel decisions.
The nuance here is that the authors have finessed the issue of what % of a teacher evaluation should VAM be. (You need to think carefully about their statement “students’ test scores tell us little about specific teacher behaviors.” It does not say that these are not measures of teacher performance.) The authors certainly say that standardized test-based VAM should not be 100%, just like evaluating a salesperson on sales volume is overly simplistic and likely to distort actual effectiveness. For example, for those who are familiar with High Tech, the old Apollo mini computer sales folks were driven by big sales commissions while their competitor HP used to pay minimal commissions. Apollo generated high volumes per $ cost of sales and low margins, while HP had lower volume per $ cost of sales and higher margins. When HP bought Apollo, the Apollo sales people left taking a significant % of their customers with them. How you measure performance is not easy and whatever measure you use has both direct and indirect consequences.
The question is what % weight should VAM be in an evaluation, assuming that the VAM measure is valid and reliable. As I said earlier, in line with the authors of the two articles, if you can develop an efficient, reliable and credible evaluation system without standardized tests then there is no need for standardized tests as part of the teacher evaluation process – they are expensive, potentially distort teacher behavior and impose a potentially high cost on some students. But in the absence of such a system, standardized tests may have a role to play, albeit it more limited than it is currently in some places. The 50% mandated in RTTT is in my opinion way too high for Classroom Teachers, though it makes more sense for a school principal.
What is perplexing is that a number of commenters here believe that there is no way to evaluate a teacher’s performance. I very much doubt that the authors of the two articles would agree with that proposition.
To put it succinctly but crudely, these folks are messin’ with kids’ heads. It just isn’t right. I don’t like it when someone messes with my head and children don’t need anyone messing with theirs. It’s abusive and a cause for consternation.
Ms Cartwheel:
There is no evidence that this is a real math question. Deborah’s flawed question is real. But it is a mistake. There is no evidence to assume that people are being abusive or messing with anybody’s mind. There just isn’t.
You would change your opinion in a NY minute if the state would release the Pearson tests administered last April. The Pearson test writers are either grossly incompetent – or purposely messing with kids heads. Hard to believe that they’re that incompetent.
Bernie, My response was tongue in cheek but I can assure you that I have seen some equally confusing and consternating math problems in a second grade classroom recently thanks to Math Investigations.
As to the tapenade twins going up the hill, well, the reading level was a tad high for second grade so my guess, it was a third grade problem bringing in inferencing and the CCSS science topic 3.E.2.2 Compare Earth’s land features (including volcanoes, mountains, valleys, canyons, caverns, and islands) by using models, pictures, diagrams, and maps.
I like the “ideal” of Common Core”, the interconnectivity of all learning but the roll out of the program and the attachment to high stakes testing is screwing with kids heads (my original choice for my post but I decide to be just a wee bit less crude.) Higher order thinking skills have a place and that place is after the basics are firmly in hand.
decided
I have a son in 3rd grade and the homework is ridiculous as well as the tests. Tonights homework required that he understand multiple concepts to complete one problem. It does often feel like riddles and I need to reread the question several times. I don’t like the multiple choice aspect of homework and tests because I see that sometimes he just picks an answer after question 15, probably because he doesn’t have the stamina after being mentally exhausted by so many problems. Overall, I can see that for many students math is to advanced because many have not mastered the basics and/or developmentally it’s just not appropriate. Here is one from tonight:
Carole read 28 pages from a book on Monday and 103 pages on Tuesday. Is 75 pages a reasonable answer for how many more pages Carole read on Tuesday than Monday. Explain your answer. OMG! Really…for an 8 year old.
This is a Pearson math sheet. Emotionally, I just prepare my son and tell him what this really is and make sure he understands what he really needs to know. It’s still madness that is frustrating for everyone.
Excellent example from a Pearson worksheet. I rest my case. Do they really expect the average 8 year old to distinguish between how many pages v. how many more pages. And why is the phrase, “reasonable answer” used when the actual computation is rather simple and precise. And why on God’s green earth would written explanations be necessary. Aren’t mathematical explanations provided whenever a student shows their work (computations)?
Let me help this item writer:
Carol started reading a new book on Monday. She stopped reading at page 15. On Tuesday she continued reading the book until she stopped at page 40. How many pages did Carol read on Tuesday?
Show your work for full credit.
Jeanette and NY Teacher:
This is great. Concrete examples are far more persuasive than pot banging.
Jeanette, can you provide specific details on the workbook your son is using. If this is not an outlier then the editor of this series deserves some forceful and well-documented feedback with a cautionary note about the use of such items in any tests.
Bernie
The Pearson math and ELA assessments administered here in NY were littered with such obtusely worded items. Definitely not an outlier.
I wrote several emails to NYSED requesting that they release these Pearson tests for parents, the media, and testing experts to evaluate. NYSED refused all request based on security issues – even after the test were scored.
NY teacher:
Then legal action is needed. There are a number of law firms that specialize in this area and who are used to abiding by legitimate needs for confidentiality while having the technical expertise available to assess the items. If the items are as flawed as they seem to be, Pearson will put up a fight but I see no way they will actually win. Any teacher who suffered an adverse consequence should have standing to push the issue. Perhaps a lawyer reading this can comment.
Many of us have been advocating the use of legal challenges to end this madness. To me it would be a slam dunk case. many teachers are fearful about stnding up to push the issue. Much harm has been done to teacher reputations, teacher health, and especially harm against children. “Send lawyers guns and money – the tests have it the fan”
NY Teacher: I’m a lawyer (there, I’ve said it). I’ve always thought that a massive class action against Pearson would be interesting. I would love to see the response that NYSED gave to you. Writing to NYSED, pursuant to FERPA, to get my daughter’s test papers is on my list of things to do. I’m wondering about their rationale for citing “security” issues. Whose security is NYSED trying to protect? NYSED’s? Pearson’s? Our children’s? And if NYSED is trying to protect Pearson’s interests over those of our children, then why?
Deborah:
Good questions. Firms that sell business information can have some pretty potent restrictions on the use and sharing of that information. Test publishers may enforce similar conditions for use.
The publishers of the well known test guides must have worked through these issues. ETS may also be more open to sharing their position on when and under what conditions they will share items.
It may be that you need more than the actual wording of the items to identify bad items, though it is certainly the place to start. The actual data by respondent would be very useful to spot rogue items.
Good luck.
I’ve been saying that if Pearson wants to keep their tests secreted, they should not give them to anyone to take! IMHO, once my child puts pencil to paper, those tests become MINE! Pearson does not own my child’s mind or her work product. Coca-Cola can claim trade secret protection in their recipe, but they still have to divulge their ingredients if they want to sell to the public.
Deborah:
Surely the FairTest folks have worked these issues before?
Deborah
After my first email to NYSED regarding the publishing of Pearson tests in their entirety, I received a timely reponse in which they cited test “security” issues (sorry I’ve deleted the emails). I responded with a second request for full transparency so that parents, the media, testing experts, and the general public could see what these 8 to 14 year olds children were up against. NYSED did not respond to my follow up email. To me their silence speaks volumes.
There so-called “security” issues MAY be linked to the inBloom test data business???????
Please understand that Pearson wrote the NYS math and ELA exams for sttae standards developed under NCLB. Those tests were readily available online and we were actually allowed to not only look at them and discuss them, teachers throughout the state copied the previous year(s) exams to use in their classrooms for review (much like state Regents tests were used in HS). It is this sudden veil of secrecy I find somewhat disturbing – certainly a red flag for all concerned teachers, parents, school boards, et. al.
Actually it’s madness with more than a little BS mixed in which, hey, that would make a great math problem;
If Jeannette’s eight year old son was making a batch of Pearson madness one evening for homework and the next evening on his ninth birthday he decided to increase the amount of the mixture in direct proportion to the increase of his age in years, how much more BS madness would be left if his birthday party attendees ate one third of a cup each and gave one half of the remaining mixture to the family pet? Explain your answer in complete sentences. Points will be taken off for spelling.
Here is a problem my third-grader brought home (I had to read it 3 times, and it took ME forever to work this–forget an 8 year old):
Easton has been raising vegetables in his garden all summer. He plans to sell some of his vegetables at a local farmer’s market.
He has selected 24 radishes, 30 onions, 16 heads of lettuce and 25 tomatoes to sell. He wants to display the radishes together, the onions together, the lettuce together, and the tomatoes together, and to place them in sets with equal rows for each kind of vegetable.
He plans to put each kind of vegetable in at least 2 rows. Show ALL the different ways that he can display equal rows for each kind of the vegetables at the market. Write an equation for each way you find.
Melody:
I agree that as stated and in isolation it is a very confusing question and, I suspect, conceptually way beyond most 8 year olds unless broken down for each vegetable. The idea that the student would derive a general equation is really pushing it. The notion of an equation in this instance is particularly confusing because the constraints have to be stated.
In addition the wording “Write an equation for each way you find” is bizarrely obtuse unless they simply mean to factor 24, 30, 16 and 25 and write out the resulting multiplication statements.
For example,
24 radishes = rows x columns = 2 x 12 = 3 x 8 = 4 x 6 = 6 x 4 = 8 x 3 = 12 x 2
Perhaps an elementary math teacher can explain why the problem was framed in this way.
Can you provide the detailed information to this book/workbook?
These math questions are framed in an obtuse style in order to test prep students for an equally obtuse battery of exams that are coming in April. Pearson must think that by writing obtuse, arcane, and consusing word problems that they are increasing the rigor of math. Instead the are frustrating students with a bunch of convoluted gibberish. PARENTS: SEND BACK THESE HW ASSIGNMENT WITH YOUR SIGNATURE AND THE PHRASE, “JUNK MATH, RETURN TO SENDER”
From what I gather, this hadnout is test prep for the CRA, which is a practice test for next year’s PARCC. What I typed above is all that was written on the page. I am in the land of Huffman (TN), so we are still doing TCAP this year. In addition, our kids have to do the CRA to prep for oncoming train wreck known as Common Core. In fact, my sons are taking the CRA today. The classroom teacher is focusing on the math skills required by TCAP, and the school math coach is focusing on the CRA. It is CRAZY in TN. CRAZY!!!
Melody:
I am confused. Are you saying that the worksheet is something a teacher put together?
Are the substance and test format of the TCAP and CRA that different?
The worksheet (although I’m not sure one problem constitutes the label of worksheet.. . lol) is pulled from a practice test bank for practice for the CRA test. The CRA test is the precursor to the PARCC, which is what we will be using next year. So this is all Common Core math. This problem is supposed to be worked using the Common Core style of problem-solving.
The TCAP is what TN has used for years to judge student progress. It is a traditional standardized test which s taken every year in the spring. In the past few years, the TCAP scores actually factor in as part of the student’s grade, and have been used to evaluate teachers (which is ridiculous). In fact, the TCAP impacts a third-grader’s average as much (in some cases more) as a final exam does in a college class.
Anyway, we are in a transition year, so technically the TCAP is what will still impact student grades and is still being used for teacher evaluation scores, so teachers are still having to use curriculum aligned with the TCAP. The CRA is merely “practice” for the students for next year when we go all-in with Common Core.
Honestly, all of these tests are set up to try to get students to fail. They all use wording and language which is beyond developmentally appropriate. Pearson is notoriously guilty.
Former third grade teacher here (and current second-grade teacher): You are correct, Bernie. The children seem to be learning about rectangular arrays. It is presented in this way to provide a “real-life” math application.
Thanks, Lehrer.
Rectangular arrays, interesting. Do your children understand this approach to multiplication as opposed to essentially repeated addition?
Yes: it’s much more concrete for them since they actually build the arrays with inch tiles. After we explore with the tiles, then we begin representing the arrays mathematically, both as models of repeated addition and area (although it isn’t called area at this point.) We actually begin the teaching of rudimentary multiplication in second grade.
So Lehrer, you seem to see far fewer issues with the question that was raised?
And I say that they are messin with kids’ heads. These little guys are going to give up on it plain and simple. One vegetable would have been sufficient. Good grief, math phobics of the future rise up and take back the basics and your sanity.
Melody:
You may be right about the effect, but I would not attribute anything more than incompetence as the explanation as to why four rather than one vegetable is included in the problem. NY Teacher can probably provide a ream of examples of poorly framed questions.
Sorry my last comment was meant for the cartwheeling librarian, not Melody.
I would also like the opportunity to see the actual question so that we could have something of substance to discuss.
I find it interesting that no one has questioned whether or not this problem is even aligned to the second grade standards. The focus of the second grade common core standards is on whole numbers. The standard 2.MD.8 has students working with cents and dollars, however, the types of problems they should solve depends upon the range of numbers and the types of operations outlined in rest of the second grade standards. It appears that this problem presented the amount of money in decimal notation. Decimal notation is introduced in the 4th grade standards through the decimal fractions of 1/10 and 1/100.
Just because someone has slapped the “common core” label onto a problem, doesn’t mean that it accurately reflects the content or the intent of the standards.
I teach second grade and I am 100% positive that this question is simply satire. Yes, my students are being subjected to so much “rigor” that some of them go home and cry, but this question reflects über-rigor to the point of insanity. I can assure you that it in no way is aligned to the CCSS.
The tapenade problem is ridiculous and, I suspect (and hope) a satire that someone found believable. In this day and age, satire and reality are overlapping.
My issue with the vegetable garden is that it is one of those “gotcha” problems that test writers seem to love. Are we trying to find out what a seven or eight-year-old knows about rectangular arrays/repeated addition, or are we trying to figure out how complicated a problem we can create for a child to solve? In this problem, it is obvious that they are testing early multiplication concepts. Why add the other vegetables? The only reason I can see is so that some will get confused and fail the item.
Lehrer:
Your comment makes great sense. However, this may be a “teacher error” since it appears that the problem was on a worksheet not from a book or a test. But who knows. It remains an awful question.
Unfortunately, I write from experience about “gotcha” questions. I have seen many similar test items over the years. I suspect that the teacher/ worksheet publisher wrote this item based on similar test items.
NEWS FLASH: NYSED suspends PARCC tests for 2014/2015! Any future commitment to PARCC in doubt. Reasons cited: cost to districts and technology logistics.
Is this a joke?
NO lie. Fact not fiction. My guess is John King will spring this on his audience in Albany tomorrow. A great distraction to those who are ill informed. The Pearson tests everyone is so worked up over are still in play.
Not if I have my say!
Just the last in a long line of jokes (which I reiterate one too many times: they’re messin’ with everyone’s heads.) It’s the new normal.