PARCC is going away

Anonymous
New Jersey no longer uses the test. It’s just DC.
Anonymous
Why not use a national test like the Iowa Test or similiar? I grew up in Baltimore County and we took it every year starting in 3rd grade or so. My son takes it in Catholic school. The school uses the scores to allocate resources. I like seeing how my kid compares to others nationally.
Anonymous
Anonymous wrote:New Jersey no longer uses the test. It’s just DC.


Yup: New Jersey is also quitting PARCC:
https://www.nj.com/education/2018/07/murphy_parcc_graduation_changes.html
[Report Post]
Anonymous
Anonymous wrote:New Jersey no longer uses the test. It’s just DC.


And New Mexico!
Anonymous
I'm relieved this awful test and meaningless test is going bye bye.
Anonymous
Anonymous wrote:Why not use a national test like the Iowa Test or similiar? I grew up in Baltimore County and we took it every year starting in 3rd grade or so. My son takes it in Catholic school. The school uses the scores to allocate resources. I like seeing how my kid compares to others nationally.


Seems sensible.
Anonymous
Anonymous wrote:I'm relieved this awful test and meaningless test is going bye bye.


It will be replaced by a different test that you will probably also consider awful and meaningless.
Anonymous
Anonymous wrote:
Anonymous wrote:I'm relieved this awful test and meaningless test is going bye bye.


It will be replaced by a different test that you will probably also consider awful and meaningless.


You can count on it!
Anonymous
Anonymous wrote:Why not use a national test like the Iowa Test or similiar? I grew up in Baltimore County and we took it every year starting in 3rd grade or so. My son takes it in Catholic school. The school uses the scores to allocate resources. I like seeing how my kid compares to others nationally.


+1

But MCPS will not do this.
Anonymous
Anonymous wrote:
Anonymous wrote:Why not use a national test like the Iowa Test or similiar? I grew up in Baltimore County and we took it every year starting in 3rd grade or so. My son takes it in Catholic school. The school uses the scores to allocate resources. I like seeing how my kid compares to others nationally.


+1

But MCPS will not do this.


It's not an MCPS decision.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:I'm relieved this awful test and meaningless test is going bye bye.


It will be replaced by a different test that you will probably also consider awful and meaningless.


You can count on it!


There will always be complaints about whether testing should happen at all. But PARCC in particular has deeper flaws: it's taken away more instruction time to administer and train students; it takes months to get any feedback; even after years people still are unclear on the meaning of the score reports; so many states have now left there's no meaningful national comparisons and it's at real risk of being discontinued by the developer.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:The PARCC tests were better than the previous state-written assessments. And now here we are, back to new and different state-written assessments.


BS, the only issue with the state assessments was they didn't provide national comparisons. PARCC was ill conceived and slapped together on all levels.


Did you have anything to do with the previous MSAs? They were terrible.


They were straightforward, and throwing the baby out with the bath water has achieved nothing.


What baby, what bathwater? The curriculum changed, so the tests had to change. Now the curriculum isn't changing (the state isn't un-adopting the Common Core State Standards), but the tests are changing anyway.


This is the comparison that was presented to parents when PARCC rolled out: https://www.montgomeryschoolsmd.org/uploadedFiles/schools/burtonsvillees/news/Attachment%20A%20Key%20Facts.pdf
Look at the two math questions. There's nothing about the new curriculum that makes the MSA question irrelevant. And although we were told the PARCC question is clearly deeper and richer, it's not. The only thing it requires is more reading comprehension. If you actually look at the math involved, there's no grand concept, it's busy work. Any thought put into the solution is wasted, just get do the plugging and be done. The MSA question actually tests what the concept of remainder means in a real world scenario, maybe that's old hat, but it's not nothing.

Now of course this is making too much of a single comparison, but this is *the* sample question the PARCC people decide to release to sell their test. School systems across the country flashed this same question up to wow parents. No need for a postmortem, PARCC's dead, we were sold a bill of goods.


the parcc question is much more difficult. please stop talking about math, you know nothing.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:The PARCC tests were better than the previous state-written assessments. And now here we are, back to new and different state-written assessments.


BS, the only issue with the state assessments was they didn't provide national comparisons. PARCC was ill conceived and slapped together on all levels.


Did you have anything to do with the previous MSAs? They were terrible.


They were straightforward, and throwing the baby out with the bath water has achieved nothing.


What baby, what bathwater? The curriculum changed, so the tests had to change. Now the curriculum isn't changing (the state isn't un-adopting the Common Core State Standards), but the tests are changing anyway.


This is the comparison that was presented to parents when PARCC rolled out: https://www.montgomeryschoolsmd.org/uploadedFiles/schools/burtonsvillees/news/Attachment%20A%20Key%20Facts.pdf
Look at the two math questions. There's nothing about the new curriculum that makes the MSA question irrelevant. And although we were told the PARCC question is clearly deeper and richer, it's not. The only thing it requires is more reading comprehension. If you actually look at the math involved, there's no grand concept, it's busy work. Any thought put into the solution is wasted, just get do the plugging and be done. The MSA question actually tests what the concept of remainder means in a real world scenario, maybe that's old hat, but it's not nothing.

Now of course this is making too much of a single comparison, but this is *the* sample question the PARCC people decide to release to sell their test. School systems across the country flashed this same question up to wow parents. No need for a postmortem, PARCC's dead, we were sold a bill of goods.


Wow. I think the PARCC question is deeper and richer and more likely to be what you see in the real world. You have to set up the question first, then solve it. That said, I'm not sure whether my 4th grader could figure it out. He could definitely figure out the MSA question though. He can do the computation, but he can have trouble figuring out what the computation is supposed to be. We are working on that.


Really, in the real world would you'd bring six empty vehicles on a field trip? And, what's wrong with asking a simple question of someone who's just learning a concept? Is there something about doing everything at once that's superior? Math is a tool for isolating concepts. The PARCC question is so hyper it's hard to tell there's not really much going on.

Even before the math, there's just too much going on--text, tables, pictures, a question that demands three responses. Why does the bus look like a van? Don't forth graders worry that one of the seats in the five passenger car is the driver? Hmm, the answers work out the same whether or not you eliminate a driver from the seat count. So the test writer thought this might be a point of confusion and they decided not to state it more clearly? Again, that just penalizes the student who notices the ambiguity and hunts for clarification or takes time to check that drivers don't make a difference.

Don't sell your son (or yourself short), this was just garbage designed to distract. Good riddance.


one of the most important parts of being good at math - abstract thinking - is knowing what to ignore and what to focus on, something you seem to be struggling with.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:The PARCC tests were better than the previous state-written assessments. And now here we are, back to new and different state-written assessments.


You didn't read or understand the article. The state is not writing the test--they are looking for vendors to write the test or propose already existing tests.


Ah. So, we're replacing one vendor-written test with another vendor-written test.


Either way doesn't matter to me. PARCC needed to go as it was based on the belief that everyone should and could go to college, NCLB bullshit. If you read the documentation for PARCC, "career" ready is just another way of saying "college"--careers, not jobs. As the article states, I agree with educators that believe the new test should be adaptive and test grade-level material. It should be able to assess if a child is below, at, or above grade level and quickly get that information to teachers/schools so that the information is useful.


Adaptive and test grade level material? I'm not sure how that would work, if the student doesn't know the grade level material it would "adapt" by going lower, therefore no longer testing grade level material and vice versa. It is also very difficult to make comparisons between students if the test is adaptive and changes for each student. Adaptive tests are great for measuring individual growth and seeing what students know, however they are not great for comparisons- Larla's 100% is not equivalent to Billy's 100% because they each had different questions measuring different outcomes.


GRE is an adaptive test and there are no issues with comparisons. your outcome is not 100% it's the difficulty level you converge on.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:The PARCC tests were better than the previous state-written assessments. And now here we are, back to new and different state-written assessments.


You didn't read or understand the article. The state is not writing the test--they are looking for vendors to write the test or propose already existing tests.


Ah. So, we're replacing one vendor-written test with another vendor-written test.


Either way doesn't matter to me. PARCC needed to go as it was based on the belief that everyone should and could go to college, NCLB bullshit. If you read the documentation for PARCC, "career" ready is just another way of saying "college"--careers, not jobs. As the article states, I agree with educators that believe the new test should be adaptive and test grade-level material. It should be able to assess if a child is below, at, or above grade level and quickly get that information to teachers/schools so that the information is useful.


Adaptive and test grade level material? I'm not sure how that would work, if the student doesn't know the grade level material it would "adapt" by going lower, therefore no longer testing grade level material and vice versa. It is also very difficult to make comparisons between students if the test is adaptive and changes for each student. Adaptive tests are great for measuring individual growth and seeing what students know, however they are not great for comparisons- Larla's 100% is not equivalent to Billy's 100% because they each had different questions measuring different outcomes.


GRE is an adaptive test and there are no issues with comparisons. your outcome is not 100% it's the difficulty level you converge on.

PARCC is completely experimental. It has not been validated as accurate & yet it will be used to evaluate students, schools and teachers.

Standardized test development practice requires that the testing instrument be validated. Such validation requires that the test maker show that the test correlates strongly with other accepted measures of what is being tested, both generally and specifically (that is, with regard to specific materials and/or skills being tested).

In short, PARCC results have no proven correlation to anything. The scores are worthless.
post reply Forum Index » Montgomery County Public Schools (MCPS)
Message Quick Reply
Go to: