Robert Frost beats Takoma Park in Mathcounts

Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Well, this the predictable outcome of no longer admitting the best and brightest into the magnets. I don't think having OOB magnet students winning math competitions ever helped the low income and poor performing students at TPMS anyway so the loss of rank and prestige at TPMS in favor of a school in the western side of the county shouldn't matter to MCPS either.

What is ridiculous though is that MCPS offers a more rigorous STEM curriculum at TPMS then the other schools who now have higher performing students. They really should swap out the curriculums. Schools like Frost, Pyle and wherever the real highest performing cohort is now located at should have the more difficult curriculum and TPMS should have the enriched curriculum which would better serve its new cohort with a larger range of academic abilities.

Clueless

says you. MCPS's own stats show where the highest performers are.

FYI https://www.montgomeryschoolsmd.org/uploadedFiles/schools/msmagnet/about/MS%20Magnet%20Field%20Test%20Data%20by%20Sending%20MS.pdf


I've seen this chart before, and it is incredibly interesting because there's a huge swing between different metrics and I'm not sure what it means. So, look at MAP-M vs. PARCC ML. For MAP-M, which you would assume to be a better metric for "kids who will excel at math competitions," the numbers are pretty consistent between schools. So, a school like SSIMS has 50-something and Pyle has 40-something kids who pass whatever threshold MCPS has identified. Ditto the COGAT Q. The numbers are not that disparate between various schools.

But then when you look at PARCC (an easier test), Pyle blows every other school out of the water.

I genuinely don't know what this tells us, objectively, about where the talent is. If you go by MAP-M and COGAT Quantitative, talent seems pretty evenly distributed. But if you go by PARCC, it's not. Which test is right?


Well, PARCC measures learned skills and like MAP-M, is a better indicator of students being exposed to advanced curriculum. MAP-M is also considered a better-written test and normative data compared to PARCC by most education professionals, especially in academia. Cogat measures cognitive ability. If I was measure raw ability, i.e., talent, I would go with MAP-M and Cogat over PARCC.


I suppose there is a loose correlation between PARCC/MAP and Cogat. Probably a better correlation between PARCC and MAP? In any case yes odd that MAP and PARCC levels not proportionally similar to Frost, Cabin John, etc. But the what explains the divergence between MAP and PARCC numbers for the ESs leading into Pyle? PARCC was used for all kinds of things so every school's kids had same incentives to do well.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Well, this the predictable outcome of no longer admitting the best and brightest into the magnets. I don't think having OOB magnet students winning math competitions ever helped the low income and poor performing students at TPMS anyway so the loss of rank and prestige at TPMS in favor of a school in the western side of the county shouldn't matter to MCPS either.

What is ridiculous though is that MCPS offers a more rigorous STEM curriculum at TPMS then the other schools who now have higher performing students. They really should swap out the curriculums. Schools like Frost, Pyle and wherever the real highest performing cohort is now located at should have the more difficult curriculum and TPMS should have the enriched curriculum which would better serve its new cohort with a larger range of academic abilities.

Clueless

says you. MCPS's own stats show where the highest performers are.

FYI https://www.montgomeryschoolsmd.org/uploadedFiles/schools/msmagnet/about/MS%20Magnet%20Field%20Test%20Data%20by%20Sending%20MS.pdf


Every time someone posts this, I wonder anew what the heck is wrong with Westland. It makes no sense.


Why?
Anonymous
How many students from each school could participate? I thought it was more than 10. My child's teacher (not at either of these schools but another MCPS middle school) said that my child had qualified. She said it was the top 16 or 17 students. (My child didn't want to participate so I never heard anything further but now I am curious).
Anonymous
Anonymous wrote:How many students from each school could participate? I thought it was more than 10. My child's teacher (not at either of these schools but another MCPS middle school) said that my child had qualified. She said it was the top 16 or 17 students. (My child didn't want to participate so I never heard anything further but now I am curious).


10 official contestants from each school. But there can be additional unofficial ones. For example TPMS this year had 10 8th grader official contestants and some unofficial 6th grader contestants. Their 7th graders probably did not qualify.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Well, this the predictable outcome of no longer admitting the best and brightest into the magnets. I don't think having OOB magnet students winning math competitions ever helped the low income and poor performing students at TPMS anyway so the loss of rank and prestige at TPMS in favor of a school in the western side of the county shouldn't matter to MCPS either.

What is ridiculous though is that MCPS offers a more rigorous STEM curriculum at TPMS then the other schools who now have higher performing students. They really should swap out the curriculums. Schools like Frost, Pyle and wherever the real highest performing cohort is now located at should have the more difficult curriculum and TPMS should have the enriched curriculum which would better serve its new cohort with a larger range of academic abilities.

Clueless

says you. MCPS's own stats show where the highest performers are.

FYI https://www.montgomeryschoolsmd.org/uploadedFiles/schools/msmagnet/about/MS%20Magnet%20Field%20Test%20Data%20by%20Sending%20MS.pdf


I've seen this chart before, and it is incredibly interesting because there's a huge swing between different metrics and I'm not sure what it means. So, look at MAP-M vs. PARCC ML. For MAP-M, which you would assume to be a better metric for "kids who will excel at math competitions," the numbers are pretty consistent between schools. So, a school like SSIMS has 50-something and Pyle has 40-something kids who pass whatever threshold MCPS has identified. Ditto the COGAT Q. The numbers are not that disparate between various schools.

But then when you look at PARCC (an easier test), Pyle blows every other school out of the water.

I genuinely don't know what this tells us, objectively, about where the talent is. If you go by MAP-M and COGAT Quantitative, talent seems pretty evenly distributed. But if you go by PARCC, it's not. Which test is right?


Well, PARCC measures learned skills and like MAP-M, is a better indicator of students being exposed to advanced curriculum. MAP-M is also considered a better-written test and normative data compared to PARCC by most education professionals, especially in academia. Cogat measures cognitive ability. If I was measure raw ability, i.e., talent, I would go with MAP-M and Cogat over PARCC.


I suppose there is a loose correlation between PARCC/MAP and Cogat. Probably a better correlation between PARCC and MAP? In any case yes odd that MAP and PARCC levels not proportionally similar to Frost, Cabin John, etc. But the what explains the divergence between MAP and PARCC numbers for the ESs leading into Pyle? PARCC was used for all kinds of things so every school's kids had same incentives to do well.


Also, I would add that the Cogat scores were adjusted per MCPS cohort/tiered school process. PARCC and MAP-M scores were cut-offs. PARCC was 4 or 5, MCPS never disclosed MAP-M cut-offs.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Well, this the predictable outcome of no longer admitting the best and brightest into the magnets. I don't think having OOB magnet students winning math competitions ever helped the low income and poor performing students at TPMS anyway so the loss of rank and prestige at TPMS in favor of a school in the western side of the county shouldn't matter to MCPS either.

What is ridiculous though is that MCPS offers a more rigorous STEM curriculum at TPMS then the other schools who now have higher performing students. They really should swap out the curriculums. Schools like Frost, Pyle and wherever the real highest performing cohort is now located at should have the more difficult curriculum and TPMS should have the enriched curriculum which would better serve its new cohort with a larger range of academic abilities.

Clueless

says you. MCPS's own stats show where the highest performers are.

FYI https://www.montgomeryschoolsmd.org/uploadedFiles/schools/msmagnet/about/MS%20Magnet%20Field%20Test%20Data%20by%20Sending%20MS.pdf


I've seen this chart before, and it is incredibly interesting because there's a huge swing between different metrics and I'm not sure what it means. So, look at MAP-M vs. PARCC ML. For MAP-M, which you would assume to be a better metric for "kids who will excel at math competitions," the numbers are pretty consistent between schools. So, a school like SSIMS has 50-something and Pyle has 40-something kids who pass whatever threshold MCPS has identified. Ditto the COGAT Q. The numbers are not that disparate between various schools.

But then when you look at PARCC (an easier test), Pyle blows every other school out of the water.

I genuinely don't know what this tells us, objectively, about where the talent is. If you go by MAP-M and COGAT Quantitative, talent seems pretty evenly distributed. But if you go by PARCC, it's not. Which test is right?


Well, PARCC measures learned skills and like MAP-M, is a better indicator of students being exposed to advanced curriculum. MAP-M is also considered a better-written test and normative data compared to PARCC by most education professionals, especially in academia. Cogat measures cognitive ability. If I was measure raw ability, i.e., talent, I would go with MAP-M and Cogat over PARCC.


I suppose there is a loose correlation between PARCC/MAP and Cogat. Probably a better correlation between PARCC and MAP? In any case yes odd that MAP and PARCC levels not proportionally similar to Frost, Cabin John, etc. But the what explains the divergence between MAP and PARCC numbers for the ESs leading into Pyle? PARCC was used for all kinds of things so every school's kids had same incentives to do well.

PARCC and MAP are "what you learned". Cogat is "cognitive ability" -- what you are capable of.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Well, this the predictable outcome of no longer admitting the best and brightest into the magnets. I don't think having OOB magnet students winning math competitions ever helped the low income and poor performing students at TPMS anyway so the loss of rank and prestige at TPMS in favor of a school in the western side of the county shouldn't matter to MCPS either.

What is ridiculous though is that MCPS offers a more rigorous STEM curriculum at TPMS then the other schools who now have higher performing students. They really should swap out the curriculums. Schools like Frost, Pyle and wherever the real highest performing cohort is now located at should have the more difficult curriculum and TPMS should have the enriched curriculum which would better serve its new cohort with a larger range of academic abilities.

Clueless

says you. MCPS's own stats show where the highest performers are.

FYI https://www.montgomeryschoolsmd.org/uploadedFiles/schools/msmagnet/about/MS%20Magnet%20Field%20Test%20Data%20by%20Sending%20MS.pdf


I've seen this chart before, and it is incredibly interesting because there's a huge swing between different metrics and I'm not sure what it means. So, look at MAP-M vs. PARCC ML. For MAP-M, which you would assume to be a better metric for "kids who will excel at math competitions," the numbers are pretty consistent between schools. So, a school like SSIMS has 50-something and Pyle has 40-something kids who pass whatever threshold MCPS has identified. Ditto the COGAT Q. The numbers are not that disparate between various schools.

But then when you look at PARCC (an easier test), Pyle blows every other school out of the water.

I genuinely don't know what this tells us, objectively, about where the talent is. If you go by MAP-M and COGAT Quantitative, talent seems pretty evenly distributed. But if you go by PARCC, it's not. Which test is right?


Well, PARCC measures learned skills and like MAP-M, is a better indicator of students being exposed to advanced curriculum. MAP-M is also considered a better-written test and normative data compared to PARCC by most education professionals, especially in academia. Cogat measures cognitive ability. If I was measure raw ability, i.e., talent, I would go with MAP-M and Cogat over PARCC.


I suppose there is a loose correlation between PARCC/MAP and Cogat. Probably a better correlation between PARCC and MAP? In any case yes odd that MAP and PARCC levels not proportionally similar to Frost, Cabin John, etc. But the what explains the divergence between MAP and PARCC numbers for the ESs leading into Pyle? PARCC was used for all kinds of things so every school's kids had same incentives to do well.

PARCC and MAP are "what you learned". Cogat is "cognitive ability" -- what you are capable of.


Pp here. Thank you.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Well, this the predictable outcome of no longer admitting the best and brightest into the magnets. I don't think having OOB magnet students winning math competitions ever helped the low income and poor performing students at TPMS anyway so the loss of rank and prestige at TPMS in favor of a school in the western side of the county shouldn't matter to MCPS either.

What is ridiculous though is that MCPS offers a more rigorous STEM curriculum at TPMS then the other schools who now have higher performing students. They really should swap out the curriculums. Schools like Frost, Pyle and wherever the real highest performing cohort is now located at should have the more difficult curriculum and TPMS should have the enriched curriculum which would better serve its new cohort with a larger range of academic abilities.

Clueless

says you. MCPS's own stats show where the highest performers are.

FYI https://www.montgomeryschoolsmd.org/uploadedFiles/schools/msmagnet/about/MS%20Magnet%20Field%20Test%20Data%20by%20Sending%20MS.pdf


I've seen this chart before, and it is incredibly interesting because there's a huge swing between different metrics and I'm not sure what it means. So, look at MAP-M vs. PARCC ML. For MAP-M, which you would assume to be a better metric for "kids who will excel at math competitions," the numbers are pretty consistent between schools. So, a school like SSIMS has 50-something and Pyle has 40-something kids who pass whatever threshold MCPS has identified. Ditto the COGAT Q. The numbers are not that disparate between various schools.

But then when you look at PARCC (an easier test), Pyle blows every other school out of the water.

I genuinely don't know what this tells us, objectively, about where the talent is. If you go by MAP-M and COGAT Quantitative, talent seems pretty evenly distributed. But if you go by PARCC, it's not. Which test is right?


Well, PARCC measures learned skills and like MAP-M, is a better indicator of students being exposed to advanced curriculum. MAP-M is also considered a better-written test and normative data compared to PARCC by most education professionals, especially in academia. Cogat measures cognitive ability. If I was measure raw ability, i.e., talent, I would go with MAP-M and Cogat over PARCC.


I suppose there is a loose correlation between PARCC/MAP and Cogat. Probably a better correlation between PARCC and MAP? In any case yes odd that MAP and PARCC levels not proportionally similar to Frost, Cabin John, etc. But the what explains the divergence between MAP and PARCC numbers for the ESs leading into Pyle? PARCC was used for all kinds of things so every school's kids had same incentives to do well.


Also, I would add that the Cogat scores were adjusted per MCPS cohort/tiered school process. PARCC and MAP-M scores were cut-offs. PARCC was 4 or 5, MCPS never disclosed MAP-M cut-offs.

The chartb was from before they created the 3 levels of adjusted by.MCPS Cogat scores.
Anonymous
Anonymous wrote:
Anonymous wrote:
FYI https://www.montgomeryschoolsmd.org/uploadedFiles/schools/msmagnet/about/MS%20Magnet%20Field%20Test%20Data%20by%20Sending%20MS.pdf


The chartb was from before they created the 3 levels of adjusted by.MCPS Cogat scores.

I'd love to see the more recent figures.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Well, this the predictable outcome of no longer admitting the best and brightest into the magnets. I don't think having OOB magnet students winning math competitions ever helped the low income and poor performing students at TPMS anyway so the loss of rank and prestige at TPMS in favor of a school in the western side of the county shouldn't matter to MCPS either.

What is ridiculous though is that MCPS offers a more rigorous STEM curriculum at TPMS then the other schools who now have higher performing students. They really should swap out the curriculums. Schools like Frost, Pyle and wherever the real highest performing cohort is now located at should have the more difficult curriculum and TPMS should have the enriched curriculum which would better serve its new cohort with a larger range of academic abilities.

Clueless

says you. MCPS's own stats show where the highest performers are.

FYI https://www.montgomeryschoolsmd.org/uploadedFiles/schools/msmagnet/about/MS%20Magnet%20Field%20Test%20Data%20by%20Sending%20MS.pdf


Can anyone explain what this chart means? What do the numbers represent? They certainly do not look like test scores.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Well, this the predictable outcome of no longer admitting the best and brightest into the magnets. I don't think having OOB magnet students winning math competitions ever helped the low income and poor performing students at TPMS anyway so the loss of rank and prestige at TPMS in favor of a school in the western side of the county shouldn't matter to MCPS either.

What is ridiculous though is that MCPS offers a more rigorous STEM curriculum at TPMS then the other schools who now have higher performing students. They really should swap out the curriculums. Schools like Frost, Pyle and wherever the real highest performing cohort is now located at should have the more difficult curriculum and TPMS should have the enriched curriculum which would better serve its new cohort with a larger range of academic abilities.

Clueless

says you. MCPS's own stats show where the highest performers are.

FYI https://www.montgomeryschoolsmd.org/uploadedFiles/schools/msmagnet/about/MS%20Magnet%20Field%20Test%20Data%20by%20Sending%20MS.pdf


Can anyone explain what this chart means? What do the numbers represent? They certainly do not look like test scores.


The numbers represent the cohort of highly able students in each MS cluster that MCPS identified in preparation for changing the MS magnet application process. In a memo or slide deck, MCPS stated that a PARCC score of 4 or 5 was the cutoff. MCPS did not disclose cutoff for MAP or Cogat scores. My guess is that kids had to score 90% or higher at low to mid FARM schools, but it could have been at least the mid to high 80s for MAP and/or Cogat at high FARM schools. Most of the studies on gifted education and equity that MCPS quotes in reports, memos, etc. advocate for 90+%.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Well, this the predictable outcome of no longer admitting the best and brightest into the magnets. I don't think having OOB magnet students winning math competitions ever helped the low income and poor performing students at TPMS anyway so the loss of rank and prestige at TPMS in favor of a school in the western side of the county shouldn't matter to MCPS either.

What is ridiculous though is that MCPS offers a more rigorous STEM curriculum at TPMS then the other schools who now have higher performing students. They really should swap out the curriculums. Schools like Frost, Pyle and wherever the real highest performing cohort is now located at should have the more difficult curriculum and TPMS should have the enriched curriculum which would better serve its new cohort with a larger range of academic abilities.

Clueless

says you. MCPS's own stats show where the highest performers are.

FYI https://www.montgomeryschoolsmd.org/uploadedFiles/schools/msmagnet/about/MS%20Magnet%20Field%20Test%20Data%20by%20Sending%20MS.pdf


Can anyone explain what this chart means? What do the numbers represent? They certainly do not look like test scores.


As I recall this may be a chart included in a report sent to the school board looking at the results of changing the MS magnet admission from a parent applies to one where there was universal screening of all 5th graders. The first year for that new process was fall of 2017. The chart shows the number of students at each school identified as "highly able" by MAP, Cogat (the MS magnet admissions test), PARCC. I am guessing to have been highly able under any test a 5th grader must have tested ...maybe 97th percent or above (but others might know more.) The idea was that all the students identified as highly able would benefit from the MS magnet programs. So they rolled out "enriched classes" in math and humanities for these students at their home middle school. This evolves each year, as apparently,does the MS magnet selection process.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Well, this the predictable outcome of no longer admitting the best and brightest into the magnets. I don't think having OOB magnet students winning math competitions ever helped the low income and poor performing students at TPMS anyway so the loss of rank and prestige at TPMS in favor of a school in the western side of the county shouldn't matter to MCPS either.

What is ridiculous though is that MCPS offers a more rigorous STEM curriculum at TPMS then the other schools who now have higher performing students. They really should swap out the curriculums. Schools like Frost, Pyle and wherever the real highest performing cohort is now located at should have the more difficult curriculum and TPMS should have the enriched curriculum which would better serve its new cohort with a larger range of academic abilities.

Clueless

says you. MCPS's own stats show where the highest performers are.

FYI https://www.montgomeryschoolsmd.org/uploadedFiles/schools/msmagnet/about/MS%20Magnet%20Field%20Test%20Data%20by%20Sending%20MS.pdf


Can anyone explain what this chart means? What do the numbers represent? They certainly do not look like test scores.


The numbers represent the cohort of highly able students in each MS cluster that MCPS identified in preparation for changing the MS magnet application process. In a memo or slide deck, MCPS stated that a PARCC score of 4 or 5 was the cutoff. MCPS did not disclose cutoff for MAP or Cogat scores. My guess is that kids had to score 90% or higher at low to mid FARM schools, but it could have been at least the mid to high 80s for MAP and/or Cogat at high FARM schools. Most of the studies on gifted education and equity that MCPS quotes in reports, memos, etc. advocate for 90+%.


I think the information in this chart guided MCPS into creating the bonus for Cogat takers at the 1/3 of schools having the highest FARMs rates and the 1/3 of schools having the next highest FARMS rates. They sought to have the schools have closer to the same number of "highly able" students. But this chart results from raw scores/percentiles derived from the central office's fall 2017 screening of all 5th grade students.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Well, this the predictable outcome of no longer admitting the best and brightest into the magnets. I don't think having OOB magnet students winning math competitions ever helped the low income and poor performing students at TPMS anyway so the loss of rank and prestige at TPMS in favor of a school in the western side of the county shouldn't matter to MCPS either.

What is ridiculous though is that MCPS offers a more rigorous STEM curriculum at TPMS then the other schools who now have higher performing students. They really should swap out the curriculums. Schools like Frost, Pyle and wherever the real highest performing cohort is now located at should have the more difficult curriculum and TPMS should have the enriched curriculum which would better serve its new cohort with a larger range of academic abilities.

Clueless

says you. MCPS's own stats show where the highest performers are.

FYI https://www.montgomeryschoolsmd.org/uploadedFiles/schools/msmagnet/about/MS%20Magnet%20Field%20Test%20Data%20by%20Sending%20MS.pdf


Can anyone explain what this chart means? What do the numbers represent? They certainly do not look like test scores.


The numbers represent the cohort of highly able students in each MS cluster that MCPS identified in preparation for changing the MS magnet application process. In a memo or slide deck, MCPS stated that a PARCC score of 4 or 5 was the cutoff. MCPS did not disclose cutoff for MAP or Cogat scores. My guess is that kids had to score 90% or higher at low to mid FARM schools, but it could have been at least the mid to high 80s for MAP and/or Cogat at high FARM schools. Most of the studies on gifted education and equity that MCPS quotes in reports, memos, etc. advocate for 90+%.


I think the information in this chart guided MCPS into creating the bonus for Cogat takers at the 1/3 of schools having the highest FARMs rates and the 1/3 of schools having the next highest FARMS rates. They sought to have the schools have closer to the same number of "highly able" students. But this chart results from raw scores/percentiles derived from the central office's fall 2017 screening of all 5th grade students.


That makes sense-- but it would be helpful if MCPS provided the cutoff scores to determine highly able students.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Well, this the predictable outcome of no longer admitting the best and brightest into the magnets. I don't think having OOB magnet students winning math competitions ever helped the low income and poor performing students at TPMS anyway so the loss of rank and prestige at TPMS in favor of a school in the western side of the county shouldn't matter to MCPS either.

What is ridiculous though is that MCPS offers a more rigorous STEM curriculum at TPMS then the other schools who now have higher performing students. They really should swap out the curriculums. Schools like Frost, Pyle and wherever the real highest performing cohort is now located at should have the more difficult curriculum and TPMS should have the enriched curriculum which would better serve its new cohort with a larger range of academic abilities.

Clueless

says you. MCPS's own stats show where the highest performers are.

FYI https://www.montgomeryschoolsmd.org/uploadedFiles/schools/msmagnet/about/MS%20Magnet%20Field%20Test%20Data%20by%20Sending%20MS.pdf


And does this chart hide the fact that kids at high poverty schools are only compared to each other or is it a straight up comparison across the District? Sorry, can't tell.
Forum Index » Montgomery County Public Schools (MCPS)
Go to: