Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Well, this the predictable outcome of no longer admitting the best and brightest into the magnets. I don't think having OOB magnet students winning math competitions ever helped the low income and poor performing students at TPMS anyway so the loss of rank and prestige at TPMS in favor of a school in the western side of the county shouldn't matter to MCPS either.
What is ridiculous though is that MCPS offers a more rigorous STEM curriculum at TPMS then the other schools who now have higher performing students. They really should swap out the curriculums. Schools like Frost, Pyle and wherever the real highest performing cohort is now located at should have the more difficult curriculum and TPMS should have the enriched curriculum which would better serve its new cohort with a larger range of academic abilities.
Clueless
says you. MCPS's own stats show where the highest performers are.
FYI
https://www.montgomeryschoolsmd.org/uploadedFiles/schools/msmagnet/about/MS%20Magnet%20Field%20Test%20Data%20by%20Sending%20MS.pdf
I've seen this chart before, and it is incredibly interesting because there's a huge swing between different metrics and I'm not sure what it means. So, look at MAP-M vs. PARCC ML. For MAP-M, which you would assume to be a better metric for "kids who will excel at math competitions," the numbers are pretty consistent between schools. So, a school like SSIMS has 50-something and Pyle has 40-something kids who pass whatever threshold MCPS has identified. Ditto the COGAT Q. The numbers are not that disparate between various schools.
But then when you look at PARCC (an easier test), Pyle blows every other school out of the water.
I genuinely don't know what this tells us, objectively, about where the talent is. If you go by MAP-M and COGAT Quantitative, talent seems pretty evenly distributed. But if you go by PARCC, it's not. Which test is right?