The data posted to the FB group was from Fall of 2021. The locally normed 85th %ile used for magnet selection changed between the 2021-22 school year and last year, increasing to an unpublicized absolute %ile for low-FARMS schools. You can keep posting that Fall of 2021 was recent enough for you, but suggesting that those percentiles stayed the same, given the anecdotal evidence to the contrary (all we have with MCPS withholding the 2022-23 numbers) and reasoned explanations as to why that might have occurred, is disingenuous, at best, and shilling for MCPS reticence, otherwise. |
+1 |
At our W feeder school, the top 15% are all in the 95% nationally. |
Yes, but these days, it's because selection from the pool is a lottery. Your kid could have the highest scores in the entire county and not be picked. You probably have something like a one-in-20 shot if you are in the pool these days. |
|
DP. It's literacy. The "curriculum" isn't really important. What matters is how the children react to it. A brilliant precocious excited kid can write an amazing essay, and a dull, slow, bored kid can write a bad essay, in response to the same book. It's not like math where you have to change the content to pose harder problems on more topics for advanced kids, while the slower kids need more time to study and review before moving on. |
Yes, they released it last year. It's recent data. |
The median RIT scores MCPS keeps updating in Parentview and appear to vary as much as 2 or 3 for the same grade YoY, but they appear stable. |
And my anecdotal experience is exactly the opposite of your anecdotal experience. Where does that leave us? Also who shares their kids scores with other adults? So weird. Most of my friends wouldn’t even know their kids scores. |
But not last year's data, which was different. |
Great. Now we understand that you are hanging your hat on a median, which wouldn't change if the top 15% moved up while those in the middle stayed roughly the same, which is more clearly what happened given the anecdotal evidence. |
That can’t be right. Of course the curriculum matters. As in, the books the students read, the pace at which they’re expected to read them, the way they’re taught to analyze the text and the methods taught to write about them. Before they changed the admission standard the kids in these classes were all among the strongest readers in the county with MAP-R scores in the high 90s. Now, kids from high FARMS schools scoring in the 70th percentile are attending. You’re saying that teachers would give a classroom of kids filled with readers who are 2 or 3 grade levels above the average reading level the same novel to read and analyze as the book they’d give a kid who is just on reading level? |
DDP. While I see what you are saying, bringing a highly able student up to speed in reading might be easier than in math. Separately, since MAP scores are exposure-based, a highly able kid from a high FARMS school might simply not have had the exposure, as the teachers needed to address the presumably larger group of more challenged students. That's where the idea of using local norming comes in, but it's fidelity to underlying ability is tenuous. Better to find and use a more directly ability-based evaluation. In any case, we shouldn't assume that those with MAP scores in the 70th %ile under more difficult conditions are less able than those scoring 95+ with high performing cohorts and better family supports. However, failing to address the needs of all those 95+ students (or 90+, or 70+ at high FARMS, or better identified in another manner) robustly is terrible. |
It leaves us with the data that MCPS recently released to the MCCPTA group from a year or two ago and the data they routinely publish to parentview. |
Absolutely agree. In fact I said pages ago in this thread that the numbers are similar to the national average and that DCUM posters would fall over themselves to explain how and why MCPS is special and different and here we are. |