How common is a math or reading MAP score at the 99th percentile in this area?

Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Very common and I’m a teacher


Strange, I’ve heard the opposite from teachers who actually administer the tests.


Recent MCPS data has been published. They also share their district averages, which are, as it turns out, very similar to national norms. You can go with known facts or believe in gossip. This is a simple choice.


Recent MCPS data has been published...where? (Please provide a direct link or specific, easily replicable steps for access.) For which year? About which measures? In what level of detail?

If the only new piece of data that is being bandied, here, is an MCPS average, it severely limits the conclusions that might be drawn. It certainly does not allow reliable conclusions about the distributuon or associated changes from year to year.

MAP scores call attention for a few reasons. Among them are:

-- Pride about high scores, whether about one's child or one's school (this can be misplaced, especially when interpreting it directly as indicative of ability, rather than achievement, and I'd suggest folks keep that to themselves in any case).

-- Concern about low scores (while I wouldn't advise ignoring that, and would suggest touching ground with a student's teacher, I'd also cushion this with an understanding of the inherent variability of individual, single-point-in-time scores for such tests, among the reasons for the "misplaced" note, above)

-- Interest in subscores and growth (which might guide more individualized teaching; this is among the most appropriate uses of MAP testing, but may not be well implemented across schools and classes)

-- Interest in average scores and distributions on a school- or county-wide basis (another relatively appropriate use, as long as large enough data sets are considered, especially longitudinally across several test periods, and evaluated in the light of other, idiosyncratic factors, especially for individual schools)

-- Concern about how a student's score might be evaluated as part of MCPS criteria-based decisions such as GT designation, magnet placement and eligibility for enriched/accelerated programming (this is usually the sticky wicket, here and elsewhere; MCPS tries very hard to keep from disclosure of related information because of the high levels of interest in these decisions, given the known limitations of their approaches, both to identification and to programming)


Sure, they didn't publish data from yesterday, but the FB data is recent enough that we can get a clear picture of things.


The data posted to the FB group was from Fall of 2021. The locally normed 85th %ile used for magnet selection changed between the 2021-22 school year and last year, increasing to an unpublicized absolute %ile for low-FARMS schools.

You can keep posting that Fall of 2021 was recent enough for you, but suggesting that those percentiles stayed the same, given the anecdotal evidence to the contrary (all we have with MCPS withholding the 2022-23 numbers) and reasoned explanations as to why that might have occurred, is disingenuous, at best, and shilling for MCPS reticence, otherwise.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Very common and I’m a teacher


Strange, I’ve heard the opposite from teachers who actually administer the tests.


Recent MCPS data has been published. They also share their district averages, which are, as it turns out, very similar to national norms. You can go with known facts or believe in gossip. This is a simple choice.


Oh, I go with the facts, not sure why you’d think otherwise. I’ve said so repeatedly on this thread.. I was just pointing out that my conversations with teachers also support the facts.


Sorry, I meant to support your viewpoint. Now, I can only guess specifics, but based on what we know, a half-dozen out of 100 may score 99% at a low FARMS school. and although this is hardly every kid, but it does exceed national norms.


I think this debate you’re having all depends on where people are coming from. Out our ES - W feeder with low FARMS - I’m not kidding when I say that nearly every parent we talk to about MAP scores reports that their DC got something at or close to 99%. We’re of course self selecting who we discuss this with, but until threads like this one, I assumed that MAP was a fairly easy test to get a 99th percentile on. I’d be genuinely troubled if any of my kids scored less than 95th percentile or so. Not because I think they’re super smart but because my impression is that everyone can score that high. I’m sure there are teachers at our school who would characterize it as pretty common too. District-wide, it’s another story completely. But when people ask “how common is it for x to happen” I think most people answer based on their personal experience and not a nuanced look at the really hard to find data.


+1
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Very common and I’m a teacher


Strange, I’ve heard the opposite from teachers who actually administer the tests.


Recent MCPS data has been published. They also share their district averages, which are, as it turns out, very similar to national norms. You can go with known facts or believe in gossip. This is a simple choice.


Oh, I go with the facts, not sure why you’d think otherwise. I’ve said so repeatedly on this thread.. I was just pointing out that my conversations with teachers also support the facts.


Sorry, I meant to support your viewpoint. Now, I can only guess specifics, but based on what we know, a half-dozen out of 100 may score 99% at a low FARMS school. and although this is hardly every kid, but it does exceed national norms.


I think this debate you’re having all depends on where people are coming from. Out our ES - W feeder with low FARMS - I’m not kidding when I say that nearly every parent we talk to about MAP scores reports that their DC got something at or close to 99%. We’re of course self selecting who we discuss this with, but until threads like this one, I assumed that MAP was a fairly easy test to get a 99th percentile on. I’d be genuinely troubled if any of my kids scored less than 95th percentile or so. Not because I think they’re super smart but because my impression is that everyone can score that high. I’m sure there are teachers at our school who would characterize it as pretty common too. District-wide, it’s another story completely. But when people ask “how common is it for x to happen” I think most people answer based on their personal experience and not a nuanced look at the really hard to find data.

At our W feeder school, the top 15% are all in the 95% nationally.
Anonymous
Anonymous wrote:I don’t know how concentrated they are in this area, but my kid has had 99th percentile on MAPs consistently and 99th percentile on CoGat and did not get into the CES gifted program.


Yes, but these days, it's because selection from the pool is a lottery. Your kid could have the highest scores in the entire county and not be picked. You probably have something like a one-in-20 shot if you are in the pool these days.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Very common and I’m a teacher


Strange, I’ve heard the opposite from teachers who actually administer the tests.


Recent MCPS data has been published. They also share their district averages, which are, as it turns out, very similar to national norms. You can go with known facts or believe in gossip. This is a simple choice.


Oh, I go with the facts, not sure why you’d think otherwise. I’ve said so repeatedly on this thread.. I was just pointing out that my conversations with teachers also support the facts.


Sorry, I meant to support your viewpoint. Now, I can only guess specifics, but based on what we know, a half-dozen out of 100 may score 99% at a low FARMS school. and although this is hardly every kid, but it does exceed national norms.


I think this debate you’re having all depends on where people are coming from. Out our ES - W feeder with low FARMS - I’m not kidding when I say that nearly every parent we talk to about MAP scores reports that their DC got something at or close to 99%. We’re of course self selecting who we discuss this with, but until threads like this one, I assumed that MAP was a fairly easy test to get a 99th percentile on. I’d be genuinely troubled if any of my kids scored less than 95th percentile or so. Not because I think they’re super smart but because my impression is that everyone can score that high. I’m sure there are teachers at our school who would characterize it as pretty common too. District-wide, it’s another story completely. But when people ask “how common is it for x to happen” I think most people answer based on their personal experience and not a nuanced look at the really hard to find data.


Anonymous
Anonymous wrote:
Anonymous wrote:Very common and I’m a teacher


A teacher at a CES? If so, could you comment on whether the curriculum and rigor of the program has changed after MCPS lowered the admission standard so much for kids from some schools and switched to a lottery?



DP.

It's literacy. The "curriculum" isn't really important. What matters is how the children react to it. A brilliant precocious excited kid can write an amazing essay, and a dull, slow, bored kid can write a bad essay, in response to the same book.


It's not like math where you have to change the content to pose harder problems on more topics for advanced kids, while the slower kids need more time to study and review before moving on.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Very common and I’m a teacher


Strange, I’ve heard the opposite from teachers who actually administer the tests.


Recent MCPS data has been published. They also share their district averages, which are, as it turns out, very similar to national norms. You can go with known facts or believe in gossip. This is a simple choice.


Recent MCPS data has been published...where? (Please provide a direct link or specific, easily replicable steps for access.) For which year? About which measures? In what level of detail?

If the only new piece of data that is being bandied, here, is an MCPS average, it severely limits the conclusions that might be drawn. It certainly does not allow reliable conclusions about the distributuon or associated changes from year to year.

MAP scores call attention for a few reasons. Among them are:

-- Pride about high scores, whether about one's child or one's school (this can be misplaced, especially when interpreting it directly as indicative of ability, rather than achievement, and I'd suggest folks keep that to themselves in any case).

-- Concern about low scores (while I wouldn't advise ignoring that, and would suggest touching ground with a student's teacher, I'd also cushion this with an understanding of the inherent variability of individual, single-point-in-time scores for such tests, among the reasons for the "misplaced" note, above)

-- Interest in subscores and growth (which might guide more individualized teaching; this is among the most appropriate uses of MAP testing, but may not be well implemented across schools and classes)

-- Interest in average scores and distributions on a school- or county-wide basis (another relatively appropriate use, as long as large enough data sets are considered, especially longitudinally across several test periods, and evaluated in the light of other, idiosyncratic factors, especially for individual schools)

-- Concern about how a student's score might be evaluated as part of MCPS criteria-based decisions such as GT designation, magnet placement and eligibility for enriched/accelerated programming (this is usually the sticky wicket, here and elsewhere; MCPS tries very hard to keep from disclosure of related information because of the high levels of interest in these decisions, given the known limitations of their approaches, both to identification and to programming)


Sure, they didn't publish data from yesterday, but the FB data is recent enough that we can get a clear picture of things.


The data posted to the FB group was from Fall of 2021. The locally normed 85th %ile used for magnet selection changed between the 2021-22 school year and last year, increasing to an unpublicized absolute %ile for low-FARMS schools.

You can keep posting that Fall of 2021 was recent enough for you, but suggesting that those percentiles stayed the same, given the anecdotal evidence to the contrary (all we have with MCPS withholding the 2022-23 numbers) and reasoned explanations as to why that might have occurred, is disingenuous, at best, and shilling for MCPS reticence, otherwise.


Yes, they released it last year. It's recent data.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Very common and I’m a teacher


Strange, I’ve heard the opposite from teachers who actually administer the tests.


Recent MCPS data has been published. They also share their district averages, which are, as it turns out, very similar to national norms. You can go with known facts or believe in gossip. This is a simple choice.


Recent MCPS data has been published...where? (Please provide a direct link or specific, easily replicable steps for access.) For which year? About which measures? In what level of detail?

If the only new piece of data that is being bandied, here, is an MCPS average, it severely limits the conclusions that might be drawn. It certainly does not allow reliable conclusions about the distributuon or associated changes from year to year.

MAP scores call attention for a few reasons. Among them are:

-- Pride about high scores, whether about one's child or one's school (this can be misplaced, especially when interpreting it directly as indicative of ability, rather than achievement, and I'd suggest folks keep that to themselves in any case).

-- Concern about low scores (while I wouldn't advise ignoring that, and would suggest touching ground with a student's teacher, I'd also cushion this with an understanding of the inherent variability of individual, single-point-in-time scores for such tests, among the reasons for the "misplaced" note, above)

-- Interest in subscores and growth (which might guide more individualized teaching; this is among the most appropriate uses of MAP testing, but may not be well implemented across schools and classes)

-- Interest in average scores and distributions on a school- or county-wide basis (another relatively appropriate use, as long as large enough data sets are considered, especially longitudinally across several test periods, and evaluated in the light of other, idiosyncratic factors, especially for individual schools)

-- Concern about how a student's score might be evaluated as part of MCPS criteria-based decisions such as GT designation, magnet placement and eligibility for enriched/accelerated programming (this is usually the sticky wicket, here and elsewhere; MCPS tries very hard to keep from disclosure of related information because of the high levels of interest in these decisions, given the known limitations of their approaches, both to identification and to programming)


Sure, they didn't publish data from yesterday, but the FB data is recent enough that we can get a clear picture of things.


The data posted to the FB group was from Fall of 2021. The locally normed 85th %ile used for magnet selection changed between the 2021-22 school year and last year, increasing to an unpublicized absolute %ile for low-FARMS schools.

You can keep posting that Fall of 2021 was recent enough for you, but suggesting that those percentiles stayed the same, given the anecdotal evidence to the contrary (all we have with MCPS withholding the 2022-23 numbers) and reasoned explanations as to why that might have occurred, is disingenuous, at best, and shilling for MCPS reticence, otherwise.


The median RIT scores MCPS keeps updating in Parentview and appear to vary as much as 2 or 3 for the same grade YoY, but they appear stable.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Very common and I’m a teacher


Strange, I’ve heard the opposite from teachers who actually administer the tests.


Recent MCPS data has been published. They also share their district averages, which are, as it turns out, very similar to national norms. You can go with known facts or believe in gossip. This is a simple choice.


Oh, I go with the facts, not sure why you’d think otherwise. I’ve said so repeatedly on this thread.. I was just pointing out that my conversations with teachers also support the facts.


Sorry, I meant to support your viewpoint. Now, I can only guess specifics, but based on what we know, a half-dozen out of 100 may score 99% at a low FARMS school. and although this is hardly every kid, but it does exceed national norms.


I think this debate you’re having all depends on where people are coming from. Out our ES - W feeder with low FARMS - I’m not kidding when I say that nearly every parent we talk to about MAP scores reports that their DC got something at or close to 99%. We’re of course self selecting who we discuss this with, but until threads like this one, I assumed that MAP was a fairly easy test to get a 99th percentile on. I’d be genuinely troubled if any of my kids scored less than 95th percentile or so. Not because I think they’re super smart but because my impression is that everyone can score that high. I’m sure there are teachers at our school who would characterize it as pretty common too. District-wide, it’s another story completely. But when people ask “how common is it for x to happen” I think most people answer based on their personal experience and not a nuanced look at the really hard to find data.


And my anecdotal experience is exactly the opposite of your anecdotal experience. Where does that leave us? Also who shares their kids scores with other adults? So weird. Most of my friends wouldn’t even know their kids scores.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Very common and I’m a teacher


Strange, I’ve heard the opposite from teachers who actually administer the tests.


Recent MCPS data has been published. They also share their district averages, which are, as it turns out, very similar to national norms. You can go with known facts or believe in gossip. This is a simple choice.


Recent MCPS data has been published...where? (Please provide a direct link or specific, easily replicable steps for access.) For which year? About which measures? In what level of detail?

If the only new piece of data that is being bandied, here, is an MCPS average, it severely limits the conclusions that might be drawn. It certainly does not allow reliable conclusions about the distributuon or associated changes from year to year.

MAP scores call attention for a few reasons. Among them are:

-- Pride about high scores, whether about one's child or one's school (this can be misplaced, especially when interpreting it directly as indicative of ability, rather than achievement, and I'd suggest folks keep that to themselves in any case).

-- Concern about low scores (while I wouldn't advise ignoring that, and would suggest touching ground with a student's teacher, I'd also cushion this with an understanding of the inherent variability of individual, single-point-in-time scores for such tests, among the reasons for the "misplaced" note, above)

-- Interest in subscores and growth (which might guide more individualized teaching; this is among the most appropriate uses of MAP testing, but may not be well implemented across schools and classes)

-- Interest in average scores and distributions on a school- or county-wide basis (another relatively appropriate use, as long as large enough data sets are considered, especially longitudinally across several test periods, and evaluated in the light of other, idiosyncratic factors, especially for individual schools)

-- Concern about how a student's score might be evaluated as part of MCPS criteria-based decisions such as GT designation, magnet placement and eligibility for enriched/accelerated programming (this is usually the sticky wicket, here and elsewhere; MCPS tries very hard to keep from disclosure of related information because of the high levels of interest in these decisions, given the known limitations of their approaches, both to identification and to programming)


Sure, they didn't publish data from yesterday, but the FB data is recent enough that we can get a clear picture of things.


The data posted to the FB group was from Fall of 2021. The locally normed 85th %ile used for magnet selection changed between the 2021-22 school year and last year, increasing to an unpublicized absolute %ile for low-FARMS schools.

You can keep posting that Fall of 2021 was recent enough for you, but suggesting that those percentiles stayed the same, given the anecdotal evidence to the contrary (all we have with MCPS withholding the 2022-23 numbers) and reasoned explanations as to why that might have occurred, is disingenuous, at best, and shilling for MCPS reticence, otherwise.


Yes, they released it last year. It's recent data.


But not last year's data, which was different.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Very common and I’m a teacher


Strange, I’ve heard the opposite from teachers who actually administer the tests.


Recent MCPS data has been published. They also share their district averages, which are, as it turns out, very similar to national norms. You can go with known facts or believe in gossip. This is a simple choice.


Recent MCPS data has been published...where? (Please provide a direct link or specific, easily replicable steps for access.) For which year? About which measures? In what level of detail?

If the only new piece of data that is being bandied, here, is an MCPS average, it severely limits the conclusions that might be drawn. It certainly does not allow reliable conclusions about the distributuon or associated changes from year to year.

MAP scores call attention for a few reasons. Among them are:

-- Pride about high scores, whether about one's child or one's school (this can be misplaced, especially when interpreting it directly as indicative of ability, rather than achievement, and I'd suggest folks keep that to themselves in any case).

-- Concern about low scores (while I wouldn't advise ignoring that, and would suggest touching ground with a student's teacher, I'd also cushion this with an understanding of the inherent variability of individual, single-point-in-time scores for such tests, among the reasons for the "misplaced" note, above)

-- Interest in subscores and growth (which might guide more individualized teaching; this is among the most appropriate uses of MAP testing, but may not be well implemented across schools and classes)

-- Interest in average scores and distributions on a school- or county-wide basis (another relatively appropriate use, as long as large enough data sets are considered, especially longitudinally across several test periods, and evaluated in the light of other, idiosyncratic factors, especially for individual schools)

-- Concern about how a student's score might be evaluated as part of MCPS criteria-based decisions such as GT designation, magnet placement and eligibility for enriched/accelerated programming (this is usually the sticky wicket, here and elsewhere; MCPS tries very hard to keep from disclosure of related information because of the high levels of interest in these decisions, given the known limitations of their approaches, both to identification and to programming)


Sure, they didn't publish data from yesterday, but the FB data is recent enough that we can get a clear picture of things.


The data posted to the FB group was from Fall of 2021. The locally normed 85th %ile used for magnet selection changed between the 2021-22 school year and last year, increasing to an unpublicized absolute %ile for low-FARMS schools.

You can keep posting that Fall of 2021 was recent enough for you, but suggesting that those percentiles stayed the same, given the anecdotal evidence to the contrary (all we have with MCPS withholding the 2022-23 numbers) and reasoned explanations as to why that might have occurred, is disingenuous, at best, and shilling for MCPS reticence, otherwise.


The median RIT scores MCPS keeps updating in Parentview and appear to vary as much as 2 or 3 for the same grade YoY, but they appear stable.


Great. Now we understand that you are hanging your hat on a median, which wouldn't change if the top 15% moved up while those in the middle stayed roughly the same, which is more clearly what happened given the anecdotal evidence.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Very common and I’m a teacher


A teacher at a CES? If so, could you comment on whether the curriculum and rigor of the program has changed after MCPS lowered the admission standard so much for kids from some schools and switched to a lottery?



DP.

It's literacy. The "curriculum" isn't really important. What matters is how the children react to it. A brilliant precocious excited kid can write an amazing essay, and a dull, slow, bored kid can write a bad essay, in response to the same book.


It's not like math where you have to change the content to pose harder problems on more topics for advanced kids, while the slower kids need more time to study and review before moving on.


That can’t be right. Of course the curriculum matters. As in, the books the students read, the pace at which they’re expected to read them, the way they’re taught to analyze the text and the methods taught to write about them.

Before they changed the admission standard the kids in these classes were all among the strongest readers in the county with MAP-R scores in the high 90s. Now, kids from high FARMS schools scoring in the 70th percentile are attending. You’re saying that teachers would give a classroom of kids filled with readers who are 2 or 3 grade levels above the average reading level the same novel to read and analyze as the book they’d give a kid who is just on reading level?
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Very common and I’m a teacher


A teacher at a CES? If so, could you comment on whether the curriculum and rigor of the program has changed after MCPS lowered the admission standard so much for kids from some schools and switched to a lottery?



DP.

It's literacy. The "curriculum" isn't really important. What matters is how the children react to it. A brilliant precocious excited kid can write an amazing essay, and a dull, slow, bored kid can write a bad essay, in response to the same book.


It's not like math where you have to change the content to pose harder problems on more topics for advanced kids, while the slower kids need more time to study and review before moving on.


That can’t be right. Of course the curriculum matters. As in, the books the students read, the pace at which they’re expected to read them, the way they’re taught to analyze the text and the methods taught to write about them.

Before they changed the admission standard the kids in these classes were all among the strongest readers in the county with MAP-R scores in the high 90s. Now, kids from high FARMS schools scoring in the 70th percentile are attending. You’re saying that teachers would give a classroom of kids filled with readers who are 2 or 3 grade levels above the average reading level the same novel to read and analyze as the book they’d give a kid who is just on reading level?


DDP. While I see what you are saying, bringing a highly able student up to speed in reading might be easier than in math.

Separately, since MAP scores are exposure-based, a highly able kid from a high FARMS school might simply not have had the exposure, as the teachers needed to address the presumably larger group of more challenged students.

That's where the idea of using local norming comes in, but it's fidelity to underlying ability is tenuous. Better to find and use a more directly ability-based evaluation.

In any case, we shouldn't assume that those with MAP scores in the 70th %ile under more difficult conditions are less able than those scoring 95+ with high performing cohorts and better family supports. However, failing to address the needs of all those 95+ students (or 90+, or 70+ at high FARMS, or better identified in another manner) robustly is terrible.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Very common and I’m a teacher


Strange, I’ve heard the opposite from teachers who actually administer the tests.


Recent MCPS data has been published. They also share their district averages, which are, as it turns out, very similar to national norms. You can go with known facts or believe in gossip. This is a simple choice.


Oh, I go with the facts, not sure why you’d think otherwise. I’ve said so repeatedly on this thread.. I was just pointing out that my conversations with teachers also support the facts.


Sorry, I meant to support your viewpoint. Now, I can only guess specifics, but based on what we know, a half-dozen out of 100 may score 99% at a low FARMS school. and although this is hardly every kid, but it does exceed national norms.


I think this debate you’re having all depends on where people are coming from. Out our ES - W feeder with low FARMS - I’m not kidding when I say that nearly every parent we talk to about MAP scores reports that their DC got something at or close to 99%. We’re of course self selecting who we discuss this with, but until threads like this one, I assumed that MAP was a fairly easy test to get a 99th percentile on. I’d be genuinely troubled if any of my kids scored less than 95th percentile or so. Not because I think they’re super smart but because my impression is that everyone can score that high. I’m sure there are teachers at our school who would characterize it as pretty common too. District-wide, it’s another story completely. But when people ask “how common is it for x to happen” I think most people answer based on their personal experience and not a nuanced look at the really hard to find data.


And my anecdotal experience is exactly the opposite of your anecdotal experience. Where does that leave us? Also who shares their kids scores with other adults? So weird. Most of my friends wouldn’t even know their kids scores.


It leaves us with the data that MCPS recently released to the MCCPTA group from a year or two ago and the data they routinely publish to parentview.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Very common and I’m a teacher


Strange, I’ve heard the opposite from teachers who actually administer the tests.


Recent MCPS data has been published. They also share their district averages, which are, as it turns out, very similar to national norms. You can go with known facts or believe in gossip. This is a simple choice.


Oh, I go with the facts, not sure why you’d think otherwise. I’ve said so repeatedly on this thread.. I was just pointing out that my conversations with teachers also support the facts.


Sorry, I meant to support your viewpoint. Now, I can only guess specifics, but based on what we know, a half-dozen out of 100 may score 99% at a low FARMS school. and although this is hardly every kid, but it does exceed national norms.


I think this debate you’re having all depends on where people are coming from. Out our ES - W feeder with low FARMS - I’m not kidding when I say that nearly every parent we talk to about MAP scores reports that their DC got something at or close to 99%. We’re of course self selecting who we discuss this with, but until threads like this one, I assumed that MAP was a fairly easy test to get a 99th percentile on. I’d be genuinely troubled if any of my kids scored less than 95th percentile or so. Not because I think they’re super smart but because my impression is that everyone can score that high. I’m sure there are teachers at our school who would characterize it as pretty common too. District-wide, it’s another story completely. But when people ask “how common is it for x to happen” I think most people answer based on their personal experience and not a nuanced look at the really hard to find data.


And my anecdotal experience is exactly the opposite of your anecdotal experience. Where does that leave us? Also who shares their kids scores with other adults? So weird. Most of my friends wouldn’t even know their kids scores.


It leaves us with the data that MCPS recently released to the MCCPTA group from a year or two ago and the data they routinely publish to parentview.


Absolutely agree. In fact I said pages ago in this thread that the numbers are similar to the national average and that DCUM posters would fall over themselves to explain how and why MCPS is special and different and here we are.
post reply Forum Index » Montgomery County Public Schools (MCPS)
Message Quick Reply
Go to: