Anonymous
Post 12/08/2025 14:04     Subject: Re:DC's School Report Cards are up. Any surprises?

Anonymous wrote:
Anonymous wrote:See the "DC Report Card Technical Guide" here: https://osse.dc.gov/page/dc-school-report-card-resource-library

p. 53 Student Group Weights

"The accountability system calculates each metric for each student group present in the school. Student
groups with fewer than 10 students for that metric are not included. In these cases, the student groups are
dropped from the overall metric scores. After calculating the student group metric scores, the are
aggregated based on the weights listed in Table 12 below to come up with a single metric score used in the
accountability score calculation.

Table 12: Student Group Weights
Student Group Percentage of Overall Score
All Students 30%
Economically Disadvantaged 40%
Race/Ethnicity 15%
Students with Disabilities 10%
English Learners 5%"


WOAH. So the achievement of economically disadvantaged kids is not only double counted, but actually alone counts more than the achievement of ALL kids??? (But they don't count at all if there aren't 10 students. So the group that's the hardest to grow just doesn't get counted at rich schools?) On top of that, what does "race/ethnicity" mean here? Does it mean Black? Because all of the other variables are a specific group of kids, whereas "Race/ethnicity" is a category with different groups of kids... not a single group. That is, "race/ethnicity" does not have a single measure and so is not able to be averaged in like the other categories. So it may be that on top of everything else, they are counting only the achievements of one specific racial/ethnic group, but they won't even come out and say that. That is INSANE.


They explain that in the next page of the document.

"3. Race/Ethnicity will include the following race/ethnicity groups:
a. American Indian/Alaska Native
b. Asian
c. Black or African American
d. Hispanic or Latino
e. Native Hawaiian or Other Pacific Islander
f. White
g. Two or more races.
4. Race/ethnicity groups with fewer than ten students will be excluded from the calculation. The 15%
race/ethnicity weights will be divided based on how many race/ethnicity groups are counted toward
the accountability score."

So convoluted.
Anonymous
Post 12/08/2025 13:43     Subject: Re:DC's School Report Cards are up. Any surprises?

Anonymous wrote:See the "DC Report Card Technical Guide" here: https://osse.dc.gov/page/dc-school-report-card-resource-library

p. 53 Student Group Weights

"The accountability system calculates each metric for each student group present in the school. Student
groups with fewer than 10 students for that metric are not included. In these cases, the student groups are
dropped from the overall metric scores. After calculating the student group metric scores, the are
aggregated based on the weights listed in Table 12 below to come up with a single metric score used in the
accountability score calculation.

Table 12: Student Group Weights
Student Group Percentage of Overall Score
All Students 30%
Economically Disadvantaged 40%
Race/Ethnicity 15%
Students with Disabilities 10%
English Learners 5%"


WOAH. So the achievement of economically disadvantaged kids is not only double counted, but actually alone counts more than the achievement of ALL kids??? (But they don't count at all if there aren't 10 students. So the group that's the hardest to grow just doesn't get counted at rich schools?) On top of that, what does "race/ethnicity" mean here? Does it mean Black? Because all of the other variables are a specific group of kids, whereas "Race/ethnicity" is a category with different groups of kids... not a single group. That is, "race/ethnicity" does not have a single measure and so is not able to be averaged in like the other categories. So it may be that on top of everything else, they are counting only the achievements of one specific racial/ethnic group, but they won't even come out and say that. That is INSANE.
Anonymous
Post 12/08/2025 13:26     Subject: Re:DC's School Report Cards are up. Any surprises?

Look people, it’s not that hard. Just ignore everything and look at overall test scores for the school. The percentage of kids testing at grade level and the percentage of kids testing above grade level.

DC manipulates the data to give more weight to at risk kids, etc… so that these poorly performing schools “look” better than they are. Just ignore that and look directly at test scores.
Anonymous
Post 12/08/2025 12:35     Subject: Re:DC's School Report Cards are up. Any surprises?

See the "DC Report Card Technical Guide" here: https://osse.dc.gov/page/dc-school-report-card-resource-library

p. 53 Student Group Weights

"The accountability system calculates each metric for each student group present in the school. Student
groups with fewer than 10 students for that metric are not included. In these cases, the student groups are
dropped from the overall metric scores. After calculating the student group metric scores, the are
aggregated based on the weights listed in Table 12 below to come up with a single metric score used in the
accountability score calculation.

Table 12: Student Group Weights
Student Group Percentage of Overall Score
All Students 30%
Economically Disadvantaged 40%
Race/Ethnicity 15%
Students with Disabilities 10%
English Learners 5%"
Anonymous
Post 12/08/2025 12:29     Subject: DC's School Report Cards are up. Any surprises?

Just catching up on this thread and this is very disturbing. It essentially feels like this data is manipulated for a very specific purpose but then broadly distributed as though it is "clean" data for doing 1:1 comparisons of schools.

I say this as someone who kept my kid in a school with lower test scores than a school we got into via the lottery in part because looking at report card scores and trends, we believed our school was on an upward trajectory and the school we'd lotteries into was on a downward trajectory, and because we'd had a decent experience at our existing school, we stayed.

Well that turned out not to be accurate and two years later we moved to the other school, realizing we'd be compromising our kid's education at the first school. Now that we've been at the other school for a year, it's clear that it is academically leaps and bounds better than our previous school, but as young parents who were just evaluating the metrics we were given (and with only one year of PK under our belt so limits exposure to academics in school) we just didn't know that.

So it is very problematic if this data is essentially being manipulated to make parents think that struggling schools are doing better than they are. Which I feel is what happened with our school.
Anonymous
Post 12/08/2025 12:07     Subject: DC's School Report Cards are up. Any surprises?

It's particularly crazy because they don't even give you the data they're working with to assign a "score," so it could all just be entirely made up. The chart of percentiles really illustrates this point. This is calculated based on non-public data to begin with, but then they assign scores that don't even remotely match the non-public data, so it's adjusted based on non-disclosed non-public data with seemingly no indication of that? How could OSSE ever be caught out if the data was entirely wrong?
Anonymous
Post 12/08/2025 12:04     Subject: DC's School Report Cards are up. Any surprises?

Anonymous wrote:Aspire DEFINITELY works this way. There are floors and ceilings for each subgroup which leads to a complicated scoring system that emphasizes at-risk, SpED, ELL populations. Is this also true for the Report Card?


It seems like it must be, because L-T gets completely hosed vs Watkins otherwise. But if achievement doesn't actually measure achievement, but instead the achievement of certain groups ... and then growth is comparing achievement scores and so also really measuring the growth of certain groups? Then these Report Cards essentially tell you where to send members of those groups. They should be way more upfront about that.
Anonymous
Post 12/08/2025 11:59     Subject: DC's School Report Cards are up. Any surprises?

Aspire DEFINITELY works this way. There are floors and ceilings for each subgroup which leads to a complicated scoring system that emphasizes at-risk, SpED, ELL populations. Is this also true for the Report Card?
Anonymous
Post 12/08/2025 11:54     Subject: DC's School Report Cards are up. Any surprises?

Something similar happens in the context of achievement scores:

Brent: ELA meeting, exceeding = 76.1% = 9 points
SWS: ELA meeting, exceeding = 74.8% = 9.6 points
Maury: ELA meeting, exceeding = 73.3% = 8.9 points
L-T: ELA meeting, exceeding = 64% = 7.2 points
CHML: ELA meeting, exceeding = 49.4% = 4.2 points
Payne: ELA meeting, exceeding = 47% = 7 points
Watkins: ELA meeting, exceeding = 37.3% = 4.7 points

Again, raw scores not telling the whole story. CHML gets hosed vis-a-vis Payne, for instance. (Payne gets nearly the same score as L-T despite being 18% lower.) As does Brent v SWS. Here it appears that they could be adjusting for population somehow, although that doesn't really explain CHML v Watkins, so it has to be more than that.
Anonymous
Post 12/08/2025 11:40     Subject: DC's School Report Cards are up. Any surprises?

Anonymous wrote:There's something screwy with the data or the data is not actually just showing overall achievement/growth, but is somehow being measured relative to population/emphasizing the performance of certain subpopulations. If this is true, so that even achievement is not actually measuring achievement, then this Report Card is really not terribly valuable for UMC parents.

Here's an example:

L-T is at 70.3 for approaching, meets, exceeds for math.

L-T is at 52.6 for meets, exceeds for math.

Watkins is at 65.9 for approaching, meets, exceeds for math.

Watkins is at 37.3 for meets, exceeds for math.

So, L-T is a little better for approaching meets, exceeds for math (4.4% better) and considerably better for meets, exceeds (15.3% better). If I were a UMC parent looking for a cohort of high performers, L-T clearly has a much bigger one.

However, when you then look at the scores the Report Card assigns that go to the overall rating, L-T gets a 2.4 for approaching meets, exceeds for math and Watkins gets a 3.9. So Watkins does a little worse and gets more than 50% higher. That seems... wrong. But then we get to meets, exceeds, where L-T was MUCH better... and we see L-T get a 5.5 and Watkins get a 7.7! So despite L-T having a noticeably higher pass rate, Watkins gets nearly a 50% higher score again.

It's also not some fluke of distribution being counted somehow, because for Watkins:
ELA: 8% 5s and 19% 1s.
Math: 10% 5s and 15% 1s.

For L-T:
ELA: 28% 5s (!) and 11% 1s.
Math: 16% 5s and 17% 1s.

So, at most, L-T has slightly more 1s in math; but way fewer 1s in ELA and WAY more 5s.

L-T does much better in ELA and does a bit better on points, so that at least makes sense.

Then let's look at growth... Raw numbers wise, the two schools appear to perform similarly, but somehow Watkins ends up with 34 points and L-T 27.1. The most perplexing one is median growth percentile for ELA, where L-T gets 62nd percentile, which gets them 8 points; meanwhile, Watkins gets 57th percentile, which gets them 8.6 points. WHAT? Can anyone makes sense of this?

At the end of the day, L-T has clearly better actual achievement numbers (especially in ELA, where they aren't close) and the schools have very similar growth and absentee numbers. L-T has more kids entering than exiting; Watkins has the reverse. The soft percentage factors I take it don't really count, but they are similar for the schools too... except that L-T is notably better in sense of belonging (76 v 69) and WAY better in safety (62 v 45). Everything else is within 1 point either way; 2 to L-T and 1 to Watkins.

And then we get to the overall scores and Watkins is at 76% and L-T is at 59%... even though the raw numbers clearly give L-T an advantage in achievement and basically a draw in everything else.

How is this possible? Does that mean that the "scores" we're seeing for achievement and growth are actually themselves relative in some totally unexplained way? Or some subpopulation scores are counting more? Because that makes this pretty useless for a family trying to use these Report Cards to make decisions.


Looking at other Hill schools for ELA growth, it appears the points only loosely track the identical reported measure. How bizarre. And it can't be adjusted for at-risk population, because Payne is a T1 that gets basically 1/2 as many points as Maury for only 3% worse! And CHML is poorer than Maury and gets 1.7 points less for the same score.

Brent: ELA median growth percentile = 70% = 10.9 points
SWS: ELA median growth percentile = 69% = 11.6 points
L-T: ELA median growth percentile = 62% = 8 points
Watkins: ELA median growth percentile = 57% = 8.6 points
Maury: ELA median growth percentile = 54% = 8.5 points
CHML: ELA median growth percentile = 54% = 6.8 points
Payne: ELA median growth percentile = 51% = 4.2 points
Anonymous
Post 12/08/2025 11:25     Subject: DC's School Report Cards are up. Any surprises?

Anonymous wrote:There's something screwy with the data or the data is not actually just showing overall achievement/growth, but is somehow being measured relative to population/emphasizing the performance of certain subpopulations. If this is true, so that even achievement is not actually measuring achievement, then this Report Card is really not terribly valuable for UMC parents.

Here's an example:

L-T is at 70.3 for approaching, meets, exceeds for math.

L-T is at 52.6 for meets, exceeds for math.

Watkins is at 65.9 for approaching, meets, exceeds for math.

Watkins is at 37.3 for meets, exceeds for math.

So, L-T is a little better for approaching meets, exceeds for math (4.4% better) and considerably better for meets, exceeds (15.3% better). If I were a UMC parent looking for a cohort of high performers, L-T clearly has a much bigger one.

However, when you then look at the scores the Report Card assigns that go to the overall rating, L-T gets a 2.4 for approaching meets, exceeds for math and Watkins gets a 3.9. So Watkins does a little worse and gets more than 50% higher. That seems... wrong. But then we get to meets, exceeds, where L-T was MUCH better... and we see L-T get a 5.5 and Watkins get a 7.7! So despite L-T having a noticeably higher pass rate, Watkins gets nearly a 50% higher score again.

It's also not some fluke of distribution being counted somehow, because for Watkins:
ELA: 8% 5s and 19% 1s.
Math: 10% 5s and 15% 1s.

For L-T:
ELA: 28% 5s (!) and 11% 1s.
Math: 16% 5s and 17% 1s.

So, at most, L-T has slightly more 1s in math; but way fewer 1s in ELA and WAY more 5s.

L-T does much better in ELA and does a bit better on points, so that at least makes sense.

Then let's look at growth... Raw numbers wise, the two schools appear to perform similarly, but somehow Watkins ends up with 34 points and L-T 27.1. The most perplexing one is median growth percentile for ELA, where L-T gets 62nd percentile, which gets them 8 points; meanwhile, Watkins gets 57th percentile, which gets them 8.6 points. WHAT? Can anyone makes sense of this?

At the end of the day, L-T has clearly better actual achievement numbers (especially in ELA, where they aren't close) and the schools have very similar growth and absentee numbers. L-T has more kids entering than exiting; Watkins has the reverse. The soft percentage factors I take it don't really count, but they are similar for the schools too... except that L-T is notably better in sense of belonging (76 v 69) and WAY better in safety (62 v 45). Everything else is within 1 point either way; 2 to L-T and 1 to Watkins.

And then we get to the overall scores and Watkins is at 76% and L-T is at 59%... even though the raw numbers clearly give L-T an advantage in achievement and basically a draw in everything else.

How is this possible? Does that mean that the "scores" we're seeing for achievement and growth are actually themselves relative in some totally unexplained way? Or some subpopulation scores are counting more? Because that makes this pretty useless for a family trying to use these Report Cards to make decisions.


This is a great analysis.

I think that "score" is almost completely useless and I would never use it to make a school decision.

Anonymous
Post 12/08/2025 10:38     Subject: DC's School Report Cards are up. Any surprises?

There's something screwy with the data or the data is not actually just showing overall achievement/growth, but is somehow being measured relative to population/emphasizing the performance of certain subpopulations. If this is true, so that even achievement is not actually measuring achievement, then this Report Card is really not terribly valuable for UMC parents.

Here's an example:

L-T is at 70.3 for approaching, meets, exceeds for math.

L-T is at 52.6 for meets, exceeds for math.

Watkins is at 65.9 for approaching, meets, exceeds for math.

Watkins is at 37.3 for meets, exceeds for math.

So, L-T is a little better for approaching meets, exceeds for math (4.4% better) and considerably better for meets, exceeds (15.3% better). If I were a UMC parent looking for a cohort of high performers, L-T clearly has a much bigger one.

However, when you then look at the scores the Report Card assigns that go to the overall rating, L-T gets a 2.4 for approaching meets, exceeds for math and Watkins gets a 3.9. So Watkins does a little worse and gets more than 50% higher. That seems... wrong. But then we get to meets, exceeds, where L-T was MUCH better... and we see L-T get a 5.5 and Watkins get a 7.7! So despite L-T having a noticeably higher pass rate, Watkins gets nearly a 50% higher score again.

It's also not some fluke of distribution being counted somehow, because for Watkins:
ELA: 8% 5s and 19% 1s.
Math: 10% 5s and 15% 1s.

For L-T:
ELA: 28% 5s (!) and 11% 1s.
Math: 16% 5s and 17% 1s.

So, at most, L-T has slightly more 1s in math; but way fewer 1s in ELA and WAY more 5s.

L-T does much better in ELA and does a bit better on points, so that at least makes sense.

Then let's look at growth... Raw numbers wise, the two schools appear to perform similarly, but somehow Watkins ends up with 34 points and L-T 27.1. The most perplexing one is median growth percentile for ELA, where L-T gets 62nd percentile, which gets them 8 points; meanwhile, Watkins gets 57th percentile, which gets them 8.6 points. WHAT? Can anyone makes sense of this?

At the end of the day, L-T has clearly better actual achievement numbers (especially in ELA, where they aren't close) and the schools have very similar growth and absentee numbers. L-T has more kids entering than exiting; Watkins has the reverse. The soft percentage factors I take it don't really count, but they are similar for the schools too... except that L-T is notably better in sense of belonging (76 v 69) and WAY better in safety (62 v 45). Everything else is within 1 point either way; 2 to L-T and 1 to Watkins.

And then we get to the overall scores and Watkins is at 76% and L-T is at 59%... even though the raw numbers clearly give L-T an advantage in achievement and basically a draw in everything else.

How is this possible? Does that mean that the "scores" we're seeing for achievement and growth are actually themselves relative in some totally unexplained way? Or some subpopulation scores are counting more? Because that makes this pretty useless for a family trying to use these Report Cards to make decisions.
Anonymous
Post 12/08/2025 09:58     Subject: DC's School Report Cards are up. Any surprises?

The report card for my kids' school is interesting. It's OK -- basically mediocre. But the data tells a story of a school serving UMC kids very well and economically disadvantaged kids very poorly with the report card overall then showing a sort of false middle ground that reflects neither accurately.
Anonymous
Post 12/08/2025 08:38     Subject: DC's School Report Cards are up. Any surprises?

Anonymous wrote:
Anonymous wrote:I am surprised by how badly Eliot-Hine performed. There are a lot of threads on DCUM trying to convince me that EH is the equivalent of SH, which this data does not support in the least... SH gets 84%ile (more or less equivalent to Hardy) and EH gets 21st%ile.

EH was substantially behind SH in both scores (basically SH is +20% in every measure) and growth (SH above average for both; EH below for both).

EH also had 35%(!!!) of students chronically absent.

I genuinely do not mean this to bash EH and I am glad it is getting increased neighborhood buy-in, but this Report Card presents a totally different reality than DCUM. EH actually came out behind Jefferson, but those are much closer and seems to be more about how you weight student achievement vs growth.


I was also surprised by EH. Those growth scores are… not good. In the context of having a good chunk of kids with room to grow, it suggests the school is not doing a great job. Hopefully it’s just an anomaly.


I don't think EH is doing a great job educating kids; it's just doing an ok job of attracting and retaining more kids who came in prepared.

The JOW scores show the challenges of a renovation. I think the school will rebound, but it doesn't bode well for Brent.

Anonymous
Post 12/07/2025 23:10     Subject: DC's School Report Cards are up. Any surprises?

Anonymous wrote:All of the movement is because the growth scores only measure change in one year, which is a really silly measure as a blip year can send you skyrocketing and then crashing or vice versa. Given that 2/3rds of points are based on CAPE scores in one year as a result, the results are always going to be crazily volatile and the percentages somewhat meaningless. Achievement and growth should both be 3 year measures; perhaps double weighting the most recent year.


+1. I was here last year defending my kid's school because the numbers were very low. They're much higher now. I do not think the school significantly improved in that period. It was better than the numbers suggested before but for all I know it's not quite as good as they suggest now.