I’m only eyeballing the numbers and guessing about school size based on ES/MS/HS, but I don’t see an obvious trend other than bigger schools generally have higher counts. |
What?? |
That’s some fuzzy math. Wow. |
Sure there is, but it has nothing to do with snow. |
I am surprised the numbers are so low for staff. Students are probably an undercount (as many parents likely didn’t know how to report), but presumably teachers and staff would have reported. |
What is confusing here? Maybe I could have phrased it better, or maybe you're just objecting to MCPS's rationale? Bottom line is that point of the 5% metric seems to have been to approximate spread and prevalence in the very local community. As in, it's a good way to measure if there's a lot of COVID floating around among kids and staff in the neighborhood/community in general. This seems correct to me. If 5% of students/staff have tested positive for COVID, why would it not be true that there was high hyper-local community spread? High in this case meaning likely 10% or more. It's not a perfect metric, but there are none. But it does stand to reason that if your literal neighbors, families your kids play with, etc. have COVID at least at 5%, and that rate used to be 1% or whatever, then you are witnessing rapidly increasing community spread. Yes, yes, we could get into the weeds about how many play dates happened during winter break or whatever, but it's a fine enough metric. That's because the entire point of it is to compare what apples to apples as much as possible. Before the break, they were measuring how many staff and students reported positive tests and were within a contagious period at any given time. They are simply comparing that number, which they have, to the same measurement now. They are saying if the same metric goes from 1-2%, which is where my school was, to 5%, then they know they have a problem. The point is less that 5% is some magic number than that it is 3-5x what it used to be. I'm not sure what other metric should be used. 5% "from spread within school" is much harder, if not impossible, to determine, and it doesn't necessarily mean anything-- the 5% part, anyway. Unless there's some scientific paper that says 5% itself is a tipping point for some reason. |
If it was intended as a proxy for community infection rates, then it wouldn’t be tracked for individual schools. People don't work, eat, socialize, etc. within the boundary of their local elementary school. Whether it is a sensible measure or threshold is a separate issue. |
No, not community infection rates, exactly. But hyperlocal community infection rates. Rates among people at the school. Of course people don't eat only in their neighborhoods and so on. But most people do 1) stick closER to home vs other parts of the DMV or the world and 2) if these people close by have COVID now, it's more likely they will have already infected people (who have not tested positive yet) in their households or neighborhoods than in other places. Not exclusively likely, but more likely. So if you know 5% of kids/staff have it, you know a good chunk of them spread it to others who have tested or will soon test positive, and that those others are at least disproportionately likely to have a connection to the school. Regardless-- again, all the schools have to go on from before is reported cases among staff and students. So they need a number to compare that to. All they can compare it to is... reported cases among staff and students (from previous months). They can't separate out some group that was only infected in school because that is impossible to prove, and difficult to even give a range for. In fact, you never could have done that. It has to be all staff and student cases. That's apples to apples. So they can see it averaged 5 people on any given day from September-November, and now it's 25 people. Or whatever. They could choose not to include anyone who reported over the break, with the rationale that they couldn't have yet spread it directly within the school. But again, you can't say for certain they didn't contract it from either the school or secondarily from the school, from someone who had been in the school, who then may have been a vector and so on. COVID doesn't usually incubate a full 14 days, but it can. And it hasn't been 14 days since kids were in school. Regardless, they very clearly (IMO) stated that they were counting cases reported over break in the 5%. That really isn't in much question now, is it? It's not an awesome metric, but it's not a terrible one, and I don't see a much better one that can be directly compared to the past. To measure only spread that is occurring in the schools post-break, you must, by definition, wait until it spreads in schools after the break. In theory, one could say this is fine-- it might not spread in schools after the break! But that really does defy all logic and everything we know about omicron at this point. The spread could be significant to disastrous, but it's not going to be mild or nonexistent. Again, refreakinggardless, you can't prove that, if a kid tests positive this Thursday, they got it in school. You can't. How about if they test positive on Tuesday? Odds are actually good they didn't get it in school. But that's okay to count because...? There's only one real way to compare apples to apples here, and that's what they're doing. Even if the apples are crappy and incomplete. At least they're both apples. |
Actually correcting myself to say that if a kid tests positive on Tuesday, there is a 99.99999% certainty they didn't get it at the school, at least not after reopening, and if they test positive Thursday, it also pretty unlikely they got it at the school post-reopening. |