| Ok, someone in the know please help me interpret this (my child has not yet taken SOL tests in FCPS). One thing I noticed in looking at the data, is that in 2022-23 compared to 2018-19 the counts (# of test takers) are way down and it varies by group. I don't know if this reflects actual demographic changes or what, but it's weird. For example, at Langley, the count of Asian test takers drops by around 12% from 2018/19 to 2022/23 but for White it drops by 43%! No wonder their pass advanced rates overall didn't drop too much at Langley! (As expected, Asians show higher pass advanced rates in all the schools.) For TJ, the change in counts does not favor the school (unlike the other schools): the counts for math and science drop by 53% for Asians whereas the counts for White test takers drop by 40%. If there is selective opting out of the SOL tests (or some other manipulation), there's really no way to interpret the changes (or rankings for that matter). Why would there be much fewer test takers in 22-23 and why would it vary by group (like White vs Asian)? |
Remember, SOL requirements changed. Years ago students took SOLs in Algebra, Geometry and Alg2. This means more tests were taken in high school and pass rates were higher. Now they take one and are done. The only students in my Alg 2 classes that take the SOL are 9th graders or students who just moved here from another state. This is why the overall math pass rate is down. Stronger students don’t continue to take them after passing 1 test. |
|
There are many nuances to this that most people don't know and it makes it hard to use this data as an indication of anything.
Like the poster above said, students don't take as many SOLs as they used to. Certain ones are tests most people take (Bio, Alg 1), and then others are pretty much only taken by students who failed those two or have transferred in (Chemistry, Earth Science, Alg 2, etc). Or they are taking some only for federal participation and don't actually have to pass it (Alg 2 or Geometry in some circumstances). Finally, substitute tests are being used much more often than they used to, which is part of the reason why there are less test takers for SOLs now than there used to be (using a PSAT subscore instead of an SOL). The data doesn't say what you think it does if you know all the variables |
Thank you. I am relatively new to Fairfax County so I don't know all the changes. When did they occur? I was under the assumption that all students take the same tests annually, that this is a requirement. I thought part of the point is to know how schools are doing (accountability), but I don't see how that is possible if there's so much variability in the ways that you describe in who actually takes the tests, what tests they take, and when. Here it seems to imply tests are federally required annually in math, science, and English, but I suppose this is outdated or otherwise poorly worded: https://www.fcps.edu/student-tests-and-assessments/student-assessment-details/sol-assessments I see elsewhere that "Federal law requires schools to provide students one SOL test in reading, in mathematics, and in science during grades 9-12. These assessments provide information about student achievement to the parent/guardian, school, division, and state." (https://www.fcps.edu/node/43069) About substitute tests, FCPS says "However, substitute tests do not fulfill federal participation requirements. Students will still be offered a SOL test when they enroll in a SOL-associated English, Biology, or Mathematics course." |
I don't understand why more tests would mean higher pass rates. Do you mean because they were more used to being tested frequently and more prepared? Also, this is a very naive q but when you look at math SOL scores does this basically lump SOLs in Alg1, Alg2, and Geometry? |
|
Right, so for example.... a student gets a qualifying substitute test score on the Reading part of the PSAT... or they get a passing score on the IB Lang and Lit test... that takes care of their verified credit for graduation in Reading as long as they passed the class as well.
That student will be required to take the Reading SOL for federal participation rules. But they don't have to pass it to meet federal participation, they just have to have taken it... so since this student will graduate anyway, there is no motivation or need to do well on that Reading SOL.. They just sit for it, and the SOL pass rates look bad Just one example Everything is highly individualized depending on the kids situation |
|
Students used to have to have up to 9 SOLs, now they only need 5. And of that 5, only 1 math and 1 science... so they take the easiest ones as large groups (generally), and the others mainly get taken by kids who struggle. And for some, if they passed a substitute test they don't take that SOL at all. So we're taking a bunch of the high achievers out if the pool that take these tests.
It's very skewed and you can't make a sound judgement about a school based on it |
SOLs are not that difficult for the majority of students to pass in HS, considering they can even use Desmos now. If they take Alg 1 in 9th and pass it they are done. Previously, the kid would take 3 SOLs for all 3 subjects- Alg, Geom and Alg 2. Let’s say they take Geometry as a 9th grader. That’s the only SOL they take. Let’s say they take Alg 2 as an 8th grader, then they substitute the PSAT in high school and never take an SOL in high school math. Then there are kids who struggle and take the Alg 1 SOL multiple times to pass. This happens because some have disabilities or are new to this country. Each fail is counted in to the percentage. This is why the overall math percentage everywhere is lower. Certain schools have fewer students in this category. Yes, it’s all counted as math. |
| A students SOL requirements are based on when they entered 9th grade and when they entered Virginia public schools... whatever the rules were that year live with you until graduation. Different years had different rules, which is why it's not the same for every student. Throw in the Covid rules for scores if they took the course during thise years, and everyone has very different situations for graduation/SOLs |
| This is all very informative, thank you. I had no idea. But I'm still wondering why there would be different rates of participation from pre to post pandemic for different racial groups. Because if we assume the massive variation is as you describe (in who takes what and when), if such patterns don't change systematically by school, you can still some get a sense of which school is doing better. Of course that's a big IF and I'm curious about how differences in patterns of test taking could affect how schools look when compared to one another. As noted earlier, I saw a pattern in some of the so-called top schools with the decline is test-taking among Asians from 2018/19 to 2022/23 being less than the decline among White students. Why would that be? It does seem to correlate with the change or lackthereof in test scores. |
You have to look closer and it’s not good. |
The decision to have students only take one EOC SOL in high school stemmed from the changes in accreditation that Virginia made in 2017. Students taking Algebra 1 in 7th grade were unaffected by this change because they take the Algebra 1 SOL in 7th grade, Geometry 8th grade, and Algebra 2 in 9th grade. Students taking Algebra 1 in 8th grade would take that SOL and then Geometry SOL in 9th grade and now stop. Students taking Algebra 1 in 9th grade would only take that SOL and then stop, assuming they passed. Thus, maybe the patterns you're noticing stem from differences in the demographic shares of accelerated math. |
Basically for some schools there's a pattern where the #of Asian test takers rebounds more than White, and this relates to SOL pass advanced scores. Not so surprising given we know the Asian students generally perform better, but it's still hard to understand is why you'd see this differential rebounding, by demographics, in # of test takers. Langely in particular sees a major rebound in Asian test takers of math SOL from 2018 to 2022 (170 to 151) but much less so for White test takers (528 to 302), and this undoubtedly affects how Langley looks in terms of recovery from pandemic (pass advanced rate in 2018/19 is 31 and pass advanced rate in 2022/23 is 26). Contrast with TJ, with rebound of math SOL test takers from 392 to 217 for Asians and 145 to 104 for White students, the pass advanced rate went from 91 in 2018/19 to 53 in 2022/23. TJ looks worse than Langley but this is no doubt in part due to changes in the demographic make up of who is taking the tests. In terms of why the change in demographic makeup of test takers, maybe this can be explained as you suggested in terms of students at some schools not needing to take the test which makes the scores lower (at TJ?) or could be more sinister if schools want their scores to look better they might encourage students from specific groups to take the tests...but maybe that's unlikely? I'm not an insider so no idea how plausible that is. Either way, when the underlying demographics of who is taking the tests changes so dramatically, the scores are going to change a lot. I don't think it means the scores are useless but there's certainly a lot of noise and changes from pre to post pandemic are very difficult if not impossible to interpret. |
| I think TJ students should be passing at 100%. I don't understand why they are not. Maybe, a few on English because of ESOL--but the idea of TJ was that they learn quickly and easily. |
There have also been changes in math course enrollment for entering TJ students. Previously, more than 90% of TJ students had taken Algebra 1 by 7th grade. Now, more TJ students have taken Algebra 1 in 8th grade so you would likely see some change in the number of students taking the various SOLs from that factor as well. |