Value of GPA vs. test scores

Anonymous
Anonymous wrote:Tons of rich kids cheated SAT in my HS. In particular rich dumb athletes.

They even had a map of test score sites they can get it done. One guy on football team had someone take there test in Hawaii. They even fly the person in.



I remember reading Trump even hired a ringer to help him transfer into Wharton since he wasn't able to sore high enough on his own.
Anonymous
Anonymous wrote:
Anonymous wrote:I don’t so much have an axe to grind about standardizing tests per se, but I think both the format and the arms race mentality are problematic. The MCQ format is a terrible way to assess a student pedagogically. It benefits only the test administrators because it’s fast and cheap to grade. But it’s susceptible to being gamed out. A student can improve dramatically by getting better at the test taking strategies that aren’t related to understanding the underlying content.

As for the second point, standardized tests are best used as one datapoint to make sure the applicant has the baseline knowledge set and skills. Not as a competition to get a perfect score. The average SAT score at Harvard in the early 1990s was UNDER 1400. Now people on this board scoff at scores like that. Scores are not linear. In reality there is a minuscule difference between 1400 and a 1600. There is a bigger difference between 1200 and 1300 than there is between 1300 and 1600.


Simply untrue. If you won’t consider the ceiling effect, you’ll never understand why the difference between a 1400 and a 1600 is far from minuscule.



College is not an SAT academy training professional SAT athletes. Being better at the SAT at the extremes is not meaningful. If you’re looking for distinguishing academic prowess at the 99+ percentile, you need a different test than the one that determines readiness for NVCC.
Anonymous
Anonymous wrote:
Anonymous wrote:How do you explain students faking disabilities for extra time and doing better because of this? It’s not merely college aptitude. Everyone would do better w more time, especially on the ACT. Things are not standardized, unfortunately; and college’s don’t who has had extra time. It tests parental aggressiveness and wealth—who can pay $7k for neurological testing to shady doctors.


Perhaps this is true to some degree, but how do you explain average standardized test scores remaining flat and even declining over time? And how do you explain that fact regarding the average test score in the context of students taking 3, 4, 5 and even more re-takes? I’m not going to drive the point further home by mentioning the widespread availability of test prep materials …

If everyone “would do better” with additional time, why are they actually doing worse - on average - with the substantial benefit of repeated exposure to the test content and unlimited exposure to test prep materials, whether via paid private coaching or access to resources like Khan Academy?

Yeah, complaints about testing invariably fall apart when you examine the actual facts. The “everyone would do better” claim is usually just a parent with an ax to grind because their kid didn’t do as well as a kid that they “think” had some form of time accommodation.


Average scores are going down (or up) due the average student, not the elite student. Average scores say nothing about what’s happening at the top 3%. Those stores contribute nothing to the average. It’s mathematically impossible. If everyone at 1400+ magically upgraded to 1600, 200pts times 3% = 6 point increase in the average.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:I’ve always thought this, I believe some are bad test takers but not in the numbers people claim. It’s hard to swallow when my B private school student got a 33 act with no prep, but looks “bad” in a sea of 4.0’s. Mine likes the test, struggles more on regular class tests and memorization of material.

If the article is correct and A students are average and should expect mediocre tests scores, your B-average student with high test scores looks like someone who happens to test well on standardized but isn’t actually a very good student. I’m not sure that’s the better problem to have.


No, it just means they’re lazy and wasting their potential. That’s all.


Exactly. There’s a negligible difference in intelligence, vs a clear track record of keeping up with responsibilities. These are lifelong traits, and the guy who only shows on test day just isn’t good for much. There used to be a tendency to coddle these kids, as if capturing that potential was possible and somehow more important than encouraging the students who work. That was misguided and the pendulum has swung.


A not very intelligent generalist obedient hard worker will never achieve what a hard working specialist genius who doesn’t care about HS liberal arts will achieve. Those students have different capabilities.
Anonymous
Anonymous wrote:I don’t so much have an axe to grind about standardizing tests per se, but I think both the format and the arms race mentality are problematic. The MCQ format is a terrible way to assess a student pedagogically. It benefits only the test administrators because it’s fast and cheap to grade. But it’s susceptible to being gamed out. A student can improve dramatically by getting better at the test taking strategies that aren’t related to understanding the underlying content.

As for the second point, standardized tests are best used as one datapoint to make sure the applicant has the baseline knowledge set and skills. Not as a competition to get a perfect score. The average SAT score at Harvard in the early 1990s was UNDER 1400. Now people on this board scoff at scores like that. Scores are not linear. In reality there is a minuscule difference between 1400 and a 1600. There is a bigger difference between 1200 and 1300 than there is between 1300 and 1600.


The SAT is substantially easier and scored more generously now than then. Also Harvard students admitted on the “academic” profile are much stronger students now, due to overall applicant population growth and expansion to international students and due to the decreasing size of the “academic” portion of the class.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:The SAT is geared towards *average* college bound students. It’s basic and pretty useless for assessing high fliers.


This is not true according to research from harvard and brown.

https://opportunityinsights.org/wp-content/uploads/2024/01/SAT_ACT_on_Grades.pdf

In fact SAT scores seem even MORE important at selective colleges than non-selective colleges where GPA seems to be more important.


And what colleges would those be? Most top colleges have a significant percent of students were admitted who applied via TO. Check out the common data sets. It’s eye opening. College WANT test optional so they have lots of flexibility in who they can admit!

We are all aware of the small handful of colleges that recently announced they’re returning to test required or test considered.

BUT most selective colleges remain test optional. I say selective, which is a broader group than Top 5 or Top 10 or whatever.


That’s 2021 thinking. Colleges realized TO was a disaster and are rolling back to test required.
Anonymous
Anonymous wrote:How do you explain students faking disabilities for extra time and doing better because of this? It’s not merely college aptitude. Everyone would do better w more time, especially on the ACT. Things are not standardized, unfortunately; and college’s don’t who has had extra time. It tests parental aggressiveness and wealth—who can pay $7k for neurological testing to shady doctors.



It’s wrong to blame the students and parents for the capricious system for changing the time limits from stupid to reasonable on an individual basis.

Thankfully, both SAT and ACT are moving to new formats that are less tests of how the the student handles extreme time pressure,

Nothing will fix the problem that you can write one test (or for SAT adaptive test, 2 tests) that accurately sorts students within each of the standard deviation buckets. Different cohorts need different tests, to avoid overtraining bias to the idiosyncrasies of a specific test.

Anonymous
After the 'Test Optional' phase, the importance of the SAT score is going to significantly increase, starting with the 2025 and 2026 admissions cycle.
Anonymous
Some of what he says resonates, but he loses all credibility when he takes potshots at his test prep competitors. His bias is overt.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:I’ve always thought this, I believe some are bad test takers but not in the numbers people claim. It’s hard to swallow when my B private school student got a 33 act with no prep, but looks “bad” in a sea of 4.0’s. Mine likes the test, struggles more on regular class tests and memorization of material.


Is there a change your kid is a little bit adhd?
I was great at tests but struggled with the daily grind of classwork.
I had undiagnosed adhd. No one diagnoses it when the kid is doing above average.


Yes, diagnosed but not until high school, straight A girl prior with great behavior and didn’t show typical signs until workload became super intense. Does much better on projects versus tests, keeps up with homework thankfully, trouble focusing in content areas they aren’t interested in. Excels in areas they enjoy, very creative. Goes to a rigorous private with no retakes or fluff to pad grades.


Well if you know what the problem is then you are probably also aware of the coping strategies.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:The SAT is geared towards *average* college bound students. It’s basic and pretty useless for assessing high fliers.


This is not true according to research from harvard and brown.

https://opportunityinsights.org/wp-content/uploads/2024/01/SAT_ACT_on_Grades.pdf

In fact SAT scores seem even MORE important at selective colleges than non-selective colleges where GPA seems to be more important.


And what colleges would those be? Most top colleges have a significant percent of students were admitted who applied via TO. Check out the common data sets. It’s eye opening. College WANT test optional so they have lots of flexibility in who they can admit!

We are all aware of the small handful of colleges that recently announced they’re returning to test required or test considered.

BUT most selective colleges remain test optional. I say selective, which is a broader group than Top 5 or Top 10 or whatever.


Well Harvard and Brown to begin with.

But their research applies to pretty much all selective colleges.

The only place where GPA is more important than test scores is in non-selective colleges. Non-selective colleges account for over 90% of all colleges and universitie4s.
Anonymous
Anonymous wrote:I don't know if I buy the argument about SAT scores being more predictive than GPA of success in college. There are lots of kids who take large numbers of APs and do very well on the tests but who do not test as well on the SAT/ACT. As far as grade inflation goes, there are lots of kids at my child's high school who receive Bs in the class but then get 4s/5s on the exams. Teachers want very high pass rates on the exam and often make the class harder than the test. I am not sure that is true in the majority of high schools though. I think the SAT and ACT can be predictive of success in the absence of better markers, like proven success in a strong college prep curriculum.


Thats's the great thing about peer reviewed research, your gut instinct doesn't matter.
Anonymous
Yeah well no schools are "transcript optional" so the GPA still has very high meaning to college admissions even if you don't like it.
Anonymous
Anonymous wrote:
Anonymous wrote:No question grades are inflated but colleges use them much more than test scores at the moment. Even the schools going back to testing seem to view testing as a way to validate grades as opposed to an individual variable.


This is the vibe I’m getting too


If everyone has a 4.0 then the test score isn't validating another metric, it is the metric.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:I don’t so much have an axe to grind about standardizing tests per se, but I think both the format and the arms race mentality are problematic. The MCQ format is a terrible way to assess a student pedagogically. It benefits only the test administrators because it’s fast and cheap to grade. But it’s susceptible to being gamed out. A student can improve dramatically by getting better at the test taking strategies that aren’t related to understanding the underlying content.

As for the second point, standardized tests are best used as one datapoint to make sure the applicant has the baseline knowledge set and skills. Not as a competition to get a perfect score. The average SAT score at Harvard in the early 1990s was UNDER 1400. Now people on this board scoff at scores like that. Scores are not linear. In reality there is a minuscule difference between 1400 and a 1600. There is a bigger difference between 1200 and 1300 than there is between 1300 and 1600.


Simply untrue. If you won’t consider the ceiling effect, you’ll never understand why the difference between a 1400 and a 1600 is far from minuscule.



College is not an SAT academy training professional SAT athletes. Being better at the SAT at the extremes is not meaningful. If you’re looking for distinguishing academic prowess at the 99+ percentile, you need a different test than the one that determines readiness for NVCC.


The sat measures more than how well you can take the sat. It measures cognitive ability.
Don't believe the princeton review ad, they are trying to get you to spend a lot of money on your mid kid.
post reply Forum Index » College and University Discussion
Message Quick Reply
Go to: