Rating TO student success

Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:See, the thing is, YOU have no idea how TO students do compared to the ones who submitted. It’s time to stop pretending the SAT/ACT is a proxy for innate ability or talent. They aren’t.


The Selingo article discussed the fact that schools that track have found that TO students are not performing as well. Of course, there is still very limited data, at least with respect to schools that went TO due to Covid.

I suspect that schools will react by making the standards for TO admission higher or letting fewer students in TO so they can more easily focus remedial efforts on them.


I don't think that's what the Selingo article said. From the article itself:


"For now, MIT remains in the minority in its claims about the predictive power of the SAT. In 2021, Wake Forest, which went test optional in 2008, released a longitudinal analysis that found that applicants who don’t submit scores — who are twice as likely to be low income, students of color, or the first in their family to go to college — have a lower GPA their first year at Wake Forest, but it narrows each subsequent year to a .03 difference by graduation with minimal difference in graduation rates. (Interestingly, students who withheld their scores even graduated at a slightly higher rate, at 90 percent, than those who sent scores, at 87 percent.) Studies of other colleges that went test optional before the pandemic have arrived at similar conclusions: After some time as an undergraduate, there isn’t much difference in the academic performance between students who submitted and those who didn’t."


Also directly from the article:

Meanwhile, other anecdotal results from the test-optional experiment are starting to trickle in. At one top-ranked liberal arts college, where 60 percent of the students who enrolled last year submitted scores, the admissions dean told me that the average first-year GPA for members of the freshman class that submitted scores was 3.57; for non-submitters it was 3.47. “Institutional research tells me the difference is statistically significant,” he said. Another admissions dean, from the selective private university weighing the “million-dollar question,” told me faculty members have informed him about students who have “a little less confidence” in the classroom. Since professors don’t know whether their students submitted scores, the admissions dean asks for names. He then looks them up. Most of the time, he said, the students didn’t submit scores. “The question is, if I’m coming in with a 1600 or a 1550 on the SAT, does that do something to my level of confidence in the classroom versus someone who just came in with grades?” this admissions dean wonders.


DP.. the level of confidence wouldn't be contingent on taking the SAT. It's more about whether your grades were inflated or not. If you got straight As, but only achieved it with a lot of support, retaking tests, graded on a curve, then your high grades that got you into college did you a disservice. But, if you got straight As without much support or retaking, then chances are you'd be more confident in college.


I would think that knowing you have an 1100 in a school where the average student has a score above 1400, that may lead to imposter syndrome. I assume that most non-submitters took the tests and didn't submit because they knew their scores were too low for admission, maybe I'm wrong there
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:See, the thing is, YOU have no idea how TO students do compared to the ones who submitted. It’s time to stop pretending the SAT/ACT is a proxy for innate ability or talent. They aren’t.


The Selingo article discussed the fact that schools that track have found that TO students are not performing as well. Of course, there is still very limited data, at least with respect to schools that went TO due to Covid.

I suspect that schools will react by making the standards for TO admission higher or letting fewer students in TO so they can more easily focus remedial efforts on them.


I don't think that's what the Selingo article said. From the article itself:


"For now, MIT remains in the minority in its claims about the predictive power of the SAT. In 2021, Wake Forest, which went test optional in 2008, released a longitudinal analysis that found that applicants who don’t submit scores — who are twice as likely to be low income, students of color, or the first in their family to go to college — have a lower GPA their first year at Wake Forest, but it narrows each subsequent year to a .03 difference by graduation with minimal difference in graduation rates. (Interestingly, students who withheld their scores even graduated at a slightly higher rate, at 90 percent, than those who sent scores, at 87 percent.) Studies of other colleges that went test optional before the pandemic have arrived at similar conclusions: After some time as an undergraduate, there isn’t much difference in the academic performance between students who submitted and those who didn’t."


Also directly from the article:

Meanwhile, other anecdotal results from the test-optional experiment are starting to trickle in. At one top-ranked liberal arts college, where 60 percent of the students who enrolled last year submitted scores, the admissions dean told me that the average first-year GPA for members of the freshman class that submitted scores was 3.57; for non-submitters it was 3.47. “Institutional research tells me the difference is statistically significant,” he said. Another admissions dean, from the selective private university weighing the “million-dollar question,” told me faculty members have informed him about students who have “a little less confidence” in the classroom. Since professors don’t know whether their students submitted scores, the admissions dean asks for names. He then looks them up. Most of the time, he said, the students didn’t submit scores. “The question is, if I’m coming in with a 1600 or a 1550 on the SAT, does that do something to my level of confidence in the classroom versus someone who just came in with grades?” this admissions dean wonders.


DP.. the level of confidence wouldn't be contingent on taking the SAT. It's more about whether your grades were inflated or not. If you got straight As, but only achieved it with a lot of support, retaking tests, graded on a curve, then your high grades that got you into college did you a disservice. But, if you got straight As without much support or retaking, then chances are you'd be more confident in college.


I would think that knowing you have an 1100 in a school where the average student has a score above 1400, that may lead to imposter syndrome. I assume that most non-submitters took the tests and didn't submit because they knew their scores were too low for admission, maybe I'm wrong there


You really think after a kid gets in they hand wring over what other admits got? By the time they start school, their concerns are their roomate parites, rushing, organizations and clubs. Did you go to college yourself? In every college class, in every school, there will be slackers and star students. Athletes, and legacies who got low scores. Everything is a wash after they start.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:See, the thing is, YOU have no idea how TO students do compared to the ones who submitted. It’s time to stop pretending the SAT/ACT is a proxy for innate ability or talent. They aren’t.


The Selingo article discussed the fact that schools that track have found that TO students are not performing as well. Of course, there is still very limited data, at least with respect to schools that went TO due to Covid.

I suspect that schools will react by making the standards for TO admission higher or letting fewer students in TO so they can more easily focus remedial efforts on them.


I don't think that's what the Selingo article said. From the article itself:


"For now, MIT remains in the minority in its claims about the predictive power of the SAT. In 2021, Wake Forest, which went test optional in 2008, released a longitudinal analysis that found that applicants who don’t submit scores — who are twice as likely to be low income, students of color, or the first in their family to go to college — have a lower GPA their first year at Wake Forest, but it narrows each subsequent year to a .03 difference by graduation with minimal difference in graduation rates. (Interestingly, students who withheld their scores even graduated at a slightly higher rate, at 90 percent, than those who sent scores, at 87 percent.) Studies of other colleges that went test optional before the pandemic have arrived at similar conclusions: After some time as an undergraduate, there isn’t much difference in the academic performance between students who submitted and those who didn’t."


Also directly from the article:

Meanwhile, other anecdotal results from the test-optional experiment are starting to trickle in. At one top-ranked liberal arts college, where 60 percent of the students who enrolled last year submitted scores, the admissions dean told me that the average first-year GPA for members of the freshman class that submitted scores was 3.57; for non-submitters it was 3.47. “Institutional research tells me the difference is statistically significant,” he said. Another admissions dean, from the selective private university weighing the “million-dollar question,” told me faculty members have informed him about students who have “a little less confidence” in the classroom. Since professors don’t know whether their students submitted scores, the admissions dean asks for names. He then looks them up. Most of the time, he said, the students didn’t submit scores. “The question is, if I’m coming in with a 1600 or a 1550 on the SAT, does that do something to my level of confidence in the classroom versus someone who just came in with grades?” this admissions dean wonders.


DP.. the level of confidence wouldn't be contingent on taking the SAT. It's more about whether your grades were inflated or not. If you got straight As, but only achieved it with a lot of support, retaking tests, graded on a curve, then your high grades that got you into college did you a disservice. But, if you got straight As without much support or retaking, then chances are you'd be more confident in college.


I would think that knowing you have an 1100 in a school where the average student has a score above 1400, that may lead to imposter syndrome. I assume that most non-submitters took the tests and didn't submit because they knew their scores were too low for admission, maybe I'm wrong there


You really think after a kid gets in they hand wring over what other admits got? By the time they start school, their concerns are their roomate parites, rushing, organizations and clubs. Did you go to college yourself? In every college class, in every school, there will be slackers and star students. Athletes, and legacies who got low scores. Everything is a wash after they start.


Imposter syndrome at elite universities is a well documented problem
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:See, the thing is, YOU have no idea how TO students do compared to the ones who submitted. It’s time to stop pretending the SAT/ACT is a proxy for innate ability or talent. They aren’t.


The Selingo article discussed the fact that schools that track have found that TO students are not performing as well. Of course, there is still very limited data, at least with respect to schools that went TO due to Covid.

I suspect that schools will react by making the standards for TO admission higher or letting fewer students in TO so they can more easily focus remedial efforts on them.


I don't think that's what the Selingo article said. From the article itself:


"For now, MIT remains in the minority in its claims about the predictive power of the SAT. In 2021, Wake Forest, which went test optional in 2008, released a longitudinal analysis that found that applicants who don’t submit scores — who are twice as likely to be low income, students of color, or the first in their family to go to college — have a lower GPA their first year at Wake Forest, but it narrows each subsequent year to a .03 difference by graduation with minimal difference in graduation rates. (Interestingly, students who withheld their scores even graduated at a slightly higher rate, at 90 percent, than those who sent scores, at 87 percent.) Studies of other colleges that went test optional before the pandemic have arrived at similar conclusions: After some time as an undergraduate, there isn’t much difference in the academic performance between students who submitted and those who didn’t."


Also directly from the article:

Meanwhile, other anecdotal results from the test-optional experiment are starting to trickle in. At one top-ranked liberal arts college, where 60 percent of the students who enrolled last year submitted scores, the admissions dean told me that the average first-year GPA for members of the freshman class that submitted scores was 3.57; for non-submitters it was 3.47. “Institutional research tells me the difference is statistically significant,” he said. Another admissions dean, from the selective private university weighing the “million-dollar question,” told me faculty members have informed him about students who have “a little less confidence” in the classroom. Since professors don’t know whether their students submitted scores, the admissions dean asks for names. He then looks them up. Most of the time, he said, the students didn’t submit scores. “The question is, if I’m coming in with a 1600 or a 1550 on the SAT, does that do something to my level of confidence in the classroom versus someone who just came in with grades?” this admissions dean wonders.


DP.. the level of confidence wouldn't be contingent on taking the SAT. It's more about whether your grades were inflated or not. If you got straight As, but only achieved it with a lot of support, retaking tests, graded on a curve, then your high grades that got you into college did you a disservice. But, if you got straight As without much support or retaking, then chances are you'd be more confident in college.


I would think that knowing you have an 1100 in a school where the average student has a score above 1400, that may lead to imposter syndrome. I assume that most non-submitters took the tests and didn't submit because they knew their scores were too low for admission, maybe I'm wrong there


You really think after a kid gets in they hand wring over what other admits got? By the time they start school, their concerns are their roomate parites, rushing, organizations and clubs. Did you go to college yourself? In every college class, in every school, there will be slackers and star students. Athletes, and legacies who got low scores. Everything is a wash after they start.


Imposter syndrome at elite universities is a well documented problem


but is it tied to test scores?
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:See, the thing is, YOU have no idea how TO students do compared to the ones who submitted. It’s time to stop pretending the SAT/ACT is a proxy for innate ability or talent. They aren’t.

It may not be a proxy for talent or success in life but it strongly correlates with ability, intelligence and for those who study intensely to do well - determination.


No, you are wrong.


I have a bright, extremely dedicated student who is a not so great test taker. I can't imagine her ability to take the SAT will impact her ability to have a successful life.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:See, the thing is, YOU have no idea how TO students do compared to the ones who submitted. It’s time to stop pretending the SAT/ACT is a proxy for innate ability or talent. They aren’t.


The Selingo article discussed the fact that schools that track have found that TO students are not performing as well. Of course, there is still very limited data, at least with respect to schools that went TO due to Covid.

I suspect that schools will react by making the standards for TO admission higher or letting fewer students in TO so they can more easily focus remedial efforts on them.


I don't think that's what the Selingo article said. From the article itself:


"For now, MIT remains in the minority in its claims about the predictive power of the SAT. In 2021, Wake Forest, which went test optional in 2008, released a longitudinal analysis that found that applicants who don’t submit scores — who are twice as likely to be low income, students of color, or the first in their family to go to college — have a lower GPA their first year at Wake Forest, but it narrows each subsequent year to a .03 difference by graduation with minimal difference in graduation rates. (Interestingly, students who withheld their scores even graduated at a slightly higher rate, at 90 percent, than those who sent scores, at 87 percent.) Studies of other colleges that went test optional before the pandemic have arrived at similar conclusions: After some time as an undergraduate, there isn’t much difference in the academic performance between students who submitted and those who didn’t."


Also directly from the article:

Meanwhile, other anecdotal results from the test-optional experiment are starting to trickle in. At one top-ranked liberal arts college, where 60 percent of the students who enrolled last year submitted scores, the admissions dean told me that the average first-year GPA for members of the freshman class that submitted scores was 3.57; for non-submitters it was 3.47. “Institutional research tells me the difference is statistically significant,” he said. Another admissions dean, from the selective private university weighing the “million-dollar question,” told me faculty members have informed him about students who have “a little less confidence” in the classroom. Since professors don’t know whether their students submitted scores, the admissions dean asks for names. He then looks them up. Most of the time, he said, the students didn’t submit scores. “The question is, if I’m coming in with a 1600 or a 1550 on the SAT, does that do something to my level of confidence in the classroom versus someone who just came in with grades?” this admissions dean wonders.



So anecdotal evidence after a year of a national pandemic and hybrid/remote learning is the same as a a longitudinal analysis from a top 20 school done prior to a pandemic?
Anonymous
If you submit your great SAT scores and don’t get in, problem is not what someone else did or did not submit. Look inside for what’s wrong.
Anonymous
THIS!! And also they are learning that selective colleges now value more than the kids who study and live their 4 years to get picked rather than the ones who live a balanced life, select a passion and go in on it for 4 years, are kind, helpful, resourceful, get good grades, and bring more to the classroom experience and dialogue. That not all selective schools want a classroom full of test takers as if a bnch of the richest now didn;t drop out of college to pursue life in tech or finance. Also, that there is value to lived experiences that correlates well with future problem solvers, and community builders.


Except that there are tons of kids who study and aim to be selected but also are all the other things you mention AND score in the 99th percentile on the SAT or ACT. Those kids are great test takers and balanced, with ECs that represent their passions. Those kids are the ones who think TO isn’t fair because they know friends who scored a 1300 after testing three times but have ridiculously high GPAs and did not put in as much effort with ECs, etc. but still sometimes get in. Those are the kids who never would have applied to the selective colleges before TO.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:See, the thing is, YOU have no idea how TO students do compared to the ones who submitted. It’s time to stop pretending the SAT/ACT is a proxy for innate ability or talent. They aren’t.


The Selingo article discussed the fact that schools that track have found that TO students are not performing as well. Of course, there is still very limited data, at least with respect to schools that went TO due to Covid.

I suspect that schools will react by making the standards for TO admission higher or letting fewer students in TO so they can more easily focus remedial efforts on them.


I don't think that's what the Selingo article said. From the article itself:


"For now, MIT remains in the minority in its claims about the predictive power of the SAT. In 2021, Wake Forest, which went test optional in 2008, released a longitudinal analysis that found that applicants who don’t submit scores — who are twice as likely to be low income, students of color, or the first in their family to go to college — have a lower GPA their first year at Wake Forest, but it narrows each subsequent year to a .03 difference by graduation with minimal difference in graduation rates. (Interestingly, students who withheld their scores even graduated at a slightly higher rate, at 90 percent, than those who sent scores, at 87 percent.) Studies of other colleges that went test optional before the pandemic have arrived at similar conclusions: After some time as an undergraduate, there isn’t much difference in the academic performance between students who submitted and those who didn’t."


Also directly from the article:

Meanwhile, other anecdotal results from the test-optional experiment are starting to trickle in. At one top-ranked liberal arts college, where 60 percent of the students who enrolled last year submitted scores, the admissions dean told me that the average first-year GPA for members of the freshman class that submitted scores was 3.57; for non-submitters it was 3.47. “Institutional research tells me the difference is statistically significant,” he said. Another admissions dean, from the selective private university weighing the “million-dollar question,” told me faculty members have informed him about students who have “a little less confidence” in the classroom. Since professors don’t know whether their students submitted scores, the admissions dean asks for names. He then looks them up. Most of the time, he said, the students didn’t submit scores. “The question is, if I’m coming in with a 1600 or a 1550 on the SAT, does that do something to my level of confidence in the classroom versus someone who just came in with grades?” this admissions dean wonders.



So anecdotal evidence after a year of a national pandemic and hybrid/remote learning is the same as a a longitudinal analysis from a top 20 school done prior to a pandemic?


Yes because it supports what I know to be true.
Anonymous
Anonymous wrote:
Anonymous wrote:See, the thing is, YOU have no idea how TO students do compared to the ones who submitted. It’s time to stop pretending the SAT/ACT is a proxy for innate ability or talent. They aren’t.


OP here. There are lots of reports that TO kids struggle, but then the difference suddenly disappears. Maybe the kids miraculously makeup for 12 years of shortfall in a semester or two, but it’s more likely others things are at work. For example, do these kids eventually choose easier majors and/or ones that they otherwise would not have preferred just to get through school? Do the schools follow the GPAs of such kids and proactively reach out to them with aid? Are professors aware of kids with challenges and provide more mentoring?

Look, if Harvard truly feels that a kid with a 1200 SAT can do the work as well as a kid with a 1550, why did they start the tests in the first place? My guess is, they don’t.


Perhaps they think 1200 vs 1500 is significant but not 1400 vs 1550.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:See, the thing is, YOU have no idea how TO students do compared to the ones who submitted. It’s time to stop pretending the SAT/ACT is a proxy for innate ability or talent. They aren’t.

It may not be a proxy for talent or success in life but it strongly correlates with ability, intelligence and for those who study intensely to do well - determination.


No, you are wrong.


I have a bright, extremely dedicated student who is a not so great test taker. I can't imagine her ability to take the SAT will impact her ability to have a successful life.


That’s probably true, assuming she doesn’t have the MCAT and specialty boards or LSAT in her future.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:See, the thing is, YOU have no idea how TO students do compared to the ones who submitted. It’s time to stop pretending the SAT/ACT is a proxy for innate ability or talent. They aren’t.


The Selingo article discussed the fact that schools that track have found that TO students are not performing as well. Of course, there is still very limited data, at least with respect to schools that went TO due to Covid.

I suspect that schools will react by making the standards for TO admission higher or letting fewer students in TO so they can more easily focus remedial efforts on them.


I don't think that's what the Selingo article said. From the article itself:


"For now, MIT remains in the minority in its claims about the predictive power of the SAT. In 2021, Wake Forest, which went test optional in 2008, released a longitudinal analysis that found that applicants who don’t submit scores — who are twice as likely to be low income, students of color, or the first in their family to go to college — have a lower GPA their first year at Wake Forest, but it narrows each subsequent year to a .03 difference by graduation with minimal difference in graduation rates. (Interestingly, students who withheld their scores even graduated at a slightly higher rate, at 90 percent, than those who sent scores, at 87 percent.) Studies of other colleges that went test optional before the pandemic have arrived at similar conclusions: After some time as an undergraduate, there isn’t much difference in the academic performance between students who submitted and those who didn’t."



Also directly from the article:

Meanwhile, other anecdotal results from the test-optional experiment are starting to trickle in. At one top-ranked liberal arts college, where 60 percent of the students who enrolled last year submitted scores, the admissions dean told me that the average first-year GPA for members of the freshman class that submitted scores was 3.57; for non-submitters it was 3.47. “Institutional research tells me the difference is statistically significant,” he said. Another admissions dean, from the selective private university weighing the “million-dollar question,” told me faculty members have informed him about students who have “a little less confidence” in the classroom. Since professors don’t know whether their students submitted scores, the admissions dean asks for names. He then looks them up. Most of the time, he said, the students didn’t submit scores. “The question is, if I’m coming in with a 1600 or a 1550 on the SAT, does that do something to my level of confidence in the classroom versus someone who just came in with grades?” this admissions dean wonders.



So anecdotal evidence after a year of a national pandemic and hybrid/remote learning is the same as a a longitudinal analysis from a top 20 school done prior to a pandemic?


Yes because it supports what I know to be true.


I hope you are being sarcastic because what "you know to be true" can't hold water agains longitudinal analysis from T20 prior to the pandemic to now. Well researched/deep surveys >school car pickup line gossip.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:See, the thing is, YOU have no idea how TO students do compared to the ones who submitted. It’s time to stop pretending the SAT/ACT is a proxy for innate ability or talent. They aren’t.


The Selingo article discussed the fact that schools that track have found that TO students are not performing as well. Of course, there is still very limited data, at least with respect to schools that went TO due to Covid.

I suspect that schools will react by making the standards for TO admission higher or letting fewer students in TO so they can more easily focus remedial efforts on them.


I don't think that's what the Selingo article said. From the article itself:


"For now, MIT remains in the minority in its claims about the predictive power of the SAT. In 2021, Wake Forest, which went test optional in 2008, released a longitudinal analysis that found that applicants who don’t submit scores — who are twice as likely to be low income, students of color, or the first in their family to go to college — have a lower GPA their first year at Wake Forest, but it narrows each subsequent year to a .03 difference by graduation with minimal difference in graduation rates. (Interestingly, students who withheld their scores even graduated at a slightly higher rate, at 90 percent, than those who sent scores, at 87 percent.) Studies of other colleges that went test optional before the pandemic have arrived at similar conclusions: After some time as an undergraduate, there isn’t much difference in the academic performance between students who submitted and those who didn’t."


Also directly from the article:

Meanwhile, other anecdotal results from the test-optional experiment are starting to trickle in. At one top-ranked liberal arts college, where 60 percent of the students who enrolled last year submitted scores, the admissions dean told me that the average first-year GPA for members of the freshman class that submitted scores was 3.57; for non-submitters it was 3.47. “Institutional research tells me the difference is statistically significant,” he said. Another admissions dean, from the selective private university weighing the “million-dollar question,” told me faculty members have informed him about students who have “a little less confidence” in the classroom. Since professors don’t know whether their students submitted scores, the admissions dean asks for names. He then looks them up. Most of the time, he said, the students didn’t submit scores. “The question is, if I’m coming in with a 1600 or a 1550 on the SAT, does that do something to my level of confidence in the classroom versus someone who just came in with grades?” this admissions dean wonders.


DP.. the level of confidence wouldn't be contingent on taking the SAT. It's more about whether your grades were inflated or not. If you got straight As, but only achieved it with a lot of support, retaking tests, graded on a curve, then your high grades that got you into college did you a disservice. But, if you got straight As without much support or retaking, then chances are you'd be more confident in college.


I would think that knowing you have an 1100 in a school where the average student has a score above 1400, that may lead to imposter syndrome. I assume that most non-submitters took the tests and didn't submit because they knew their scores were too low for admission, maybe I'm wrong there


You didn't consider that many students are not submitting test scores because they scored 1300/1350 when the average score is 1400. In other words, TO is causing students to only submit scores above the 50 percentile and at super reach schools, at or above the 75 percentile.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:See, the thing is, YOU have no idea how TO students do compared to the ones who submitted. It’s time to stop pretending the SAT/ACT is a proxy for innate ability or talent. They aren’t.


The Selingo article discussed the fact that schools that track have found that TO students are not performing as well. Of course, there is still very limited data, at least with respect to schools that went TO due to Covid.

I suspect that schools will react by making the standards for TO admission higher or letting fewer students in TO so they can more easily focus remedial efforts on them.


I don't think that's what the Selingo article said. From the article itself:


"For now, MIT remains in the minority in its claims about the predictive power of the SAT. In 2021, Wake Forest, which went test optional in 2008, released a longitudinal analysis that found that applicants who don’t submit scores — who are twice as likely to be low income, students of color, or the first in their family to go to college — have a lower GPA their first year at Wake Forest, but it narrows each subsequent year to a .03 difference by graduation with minimal difference in graduation rates. (Interestingly, students who withheld their scores even graduated at a slightly higher rate, at 90 percent, than those who sent scores, at 87 percent.) Studies of other colleges that went test optional before the pandemic have arrived at similar conclusions: After some time as an undergraduate, there isn’t much difference in the academic performance between students who submitted and those who didn’t."


Also directly from the article:

Meanwhile, other anecdotal results from the test-optional experiment are starting to trickle in. At one top-ranked liberal arts college, where 60 percent of the students who enrolled last year submitted scores, the admissions dean told me that the average first-year GPA for members of the freshman class that submitted scores was 3.57; for non-submitters it was 3.47. “Institutional research tells me the difference is statistically significant,” he said. Another admissions dean, from the selective private university weighing the “million-dollar question,” told me faculty members have informed him about students who have “a little less confidence” in the classroom. Since professors don’t know whether their students submitted scores, the admissions dean asks for names. He then looks them up. Most of the time, he said, the students didn’t submit scores. “The question is, if I’m coming in with a 1600 or a 1550 on the SAT, does that do something to my level of confidence in the classroom versus someone who just came in with grades?” this admissions dean wonders.


DP.. the level of confidence wouldn't be contingent on taking the SAT. It's more about whether your grades were inflated or not. If you got straight As, but only achieved it with a lot of support, retaking tests, graded on a curve, then your high grades that got you into college did you a disservice. But, if you got straight As without much support or retaking, then chances are you'd be more confident in college.


I would think that knowing you have an 1100 in a school where the average student has a score above 1400, that may lead to imposter syndrome. I assume that most non-submitters took the tests and didn't submit because they knew their scores were too low for admission, maybe I'm wrong there


You didn't consider that many students are not submitting test scores because they scored 1300/1350 when the average score is 1400. In other words, TO is causing students to only submit scores above the 50 percentile and at super reach schools, at or above the 75 percentile.


Very true. It’s getting more and more confusing.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:See, the thing is, YOU have no idea how TO students do compared to the ones who submitted. It’s time to stop pretending the SAT/ACT is a proxy for innate ability or talent. They aren’t.


The Selingo article discussed the fact that schools that track have found that TO students are not performing as well. Of course, there is still very limited data, at least with respect to schools that went TO due to Covid.

I suspect that schools will react by making the standards for TO admission higher or letting fewer students in TO so they can more easily focus remedial efforts on them.


I don't think that's what the Selingo article said. From the article itself:


"For now, MIT remains in the minority in its claims about the predictive power of the SAT. In 2021, Wake Forest, which went test optional in 2008, released a longitudinal analysis that found that applicants who don’t submit scores — who are twice as likely to be low income, students of color, or the first in their family to go to college — have a lower GPA their first year at Wake Forest, but it narrows each subsequent year to a .03 difference by graduation with minimal difference in graduation rates. (Interestingly, students who withheld their scores even graduated at a slightly higher rate, at 90 percent, than those who sent scores, at 87 percent.) Studies of other colleges that went test optional before the pandemic have arrived at similar conclusions: After some time as an undergraduate, there isn’t much difference in the academic performance between students who submitted and those who didn’t."



Also directly from the article:

Meanwhile, other anecdotal results from the test-optional experiment are starting to trickle in. At one top-ranked liberal arts college, where 60 percent of the students who enrolled last year submitted scores, the admissions dean told me that the average first-year GPA for members of the freshman class that submitted scores was 3.57; for non-submitters it was 3.47. “Institutional research tells me the difference is statistically significant,” he said. Another admissions dean, from the selective private university weighing the “million-dollar question,” told me faculty members have informed him about students who have “a little less confidence” in the classroom. Since professors don’t know whether their students submitted scores, the admissions dean asks for names. He then looks them up. Most of the time, he said, the students didn’t submit scores. “The question is, if I’m coming in with a 1600 or a 1550 on the SAT, does that do something to my level of confidence in the classroom versus someone who just came in with grades?” this admissions dean wonders.



So anecdotal evidence after a year of a national pandemic and hybrid/remote learning is the same as a a longitudinal analysis from a top 20 school done prior to a pandemic?


Yes because it supports what I know to be true.


I hope you are being sarcastic because what "you know to be true" can't hold water agains longitudinal analysis from T20 prior to the pandemic to now. Well researched/deep surveys >school car pickup line gossip.


Are you referring to the Wake Forest study? I don’t find that incredibly persuasive because it’s just one school.
post reply Forum Index » College and University Discussion
Message Quick Reply
Go to: