When you say t50...

Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:More educated parents and students aren't blindly following US News rankings any more. Since their last release was so heavily mocked and with availability of information about a school's outcomes and strength of majors and strength of students academic achievements, US News is a much more minor role player now than in the past.

Parents look at the cost of the school, the name brand of the school, the SAT averages of the school, the acceptance rate of the school and the outcomes of the school a lot more than some outdated magazine.



All of which are contained in....the USNews rankings! 🙂

It's a likely "first stop" for parents with other research options later. It is what it is.


No, it's missing many important factors and Contain some insignificant factors such as how many Pell grant students.
However it's still a nice reference for an initial screening.

At the end every year, we get the actual result of the collective decisions by the students.
The result is reflected in the combination of admission rate, yield rate, cohort quality, retention rate, and graduation rate.


That's why although USNWR removed factors like acceptance rate, parents and students pay good attention to it, and consider the competitive schools good schools in general.



All of the top 30 have either low or extremely low acceptance rates.


No, schools like UF, UVA, UCSD, UT have significantly higher acceptance rate than schools like Tufts, BU, Wake Forest, Northeastern, BC as students chose that way.



Acceptance rate can be manipulated (by inducing more applicants, most of the unqualified), so they correctly dropped it. It isn't the best indicator of true selectivity and quality of the enrolled student body.


Hence I said 1000 times the combination of acceptance rate, yield rate, cohort quality, retention rate, and graduation rate.


But continue to include the thing that can be manipulated? And don’t factor in what industry experts think about the quality of education taking place?

The problem with what you propose it can reward those schools where good students flock mostly because of non academic factors like location or dorms.

USNWR isn’t perfect but it’s the best option available and it’s not close.


USNWR significantly hurt their credibility when they started implicitly adjusting their rankings first to get a couple of Publics into the top 25 and then by adding social justice factors like economic mobility and Pell grant factors again designed to bump publics. That said the others are even worse.

But in the end it isn't that hard to line out ones that are obviously ridiculous and discount large moves that anyone thinking can see were caused by these adjustments rather than reality. Tufts and Middlebury didn't each drop about 15 spots in 5 years, no public is really a T20, etc. And most of all, you can't be that granular in the first place.


There’s this weird narrative that USNWR went too social justice, but the only metrics in 2025 close to what you describe are the Pell grant graduation rate and graduation performance that come to just 11%. That’s it. Nationwide 1/3 of college students are Pell recipients. It’s kind of absurd to not care one bit if 1/3 of a student body is performing more poorly than the rest because of economic factors. Not only do the bottom 1/3 count, but their being miserable would diminish the overall experience on campus for the other 2/3. I don’t agree with all of their ranks— far from it— but I don’t have to for them to be the best in the business. They just have to be better than the competition, which they are. I agree it’s odd when a school drops 15 spots over 5 years, but that happens more often and to a far greater extent (I’ve seen over 100
spots!) in other rankings. I agree it’s common for people to take rank too literally, but those who do so ignore the publication’s own advice:

“Many other factors, including some that can't be measured, should figure into your decision… Study the data that accompanies the actual rankings. You should not use the rankings as the sole basis for deciding on one school over another.”


They dropped factors that actually matter as well. Avg class size was dropped to help public’s. Pell grant has zero to do with educational quality, but faculty with terminal degrees does; one was added, one was dropped. It adds up to about 35% of the ranking all told.


Not sure I follow, but only 11% of the current ranking weighting specifically pertains to Pell grant recipients, who make up a large portion (1/3) of college students. That 11% involves graduation rates, which is certainly relevant to the quality of the educational experience for that third. If you are wondering why that third should get the extra emphasis, it’s cause schools otherwise can engineer higher ranks by under-admitting from financially constrained families who historically have lower grad rates, often because of family hardship or the burden of having to hold down a job while in school, and not academic performance per se. Families are still free to rule out publics if they don’t want them, but it’s a positive privates now have less of an artificial and unintentionally created incentive (by USNWR themselves in their older methodology) to under-admit the 1/3 most financially constrained. This was an example of USNWR listening to the universities themselves who proposed the change so they wouldn’t be penalized for doing what they felt was proper. Incidentally, per my 2011 copy of their guide, Pell grant recipients were receiving some extra emphasis even back then; they were less transparent on the exact amount, but it was under 7.5%.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:More educated parents and students aren't blindly following US News rankings any more. Since their last release was so heavily mocked and with availability of information about a school's outcomes and strength of majors and strength of students academic achievements, US News is a much more minor role player now than in the past.

Parents look at the cost of the school, the name brand of the school, the SAT averages of the school, the acceptance rate of the school and the outcomes of the school a lot more than some outdated magazine.



All of which are contained in....the USNews rankings! 🙂

It's a likely "first stop" for parents with other research options later. It is what it is.


No, it's missing many important factors and Contain some insignificant factors such as how many Pell grant students.
However it's still a nice reference for an initial screening.

At the end every year, we get the actual result of the collective decisions by the students.
The result is reflected in the combination of admission rate, yield rate, cohort quality, retention rate, and graduation rate.


That's why although USNWR removed factors like acceptance rate, parents and students pay good attention to it, and consider the competitive schools good schools in general.



All of the top 30 have either low or extremely low acceptance rates.


No, schools like UF, UVA, UCSD, UT have significantly higher acceptance rate than schools like Tufts, BU, Wake Forest, Northeastern, BC as students chose that way.



Acceptance rate can be manipulated (by inducing more applicants, most of the unqualified), so they correctly dropped it. It isn't the best indicator of true selectivity and quality of the enrolled student body.


Hence I said 1000 times the combination of acceptance rate, yield rate, cohort quality, retention rate, and graduation rate.


But continue to include the thing that can be manipulated? And don’t factor in what industry experts think about the quality of education taking place?

The problem with what you propose it can reward those schools where good students flock mostly because of non academic factors like location or dorms.

USNWR isn’t perfect but it’s the best option available and it’s not close.


USNWR significantly hurt their credibility when they started implicitly adjusting their rankings first to get a couple of Publics into the top 25 and then by adding social justice factors like economic mobility and Pell grant factors again designed to bump publics. That said the others are even worse.

But in the end it isn't that hard to line out ones that are obviously ridiculous and discount large moves that anyone thinking can see were caused by these adjustments rather than reality. Tufts and Middlebury didn't each drop about 15 spots in 5 years, no public is really a T20, etc. And most of all, you can't be that granular in the first place.


Except the top publics were already in the top 25 before these changes and some of them barely budged with the changes. But why deal in facts?


Because facts are friendly. I should have said top 20. No public ever cracked the top 20 prior to the changes. UCLA and UVA were the only others to ever enter the top 25. Happy to fix my small error because the argument is intact. The changes made do not reflect reality but rather were made to make a group of large universities in particular look better than they are. You sell more magazines to people interested in Michigan than Swarthmore.


Still wrong. Michigan and UVA were consistently in the top 25 in the 2000s, and UNC even had a few years there. UVA stayed there for most of the 2010s and Michigan returned there in 2019. UVA was inside the top 20 in the late 90s and UCLA was 19 in 2019.

See, the thing is, when you have a perspective of more than a couple of years, you realize these universities are ranked pretty close to where they’ve always been.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:More educated parents and students aren't blindly following US News rankings any more. Since their last release was so heavily mocked and with availability of information about a school's outcomes and strength of majors and strength of students academic achievements, US News is a much more minor role player now than in the past.

Parents look at the cost of the school, the name brand of the school, the SAT averages of the school, the acceptance rate of the school and the outcomes of the school a lot more than some outdated magazine.



All of which are contained in....the USNews rankings! 🙂

It's a likely "first stop" for parents with other research options later. It is what it is.


No, it's missing many important factors and Contain some insignificant factors such as how many Pell grant students.
However it's still a nice reference for an initial screening.

At the end every year, we get the actual result of the collective decisions by the students.
The result is reflected in the combination of admission rate, yield rate, cohort quality, retention rate, and graduation rate.


That's why although USNWR removed factors like acceptance rate, parents and students pay good attention to it, and consider the competitive schools good schools in general.



All of the top 30 have either low or extremely low acceptance rates.


No, schools like UF, UVA, UCSD, UT have significantly higher acceptance rate than schools like Tufts, BU, Wake Forest, Northeastern, BC as students chose that way.



Acceptance rate can be manipulated (by inducing more applicants, most of the unqualified), so they correctly dropped it. It isn't the best indicator of true selectivity and quality of the enrolled student body.


Hence I said 1000 times the combination of acceptance rate, yield rate, cohort quality, retention rate, and graduation rate.


But continue to include the thing that can be manipulated? And don’t factor in what industry experts think about the quality of education taking place?

The problem with what you propose it can reward those schools where good students flock mostly because of non academic factors like location or dorms.

USNWR isn’t perfect but it’s the best option available and it’s not close.


USNWR significantly hurt their credibility when they started implicitly adjusting their rankings first to get a couple of Publics into the top 25 and then by adding social justice factors like economic mobility and Pell grant factors again designed to bump publics. That said the others are even worse.

But in the end it isn't that hard to line out ones that are obviously ridiculous and discount large moves that anyone thinking can see were caused by these adjustments rather than reality. Tufts and Middlebury didn't each drop about 15 spots in 5 years, no public is really a T20, etc. And most of all, you can't be that granular in the first place.


There’s this weird narrative that USNWR went too social justice, but the only metrics in 2025 close to what you describe are the Pell grant graduation rate and graduation performance that come to just 11%. That’s it. Nationwide 1/3 of college students are Pell recipients. It’s kind of absurd to not care one bit if 1/3 of a student body is performing more poorly than the rest because of economic factors. Not only do the bottom 1/3 count, but their being miserable would diminish the overall experience on campus for the other 2/3. I don’t agree with all of their ranks— far from it— but I don’t have to for them to be the best in the business. They just have to be better than the competition, which they are. I agree it’s odd when a school drops 15 spots over 5 years, but that happens more often and to a far greater extent (I’ve seen over 100
spots!) in other rankings. I agree it’s common for people to take rank too literally, but those who do so ignore the publication’s own advice:

“Many other factors, including some that can't be measured, should figure into your decision… Study the data that accompanies the actual rankings. You should not use the rankings as the sole basis for deciding on one school over another.”


They dropped factors that actually matter as well. Avg class size was dropped to help public’s. Pell grant has zero to do with educational quality, but faculty with terminal degrees does; one was added, one was dropped. It adds up to about 35% of the ranking all told.


Not sure I follow, but only 11% of the current ranking weighting specifically pertains to Pell grant recipients, who make up a large portion (1/3) of college students. That 11% involves graduation rates, which is certainly relevant to the quality of the educational experience for that third. If you are wondering why that third should get the extra emphasis, it’s cause schools otherwise can engineer higher ranks by under-admitting from financially constrained families who historically have lower grad rates, often because of family hardship or the burden of having to hold down a job while in school, and not academic performance per se. Families are still free to rule out publics if they don’t want them, but it’s a positive privates now have less of an artificial and unintentionally created incentive (by USNWR themselves in their older methodology) to under-admit the 1/3 most financially constrained. This was an example of USNWR listening to the universities themselves who proposed the change so they wouldn’t be penalized for doing what they felt was proper. Incidentally, per my 2011 copy of their guide, Pell grant recipients were receiving some extra emphasis even back then; they were less transparent on the exact amount, but it was under 7.5%.


+100
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:More educated parents and students aren't blindly following US News rankings any more. Since their last release was so heavily mocked and with availability of information about a school's outcomes and strength of majors and strength of students academic achievements, US News is a much more minor role player now than in the past.

Parents look at the cost of the school, the name brand of the school, the SAT averages of the school, the acceptance rate of the school and the outcomes of the school a lot more than some outdated magazine.



All of which are contained in....the USNews rankings! 🙂

It's a likely "first stop" for parents with other research options later. It is what it is.


No, it's missing many important factors and Contain some insignificant factors such as how many Pell grant students.
However it's still a nice reference for an initial screening.

At the end every year, we get the actual result of the collective decisions by the students.
The result is reflected in the combination of admission rate, yield rate, cohort quality, retention rate, and graduation rate.


That's why although USNWR removed factors like acceptance rate, parents and students pay good attention to it, and consider the competitive schools good schools in general.



All of the top 30 have either low or extremely low acceptance rates.


No, schools like UF, UVA, UCSD, UT have significantly higher acceptance rate than schools like Tufts, BU, Wake Forest, Northeastern, BC as students chose that way.



Acceptance rate can be manipulated (by inducing more applicants, most of the unqualified), so they correctly dropped it. It isn't the best indicator of true selectivity and quality of the enrolled student body.


Hence I said 1000 times the combination of acceptance rate, yield rate, cohort quality, retention rate, and graduation rate.


But continue to include the thing that can be manipulated? And don’t factor in what industry experts think about the quality of education taking place?

The problem with what you propose it can reward those schools where good students flock mostly because of non academic factors like location or dorms.

USNWR isn’t perfect but it’s the best option available and it’s not close.


USNWR significantly hurt their credibility when they started implicitly adjusting their rankings first to get a couple of Publics into the top 25 and then by adding social justice factors like economic mobility and Pell grant factors again designed to bump publics. That said the others are even worse.

But in the end it isn't that hard to line out ones that are obviously ridiculous and discount large moves that anyone thinking can see were caused by these adjustments rather than reality. Tufts and Middlebury didn't each drop about 15 spots in 5 years, no public is really a T20, etc. And most of all, you can't be that granular in the first place.


There’s this weird narrative that USNWR went too social justice, but the only metrics in 2025 close to what you describe are the Pell grant graduation rate and graduation performance that come to just 11%. That’s it. Nationwide 1/3 of college students are Pell recipients. It’s kind of absurd to not care one bit if 1/3 of a student body is performing more poorly than the rest because of economic factors. Not only do the bottom 1/3 count, but their being miserable would diminish the overall experience on campus for the other 2/3. I don’t agree with all of their ranks— far from it— but I don’t have to for them to be the best in the business. They just have to be better than the competition, which they are. I agree it’s odd when a school drops 15 spots over 5 years, but that happens more often and to a far greater extent (I’ve seen over 100
spots!) in other rankings. I agree it’s common for people to take rank too literally, but those who do so ignore the publication’s own advice:

“Many other factors, including some that can't be measured, should figure into your decision… Study the data that accompanies the actual rankings. You should not use the rankings as the sole basis for deciding on one school over another.”


They dropped factors that actually matter as well. Avg class size was dropped to help public’s. Pell grant has zero to do with educational quality, but faculty with terminal degrees does; one was added, one was dropped. It adds up to about 35% of the ranking all told.


Not sure I follow, but only 11% of the current ranking weighting specifically pertains to Pell grant recipients, who make up a large portion (1/3) of college students. That 11% involves graduation rates, which is certainly relevant to the quality of the educational experience for that third. If you are wondering why that third should get the extra emphasis, it’s cause schools otherwise can engineer higher ranks by under-admitting from financially constrained families who historically have lower grad rates, often because of family hardship or the burden of having to hold down a job while in school, and not academic performance per se. Families are still free to rule out publics if they don’t want them, but it’s a positive privates now have less of an artificial and unintentionally created incentive (by USNWR themselves in their older methodology) to under-admit the 1/3 most financially constrained. This was an example of USNWR listening to the universities themselves who proposed the change so they wouldn’t be penalized for doing what they felt was proper. Incidentally, per my 2011 copy of their guide, Pell grant recipients were receiving some extra emphasis even back then; they were less transparent on the exact amount, but it was under 7.5%.


+1. The irony of the people who fly off the handle about the Pell grant criteria is that these are the same people around here who only value universities based on what they provide to undergrads and the undergrad experience (as opposed to research, grad school, etc). And the Pell grant criteria are actually a measure of the support provided to and the educational experience of undergrads, since these are the kids with the least support outside of school. And it incentivizes schools to focus more on undergraduate education and services. Ya know, the whole thing these people keeping claiming to care about.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:More educated parents and students aren't blindly following US News rankings any more. Since their last release was so heavily mocked and with availability of information about a school's outcomes and strength of majors and strength of students academic achievements, US News is a much more minor role player now than in the past.

Parents look at the cost of the school, the name brand of the school, the SAT averages of the school, the acceptance rate of the school and the outcomes of the school a lot more than some outdated magazine.



All of which are contained in....the USNews rankings! 🙂

It's a likely "first stop" for parents with other research options later. It is what it is.


No, it's missing many important factors and Contain some insignificant factors such as how many Pell grant students.
However it's still a nice reference for an initial screening.

At the end every year, we get the actual result of the collective decisions by the students.
The result is reflected in the combination of admission rate, yield rate, cohort quality, retention rate, and graduation rate.


That's why although USNWR removed factors like acceptance rate, parents and students pay good attention to it, and consider the competitive schools good schools in general.



All of the top 30 have either low or extremely low acceptance rates.


No, schools like UF, UVA, UCSD, UT have significantly higher acceptance rate than schools like Tufts, BU, Wake Forest, Northeastern, BC as students chose that way.



Acceptance rate can be manipulated (by inducing more applicants, most of the unqualified), so they correctly dropped it. It isn't the best indicator of true selectivity and quality of the enrolled student body.


Hence I said 1000 times the combination of acceptance rate, yield rate, cohort quality, retention rate, and graduation rate.


But continue to include the thing that can be manipulated? And don’t factor in what industry experts think about the quality of education taking place?

The problem with what you propose it can reward those schools where good students flock mostly because of non academic factors like location or dorms.

USNWR isn’t perfect but it’s the best option available and it’s not close.


USNWR significantly hurt their credibility when they started implicitly adjusting their rankings first to get a couple of Publics into the top 25 and then by adding social justice factors like economic mobility and Pell grant factors again designed to bump publics. That said the others are even worse.

But in the end it isn't that hard to line out ones that are obviously ridiculous and discount large moves that anyone thinking can see were caused by these adjustments rather than reality. Tufts and Middlebury didn't each drop about 15 spots in 5 years, no public is really a T20, etc. And most of all, you can't be that granular in the first place.


There’s this weird narrative that USNWR went too social justice, but the only metrics in 2025 close to what you describe are the Pell grant graduation rate and graduation performance that come to just 11%. That’s it. Nationwide 1/3 of college students are Pell recipients. It’s kind of absurd to not care one bit if 1/3 of a student body is performing more poorly than the rest because of economic factors. Not only do the bottom 1/3 count, but their being miserable would diminish the overall experience on campus for the other 2/3. I don’t agree with all of their ranks— far from it— but I don’t have to for them to be the best in the business. They just have to be better than the competition, which they are. I agree it’s odd when a school drops 15 spots over 5 years, but that happens more often and to a far greater extent (I’ve seen over 100
spots!) in other rankings. I agree it’s common for people to take rank too literally, but those who do so ignore the publication’s own advice:

“Many other factors, including some that can't be measured, should figure into your decision… Study the data that accompanies the actual rankings. You should not use the rankings as the sole basis for deciding on one school over another.”


Pell grant recipients almost get free college education.
It's the middle class that's getting huge burden with limited choices.
Admissions standard should be equal to everyone. It shouldn't be a factor for college ranking.

In fact, I would like if they rather give negative points for having more rich people if they want to do things right.



Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:More educated parents and students aren't blindly following US News rankings any more. Since their last release was so heavily mocked and with availability of information about a school's outcomes and strength of majors and strength of students academic achievements, US News is a much more minor role player now than in the past.

Parents look at the cost of the school, the name brand of the school, the SAT averages of the school, the acceptance rate of the school and the outcomes of the school a lot more than some outdated magazine.



All of which are contained in....the USNews rankings! 🙂

It's a likely "first stop" for parents with other research options later. It is what it is.


No, it's missing many important factors and Contain some insignificant factors such as how many Pell grant students.
However it's still a nice reference for an initial screening.

At the end every year, we get the actual result of the collective decisions by the students.
The result is reflected in the combination of admission rate, yield rate, cohort quality, retention rate, and graduation rate.


That's why although USNWR removed factors like acceptance rate, parents and students pay good attention to it, and consider the competitive schools good schools in general.



All of the top 30 have either low or extremely low acceptance rates.


No, schools like UF, UVA, UCSD, UT have significantly higher acceptance rate than schools like Tufts, BU, Wake Forest, Northeastern, BC as students chose that way.



Acceptance rate can be manipulated (by inducing more applicants, most of the unqualified), so they correctly dropped it. It isn't the best indicator of true selectivity and quality of the enrolled student body.


Hence I said 1000 times the combination of acceptance rate, yield rate, cohort quality, retention rate, and graduation rate.


But continue to include the thing that can be manipulated? And don’t factor in what industry experts think about the quality of education taking place?

The problem with what you propose it can reward those schools where good students flock mostly because of non academic factors like location or dorms.

USNWR isn’t perfect but it’s the best option available and it’s not close.


DP. It is fair to point out that US News does not match the behavior of the top students. Princeton is almost always #1 in US News but most people would rather go to Harvard, Stanford, or MIT. Hopkins is tied with Caltech in US News...but not in the real world. Etc.

Mismatches between US News and behavior happen because people disagree with the US News methodology.


I wouldn’t expect perfect adherence even if it were a perfect ranking (it is not) because there’s no shortage of poor decision-making made even with good info and, separately, lots of the other reasonable factors they themselves say should go into consideration but can’t be ranked.


Every year, we have the perfect ranking which is the collectives decisions by all the students who actually pay $$$
That is the ground truth ranking.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:More educated parents and students aren't blindly following US News rankings any more. Since their last release was so heavily mocked and with availability of information about a school's outcomes and strength of majors and strength of students academic achievements, US News is a much more minor role player now than in the past.

Parents look at the cost of the school, the name brand of the school, the SAT averages of the school, the acceptance rate of the school and the outcomes of the school a lot more than some outdated magazine.



All of which are contained in....the USNews rankings! 🙂

It's a likely "first stop" for parents with other research options later. It is what it is.


No, it's missing many important factors and Contain some insignificant factors such as how many Pell grant students.
However it's still a nice reference for an initial screening.

At the end every year, we get the actual result of the collective decisions by the students.
The result is reflected in the combination of admission rate, yield rate, cohort quality, retention rate, and graduation rate.


That's why although USNWR removed factors like acceptance rate, parents and students pay good attention to it, and consider the competitive schools good schools in general.



All of the top 30 have either low or extremely low acceptance rates.


No, schools like UF, UVA, UCSD, UT have significantly higher acceptance rate than schools like Tufts, BU, Wake Forest, Northeastern, BC as students chose that way.



Acceptance rate can be manipulated (by inducing more applicants, most of the unqualified), so they correctly dropped it. It isn't the best indicator of true selectivity and quality of the enrolled student body.


Hence I said 1000 times the combination of acceptance rate, yield rate, cohort quality, retention rate, and graduation rate.


But continue to include the thing that can be manipulated? And don’t factor in what industry experts think about the quality of education taking place?

The problem with what you propose it can reward those schools where good students flock mostly because of non academic factors like location or dorms.

USNWR isn’t perfect but it’s the best option available and it’s not close.


USNWR significantly hurt their credibility when they started implicitly adjusting their rankings first to get a couple of Publics into the top 25 and then by adding social justice factors like economic mobility and Pell grant factors again designed to bump publics. That said the others are even worse.

But in the end it isn't that hard to line out ones that are obviously ridiculous and discount large moves that anyone thinking can see were caused by these adjustments rather than reality. Tufts and Middlebury didn't each drop about 15 spots in 5 years, no public is really a T20, etc. And most of all, you can't be that granular in the first place.


There’s this weird narrative that USNWR went too social justice, but the only metrics in 2025 close to what you describe are the Pell grant graduation rate and graduation performance that come to just 11%. That’s it. Nationwide 1/3 of college students are Pell recipients. It’s kind of absurd to not care one bit if 1/3 of a student body is performing more poorly than the rest because of economic factors. Not only do the bottom 1/3 count, but their being miserable would diminish the overall experience on campus for the other 2/3. I don’t agree with all of their ranks— far from it— but I don’t have to for them to be the best in the business. They just have to be better than the competition, which they are. I agree it’s odd when a school drops 15 spots over 5 years, but that happens more often and to a far greater extent (I’ve seen over 100
spots!) in other rankings. I agree it’s common for people to take rank too literally, but those who do so ignore the publication’s own advice:

“Many other factors, including some that can't be measured, should figure into your decision… Study the data that accompanies the actual rankings. You should not use the rankings as the sole basis for deciding on one school over another.”


They dropped factors that actually matter as well. Avg class size was dropped to help public’s. Pell grant has zero to do with educational quality, but faculty with terminal degrees does; one was added, one was dropped. It adds up to about 35% of the ranking all told.


Not sure I follow, but only 11% of the current ranking weighting specifically pertains to Pell grant recipients, who make up a large portion (1/3) of college students. That 11% involves graduation rates, which is certainly relevant to the quality of the educational experience for that third. If you are wondering why that third should get the extra emphasis, it’s cause schools otherwise can engineer higher ranks by under-admitting from financially constrained families who historically have lower grad rates, often because of family hardship or the burden of having to hold down a job while in school, and not academic performance per se. Families are still free to rule out publics if they don’t want them, but it’s a positive privates now have less of an artificial and unintentionally created incentive (by USNWR themselves in their older methodology) to under-admit the 1/3 most financially constrained. This was an example of USNWR listening to the universities themselves who proposed the change so they wouldn’t be penalized for doing what they felt was proper. Incidentally, per my 2011 copy of their guide, Pell grant recipients were receiving some extra emphasis even back then; they were less transparent on the exact amount, but it was under 7.5%.



Good points. But look deeper into how US News measures things. They do not take into account the very generous non-loan financial aid that high endownment private universities often give to their students. More often than not, these students don't need to apply for Pell Grants because the university has already covered everything. But those private universities got penalized by the updated US News algorithm. It actually incentivizes private universities to give LESS financial aid in order to force more of their students toward federal financial aid. No matter how you look at it, it's very clear that US News very purposely changed their algorithm to boost public schools. Which, fine. It's their magazine. But there is a distinct difference between pre-2023 rankings and today.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:More educated parents and students aren't blindly following US News rankings any more. Since their last release was so heavily mocked and with availability of information about a school's outcomes and strength of majors and strength of students academic achievements, US News is a much more minor role player now than in the past.

Parents look at the cost of the school, the name brand of the school, the SAT averages of the school, the acceptance rate of the school and the outcomes of the school a lot more than some outdated magazine.



All of which are contained in....the USNews rankings! 🙂

It's a likely "first stop" for parents with other research options later. It is what it is.


No, it's missing many important factors and Contain some insignificant factors such as how many Pell grant students.
However it's still a nice reference for an initial screening.

At the end every year, we get the actual result of the collective decisions by the students.
The result is reflected in the combination of admission rate, yield rate, cohort quality, retention rate, and graduation rate.


That's why although USNWR removed factors like acceptance rate, parents and students pay good attention to it, and consider the competitive schools good schools in general.



All of the top 30 have either low or extremely low acceptance rates.


No, schools like UF, UVA, UCSD, UT have significantly higher acceptance rate than schools like Tufts, BU, Wake Forest, Northeastern, BC as students chose that way.



Acceptance rate can be manipulated (by inducing more applicants, most of the unqualified), so they correctly dropped it. It isn't the best indicator of true selectivity and quality of the enrolled student body.


Hence I said 1000 times the combination of acceptance rate, yield rate, cohort quality, retention rate, and graduation rate.


But continue to include the thing that can be manipulated? And don’t factor in what industry experts think about the quality of education taking place?

The problem with what you propose it can reward those schools where good students flock mostly because of non academic factors like location or dorms.

USNWR isn’t perfect but it’s the best option available and it’s not close.


USNWR significantly hurt their credibility when they started implicitly adjusting their rankings first to get a couple of Publics into the top 25 and then by adding social justice factors like economic mobility and Pell grant factors again designed to bump publics. That said the others are even worse.

But in the end it isn't that hard to line out ones that are obviously ridiculous and discount large moves that anyone thinking can see were caused by these adjustments rather than reality. Tufts and Middlebury didn't each drop about 15 spots in 5 years, no public is really a T20, etc. And most of all, you can't be that granular in the first place.


There’s this weird narrative that USNWR went too social justice, but the only metrics in 2025 close to what you describe are the Pell grant graduation rate and graduation performance that come to just 11%. That’s it. Nationwide 1/3 of college students are Pell recipients. It’s kind of absurd to not care one bit if 1/3 of a student body is performing more poorly than the rest because of economic factors. Not only do the bottom 1/3 count, but their being miserable would diminish the overall experience on campus for the other 2/3. I don’t agree with all of their ranks— far from it— but I don’t have to for them to be the best in the business. They just have to be better than the competition, which they are. I agree it’s odd when a school drops 15 spots over 5 years, but that happens more often and to a far greater extent (I’ve seen over 100
spots!) in other rankings. I agree it’s common for people to take rank too literally, but those who do so ignore the publication’s own advice:

“Many other factors, including some that can't be measured, should figure into your decision… Study the data that accompanies the actual rankings. You should not use the rankings as the sole basis for deciding on one school over another.”


They dropped factors that actually matter as well. Avg class size was dropped to help public’s. Pell grant has zero to do with educational quality, but faculty with terminal degrees does; one was added, one was dropped. It adds up to about 35% of the ranking all told.


Not sure I follow, but only 11% of the current ranking weighting specifically pertains to Pell grant recipients, who make up a large portion (1/3) of college students. That 11% involves graduation rates, which is certainly relevant to the quality of the educational experience for that third. If you are wondering why that third should get the extra emphasis, it’s cause schools otherwise can engineer higher ranks by under-admitting from financially constrained families who historically have lower grad rates, often because of family hardship or the burden of having to hold down a job while in school, and not academic performance per se. Families are still free to rule out publics if they don’t want them, but it’s a positive privates now have less of an artificial and unintentionally created incentive (by USNWR themselves in their older methodology) to under-admit the 1/3 most financially constrained. This was an example of USNWR listening to the universities themselves who proposed the change so they wouldn’t be penalized for doing what they felt was proper. Incidentally, per my 2011 copy of their guide, Pell grant recipients were receiving some extra emphasis even back then; they were less transparent on the exact amount, but it was under 7.5%.



Good points. But look deeper into how US News measures things. They do not take into account the very generous non-loan financial aid that high endownment private universities often give to their students. More often than not, these students don't need to apply for Pell Grants because the university has already covered everything. But those private universities got penalized by the updated US News algorithm. It actually incentivizes private universities to give LESS financial aid in order to force more of their students toward federal financial aid. No matter how you look at it, it's very clear that US News very purposely changed their algorithm to boost public schools. Which, fine. It's their magazine. But there is a distinct difference between pre-2023 rankings and today.


+++ agree, and am a former pellgrantee who got in to an ivy when the bar was not lowered for me, as were two of my 3 closest friends i met in our T5 law school. Different elite private undergrads with excellent financial aid changed our lives, and we did not have the stigma of getting a boost because we were poor. We knew we belonged there and got in on smarts. The boost for FG/LI has gotten way out of hand and it is a detriment to the students. Mental health suffers when you get to the ivy trying to go top-law with 1300s vs all your peers with 1530+. Back then FGLIs had the raw ability, often less-quality high school courses but we could catch up just fine because our score profiles were essentially the same. Plenty of poor kids score extremely well. the 200-250 point boost is not needed. Same with other DEI initiatives as well as all of TO. It causes more problems for those it is trying to help. The academic probation and other data from top schools proves it
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:More educated parents and students aren't blindly following US News rankings any more. Since their last release was so heavily mocked and with availability of information about a school's outcomes and strength of majors and strength of students academic achievements, US News is a much more minor role player now than in the past.

Parents look at the cost of the school, the name brand of the school, the SAT averages of the school, the acceptance rate of the school and the outcomes of the school a lot more than some outdated magazine.



All of which are contained in....the USNews rankings! 🙂

It's a likely "first stop" for parents with other research options later. It is what it is.


No, it's missing many important factors and Contain some insignificant factors such as how many Pell grant students.
However it's still a nice reference for an initial screening.

At the end every year, we get the actual result of the collective decisions by the students.
The result is reflected in the combination of admission rate, yield rate, cohort quality, retention rate, and graduation rate.


That's why although USNWR removed factors like acceptance rate, parents and students pay good attention to it, and consider the competitive schools good schools in general.



All of the top 30 have either low or extremely low acceptance rates.


No, schools like UF, UVA, UCSD, UT have significantly higher acceptance rate than schools like Tufts, BU, Wake Forest, Northeastern, BC as students chose that way.



Acceptance rate can be manipulated (by inducing more applicants, most of the unqualified), so they correctly dropped it. It isn't the best indicator of true selectivity and quality of the enrolled student body.


Hence I said 1000 times the combination of acceptance rate, yield rate, cohort quality, retention rate, and graduation rate.


But continue to include the thing that can be manipulated? And don’t factor in what industry experts think about the quality of education taking place?

The problem with what you propose it can reward those schools where good students flock mostly because of non academic factors like location or dorms.

USNWR isn’t perfect but it’s the best option available and it’s not close.


USNWR significantly hurt their credibility when they started implicitly adjusting their rankings first to get a couple of Publics into the top 25 and then by adding social justice factors like economic mobility and Pell grant factors again designed to bump publics. That said the others are even worse.

But in the end it isn't that hard to line out ones that are obviously ridiculous and discount large moves that anyone thinking can see were caused by these adjustments rather than reality. Tufts and Middlebury didn't each drop about 15 spots in 5 years, no public is really a T20, etc. And most of all, you can't be that granular in the first place.


There’s this weird narrative that USNWR went too social justice, but the only metrics in 2025 close to what you describe are the Pell grant graduation rate and graduation performance that come to just 11%. That’s it. Nationwide 1/3 of college students are Pell recipients. It’s kind of absurd to not care one bit if 1/3 of a student body is performing more poorly than the rest because of economic factors. Not only do the bottom 1/3 count, but their being miserable would diminish the overall experience on campus for the other 2/3. I don’t agree with all of their ranks— far from it— but I don’t have to for them to be the best in the business. They just have to be better than the competition, which they are. I agree it’s odd when a school drops 15 spots over 5 years, but that happens more often and to a far greater extent (I’ve seen over 100
spots!) in other rankings. I agree it’s common for people to take rank too literally, but those who do so ignore the publication’s own advice:

“Many other factors, including some that can't be measured, should figure into your decision… Study the data that accompanies the actual rankings. You should not use the rankings as the sole basis for deciding on one school over another.”


They dropped factors that actually matter as well. Avg class size was dropped to help public’s. Pell grant has zero to do with educational quality, but faculty with terminal degrees does; one was added, one was dropped. It adds up to about 35% of the ranking all told.


Not sure I follow, but only 11% of the current ranking weighting specifically pertains to Pell grant recipients, who make up a large portion (1/3) of college students. That 11% involves graduation rates, which is certainly relevant to the quality of the educational experience for that third. If you are wondering why that third should get the extra emphasis, it’s cause schools otherwise can engineer higher ranks by under-admitting from financially constrained families who historically have lower grad rates, often because of family hardship or the burden of having to hold down a job while in school, and not academic performance per se. Families are still free to rule out publics if they don’t want them, but it’s a positive privates now have less of an artificial and unintentionally created incentive (by USNWR themselves in their older methodology) to under-admit the 1/3 most financially constrained. This was an example of USNWR listening to the universities themselves who proposed the change so they wouldn’t be penalized for doing what they felt was proper. Incidentally, per my 2011 copy of their guide, Pell grant recipients were receiving some extra emphasis even back then; they were less transparent on the exact amount, but it was under 7.5%.



Good points. But look deeper into how US News measures things. They do not take into account the very generous non-loan financial aid that high endownment private universities often give to their students. More often than not, these students don't need to apply for Pell Grants because the university has already covered everything. But those private universities got penalized by the updated US News algorithm. It actually incentivizes private universities to give LESS financial aid in order to force more of their students toward federal financial aid. No matter how you look at it, it's very clear that US News very purposely changed their algorithm to boost public schools. Which, fine. It's their magazine. But there is a distinct difference between pre-2023 rankings and today.


+++ agree, and am a former pellgrantee who got in to an ivy when the bar was not lowered for me, as were two of my 3 closest friends i met in our T5 law school. Different elite private undergrads with excellent financial aid changed our lives, and we did not have the stigma of getting a boost because we were poor. We knew we belonged there and got in on smarts. The boost for FG/LI has gotten way out of hand and it is a detriment to the students. Mental health suffers when you get to the ivy trying to go top-law with 1300s vs all your peers with 1530+. Back then FGLIs had the raw ability, often less-quality high school courses but we could catch up just fine because our score profiles were essentially the same. Plenty of poor kids score extremely well. the 200-250 point boost is not needed. Same with other DEI initiatives as well as all of TO. It causes more problems for those it is trying to help. The academic probation and other data from top schools proves it


This point and the preceding one demonstrate that all of the people wringing their hands about the Pell grant criteria don’t even understand what it is measuring. It isn’t measuring the number of Pell grant recipients. It’s measuring their graduation rate and their graduation rate relative to non-Pell grant recipients. The only penalty is if the private schools suck at graduating these kids on time.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:More educated parents and students aren't blindly following US News rankings any more. Since their last release was so heavily mocked and with availability of information about a school's outcomes and strength of majors and strength of students academic achievements, US News is a much more minor role player now than in the past.

Parents look at the cost of the school, the name brand of the school, the SAT averages of the school, the acceptance rate of the school and the outcomes of the school a lot more than some outdated magazine.



All of which are contained in....the USNews rankings! 🙂

It's a likely "first stop" for parents with other research options later. It is what it is.


No, it's missing many important factors and Contain some insignificant factors such as how many Pell grant students.
However it's still a nice reference for an initial screening.

At the end every year, we get the actual result of the collective decisions by the students.
The result is reflected in the combination of admission rate, yield rate, cohort quality, retention rate, and graduation rate.


That's why although USNWR removed factors like acceptance rate, parents and students pay good attention to it, and consider the competitive schools good schools in general.



All of the top 30 have either low or extremely low acceptance rates.


No, schools like UF, UVA, UCSD, UT have significantly higher acceptance rate than schools like Tufts, BU, Wake Forest, Northeastern, BC as students chose that way.



Acceptance rate can be manipulated (by inducing more applicants, most of the unqualified), so they correctly dropped it. It isn't the best indicator of true selectivity and quality of the enrolled student body.


Hence I said 1000 times the combination of acceptance rate, yield rate, cohort quality, retention rate, and graduation rate.


But continue to include the thing that can be manipulated? And don’t factor in what industry experts think about the quality of education taking place?

The problem with what you propose it can reward those schools where good students flock mostly because of non academic factors like location or dorms.

USNWR isn’t perfect but it’s the best option available and it’s not close.


USNWR significantly hurt their credibility when they started implicitly adjusting their rankings first to get a couple of Publics into the top 25 and then by adding social justice factors like economic mobility and Pell grant factors again designed to bump publics. That said the others are even worse.

But in the end it isn't that hard to line out ones that are obviously ridiculous and discount large moves that anyone thinking can see were caused by these adjustments rather than reality. Tufts and Middlebury didn't each drop about 15 spots in 5 years, no public is really a T20, etc. And most of all, you can't be that granular in the first place.


There’s this weird narrative that USNWR went too social justice, but the only metrics in 2025 close to what you describe are the Pell grant graduation rate and graduation performance that come to just 11%. That’s it. Nationwide 1/3 of college students are Pell recipients. It’s kind of absurd to not care one bit if 1/3 of a student body is performing more poorly than the rest because of economic factors. Not only do the bottom 1/3 count, but their being miserable would diminish the overall experience on campus for the other 2/3. I don’t agree with all of their ranks— far from it— but I don’t have to for them to be the best in the business. They just have to be better than the competition, which they are. I agree it’s odd when a school drops 15 spots over 5 years, but that happens more often and to a far greater extent (I’ve seen over 100
spots!) in other rankings. I agree it’s common for people to take rank too literally, but those who do so ignore the publication’s own advice:

“Many other factors, including some that can't be measured, should figure into your decision… Study the data that accompanies the actual rankings. You should not use the rankings as the sole basis for deciding on one school over another.”


They dropped factors that actually matter as well. Avg class size was dropped to help public’s. Pell grant has zero to do with educational quality, but faculty with terminal degrees does; one was added, one was dropped. It adds up to about 35% of the ranking all told.


Not sure I follow, but only 11% of the current ranking weighting specifically pertains to Pell grant recipients, who make up a large portion (1/3) of college students. That 11% involves graduation rates, which is certainly relevant to the quality of the educational experience for that third. If you are wondering why that third should get the extra emphasis, it’s cause schools otherwise can engineer higher ranks by under-admitting from financially constrained families who historically have lower grad rates, often because of family hardship or the burden of having to hold down a job while in school, and not academic performance per se. Families are still free to rule out publics if they don’t want them, but it’s a positive privates now have less of an artificial and unintentionally created incentive (by USNWR themselves in their older methodology) to under-admit the 1/3 most financially constrained. This was an example of USNWR listening to the universities themselves who proposed the change so they wouldn’t be penalized for doing what they felt was proper. Incidentally, per my 2011 copy of their guide, Pell grant recipients were receiving some extra emphasis even back then; they were less transparent on the exact amount, but it was under 7.5%.



Good points. But look deeper into how US News measures things. They do not take into account the very generous non-loan financial aid that high endownment private universities often give to their students. More often than not, these students don't need to apply for Pell Grants because the university has already covered everything. But those private universities got penalized by the updated US News algorithm. It actually incentivizes private universities to give LESS financial aid in order to force more of their students toward federal financial aid. No matter how you look at it, it's very clear that US News very purposely changed their algorithm to boost public schools. Which, fine. It's their magazine. But there is a distinct difference between pre-2023 rankings and today.


+++ agree, and am a former pellgrantee who got in to an ivy when the bar was not lowered for me, as were two of my 3 closest friends i met in our T5 law school. Different elite private undergrads with excellent financial aid changed our lives, and we did not have the stigma of getting a boost because we were poor. We knew we belonged there and got in on smarts. The boost for FG/LI has gotten way out of hand and it is a detriment to the students. Mental health suffers when you get to the ivy trying to go top-law with 1300s vs all your peers with 1530+. Back then FGLIs had the raw ability, often less-quality high school courses but we could catch up just fine because our score profiles were essentially the same. Plenty of poor kids score extremely well. the 200-250 point boost is not needed. Same with other DEI initiatives as well as all of TO. It causes more problems for those it is trying to help. The academic probation and other data from top schools proves it


This point and the preceding one demonstrate that all of the people wringing their hands about the Pell grant criteria don’t even understand what it is measuring. It isn’t measuring the number of Pell grant recipients. It’s measuring their graduation rate and their graduation rate relative to non-Pell grant recipients. The only penalty is if the private schools suck at graduating these kids on time.


It's all about race. White men don't like the fact that the public colleges - most of which serve a more racially and socioeconomically diverse student population - are getting bumped up in the USNWR rankings. Some admit it, some don't.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:More educated parents and students aren't blindly following US News rankings any more. Since their last release was so heavily mocked and with availability of information about a school's outcomes and strength of majors and strength of students academic achievements, US News is a much more minor role player now than in the past.

Parents look at the cost of the school, the name brand of the school, the SAT averages of the school, the acceptance rate of the school and the outcomes of the school a lot more than some outdated magazine.



All of which are contained in....the USNews rankings! 🙂

It's a likely "first stop" for parents with other research options later. It is what it is.


No, it's missing many important factors and Contain some insignificant factors such as how many Pell grant students.
However it's still a nice reference for an initial screening.

At the end every year, we get the actual result of the collective decisions by the students.
The result is reflected in the combination of admission rate, yield rate, cohort quality, retention rate, and graduation rate.


That's why although USNWR removed factors like acceptance rate, parents and students pay good attention to it, and consider the competitive schools good schools in general.



All of the top 30 have either low or extremely low acceptance rates.


No, schools like UF, UVA, UCSD, UT have significantly higher acceptance rate than schools like Tufts, BU, Wake Forest, Northeastern, BC as students chose that way.



Acceptance rate can be manipulated (by inducing more applicants, most of the unqualified), so they correctly dropped it. It isn't the best indicator of true selectivity and quality of the enrolled student body.


Hence I said 1000 times the combination of acceptance rate, yield rate, cohort quality, retention rate, and graduation rate.


But continue to include the thing that can be manipulated? And don’t factor in what industry experts think about the quality of education taking place?

The problem with what you propose it can reward those schools where good students flock mostly because of non academic factors like location or dorms.

USNWR isn’t perfect but it’s the best option available and it’s not close.


USNWR significantly hurt their credibility when they started implicitly adjusting their rankings first to get a couple of Publics into the top 25 and then by adding social justice factors like economic mobility and Pell grant factors again designed to bump publics. That said the others are even worse.

But in the end it isn't that hard to line out ones that are obviously ridiculous and discount large moves that anyone thinking can see were caused by these adjustments rather than reality. Tufts and Middlebury didn't each drop about 15 spots in 5 years, no public is really a T20, etc. And most of all, you can't be that granular in the first place.


There’s this weird narrative that USNWR went too social justice, but the only metrics in 2025 close to what you describe are the Pell grant graduation rate and graduation performance that come to just 11%. That’s it. Nationwide 1/3 of college students are Pell recipients. It’s kind of absurd to not care one bit if 1/3 of a student body is performing more poorly than the rest because of economic factors. Not only do the bottom 1/3 count, but their being miserable would diminish the overall experience on campus for the other 2/3. I don’t agree with all of their ranks— far from it— but I don’t have to for them to be the best in the business. They just have to be better than the competition, which they are. I agree it’s odd when a school drops 15 spots over 5 years, but that happens more often and to a far greater extent (I’ve seen over 100
spots!) in other rankings. I agree it’s common for people to take rank too literally, but those who do so ignore the publication’s own advice:

“Many other factors, including some that can't be measured, should figure into your decision… Study the data that accompanies the actual rankings. You should not use the rankings as the sole basis for deciding on one school over another.”


They dropped factors that actually matter as well. Avg class size was dropped to help public’s. Pell grant has zero to do with educational quality, but faculty with terminal degrees does; one was added, one was dropped. It adds up to about 35% of the ranking all told.


Not sure I follow, but only 11% of the current ranking weighting specifically pertains to Pell grant recipients, who make up a large portion (1/3) of college students. That 11% involves graduation rates, which is certainly relevant to the quality of the educational experience for that third. If you are wondering why that third should get the extra emphasis, it’s cause schools otherwise can engineer higher ranks by under-admitting from financially constrained families who historically have lower grad rates, often because of family hardship or the burden of having to hold down a job while in school, and not academic performance per se. Families are still free to rule out publics if they don’t want them, but it’s a positive privates now have less of an artificial and unintentionally created incentive (by USNWR themselves in their older methodology) to under-admit the 1/3 most financially constrained. This was an example of USNWR listening to the universities themselves who proposed the change so they wouldn’t be penalized for doing what they felt was proper. Incidentally, per my 2011 copy of their guide, Pell grant recipients were receiving some extra emphasis even back then; they were less transparent on the exact amount, but it was under 7.5%.


+1. The irony of the people who fly off the handle about the Pell grant criteria is that these are the same people around here who only value universities based on what they provide to undergrads and the undergrad experience (as opposed to research, grad school, etc). And the Pell grant criteria are actually a measure of the support provided to and the educational experience of undergrads, since these are the kids with the least support outside of school. And it incentivizes schools to focus more on undergraduate education and services. Ya know, the whole thing these people keeping claiming to care about.


The Pell Grant data is strongly weighted by the percentage of students that are Pell Grant. There are in many cases schools with lower Pell Grant percentages that have better Pell Grant student outcomes and other student outcomes (graduation rate, etc.) than schools that are now ranked schools. In other words, if you are a Pell Grant student (or not) those lower ranked schools may be better for you based on many other metrics.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:[mastodon]
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:More educated parents and students aren't blindly following US News rankings any more. Since their last release was so heavily mocked and with availability of information about a school's outcomes and strength of majors and strength of students academic achievements, US News is a much more minor role player now than in the past.

Parents look at the cost of the school, the name brand of the school, the SAT averages of the school, the acceptance rate of the school and the outcomes of the school a lot more than some outdated magazine.



All of which are contained in....the USNews rankings! 🙂

It's a likely "first stop" for parents with other research options later. It is what it is.


No, it's missing many important factors and Contain some insignificant factors such as how many Pell grant students.
However it's still a nice reference for an initial screening.

At the end every year, we get the actual result of the collective decisions by the students.
The result is reflected in the combination of admission rate, yield rate, cohort quality, retention rate, and graduation rate.


That's why although USNWR removed factors like acceptance rate, parents and students pay good attention to it, and consider the competitive schools good schools in general.



All of the top 30 have either low or extremely low acceptance rates.


No, schools like UF, UVA, UCSD, UT have significantly higher acceptance rate than schools like Tufts, BU, Wake Forest, Northeastern, BC as students chose that way.



Acceptance rate can be manipulated (by inducing more applicants, most of the unqualified), so they correctly dropped it. It isn't the best indicator of true selectivity and quality of the enrolled student body.


Hence I said 1000 times the combination of acceptance rate, yield rate, cohort quality, retention rate, and graduation rate.


But continue to include the thing that can be manipulated? And don’t factor in what industry experts think about the quality of education taking place?

The problem with what you propose it can reward those schools where good students flock mostly because of non academic factors like location or dorms.

USNWR isn’t perfect but it’s the best option available and it’s not close.


DP. It is fair to point out that US News does not match the behavior of the top students. Princeton is almost always #1 in US News but most people would rather go to Harvard, Stanford, or MIT. Hopkins is tied with Caltech in US News...but not in the real world. Etc.

Mismatches between US News and behavior happen because people disagree with the US News methodology.


Lots of kids would rather go to Princeton. And this is the problem with your whole “we get a real world result every year.” We actually don’t, because it’s messy, and it’s why you can’t answer anyone’s question about where schools fall relative to others and how your criteria should be applied.

Like 10 pages here of just absolute nonsense.


Some kids would rather go to Princeton but you are misinformed if you think people want to go there as much as Harvard, Stanford, or MIT. Look at the yields and acceptance rates.


DP. I think academia broadly agrees with USNWR on Princeton over Harvard for undergrad. Honestly I think if there were a secret vote amongst even Harvard’s own faculty Princeton might win.


If you poll Academia you would find that outside of engineering and CS they would put the top SLACs above everyone for undergraduate education. They will tell you that the small class size and teaching model creates better thinkers.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:More educated parents and students aren't blindly following US News rankings any more. Since their last release was so heavily mocked and with availability of information about a school's outcomes and strength of majors and strength of students academic achievements, US News is a much more minor role player now than in the past.

Parents look at the cost of the school, the name brand of the school, the SAT averages of the school, the acceptance rate of the school and the outcomes of the school a lot more than some outdated magazine.



All of which are contained in....the USNews rankings! 🙂

It's a likely "first stop" for parents with other research options later. It is what it is.


No, it's missing many important factors and Contain some insignificant factors such as how many Pell grant students.
However it's still a nice reference for an initial screening.

At the end every year, we get the actual result of the collective decisions by the students.
The result is reflected in the combination of admission rate, yield rate, cohort quality, retention rate, and graduation rate.


That's why although USNWR removed factors like acceptance rate, parents and students pay good attention to it, and consider the competitive schools good schools in general.



All of the top 30 have either low or extremely low acceptance rates.


No, schools like UF, UVA, UCSD, UT have significantly higher acceptance rate than schools like Tufts, BU, Wake Forest, Northeastern, BC as students chose that way.



Acceptance rate can be manipulated (by inducing more applicants, most of the unqualified), so they correctly dropped it. It isn't the best indicator of true selectivity and quality of the enrolled student body.


Hence I said 1000 times the combination of acceptance rate, yield rate, cohort quality, retention rate, and graduation rate.


But continue to include the thing that can be manipulated? And don’t factor in what industry experts think about the quality of education taking place?

The problem with what you propose it can reward those schools where good students flock mostly because of non academic factors like location or dorms.

USNWR isn’t perfect but it’s the best option available and it’s not close.


USNWR significantly hurt their credibility when they started implicitly adjusting their rankings first to get a couple of Publics into the top 25 and then by adding social justice factors like economic mobility and Pell grant factors again designed to bump publics. That said the others are even worse.

But in the end it isn't that hard to line out ones that are obviously ridiculous and discount large moves that anyone thinking can see were caused by these adjustments rather than reality. Tufts and Middlebury didn't each drop about 15 spots in 5 years, no public is really a T20, etc. And most of all, you can't be that granular in the first place.


There’s this weird narrative that USNWR went too social justice, but the only metrics in 2025 close to what you describe are the Pell grant graduation rate and graduation performance that come to just 11%. That’s it. Nationwide 1/3 of college students are Pell recipients. It’s kind of absurd to not care one bit if 1/3 of a student body is performing more poorly than the rest because of economic factors. Not only do the bottom 1/3 count, but their being miserable would diminish the overall experience on campus for the other 2/3. I don’t agree with all of their ranks— far from it— but I don’t have to for them to be the best in the business. They just have to be better than the competition, which they are. I agree it’s odd when a school drops 15 spots over 5 years, but that happens more often and to a far greater extent (I’ve seen over 100
spots!) in other rankings. I agree it’s common for people to take rank too literally, but those who do so ignore the publication’s own advice:

“Many other factors, including some that can't be measured, should figure into your decision… Study the data that accompanies the actual rankings. You should not use the rankings as the sole basis for deciding on one school over another.”


They dropped factors that actually matter as well. Avg class size was dropped to help public’s. Pell grant has zero to do with educational quality, but faculty with terminal degrees does; one was added, one was dropped. It adds up to about 35% of the ranking all told.


Not sure I follow, but only 11% of the current ranking weighting specifically pertains to Pell grant recipients, who make up a large portion (1/3) of college students. That 11% involves graduation rates, which is certainly relevant to the quality of the educational experience for that third. If you are wondering why that third should get the extra emphasis, it’s cause schools otherwise can engineer higher ranks by under-admitting from financially constrained families who historically have lower grad rates, often because of family hardship or the burden of having to hold down a job while in school, and not academic performance per se. Families are still free to rule out publics if they don’t want them, but it’s a positive privates now have less of an artificial and unintentionally created incentive (by USNWR themselves in their older methodology) to under-admit the 1/3 most financially constrained. This was an example of USNWR listening to the universities themselves who proposed the change so they wouldn’t be penalized for doing what they felt was proper. Incidentally, per my 2011 copy of their guide, Pell grant recipients were receiving some extra emphasis even back then; they were less transparent on the exact amount, but it was under 7.5%.



Good points. But look deeper into how US News measures things. They do not take into account the very generous non-loan financial aid that high endownment private universities often give to their students. More often than not, these students don't need to apply for Pell Grants because the university has already covered everything. But those private universities got penalized by the updated US News algorithm. It actually incentivizes private universities to give LESS financial aid in order to force more of their students toward federal financial aid. No matter how you look at it, it's very clear that US News very purposely changed their algorithm to boost public schools. Which, fine. It's their magazine. But there is a distinct difference between pre-2023 rankings and today.


They still have to apply for the Pell grant, none of these schools are giving that money back to the govt.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:More educated parents and students aren't blindly following US News rankings any more. Since their last release was so heavily mocked and with availability of information about a school's outcomes and strength of majors and strength of students academic achievements, US News is a much more minor role player now than in the past.

Parents look at the cost of the school, the name brand of the school, the SAT averages of the school, the acceptance rate of the school and the outcomes of the school a lot more than some outdated magazine.



All of which are contained in....the USNews rankings! 🙂

It's a likely "first stop" for parents with other research options later. It is what it is.


No, it's missing many important factors and Contain some insignificant factors such as how many Pell grant students.
However it's still a nice reference for an initial screening.

At the end every year, we get the actual result of the collective decisions by the students.
The result is reflected in the combination of admission rate, yield rate, cohort quality, retention rate, and graduation rate.


That's why although USNWR removed factors like acceptance rate, parents and students pay good attention to it, and consider the competitive schools good schools in general.



All of the top 30 have either low or extremely low acceptance rates.


No, schools like UF, UVA, UCSD, UT have significantly higher acceptance rate than schools like Tufts, BU, Wake Forest, Northeastern, BC as students chose that way.



Acceptance rate can be manipulated (by inducing more applicants, most of the unqualified), so they correctly dropped it. It isn't the best indicator of true selectivity and quality of the enrolled student body.


Hence I said 1000 times the combination of acceptance rate, yield rate, cohort quality, retention rate, and graduation rate.


But continue to include the thing that can be manipulated? And don’t factor in what industry experts think about the quality of education taking place?

The problem with what you propose it can reward those schools where good students flock mostly because of non academic factors like location or dorms.

USNWR isn’t perfect but it’s the best option available and it’s not close.


USNWR significantly hurt their credibility when they started implicitly adjusting their rankings first to get a couple of Publics into the top 25 and then by adding social justice factors like economic mobility and Pell grant factors again designed to bump publics. That said the others are even worse.

But in the end it isn't that hard to line out ones that are obviously ridiculous and discount large moves that anyone thinking can see were caused by these adjustments rather than reality. Tufts and Middlebury didn't each drop about 15 spots in 5 years, no public is really a T20, etc. And most of all, you can't be that granular in the first place.


There’s this weird narrative that USNWR went too social justice, but the only metrics in 2025 close to what you describe are the Pell grant graduation rate and graduation performance that come to just 11%. That’s it. Nationwide 1/3 of college students are Pell recipients. It’s kind of absurd to not care one bit if 1/3 of a student body is performing more poorly than the rest because of economic factors. Not only do the bottom 1/3 count, but their being miserable would diminish the overall experience on campus for the other 2/3. I don’t agree with all of their ranks— far from it— but I don’t have to for them to be the best in the business. They just have to be better than the competition, which they are. I agree it’s odd when a school drops 15 spots over 5 years, but that happens more often and to a far greater extent (I’ve seen over 100
spots!) in other rankings. I agree it’s common for people to take rank too literally, but those who do so ignore the publication’s own advice:

“Many other factors, including some that can't be measured, should figure into your decision… Study the data that accompanies the actual rankings. You should not use the rankings as the sole basis for deciding on one school over another.”


They dropped factors that actually matter as well. Avg class size was dropped to help public’s. Pell grant has zero to do with educational quality, but faculty with terminal degrees does; one was added, one was dropped. It adds up to about 35% of the ranking all told.


Not sure I follow, but only 11% of the current ranking weighting specifically pertains to Pell grant recipients, who make up a large portion (1/3) of college students. That 11% involves graduation rates, which is certainly relevant to the quality of the educational experience for that third. If you are wondering why that third should get the extra emphasis, it’s cause schools otherwise can engineer higher ranks by under-admitting from financially constrained families who historically have lower grad rates, often because of family hardship or the burden of having to hold down a job while in school, and not academic performance per se. Families are still free to rule out publics if they don’t want them, but it’s a positive privates now have less of an artificial and unintentionally created incentive (by USNWR themselves in their older methodology) to under-admit the 1/3 most financially constrained. This was an example of USNWR listening to the universities themselves who proposed the change so they wouldn’t be penalized for doing what they felt was proper. Incidentally, per my 2011 copy of their guide, Pell grant recipients were receiving some extra emphasis even back then; they were less transparent on the exact amount, but it was under 7.5%.



Good points. But look deeper into how US News measures things. They do not take into account the very generous non-loan financial aid that high endownment private universities often give to their students. More often than not, these students don't need to apply for Pell Grants because the university has already covered everything. But those private universities got penalized by the updated US News algorithm. It actually incentivizes private universities to give LESS financial aid in order to force more of their students toward federal financial aid. No matter how you look at it, it's very clear that US News very purposely changed their algorithm to boost public schools. Which, fine. It's their magazine. But there is a distinct difference between pre-2023 rankings and today.


+++ agree, and am a former pellgrantee who got in to an ivy when the bar was not lowered for me, as were two of my 3 closest friends i met in our T5 law school. Different elite private undergrads with excellent financial aid changed our lives, and we did not have the stigma of getting a boost because we were poor. We knew we belonged there and got in on smarts. The boost for FG/LI has gotten way out of hand and it is a detriment to the students. Mental health suffers when you get to the ivy trying to go top-law with 1300s vs all your peers with 1530+. Back then FGLIs had the raw ability, often less-quality high school courses but we could catch up just fine because our score profiles were essentially the same. Plenty of poor kids score extremely well. the 200-250 point boost is not needed. Same with other DEI initiatives as well as all of TO. It causes more problems for those it is trying to help. The academic probation and other data from top schools proves it


Plenty of actual data available which proves you wrong when it comes to rural and FGLI applicants. Not saying that I agree with the policies as written but the evidence is there.

I was like you, rural FGLI scored great and did great. It was much simpler then, most people stayed regional, no common app meant you did limited applications. The EC stuff is out of hand because of all the apps, etc. It was easier for us.

When I applied to Cornell acceptance rates were over 30% and my stats were enough over the averages that I knew I would get in and I did. Today even the best candidate cannot be sure.
Anonymous
1. Take US News Top 50
2. Remove these 5: UC Davis, UC Irvine, UC Santa Barbara, Wisconsin, and Illinois
3. Insert these 5: BU, Northeastern, William and Mary, Wake Forest, Rochester
There’s your top 50…
post reply Forum Index » College and University Discussion
Message Quick Reply
Go to: