why the usnwr best college list is ridiculous

Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
The most heavily weighted "peer assessment" 20% ha!


So? Colleges know a lot about each other. It’s a whole industry. People work or attend multiple institutions over time, meeting others who have done the same. It’s not unlike people in whatever line of business you might be familiar with knowing more about their competition than the general public. Peer review is integral to academia in general, and this peer assessment category which only USNWR has meaningful data on is a big part of what their rankings have fewer sanity test fails than those from other publications (like WSJ’s having UC Merced and San Jose State over Duke, Northwestern, UChicago, and Johns Hopkins). WSJ, Forbes, and even Niche over-rely on data only a few years out from college to try to do something different. All rankings are flawed, but USNWR is the least flawed, the least volatile, and more influential than all the others combined.

This makes no sense to me.
No, Amherst knows very little about Pomona and Pomona little about Wellesley. It’s not like their offices are in communication with each other. Their professors may have never even heard of the other and attended undergraduate universities.

Many of my professors at Williams hadn’t heard of the school until they put their job application in. These are people with PhDs from Harvard and uchicago


I actually don't believe you. Maybe 30 years ago when there was no internet, but today all professors know of Pomona's rise on USNWR.


When I applied to college in 1989, Pomona was 4 on USNWR liberal arts college list so it’s not like this is a new thing. I think PP is just ignorant — many of my fellow students from Amherst are professors at USC, Stanford, Hopkins, and GW, as well as at Amherst itself. PhD program are riddled with grads from Amherst, Williams, etc — that’s what these schools do best is send kids to academia or law school. Most of these professors know each other and go to conferences with each other and read/anssign each others books, and many are trying to parlay their job at one school into a job at another school.
Anonymous
Anonymous wrote:Haven't read the thread, but yes: USNWR is ridiculous. Not because of this year's criteria, or last year's, or any year before that.

But because the very idea of creating an ordinal ranking of something as complex and personal and multivariate education is absurd -- especially given that each institution serves a huge and dynamic array of individuals, each of whom has distinct goals, interests, backgrounds, and approaches to learning.


+1

It is mainly for ego stroking and internet spats. The institutions really care though - too much. My dh is in higher ed advancement and called to let me know the rankings were out and I thought he meant college football. Big whoop.
Anonymous
I would love a better ranking or assessment of engineering schools. I feel like for general liberal arts or even stuff like pre-med, I have a general sense of where the schools stand. But when it comes to something like engineering, I have no idea once you get past the top few programs. Is there anyone that does a more real ranking of the top engineering schools?
Anonymous
Anonymous wrote:I would love a better ranking or assessment of engineering schools. I feel like for general liberal arts or even stuff like pre-med, I have a general sense of where the schools stand. But when it comes to something like engineering, I have no idea once you get past the top few programs. Is there anyone that does a more real ranking of the top engineering schools?


USNWR does subject ranking. Here's engineering.https://www.usnews.com/best-colleges/rankings/engineering
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
The most heavily weighted "peer assessment" 20% ha!


So? Colleges know a lot about each other. It’s a whole industry. People work or attend multiple institutions over time, meeting others who have done the same. It’s not unlike people in whatever line of business you might be familiar with knowing more about their competition than the general public. Peer review is integral to academia in general, and this peer assessment category which only USNWR has meaningful data on is a big part of what their rankings have fewer sanity test fails than those from other publications (like WSJ’s having UC Merced and San Jose State over Duke, Northwestern, UChicago, and Johns Hopkins). WSJ, Forbes, and even Niche over-rely on data only a few years out from college to try to do something different. All rankings are flawed, but USNWR is the least flawed, the least volatile, and more influential than all the others combined.

This makes no sense to me.
No, Amherst knows very little about Pomona and Pomona little about Wellesley. It’s not like their offices are in communication with each other. Their professors may have never even heard of the other and attended undergraduate universities.

Many of my professors at Williams hadn’t heard of the school until they put their job application in. These are people with PhDs from Harvard and uchicago


I actually don't believe you. Maybe 30 years ago when there was no internet, but today all professors know of Pomona's rise on USNWR.


When I applied to college in 1989, Pomona was 4 on USNWR liberal arts college list so it’s not like this is a new thing. I think PP is just ignorant — many of my fellow students from Amherst are professors at USC, Stanford, Hopkins, and GW, as well as at Amherst itself. PhD program are riddled with grads from Amherst, Williams, etc — that’s what these schools do best is send kids to academia or law school. Most of these professors know each other and go to conferences with each other and read/anssign each others books, and many are trying to parlay their job at one school into a job at another school.



Pomona was listed in the top ten in 1989 but wasn't no 4 until 2023.
Anonymous
Anonymous wrote:
Anonymous wrote:...Vanderbilt University was so thoroughly enraged by its five-spot dip in the U.S. News standings ...


Understandable, since some people put so much weight on them, but hopefully there will be a growing understanding that while there's clearly a real and significant difference between the educational environment at #10 and #110, there is effectively no difference between #100 and #110, or between #10 and #15.



I’m not even sure I’d agree with this. A difference in cachet? Yes. Educational environment? Who knows. It’s not impossible that 110 is better than 10 in some departments.
Anonymous
Anonymous wrote:it's still the gold standard, whether you like it or not.


lol, no one serious uses this list as anything but entertainment.
Anonymous
Anonymous wrote:Simple solution. Put selectivity (test scores and class rank of enrolled students) back into the equation after 2025 (since many test optional schools are reverting by then). Those that remain test optional will be dinged.


Those numbers are grossly manipulated by school and do not show what you think they show.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:...Vanderbilt University was so thoroughly enraged by its five-spot dip in the U.S. News standings ...


Understandable, since some people put so much weight on them, but hopefully there will be a growing understanding that while there's clearly a real and significant difference between the educational environment at #10 and #110, there is effectively no difference between #100 and #110, or between #10 and #15.



I’m not even sure I’d agree with this. A difference in cachet? Yes. Educational environment? Who knows. It’s not impossible that 110 is better than 10 in some departments.


+1 I'm guessing PP simply doesn't know enough about schools throughout the ranking.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:it's still the gold standard, whether you like it or not.


For what. This list does not guarantee outcomes. I attended a shi----y big midwestern University with no prestige. But you would not know it unless I told you. And there are many like me.


The problem is that the people who have verifiably the best financial outcomes all went to top schools (or dropped out of them). Gates, Zuckerberg, Bezos, Altman, Schwarzman, Griffin, et al.

Nobody ever says what their own outcome actually is, nor names their friend/cousin/friend's cousin that supposedly also went to some no-name shitty university and is now super-successful.

So, if all anyone will offer on DCUM is anecdotes without being willing to provide any specifics...you won't persuade anyone.


OK but these are 1 out of millions. The odds of anyone ending up like them are supremely low. These people are also likely sociopaths as well.


I am not disputing they are 1 out of millions (though, I do dispute they are likely sociopaths).

My only point is that there are too many folks on DCUM that point to some supposed great outcome of them or someone they know, without indicating who that person is or what exactly their great outcome may be. Just "trust me, I or my best friend's third cousin is really successful even though they attended some no name school".



You could, if you chose, list the giants who did not attend those schools. You chose not to.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
The most heavily weighted "peer assessment" 20% ha!


So? Colleges know a lot about each other. It’s a whole industry. People work or attend multiple institutions over time, meeting others who have done the same. It’s not unlike people in whatever line of business you might be familiar with knowing more about their competition than the general public. Peer review is integral to academia in general, and this peer assessment category which only USNWR has meaningful data on is a big part of what their rankings have fewer sanity test fails than those from other publications (like WSJ’s having UC Merced and San Jose State over Duke, Northwestern, UChicago, and Johns Hopkins). WSJ, Forbes, and even Niche over-rely on data only a few years out from college to try to do something different. All rankings are flawed, but USNWR is the least flawed, the least volatile, and more influential than all the others combined.

This makes no sense to me.
No, Amherst knows very little about Pomona and Pomona little about Wellesley. It’s not like their offices are in communication with each other. Their professors may have never even heard of the other and attended undergraduate universities.

Many of my professors at Williams hadn’t heard of the school until they put their job application in. These are people with PhDs from Harvard and uchicago


I actually don't believe you. Maybe 30 years ago when there was no internet, but today all professors know of Pomona's rise on USNWR.


When I applied to college in 1989, Pomona was 4 on USNWR liberal arts college list so it’s not like this is a new thing. I think PP is just ignorant — many of my fellow students from Amherst are professors at USC, Stanford, Hopkins, and GW, as well as at Amherst itself. PhD program are riddled with grads from Amherst, Williams, etc — that’s what these schools do best is send kids to academia or law school. Most of these professors know each other and go to conferences with each other and read/anssign each others books, and many are trying to parlay their job at one school into a job at another school.



Pomona was listed in the top ten in 1989 but wasn't no 4 until 2023.


Pomona was ranked 4 in 1990 and 2021. Between those years, it was ranked 5-8.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:it's still the gold standard, whether you like it or not.


Rankings are not useful and the USNWR one is particularly useless.

“Art & Science Group, a higher education consultancy, found that some 40 percent of students do not use rankings at all when they are picking colleges and that only 3 percent turn to them through the whole of their college searches.”


What on earth is “Art & Science Group”? All the parents at our kids’ school talk about rankings and most definitely use USNWR. You’re kidding yourself by pretending otherwise.
DP


DP: I have never heard a parent mention ranking in discussing a list of schools in real life. They talk about what their kid is looking for, chances of getting in, safeties they love and why. My kid ranked what he is looking for in a school, popped athat into college vine and only three in the top 25 fit his list of wants, two he didn't like for other reasons, and the one left is, of course, a super reach, so he probably won't apply unless he doesn't get into is EDs. It would make no sense to apply to schools based on rank when he already knows it's not what he wants. Far more schools on the LAC list meet his wants, and he sorted those by other factors.
Anonymous
Anonymous wrote:Also, what does it all mean? It is mostly ego and...what?

Aside from some of the few places where students have access to amazing undergrad faculty (and that varies withing schools by departments) school is ultimately what you put into it.

I think having a ranking of tiers of schools would make more sense, without ranking the schools within each band/tier.


That's not much different and just as senseless. Unless a kid really doesn't care much about where she goes, there are too many personal preferences to factor in.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
The most heavily weighted "peer assessment" 20% ha!


So? Colleges know a lot about each other. It’s a whole industry. People work or attend multiple institutions over time, meeting others who have done the same. It’s not unlike people in whatever line of business you might be familiar with knowing more about their competition than the general public. Peer review is integral to academia in general, and this peer assessment category which only USNWR has meaningful data on is a big part of what their rankings have fewer sanity test fails than those from other publications (like WSJ’s having UC Merced and San Jose State over Duke, Northwestern, UChicago, and Johns Hopkins). WSJ, Forbes, and even Niche over-rely on data only a few years out from college to try to do something different. All rankings are flawed, but USNWR is the least flawed, the least volatile, and more influential than all the others combined.

This makes no sense to me.
No, Amherst knows very little about Pomona and Pomona little about Wellesley. It’s not like their offices are in communication with each other. Their professors may have never even heard of the other and attended undergraduate universities.

Many of my professors at Williams hadn’t heard of the school until they put their job application in. These are people with PhDs from Harvard and uchicago


I actually don't believe you. Maybe 30 years ago when there was no internet, but today all professors know of Pomona's rise on USNWR.


When I applied to college in 1989, Pomona was 4 on USNWR liberal arts college list so it’s not like this is a new thing. I think PP is just ignorant — many of my fellow students from Amherst are professors at USC, Stanford, Hopkins, and GW, as well as at Amherst itself. PhD program are riddled with grads from Amherst, Williams, etc — that’s what these schools do best is send kids to academia or law school. Most of these professors know each other and go to conferences with each other and read/anssign each others books, and many are trying to parlay their job at one school into a job at another school.



Pomona was listed in the top ten in 1989 but wasn't no 4 until 2023.


Pomona was ranked 4 in 1990 and 2021. Between those years, it was ranked 5-8.


A correction to my post: Pomona was also ranked 4 in 2016, 2014, 2013, 2012, and 2004.

Source: https://andyreiter.com/datasets/

Anonymous
Anonymous wrote:
Anonymous wrote:There is absolutely nothing merit-based (academics) in the rankings.

https://www.usnews.com/education/best-colleges/articles/how-us-news-calculated-the-rankings

For schools that require scores there was a 5% weight factor--and for those w/out scores 0%.

That was it. 95% was just pure Bullsh*t

The most heavily weighted "peer assessment" 20% ha!

Student-faculty ratio (prob one of the most important factors for undergrads)...a mere 3%

nothing on gpa/selectivity, etc.




The list became a joke over the years.


It's consistently accurate. Whether some people like it or not, it's the gold standard.
Forum Index » College and University Discussion
Go to: