
There is absolutely nothing merit-based (academics) in the rankings.
https://www.usnews.com/education/best-college...culated-the-rankings For schools that require scores there was a 5% weight factor--and for those w/out scores 0%. That was it. 95% was just pure Bullsh*t The most heavily weighted "peer assessment" 20% ha! Student-faculty ratio (prob one of the most important factors for undergrads)...a mere 3% nothing on gpa/selectivity, etc. |
The list became a joke over the years. |
11% Pell Grant data
From today’s NYT article: “Few franchises in American higher education are as contentious as the U.S. News rankings. Over the decades, their publisher has faced trouble with manipulated data, complaints about murky methodologies, accusations of revenge and the foundational question of whether it is appropriate to rank colleges. To U.S. News, which retired its print newsmagazine in 2010, the rankings are a bastion of its largely bygone influence. They are also a source of millions of dollars each year, as universities pay licensing fees to promote how they fared.” “Bowing to critiques that its model did not adequately account for ideals that administrators often talk about, like promoting social mobility, U.S. News put more weight on retention and graduation rates for people with need-based Pell grants. The result was that some public universities surged in the rankings, with more than a dozen jumping 50 spots or more, while a handful of private schools saw their stature drop.” “Separately, Vanderbilt University was so thoroughly enraged by its five-spot dip in the U.S. News standings last year that the school commissioned a review of the methodologies of five rankings services. The resulting report argued that those rankings were rife with flaws, including faulty data and subjectivity.” |
Yet, every college list still has the same schools ranked in the top 25, with the deck chairs shuffled a little bit.
Do you honestly think schools like Princeton, Harvard, CalTech, MIT will fall to the bottom if you were to assign higher weights to your ranking criteria? |
it's still the gold standard, whether you like it or not. |
Back in 1983, the inaugural U.S. News & World Report rankings first came out. The initial rankings were relatively simple, and they focused primarily on the national universities.
In that early focus of data in the 1980’s, it really relied on data such as faculty-to-student ratios, standardized test scores of admitted students, and financial resources per student to assess essentially the quality of the institution. Back then, the methodology was relatively straightforward, especially compared with today’s rankings. As the years went by, the rankings became more complicated and data driven. Throughout the 1990’s, U.S. News rankings became more complex. They incorporated additional factors such as graduation rates, retention rates, the peer assessment survey, and alumni giving rates. They took the merit/selectivity out of it. |
Rankings are not useful and the USNWR one is particularly useless. “Art & Science Group, a higher education consultancy, found that some 40 percent of students do not use rankings at all when they are picking colleges and that only 3 percent turn to them through the whole of their college searches.” |
Tips given to colleges to move up the rankings---oh dear lord Insta world:
CONTROL YOUR BRAND NARRATIVE WITH STRATEGIC CONTENT MARKETING. By creating compelling stories about your institution’s successes, particularly in areas valued by the rankings, you can influence perceptions and indirectly impact rankings. Make sure that all the success stories you have are out there and available to the U.S. News & World Report Best Colleges Rankings through a well-planned and well-executed content marketing strategy. |
Understandable, since some people put so much weight on them, but hopefully there will be a growing understanding that while there's clearly a real and significant difference between the educational environment at #10 and #110, there is effectively no difference between #100 and #110, or between #10 and #15. |
It was the age of the internet that really shook up the list with certain schools very actively doing a huge campaign to climb the rankings---drive up application numbers, huge amount of mailings/emails, etc. |
“Selectivity” is so game-able. |
You can't game having full of 1500 SAT kids vs 1200 SAT kids |
BUT--we have TEST OPTIONAL schools (TO last year) in the top 10 ![]() ![]() ![]() ![]() Next cycle--all TO should get dinged. There should be no TO in the top T1-T20. It's not really elite when you don't have academic standards for admission. |
What's most ridiculous is the premise that there can be any objective ranking of colleges.
It's like ranking the best places to live in America. Online dorks may argue about median earnings, crime rates, etc. But, at the end of the day, it really doesn't make one city any better for any individual than another. BTW, USNWR does just this. Are you dorks packing your bags for Naples, FL or Boise, ID? https://realestate.usnews.com/places/rankings/best-places-to-live |
Schools kids don't want to go to after admission should also be considered. If a school has a very low yield rate--I'm not sure how it could rank high. It's not a first choice destination-yet ranks high. Usually desirability factors into a list of best. But, yes, I realize schools play games around that--by admitting kids they know will yield--calling them to accept so it doesn't factor into numbers. |