WSJ Rankings 2025

Anonymous
If the colorful Top 20 rankings in various academic and non-academic areas that I keep seeing in social media are representative of how content creators are now resigned to capturing eyeballs, I guess that explains the WSJ 2025 rankings.

For me, the pursuit of an undergraduate education is a blend of the actual educational experience, establishing the foundation for a career path (for graduate school or the start of a career), and social / personal development. Of these, that third part of the equation is of utmost importance.

I already know that my children have the intellectual chops and personal motivations to seek out a successful, impactful career. What I hope for them to gain from an undergraduate education is to position themselves to help make the world a better place while having the best college experience possible. Treating college like a make-or-break pre-professional Hunger Games, where average pay rate after five or ten years (average?) is vitally important, really suggests that college is a job for your kid, and little more.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Only nine of the top 20 USNWR made top 20 on this list.

Princeton
Stanford
Yale
MIT
Harvard
Berkeley
Penn
Columbia
Notre Dame


OK had a bit of time to cross reference the top 50 USNWR.

Those USNWR top 50 that made WSJ top 50:

Princeton
MIT
Harvard
Stanford
Yale
Penn
Cal Tech
Duke
Brown
Cornell
Columbia
Berkeley
Rice
Vanderbilt
Notre Dame
Lehigh
Virginia Tech

Michigan
Georgetown
Wash U
UVA
USC
UC San Diego
UC Davis
UT Austin
UC Irvine
Gtech

Those USNWR top 50 that didn't make top 50 WSJ and along with their WSJ rank:

Hopkins 92
Northwestern 62
Chicago 75
UCLA 79
Dartmouth 57
UNC 59
Emory 103
CMU 56
U Florida 83
NYU 273
UC Santa Barbara 179
Wisconsin 73
U Illinois 53
Boston College 100
Wake Forest 137
U Rochester 316
William and Mary 178
Brandeis 335
Case Western 224
Tulane 451
Northeastern 168
U Georgia 151
Ohio State 329


FIFY


Thanks. I did count and did not get fifty, but ran out of time and didn't want to go fishing for the missing ones. Glad you added them.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Only nine of the top 20 USNWR made top 20 on this list.

Princeton
Stanford
Yale
MIT
Harvard
Berkeley
Penn
Columbia
Notre Dame


OK had a bit of time to cross reference the top 50 USNWR.

Those USNWR top 50 that made WSJ top 50:

Princeton
MIT
Harvard
Stanford
Yale
Penn
Cal Tech
Duke
Brown
Cornell
Columbia
Berkeley
Rice
Vanderbilt
Notre Dame
Michigan
Georgetown
Wash U
UVA
USC
UC San Diego
UC Davis
UT Austin
UC Irvine
Gtech

Those USNWR top 50 that didn't make top 50 WSJ and along with their WSJ rank:

Hopkins 92
Northwestern 62
Chicago 75
UCLA 79
Dartmouth 57
UNC 59
Emory 103
CMU 56
U Florida 83
NYU 273
UC Santa Barbara 179
Wisconsin 73
U Illinois 53
Boston College 100
Wake Forest 137
U Rochester 316
William and Mary 178
Brandeis 335
Case Western 224
Tulane 451
Northeastern 168
U Georgia 151
Ohio State 329




I’m a Wash. U. alum.

It’s fun to see Wash. U. do well on these lists, but any list where Case Western, Rochester, Tulane and Brandeis rank so much lower seems odd.

It’s also hard to trust a list that sets UC Merced over Wash. U. or Wash. U. over the University of Chicago.



Lehigh made both lists. USNEWS 47, WSJ 15


They didn’t include any of the LACs. Someone earlier posted that Claremont McKenna, Davidson, and Harvey Mudd are in the top 20 of both the WSJ and USNWR LACs lists. You could lookup the lists and extend that analysis to the overlap of the top 50.


I am the one who posted the list. It is time consuming. You are welcome to add the LACS to the list.
Anonymous
Anonymous wrote:
Anonymous wrote:I know this may sound crazy, but UC Merced is stealing applicants and students from Yale, Oxford and Cambridge. It is a force.

I know my neighbor, 1600 SAT, 4.0, math olympiads, gunning either for hedge fund or biomedical research whose dream school is UC Merced.



That does sound crazy. The student you are describing should be gunning for Stanford.


Never heard of the school until this year. Everything has become too preprofessional. My kid is at Harvard with the goal of getting a PhD in Math. I told her not to think about internships. Money follows passion.
Anonymous
Anonymous wrote:
Anonymous wrote:Here is what people should know related to the methodology. It is not a best college ranking or even an ROI ranking. It is a subjective return on expected earnings ranking. In the background, WSJ creates a score that effectively is subjective as it compares colleges to “peers” that have somewhat similar student intakes. So whatever objective raw score is factual, it is then turned subjective by this segmentation. Then all of the newly formed scores are compared against each other. I confirmed last year that this is what WSJ actually is doing. This is a flawed statistical method, and would not pass a first year college stats class if it was submitted for a grade.

Another poster likened it to competing in a major golf tournament where someone can apply their handicap whereas a full pro could not. Or, for baseball fans, think of it as WSJ saying the best Texas League baseball team should rank higher than a middling Major League team because the former is first in its class. No one would accept this. And yet, that is what WSJ is doing here.

Last year when it introduced the new methodology, I actually had a series of back and forth conversation with the two authors. They explained their methodology twice to me. I asked questions to make sure I understood it (I work in applied mathematics) and then challenged this objective to subjective to scoring to ranking approach. I then asked them to show me where my conclusions about their methodology was wrong. I was willing to learn something. I never heard back from them. In other words, I truly believe they realized that an actual statistician could easily show the massive flaws in their methodology. For a so-called college ranking methodology to not even be good enough to past muster in an intro college stats class is rather absurd.

It is a shame. Sticking the previously honorable WSJ brand to this untidy piece of work is a disservice to the reader and especially parents of students aiming to learn about colleges. The one thing the ranking is good for is how not to attempt statistical models and inference, and then try to cover your tracks.


**********

The authors obviously didn’t further engage with you because you’re an insufferable twit.

Which of your favored schools tanked in this ranking?


***********

To answer your question -- my "favored" school i.e. my alma mater is one of the top 20 schools in this table, as it was last year. Sorry my friend, your attempt to attack me is a failure. And my analysis remains correct though your opinions of me are not very mature. How old are you?
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:https://www.wsj.com/rankings/college-rankings/best-colleges-2025
https://www.wsj.com/rankings/college-rankings/best-colleges-2024

I have to say, I find it hard to believe in the credibility of a ranking that can have a school in the top 10 one year and outside the top 100 the next. Amherst College: #8 to #120.


I recovered my Wall Streef Journal account and figured out why this list is so weird:

- It’s been a bad summer for web traffic and ad sales. A weird list that generates angry traffic beats a good list that gets fewer views.

- Roughly half of the results are based on a *student survey*, even when the team was creating statistics on what seem like factors that should be based on hard, common data set data or alumni surveys. Example: learning opportunities and learning opportunities. The “character building” factor seems to be based on a proprietary set of survey questions that isn’t available online, or at least was hard to find online.

- Some of the more concrete numbers were calculated in unusual ways. Example: 94% of Washington University undergraduates get their bachelor’s degrees within six years, but it got a 71 on that indicator in the WSJ rankings. That’s because the team adjusted the graduation rate to adjust for the test scores of the freshman and the percentage of freshman with household income over $110,000 per year. So, due to the adjustments, that indicator punishes schools with a lot of high-income-family students with high test scores, even though going to a school with smart rich kids might generally be considered a good thing. MIT gets dinged hard on that factor for the same reason.


These descriptions continue to get crazier and crazier - is THIS is how college rankings have always been created? If so, my god.


First, news websites need hits. No one is asking the reporters if these lists make any sense. Hits are God. No stupid lists means no reporters.

Second, I think one thing that happened here is that the WSJ team (apparently, maybe including one of more people here in this thread), came up with the adjustments to try to make certain indicators (example: graduation rates) more fair than usual. But the problem is that the adjustment algorithms weren’t well-designed and are too hard on schools with rich, smart students.

Third, it’s hard to tell how the surveys worked, but the WSJ probably should have figured out a way to calibrate the data by comparing student positivity with some objective school statistic. Example: The WSJ shows Elon University having a higher learning opportunities score than my alma mater, Washington University. Both Elon and Wash. U. have higher learning opportunities scores than Yale, UC Berkeley and the University of Michigan.

I’m sure that many Elon students get a great education, but maybe they’re simply very positive people and Yale students are much more critical. Otherwise, it’s hard to see how Elon could beat Yale, Wash. U., Berkeley and Michigan on learning opportunities.

Maybe I missed something, but I think the income impact factor is another problem. I think the WSJ team adjusted that factor for incoming student test scores and the cost of living in the college’s state, but not for major mix or gender mix. One way to adjust for that would be to base the income factor solely on the income of math majors, to filter out any differences caused by what majors students pick. You figure, any school in the top 500 must have a few math majors, and that’s both a STEM topic and one of the liberal arts. So, both the engineering schools and the liberal arts colleges should have math major data.


Was Elon ranked higher than some Ivies? LOL. That is all you need to know about the list.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Serious question, even does anyone believe- even for one second-that there are 270 schools better than NYU?

If not, then these rankings are bile and trash, assembled merely to get a Barnum-like reaction from the masses.


It's not that they are trash...it's that their ranking methodology is drastically different than other rankings and they really don't give a good explanation how they are different.

As an example, University of Delaware (#24) gets a graduation rate ranking of 96/100 while Duke (#45) gets a graduation rate ranking of 77/100.

On the surface, it implies that Delaware kids graduate in 4 years at a much higher rate than Duke kids...which most people think sounds crazy. Well, it is crazy. Duke's 4-year graduation rate is 95% while Delaware's is 73%. So, how does Delaware score a 96 while Duke scores a 77? In theory, it's because Duke should have the same graduation rate of Princeton (at 99%) because the kids have similar test scores and demographics. Yet, it doesn't...it's 4 percentage points lower.

On the flip side, Delaware should have a graduation rate that their statisticians believe should be much lower than 73% based on the test scores and other demographics of their students. In fact, it is much higher so they get rewarded for this fact.

Again, these are useful rankings...but the methodology is quite confusing and not well explained.


If that kind of logic is central to these rankings, I feel confident in dismissing them entirely.


This is the exact logic applied to these rankings, and how the scores were calculated.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Can someone explain how Claremont McKenna is so high and Pomona is so low? Are they really that different? I’ve read the whole thread and the methodology - and as a Colgate alum I’m happy with my college’s placement - but I still think this list is weird.



These schools are extremely different. Pomona is modelled as a West Coast Dartmouth. It produces a high percentage that go on to do PhDs. CMC has preparing for the professions in the mission statement. The school frames the experience so the kids are focused on going for MBB/finance. The school does an amazing job of framing the experience around that objective. Many successful entrepreneurs value this, so the student body tends to be well connected, and the school incorporates this into its network as much as alums. Ultimately, the outcomes are amazing & the CMC network appreciates this, in turn setting records for SLAC fundraisings for the entire country (over $1bn raised in last fund raising drive).

So if your methodology is ROI, it goes to CMC. If it’s PhD, you then it goes to Pomona.


Pomona scores very low on salary impact compared to CMC…however, salary impact only looks at kids getting jobs. There are no points detracted for kids that go to grad school.

This implies Pomona grads take very different jobs vs CMC grads.


Pomona has very academic kids. CMC has business and finance kids. Dartmouth is like those two schools put together.


What does that mean from a job perspective? Pomona kids that go into the workforce take jobs at NPOs or where?


Many Pomona kids go to grad school, medical school, or law school. The Pomona grads that go into the workforce go to different companies, not just NPOs. A CS major can go to a tech company, for example. I have heard that Swarthmore is very similar to Pomona and also like this.


So why is Swarthmore ranked 35 and Pomona 170?


Maybe because Swarthmore offers degrees in engineering and Pomona does not. Also, Swarthmore is geographically closer to banking and consulting centers, so maybe they have more kids going into those high-paying jobs.


What about Amherst and Williams? They’re even closer to NYC.


I don't know. Probably Amherst, Williams, and Swarthmore all do better in $$$ outcome than Pomona. The kids at Pomona are throwing a frisbee around in January while the kids at the other schools are snowed in and studying.



Or maybe the WSJ came up with a completely useless ranking that makes no sense whatsoever. It takes a lot of mental gymnastics to put UC Merced in the top 20. And Pomona at 170.


There is actually zero mental gymnastics. It’s all from mathematical formulas based on quantitative data.


However, the decision to apply an expected result (e.g. actual graduation rate vs 'expected' graduation rate) is a qualitative one, as are the weightings. This is what is meant by an objective (for the data) to subjective (for the scoring) back to objective (for the ranking) flaw in the methodology and inference.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:I know this may sound crazy, but UC Merced is stealing applicants and students from Yale, Oxford and Cambridge. It is a force.

I know my neighbor, 1600 SAT, 4.0, math olympiads, gunning either for hedge fund or biomedical research whose dream school is UC Merced.



That does sound crazy. The student you are describing should be gunning for Stanford.


Never heard of the school until this year. Everything has become too preprofessional. My kid is at Harvard with the goal of getting a PhD in Math. I told her not to think about internships. Money follows passion.

ummm yea, tell that to every starving artist/musician/actor.

FWIW, my one kid is a dual math major, and another loves the theater.
Anonymous
Anonymous wrote:What weird (to me) is that Claremont is so high and Pomona is so low (191). I get Harvey Mudd since all engineering and business schools always do well.

Are the schools really that dramatically different considering they are all the same consortium?


No, of course not. This is how we know the list is based on made up statistics.

I mean the single largest score under "student outcomes" was obtained this way - look at how many steps it takes to get the score, based on subjective "metrics."

Salary impact (33%): This measures the extent to which a college boosts its graduates’ salaries beyond what they would be expected to earn regardless of which college they attended. We used statistical modeling to estimate what we would expect the median earnings of a college’s graduates to be on the basis of the exam results of its students prior to attending the college and the cost of living in the state in which the college is based. We then scored the college on its performance against that estimate. These scores were then combined with scores for raw graduate salaries to factor in absolute performance alongside performance relative to our estimates. Our analysis for this metric used research on this topic by the policy-research think tank the Brookings Institution as a guide.

Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Here is what people should know related to the methodology. It is not a best college ranking or even an ROI ranking. It is a subjective return on expected earnings ranking. In the background, WSJ creates a score that effectively is subjective as it compares colleges to “peers” that have somewhat similar student intakes. So whatever objective raw score is factual, it is then turned subjective by this segmentation. Then all of the newly formed scores are compared against each other. I confirmed last year that this is what WSJ actually is doing. This is a flawed statistical method, and would not pass a first year college stats class if it was submitted for a grade.

Another poster likened it to competing in a major golf tournament where someone can apply their handicap whereas a full pro could not. Or, for baseball fans, think of it as WSJ saying the best Texas League baseball team should rank higher than a middling Major League team because the former is first in its class. No one would accept this. And yet, that is what WSJ is doing here.

Last year when it introduced the new methodology, I actually had a series of back and forth conversation with the two authors. They explained their methodology twice to me. I asked questions to make sure I understood it (I work in applied mathematics) and then challenged this objective to subjective to scoring to ranking approach. I then asked them to show me where my conclusions about their methodology was wrong. I was willing to learn something. I never heard back from them. In other words, I truly believe they realized that an actual statistician could easily show the massive flaws in their methodology. For a so-called college ranking methodology to not even be good enough to past muster in an intro college stats class is rather absurd.

It is a shame. Sticking the previously honorable WSJ brand to this untidy piece of work is a disservice to the reader and especially parents of students aiming to learn about colleges. The one thing the ranking is good for is how not to attempt statistical models and inference, and then try to cover your tracks.


**********

The authors obviously didn’t further engage with you because you’re an insufferable twit.

Which of your favored schools tanked in this ranking?


***********

To answer your question -- my "favored" school i.e. my alma mater is one of the top 20 schools in this table, as it was last year. Sorry my friend, your attempt to attack me is a failure. And my analysis remains correct though your opinions of me are not very mature. How old are you?



Please stop engaging in this back and forth about alma maters or your kids' schools' rankings. This is going to continue forever. FWIW I agree with you that it's immature for the PP to accuse people of only disagreeing with the WSJ rankings because their school or their kids' schools didn't make the cut, but just ignore it.

I actually like the factors in the WSJ rankings. I can see how some people might prefer a USN style ranking and others might prefer a WSJ style ranking. However, the method is obviously flawed simply because of the vast differences between one year and the next, as pointed out by a PP. Perhaps if this list gets any traction, they will invest more in resources and establish a more robust methodology.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:I know this may sound crazy, but UC Merced is stealing applicants and students from Yale, Oxford and Cambridge. It is a force.

I know my neighbor, 1600 SAT, 4.0, math olympiads, gunning either for hedge fund or biomedical research whose dream school is UC Merced.



That does sound crazy. The student you are describing should be gunning for Stanford.


Never heard of the school until this year. Everything has become too preprofessional. My kid is at Harvard with the goal of getting a PhD in Math. I told her not to think about internships. Money follows passion.

ummm yea, tell that to every starving artist/musician/actor.

FWIW, my one kid is a dual math major, and another loves the theater.



I had the same reaction. it is a luxury to be able to follow one's passion without thought to income. However, aside from the very top success stories, most of the musicians and artists I know who manage to live comfortably are only able to do so because they come from wealthy families or have a spouse with a high earning career.

This makes me wonder... if the WSJ rankings are heavily skewed towards ROI, this will definitely ding schools that have more humanities and arts majors, and favor schools that have more kids go into finance. Anyone know if this is factored into the methods somehow?
Anonymous
Anonymous wrote:

There is actually zero mental gymnastics. It’s all from mathematical formulas based on quantitative data.


"Salary impact (33%): This measures the extent to which a college boosts its graduates’ salaries beyond what they would be expected to earn regardless of which college they attended. We used statistical modeling to estimate what we would expect the median earnings of a college’s graduates to be on the basis of the exam results of its students prior to attending the college and the cost of living in the state in which the college is based. We then scored the college on its performance against that estimate."

"Graduation rate impact (20%): This is a measure of a college’s performance in ensuring that its students graduate, beyond what would have been expected of the students regardless of which college they attended. We used statistical modeling to estimate what we would expect a college’s graduation rate to be on the basis of the exam results of its students prior to attending the college and the proportion of their students whose family income is $110,000 per year or higher. We then scored the college on its performance against that estimate. These scores were then combined with scores for raw graduation rates to factor in absolute performance alongside performance relative to our estimates."

The mental gymnastics are in the assumptions they put into the "statistical modeling". You can get a model to say whatever you want if you just adjust the assumptions, and the result will not be "hard and quantitative" no matter how much hard, quantitative data you feed into it.

Anonymous
The fact that the whiners cannot explain away is why HPYSM did so well while so many other high brow schools did not. If the methodology is a complete sham or just random, HPYSM would not constitute half of the top 10. Further, other top universities and LACs make the top 10, top 20, and top 50. Again, this is not random. The head scratching and consternation is really around a set of “treasured” schools that didn’t perform well. Perhaps, it would be better to understand why those didn’t perform well instead of assailing the methodology.

As for the methodology, it is not primarily a survey. That’s just a weak way of dismissing the results. What people seem to struggle with most is the comparison of student outcomes to expectations. Expectations account for two things: the quality of the student body and the regional cost of living. So, a Williams or Amherst faces more headwinds than Kenyon. Yet, there is no guarantee that Kenyon will punch above its weight. What’s eye opening is that HPYSM have probably the highest expectations hurdle, and yet, they jumped it - big time! Claremont McKenna and Davidson also standout here. Take note.
Anonymous
"What’s eye opening is that HPYSM have probably the highest expectations hurdle, and yet, they jumped it - big time!"

Schools that have their pick of the best students in the world, have huge prestige, enormous endowments, and powerful alumni networks "exceed expectations". Yay! (polite golf clap)
post reply Forum Index » College and University Discussion
Message Quick Reply
Go to: