WSJ Rankings 2025

Anonymous
Anonymous wrote:I have a junior so we haven’t been through the college admissions process yet but I find this list refreshing! It’s time to look beyond the USNWR rankings which have done nothing but cause endless stress for parents and students alike.

I'm sure you'd pick Towson over Duke or CSU Stanislaus over UCLA. Please do so.

I just can't take this list seriously.

-double CSU grad above schools like Johns Hopkins and Cornell and Duke on that list

Ridiculous list.
Anonymous
Anonymous wrote:
Anonymous wrote:Here is what people should know related to the methodology. It is not a best college ranking or even an ROI ranking. It is a subjective return on expected earnings ranking. In the background, WSJ creates a score that effectively is subjective as it compares colleges to “peers” that have somewhat similar student intakes. So whatever objective raw score is factual, it is then turned subjective by this segmentation. Then all of the newly formed scores are compared against each other. I confirmed last year that this is what WSJ actually is doing. This is a flawed statistical method, and would not pass a first year college stats class if it was submitted for a grade.

Another poster likened it to competing in a major golf tournament where someone can apply their handicap whereas a full pro could not. Or, for baseball fans, think of it as WSJ saying the best Texas League baseball team should rank higher than a middling Major League team because the former is first in its class. No one would accept this. And yet, that is what WSJ is doing here.

Last year when it introduced the new methodology, I actually had a series of back and forth conversation with the two authors. They explained their methodology twice to me. I asked questions to make sure I understood it (I work in applied mathematics) and then challenged this objective to subjective to scoring to ranking approach. I then asked them to show me where my conclusions about their methodology was wrong. I was willing to learn something. I never heard back from them. In other words, I truly believe they realized that an actual statistician could easily show the massive flaws in their methodology. For a so-called college ranking methodology to not even be good enough to past muster in an intro college stats class is rather absurd.

It is a shame. Sticking the previously honorable WSJ brand to this untidy piece of work is a disservice to the reader and especially parents of students aiming to learn about colleges. The one thing the ranking is good for is how not to attempt statistical models and inference, and then try to cover your tracks.


I was searching for the best analogy to understand these rankings...and this is it.

It's not objectively saying University of Delaware kids on average have absolute better outcomes than kids from Brown. It's saying they punch above their weight for what the school is. So, they award Delaware all kinds of additional points to reflect its particular circumstances compared to Brown, and now apply some multiplier to their average salaries to make up for these deficiencies.

It's definitely an interesting take on college rankings, but they still don't make their methodology that clear to understand. Perhaps they could show their rankings with the raw data, indicate the multiplier they determined, and then show these rankings.

Agree. It's a list that shows which university (as you say) punches above their status. It doesn't mean that Towson is actually better than Duke, but that Towson grads do fairly well for graduating from a regional university like Towson, and Duke grads do as expected from graduating from Duke considering that most are from UMC to begin with. There's not as much room for SES movement for Duke grads as there are for Towson grads.

It's a good list for those who don't have the means or stats to go to your standard T50.
Anonymous
Anonymous wrote:
Anonymous wrote:Here is what people should know related to the methodology. It is not a best college ranking or even an ROI ranking. It is a subjective return on expected earnings ranking. In the background, WSJ creates a score that effectively is subjective as it compares colleges to “peers” that have somewhat similar student intakes. So whatever objective raw score is factual, it is then turned subjective by this segmentation. Then all of the newly formed scores are compared against each other. I confirmed last year that this is what WSJ actually is doing. This is a flawed statistical method, and would not pass a first year college stats class if it was submitted for a grade.

Another poster likened it to competing in a major golf tournament where someone can apply their handicap whereas a full pro could not. Or, for baseball fans, think of it as WSJ saying the best Texas League baseball team should rank higher than a middling Major League team because the former is first in its class. No one would accept this. And yet, that is what WSJ is doing here.

Last year when it introduced the new methodology, I actually had a series of back and forth conversation with the two authors. They explained their methodology twice to me. I asked questions to make sure I understood it (I work in applied mathematics) and then challenged this objective to subjective to scoring to ranking approach. I then asked them to show me where my conclusions about their methodology was wrong. I was willing to learn something. I never heard back from them. In other words, I truly believe they realized that an actual statistician could easily show the massive flaws in their methodology. For a so-called college ranking methodology to not even be good enough to past muster in an intro college stats class is rather absurd.

It is a shame. Sticking the previously honorable WSJ brand to this untidy piece of work is a disservice to the reader and especially parents of students aiming to learn about colleges. The one thing the ranking is good for is how not to attempt statistical models and inference, and then try to cover your tracks.


I was searching for the best analogy to understand these rankings...and this is it.

It's not objectively saying University of Delaware kids on average have absolute better outcomes than kids from Brown. It's saying they punch above their weight for what the school is. So, they award Delaware all kinds of additional points to reflect its particular circumstances compared to Brown, and now apply some multiplier to their average salaries to make up for these deficiencies.

It's definitely an interesting take on college rankings, but they still don't make their methodology that clear to understand. Perhaps they could show their rankings with the raw data, indicate the multiplier they determined, and then show these rankings.


I think that looking at these USN and WSJ rankings combined is helpful...especially when you notice a school ranked in the top 20 in both rankings.
Anonymous
Anonymous wrote:
Anonymous wrote:Here is what people should know related to the methodology. It is not a best college ranking or even an ROI ranking. It is a subjective return on expected earnings ranking. In the background, WSJ creates a score that effectively is subjective as it compares colleges to “peers” that have somewhat similar student intakes. So whatever objective raw score is factual, it is then turned subjective by this segmentation. Then all of the newly formed scores are compared against each other. I confirmed last year that this is what WSJ actually is doing. This is a flawed statistical method, and would not pass a first year college stats class if it was submitted for a grade.

Another poster likened it to competing in a major golf tournament where someone can apply their handicap whereas a full pro could not. Or, for baseball fans, think of it as WSJ saying the best Texas League baseball team should rank higher than a middling Major League team because the former is first in its class. No one would accept this. And yet, that is what WSJ is doing here.

Last year when it introduced the new methodology, I actually had a series of back and forth conversation with the two authors. They explained their methodology twice to me. I asked questions to make sure I understood it (I work in applied mathematics) and then challenged this objective to subjective to scoring to ranking approach. I then asked them to show me where my conclusions about their methodology was wrong. I was willing to learn something. I never heard back from them. In other words, I truly believe they realized that an actual statistician could easily show the massive flaws in their methodology. For a so-called college ranking methodology to not even be good enough to past muster in an intro college stats class is rather absurd.

It is a shame. Sticking the previously honorable WSJ brand to this untidy piece of work is a disservice to the reader and especially parents of students aiming to learn about colleges. The one thing the ranking is good for is how not to attempt statistical models and inference, and then try to cover your tracks.


So, you’re saying that you proved that they were wrong, but they doubled down and used the same methodology this year? I don’t believe you.


+1000
The PP is trying very hard to convince himself (and others) that he, a random poster, knows better than the authors. It’s simultaneously amusing and sad.
Anonymous
Anonymous wrote:
Anonymous wrote:Here is what people should know related to the methodology. It is not a best college ranking or even an ROI ranking. It is a subjective return on expected earnings ranking. In the background, WSJ creates a score that effectively is subjective as it compares colleges to “peers” that have somewhat similar student intakes. So whatever objective raw score is factual, it is then turned subjective by this segmentation. Then all of the newly formed scores are compared against each other. I confirmed last year that this is what WSJ actually is doing. This is a flawed statistical method, and would not pass a first year college stats class if it was submitted for a grade.

Another poster likened it to competing in a major golf tournament where someone can apply their handicap whereas a full pro could not. Or, for baseball fans, think of it as WSJ saying the best Texas League baseball team should rank higher than a middling Major League team because the former is first in its class. No one would accept this. And yet, that is what WSJ is doing here.

Last year when it introduced the new methodology, I actually had a series of back and forth conversation with the two authors. They explained their methodology twice to me. I asked questions to make sure I understood it (I work in applied mathematics) and then challenged this objective to subjective to scoring to ranking approach. I then asked them to show me where my conclusions about their methodology was wrong. I was willing to learn something. I never heard back from them. In other words, I truly believe they realized that an actual statistician could easily show the massive flaws in their methodology. For a so-called college ranking methodology to not even be good enough to past muster in an intro college stats class is rather absurd.

It is a shame. Sticking the previously honorable WSJ brand to this untidy piece of work is a disservice to the reader and especially parents of students aiming to learn about colleges. The one thing the ranking is good for is how not to attempt statistical models and inference, and then try to cover your tracks.


After reading that, thousands of parents who had felt euphoric in seeing what they wanted to see in that piss-poor “ranking” are now on suicide watch.


You have that exactly backwards. It’s clear who’s triggered by this list - those who assume their favored school is entitled to a high spot in every ranking.
Anonymous
Anonymous wrote:You do not have to believe me. They may have made some minor tweaks. I have not spoken to them this year. But the grand majority of the methodology clearly is the same, and I reckon I am not the only person to have had conversations with them about it.

People double-down on poorly thought out and executed processes all the time. CEOs do it, coaches do it, professors do it. It is a flaw with many humans. And if you were defending your job once you received lots of criticism with a “no, but we are right” defensive approach, that would not make you unique.


Oh, the irony!
DP
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Initially I was puzzled with some of the ratings in the various categories, until I realized that they are based on actual data vs expectations, where if the data outperforms the expectations, they receive a higher rating. I was looking specifically at graduation rate and was initially surprised to see higher grad ratings on some of the schools where I knew they were lower.

I actually like this balanced approach. I think looking at both this ranking and USNWR would be a good way to evaluate a school. I also like the New York Times tool, where you can set your own parameters. You can learn a lot from these three platforms.


No, it's like measuring your schwantz from the floor up. Does not give an accurate measurement of what it claims to. It's data, not information.


Sorry you are not happy with the results of your school that sunk in the rankings. Signed, mom of kid whose kid's school is in top 20 for both USNWR and WSJ.


+ a million
The PP is no different than a spoiled kid throwing a tantrum. Absolutely insufferable.
Anonymous
Anonymous wrote:Duke at 45, should be enough to show how dumb this ranking is. This thread shouldn't be this long.


Then stop participating in it. Bye!
Anonymous
Jesus, the navel gazing on this thread is so gross.

Newsflash: you can do anything from any school. No one cares once you're in college, so long as you do well and work hard. Hardly anyone has heard of my big, no-name midwestern U and it's certainly not any list in any meaningful rank. You know what? I've succeeded by every measure you people would use. And there are many people in my circle like me who didn't go to JHU, Duke, etc.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Initially I was puzzled with some of the ratings in the various categories, until I realized that they are based on actual data vs expectations, where if the data outperforms the expectations, they receive a higher rating. I was looking specifically at graduation rate and was initially surprised to see higher grad ratings on some of the schools where I knew they were lower.

I actually like this balanced approach. I think looking at both this ranking and USNWR would be a good way to evaluate a school. I also like the New York Times tool, where you can set your own parameters. You can learn a lot from these three platforms.


No, it's like measuring your schwantz from the floor up. Does not give an accurate measurement of what it claims to. It's data, not information.


Sorry you are not happy with the results of your school that sunk in the rankings. Signed, mom of kid whose kid's school is in top 20 for both USNWR and WSJ.


USN rankings mean nothing, and these WSJ rankings mean less than nothing. Not one human being on the planet will make a college selection based on them. Insults and adhoms indicate you know you have weak tea.

Signed parent of one Ivy kid w undergrad degree currently getting grad degree at different Ivy and second kid w NESCAC degree but bragging is for aholes.


DP. Speaking of aholes… it’s funny that you say something as patently false as “rankings mean nothing.” If they truly mean nothing, you wouldn’t be here writing screed after hysterical screed. Meanwhile, millions of families will continue to use rankings as part of their college selection process.

Do continue seething!
Anonymous
Anonymous wrote:Jesus, the navel gazing on this thread is so gross.

Newsflash: you can do anything from any school. No one cares once you're in college, so long as you do well and work hard. Hardly anyone has heard of my big, no-name midwestern U and it's certainly not any list in any meaningful rank. You know what? I've succeeded by every measure you people would use. And there are many people in my circle like me who didn't go to JHU, Duke, etc.

I went to no name state u that's on the list, and I think I'm fairly successful (making six figures for the past 20 years), but going to a top school gives you the edge that going to a no name doesn't. The number of grads from a T20 making six figures is far far higher than those graduating from a T75 to T100.

Big companies that pay well don't recruit at the T75 to T100. I got to where I am at with a combination of hard work, smarts and luck, being at the right place at the right time.

That's not to say that I push my kids to go to T20. My one kid is at the state flagship with merit, and the other will probably go to a T75. They will do fine as long as they work hard, but the road to get to their destination may take longer than for those who get their start at a T20.

Going to T20 is basically like being born with a silver spoon in your mouth (which many who go to elite expensive colleges are) vs being born to a middle class family. Both can reach millionaire status but it will take the middle class grad longer to get there.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Good for the WSJ. About time. Oh, and I'm loving the over-the-top wailing from the usual suspects!



That's sweet. Now find the Wall Street Journal reporters that went to UC-Merced, Cal-State Stanislaus, Dalton State, Martin Luther College, Kettering University, Benedictine College, and - this one's not f#cking around - The Masters University.

Best and brightest no doubt.

I genuinely don't understand why the editors of the WSJ let this silliness go. There is a huge opportunity to take down US News and this is what the Murdochs came up with? Clearly not ready for prime time. Stupid list. Not interesting. And not even worth talking about. Massive fail.


And yet, here you are! Clearly very interesting and very much worth talking about. Cheers to the WSJ.



Because I would like to see some genuine competition to US News, And I was hopeful for WSJ because they have the brand and the reach. Unfortunately, this is a very lame list that won't be taken seriously. There's this massive, very lucrative space for a major publication, and yet none are reaching for it. Mystifying.


Princeton Review has the top 387 schools, in book format.
Anonymous
Anonymous wrote:
Anonymous wrote:Here is what people should know related to the methodology. It is not a best college ranking or even an ROI ranking. It is a subjective return on expected earnings ranking. In the background, WSJ creates a score that effectively is subjective as it compares colleges to “peers” that have somewhat similar student intakes. So whatever objective raw score is factual, it is then turned subjective by this segmentation. Then all of the newly formed scores are compared against each other. I confirmed last year that this is what WSJ actually is doing. This is a flawed statistical method, and would not pass a first year college stats class if it was submitted for a grade.

Another poster likened it to competing in a major golf tournament where someone can apply their handicap whereas a full pro could not. Or, for baseball fans, think of it as WSJ saying the best Texas League baseball team should rank higher than a middling Major League team because the former is first in its class. No one would accept this. And yet, that is what WSJ is doing here.

Last year when it introduced the new methodology, I actually had a series of back and forth conversation with the two authors. They explained their methodology twice to me. I asked questions to make sure I understood it (I work in applied mathematics) and then challenged this objective to subjective to scoring to ranking approach. I then asked them to show me where my conclusions about their methodology was wrong. I was willing to learn something. I never heard back from them. In other words, I truly believe they realized that an actual statistician could easily show the massive flaws in their methodology. For a so-called college ranking methodology to not even be good enough to past muster in an intro college stats class is rather absurd.

It is a shame. Sticking the previously honorable WSJ brand to this untidy piece of work is a disservice to the reader and especially parents of students aiming to learn about colleges. The one thing the ranking is good for is how not to attempt statistical models and inference, and then try to cover your tracks.


So, you’re saying that you proved that they were wrong, but they doubled down and used the same methodology this year? I don’t believe you.


Exactly. I wrote the WSJ last year because they were skipping regional schools like Providence College, St. Joe’s, etc. They’ve all shown up nicely on the list this year. Read the description of their methodology.
Anonymous
Anonymous wrote:
Anonymous wrote:Duke at 45, should be enough to show how dumb this ranking is. This thread shouldn't be this long.


Duke in the top ten shows how dumb USNWR rankings are. The student body is more impressive than the departmental rankings.


Exactly! The top schools are typically old money schools where the offspring of the top 5% trade connections and schools flatter their egos. The WSJ rankings are great for families who can’t pay $400k for an undergrad degree.
Anonymous
Anonymous wrote:Jesus, the navel gazing on this thread is so gross.

Newsflash: you can do anything from any school. No one cares once you're in college, so long as you do well and work hard. Hardly anyone has heard of my big, no-name midwestern U and it's certainly not any list in any meaningful rank. You know what? I've succeeded by every measure you people would use. And there are many people in my circle like me who didn't go to JHU, Duke, etc.


The only measure DCUM people use is the name of your college. If you can't figure even that out, your no-name school did not educate you properly.
post reply Forum Index » College and University Discussion
Message Quick Reply
Go to: