Anonymous wrote:Anonymous wrote:I don't see UGA or U Miami? (both on my kid's list)
170, 90
Anonymous wrote:Anonymous wrote:Anonymous wrote:Ok, this ranking is absolutely crazy and methodology is absolutely specious, so much so that I am surprised it got by the WSJ editors. The 70% “outcomes” weighting is largely driven by the calculated “value-added” by the college. They do this by totally ignoring the input of a high-achieving, top quality student and instead try to determine what “should” that student be making if he had gone to another “comparable” institution, say Penn instead of Brown, vs what that student is actually making. So, it has the effect of magnifying in geometric terms the 1st year salary differentials between schools in different selectivity tiers and rather bizarrely does not account for mix differentials such as engineering majors vs. engineering majors and classics majors vs. classics majors. The entire methodology fails data science 101. Our country is screwed if this passes for analysis/insight. I am absolutely for outcomes based rankings but to ignore absolute outcomes in favor of an incredibly flawed attempt to determine value-added by the college vs. some “expected value” for the student had he gone somewhere else is insane. In statistical terms, the error term is sky high here.
In short, your school did not receive a high ranking.
PP, actually I went to Columbia. I just hate sloppy work, and I will call it out when I see it.
Anonymous wrote:My DC is interested in W&M for a career in government service. I don’t see how a school should be higher ranked because its graduates prioritize money over service at age 22. Is that how the new rankings are being formulated?
Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:Ok, this ranking is absolutely crazy and methodology is absolutely specious, so much so that I am surprised it got by the WSJ editors. The 70% “outcomes” weighting is largely driven by the calculated “value-added” by the college. They do this by totally ignoring the input of a high-achieving, top quality student and instead try to determine what “should” that student be making if he had gone to another “comparable” institution, say Penn instead of Brown, vs what that student is actually making. So, it has the effect of magnifying in geometric terms the 1st year salary differentials between schools in different selectivity tiers and rather bizarrely does not account for mix differentials such as engineering majors vs. engineering majors and classics majors vs. classics majors. The entire methodology fails data science 101. Our country is screwed if this passes for analysis/insight. I am absolutely for outcomes based rankings but to ignore absolute outcomes in favor of an incredibly flawed attempt to determine value-added by the college vs. some “expected value” for the student had he gone somewhere else is insane. In statistical terms, the error term is sky high here.
In short, your school did not receive a high ranking.
That's obviously not at all what that post is saying (which I did not write). If you want it short: an analysis of financial return on investment needs to be more granular to be able to quantitatively compare apples to apples.
Yes, I think that we all understand. Was just trying to add a touch of humor to the thread.
Anonymous wrote:Anonymous wrote:Anonymous wrote:Ok, this ranking is absolutely crazy and methodology is absolutely specious, so much so that I am surprised it got by the WSJ editors. The 70% “outcomes” weighting is largely driven by the calculated “value-added” by the college. They do this by totally ignoring the input of a high-achieving, top quality student and instead try to determine what “should” that student be making if he had gone to another “comparable” institution, say Penn instead of Brown, vs what that student is actually making. So, it has the effect of magnifying in geometric terms the 1st year salary differentials between schools in different selectivity tiers and rather bizarrely does not account for mix differentials such as engineering majors vs. engineering majors and classics majors vs. classics majors. The entire methodology fails data science 101. Our country is screwed if this passes for analysis/insight. I am absolutely for outcomes based rankings but to ignore absolute outcomes in favor of an incredibly flawed attempt to determine value-added by the college vs. some “expected value” for the student had he gone somewhere else is insane. In statistical terms, the error term is sky high here.
In short, your school did not receive a high ranking.
That's obviously not at all what that post is saying (which I did not write). If you want it short: an analysis of financial return on investment needs to be more granular to be able to quantitatively compare apples to apples.
Anonymous wrote:Anonymous wrote:Anonymous wrote:I don't see UGA or U Miami? (both on my kid's list)
170, 90
yikes, quite a drop from most (all?) other rankings we've seen.
Anonymous wrote:Anonymous wrote:Ok, this ranking is absolutely crazy and methodology is absolutely specious, so much so that I am surprised it got by the WSJ editors. The 70% “outcomes” weighting is largely driven by the calculated “value-added” by the college. They do this by totally ignoring the input of a high-achieving, top quality student and instead try to determine what “should” that student be making if he had gone to another “comparable” institution, say Penn instead of Brown, vs what that student is actually making. So, it has the effect of magnifying in geometric terms the 1st year salary differentials between schools in different selectivity tiers and rather bizarrely does not account for mix differentials such as engineering majors vs. engineering majors and classics majors vs. classics majors. The entire methodology fails data science 101. Our country is screwed if this passes for analysis/insight. I am absolutely for outcomes based rankings but to ignore absolute outcomes in favor of an incredibly flawed attempt to determine value-added by the college vs. some “expected value” for the student had he gone somewhere else is insane. In statistical terms, the error term is sky high here.
In short, your school did not receive a high ranking.
Anonymous wrote:Anonymous wrote:I don't see UGA or U Miami? (both on my kid's list)
170, 90
Anonymous wrote:Anonymous wrote:Anonymous wrote:Ok, this ranking is absolutely crazy and methodology is absolutely specious, so much so that I am surprised it got by the WSJ editors. The 70% “outcomes” weighting is largely driven by the calculated “value-added” by the college. They do this by totally ignoring the input of a high-achieving, top quality student and instead try to determine what “should” that student be making if he had gone to another “comparable” institution, say Penn instead of Brown, vs what that student is actually making. So, it has the effect of magnifying in geometric terms the 1st year salary differentials between schools in different selectivity tiers and rather bizarrely does not account for mix differentials such as engineering majors vs. engineering majors and classics majors vs. classics majors. The entire methodology fails data science 101. Our country is screwed if this passes for analysis/insight. I am absolutely for outcomes based rankings but to ignore absolute outcomes in favor of an incredibly flawed attempt to determine value-added by the college vs. some “expected value” for the student had he gone somewhere else is insane. In statistical terms, the error term is sky high here.
In short, your school did not receive a high ranking.
PP, actually I went to Columbia. I just hate sloppy work, and I will call it out when I see it.
Anonymous wrote:I don't see UGA or U Miami? (both on my kid's list)
Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:The list is absurd. Florida International is ranked 29 while Williams is 31.
Lehigh is 14, NJIT is 19, BYU is 20. Garbage in garbage out.
Disagree.
Just suggests that you are unfamiliar with FIU, Lehigh, NJIT, & BYU job placement statistics.
Schools with engineering and business programs seem to do better here. Obviously an accounting major has better salary prospects than an anthropologist. This doesn’t make Babson (number 10) better than Bowdoin (89) in any reasonable sense.
It does if one's focus is ROI (return on investment).
Not really. Kids attend these schools with different motivations. It’s measuring apples and oranges.
Then let them research & create their own college rating & ranking system that would be useful to millions of readers.
USNWR for all its flaws is a lot more sensible when comparing schools. Statistical analysis detached from reason is potentially more harmful than helpful.
Anonymous wrote:I don't think DCUM takes regional bias into consideration. UMD is a good school, as we know, but it's a good school with a regional reputation. It's not well known outside this area. And so .. when some publication ranks FL schools highly, people go nuts. But it may be we just don't know enough.
Also, this list is trash. ALL lists are trash.