Stanford said it wouldn’t put out a press release about their admit rate when decisions were posted. They said they would still report their data to the government. |
That's because Virginians, as a whole, are more wealthy than Californians. NOw UCLA and Berkeley must take 80% in state, reducing the OOS figures. |
| I think Claremont Mckenna College that belongs with Pomona College actually got caught lying about their stats it provided USNews. Google. It's a fact. |
|
If I wanted to achieve a sky-high retention rate I'd just count my freshman retention in May of their freshman year by who registered for fall courses. Of course this would be deducting far fewer kids than if I did the count in the fall, as most transferring/bleeding happens over the summer.
Or even if there was an industry/federal retention date you had to use in the fall (is there?), I'd pump up the retention by making first year courses easier, get my advisors to put dumb kids in those remedial courses, pressure adjuncts to give them decent grades, and instead of kicking failing kids out, I'd allow them to take a leave from the college to keep them "technically" affiliated and thus not added to the loss column. |
Nobody has clarified this supposed industry standard way to calculate retention. I almost guarantee they only count students who notify them they're withdrawing from the school. I transferred out of a top 50 university (into a top 20) and I remember a few years later I could still get on the top 50's portal and enroll in courses! You really think I was counted against their retention %? Doubt it. |
|
It’s all a game.
Test optional? Only high test scores are received and reported, skyrocketing the averages. Spring freshman admits? First year or semester abroad? Their stats won’t be reported as freshman data. I could go on. But yes, it’s a game. Some colleges play better than others. And anyone who thinks that an education at one top 25 school is vastly superior to another based on rankings is delusional. |
Here are some typical tactics by USNews criteria: Average six-year graduation rate 17.6% and Average first-year student retention rate 4.4% - inflate grades making graduation easy; do not take on any “risky” kids who might deserve a chance; figure out ways to lower the denominator; do not expel Social mobility 5%, (Pell Grant graduation rates 2.5%; Pell Grant graduation rates compared all other students 2.5%) -- this is a new category, so give schools a couple of years to figure out how to spin it Graduation rate performance 8% -- inflate grades making graduation easy; do not take on any “risky” kids who might deserve a chance; figure out ways to lower the denominator; do not expel Peer assessment survey 15% - you scratch my back I’ll scratch yours High school counselors' ratings 5% -- market to them Class size index 8% -- some change this year, but limit class size to exact USNews cutoff points that earn points (e.g. 19 rather than 20 and 49 rather than 50); use independent study Faculty compensation 7% - use poorly-paid adjuncts and teaching assistants to actually teach and free up highly-paid tenure-track faculty more to do research that boost reputation Percent faculty with terminal degree in their field 3% - require terminal degree even if better alternative exists. Percent faculty that is full time 1% -- only 1% so not very damaging to have adjuncts; does not mean the faculty actually teach (TAs) Student-faculty ratio 1% - count all faculty in calculations including those primarily in research or only teach graduate students Math and evidence-based reading and writing portions of the SAT and the composite ACT scores 7.75% - Put low scorers in spring admit; do not require for international students; go test optional (only high scorers will submit); use merit aid (even if from wealth families) to get high scorers High school class standing in top 10% 2.25% -- Most high schools don’t provide rank any more. For ones that do, favor those in top 10% to boost ranking and give the impression all are in that category. Acceptance rate 0% - this has been dropped but 1) adopt common application 2) market to get anyone to apply including foreign countries and those who are not qualified 3) count even unfinished applications 4) drop essays or anything that is difficult Financial resources per student 10% -- Count any resources allowed (there is significant latitude), including money actually spent on research including medical, graduate program spending, etc. Average alumni giving rate 5% -- try to lower the denominator by not reaching out to alumni who aren’t giving (we lost contact); push for any donation, regardless of how small. |
| It's called data/stat juking and it's done in literally EVERY industry -- from sales to accounting to law enforcement to higher ed. Sometimes it goes so far as being illegal. |
A cynical (but good) list, but you omit one of the biggest. Let your big shot faculty publish, do research, and become a talking head because that influences reputation more than actually teaching. |
Hi. I work in higher education. Schools have to do a census in October to gather their data for the year. |
|
" No one thinks the colleges actually lie but not listing a couple dozen students who didn't tell them they are withdrawing so that a 96% retention rate becomes 97% and UVa is one point ahead of Michigan, yeah, that kind of stuff happens. It's no different than what you see on pretty much every ad on TV.
Stop...you are just making up stuff. UVa is unethical but Michigan is??? Come on... " That was not the poster's point. The point was that colleges, including BOTH UVa AND Michigan, build their reported statistics in ways that make them look good, just like TV ads. The only reason they do this is because lots of crazy people somehow think that going to a school ranked #25 is better than going to the one ranked #35. They then spend ridiculous amounts of money (too often money they don't have) to obtain this imaginary advantage. |
Fair points, but I’ll bet you think #25 is better than #75, or #125. I would also argue that though the rankings might be based on questionable methods (okay definitely) reality often follows perception. When a school rises on the list more competitive students notice it and apply. Before you know it the quality of student has changed. Look at Northeastern. Kids who wouldn’t have considered it 10-15 years ago are finding it impossible to get in now. Rankings did that, for better or worse (okay worse - but not for Northeastern!). For me it’s a fun hobby, rooting for my alma and my kids’ schools. And yes if they plummeted I’d be a little worried because the perception of brand value is a real thing, however dumb it might be. |
I love to give my brother hell about my college being higher ranked than his. We both know it doesn’t mean much but whatever. Not that different from rooting for their sports teams. But yes, W&L is objectively better than Randolph Macon. There, I said it.
|
| To answer the question as posed: because so many people feel compelled to spend exorbitant amounts of money on a service they don’t really understand or know how to value. If they’re pleased with the ranking, it’s external validation. If they aren’t, it’s easily dismissed. Win-win! |
+1 Agree re: education quality. Focus on rankings and "top 25" and "top tier" etc. is putting rank (decided by whom, on what basis, and why do we care?) ahead of actually finding the right school for the individual student. We started the college list by looking at which schools had the majors that interest DC and how those schools run those courses. |