Pervasive Myths - set the record straight

Anonymous
Anonymous wrote:
Anonymous wrote:that northeastern gamed the system for rankings


But that’s not a myth. It’s actually true. Took about 2 decades but they did it.


You can call it gaming, but Northeastern looked at the formula and sought to improve where they could. A myth would be that no other schools did the same thing. They might not have been as effective, but so many have. Schools took steps to lower admission rate when it was a factor (by inducing applications through push marketing), class size (through registration cutoffs), student faculty (by counting faculty that are peripheral to undergraduate education), same with resources (counting resources in IPEDS, the government database used by USNWR, that are peripheral to undergraduate education), improving alumni giving rate when it was a factor by dropping alumni from the database to lower the denominator (I think Berkeley was caught doing this), influencing other voters like it is Eurovision, admitting students that don't count against stats (foreign, Spring, etc.). There is literally no factor on USNWR that could not be "gamed" to some extent.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:I was at a career day and a kid asked me how much rank of school matters.

I told him what I believe, that it matters a little bit, especially earlier in your career and all other things being equal, but it doesn’t matter as as much as people think it does and especially later in your career no one even knows that I went to a top school.

Also, if a bright kid who is at a high ranked school can’t look me in the eye or socialize with me normally or express why they want this job over others, and I don’t wanna spend the summer with them, they’re not getting the internship.


It totally depend on the person’s major/field. STEM it matters the least. Humanities/polysci/history/psych/english, etc., it matters a lot.


Reputation matters, more or less depending on field, but I think the myth is that there is a firm, ordinal ranking like USNWR perpetuates (e.g. Princeton > Stanford because USNWR has them 1 and 4 respectively). I don't think it is that defined and rigid in the minds of recruiters and those in positions of influence.

In my field, high tech, school literally never comes up at this point in my career.


Reputation can also be highly regional and industry specific. Like, Procter & Gamble loves University of Cincinnati students (they have a relationship with them for coops). Or tech companies in the Bay Area recruit heavily from San Jose State.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:I was at a career day and a kid asked me how much rank of school matters.

I told him what I believe, that it matters a little bit, especially earlier in your career and all other things being equal, but it doesn’t matter as as much as people think it does and especially later in your career no one even knows that I went to a top school.

Also, if a bright kid who is at a high ranked school can’t look me in the eye or socialize with me normally or express why they want this job over others, and I don’t wanna spend the summer with them, they’re not getting the internship.


It totally depend on the person’s major/field. STEM it matters the least. Humanities/polysci/history/psych/english, etc., it matters a lot.


Reputation matters, more or less depending on field, but I think the myth is that there is a firm, ordinal ranking like USNWR perpetuates (e.g. Princeton > Stanford because USNWR has them 1 and 4 respectively). I don't think it is that defined and rigid in the minds of recruiters and those in positions of influence.

In my field, high tech, school literally never comes up at this point in my career.


Reputation can also be highly regional and industry specific. Like, Procter & Gamble loves University of Cincinnati students (they have a relationship with them for coops). Or tech companies in the Bay Area recruit heavily from San Jose State.


Good point.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:that northeastern gamed the system for rankings


But that’s not a myth. It’s actually true. Took about 2 decades but they did it.


You can call it gaming, but Northeastern looked at the formula and sought to improve where they could. A myth would be that no other schools did the same thing. They might not have been as effective, but so many have. Schools took steps to lower admission rate when it was a factor (by inducing applications through push marketing), class size (through registration cutoffs), student faculty (by counting faculty that are peripheral to undergraduate education), same with resources (counting resources in IPEDS, the government database used by USNWR, that are peripheral to undergraduate education), improving alumni giving rate when it was a factor by dropping alumni from the database to lower the denominator (I think Berkeley was caught doing this), influencing other voters like it is Eurovision, admitting students that don't count against stats (foreign, Spring, etc.). There is literally no factor on USNWR that could not be "gamed" to some extent.


+1.

What they did was brilliant. The fact that most people blindly follow "rankings" and do not understand how they are generated is more troubling. Yes, NEU improved, but did they improve as much as rankings seem to imply? I'll leave that up to you. But go on the parents pages and there are a lot of issues---they grew faster than the infrastructure could support. It's not very challenging for kids to get even one coop---I'd call a kid with a 3.8+ and sending out 250+ resumes "trying extremely hard" and yet some end up with nothing. Housing is a nightmare and getting the courses you need is challenging as the undergraduate population has grown without the infrastructure support needed. And advising is dismal in many instances---which often happens as a school grows. Yes bad advising can happen at a 5-6K undergrad school, but it happens less and less and is typically easier to rectify if a student even makes an effort. That is not the case at NEU

Anonymous
Anonymous wrote:Do people really believe kids are elites can’t look people in eyes and are socially inept? You see the hoops to getting accepted. The reality is nearly all are the full package.


Have you met some of the over the top STEM kids? No, they don't know how to socialize and are best at working in the back cubicle coding for you. They will do an excellent job in that role.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:that northeastern gamed the system for rankings


But that’s not a myth. It’s actually true. Took about 2 decades but they did it.


You can call it gaming, but Northeastern looked at the formula and sought to improve where they could. A myth would be that no other schools did the same thing. They might not have been as effective, but so many have. Schools took steps to lower admission rate when it was a factor (by inducing applications through push marketing), class size (through registration cutoffs), student faculty (by counting faculty that are peripheral to undergraduate education), same with resources (counting resources in IPEDS, the government database used by USNWR, that are peripheral to undergraduate education), improving alumni giving rate when it was a factor by dropping alumni from the database to lower the denominator (I think Berkeley was caught doing this), influencing other voters like it is Eurovision, admitting students that don't count against stats (foreign, Spring, etc.). There is literally no factor on USNWR that could not be "gamed" to some extent.


And this is part of the reason USNWR dropped some of these metrics, with the myth being that they did it to “hurt” private schools. The Pell grant criteria existing just to help public schools is also a partial myth, because many T20s actually do very well on these metrics, especially Ivies.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:that northeastern gamed the system for rankings


But that’s not a myth. It’s actually true. Took about 2 decades but they did it.


You can call it gaming, but Northeastern looked at the formula and sought to improve where they could. A myth would be that no other schools did the same thing. They might not have been as effective, but so many have. Schools took steps to lower admission rate when it was a factor (by inducing applications through push marketing), class size (through registration cutoffs), student faculty (by counting faculty that are peripheral to undergraduate education), same with resources (counting resources in IPEDS, the government database used by USNWR, that are peripheral to undergraduate education), improving alumni giving rate when it was a factor by dropping alumni from the database to lower the denominator (I think Berkeley was caught doing this), influencing other voters like it is Eurovision, admitting students that don't count against stats (foreign, Spring, etc.). There is literally no factor on USNWR that could not be "gamed" to some extent.


NEU calls it gaming: https://www.bostonmagazine.com/news/2014/08/26/how-northeastern-gamed-the-college-rankings/

quoting NEU: "We did play other kinds of games,” he says. “You get credit for the number of classes you have under 20 [students], so we lowered our caps on a lot of our classes to 19 just to make sure.” From 1996 to the 2003 edition (released in 2002), Northeastern rose 20 spots. (The title of each U.S. News “Best Colleges” edition actually refers to the upcoming year.)"

quoting NEU: "There was one thing, however, that U.S. News weighted heavily that could not be fixed with numbers or formulas: the peer assessment. This would require some old-fashioned glad-handing. Freeland guessed that if there were 100 or so universities ahead of NU and if three people at each school were filling out the assessments, he and his team would have to influence some 300 people. “We figured, ‘That’s a manageable number, so we’re just gonna try to get to every one of them,’” Freeland says. “Every trip I took, every city I went to, every conference I went to, I made a point of making contact with any president who was in that national ranking.” Meanwhile, he put less effort into assessing other schools."

Quoting the article: "but he had “gamed” the system as far as he could on his own. To break into the top 100, he’d need more intel on the news magazine’s methodology. He would also need U.S. News’s complicity. “We were trying to move the needle,” Freeland says, “and we felt there were a couple of ways in which the formula was not fair to Northeastern.” And so it was in 2004 when Freeland, a 63-year-old with bushy gray eyebrows and slightly unkempt hair, stepped out of a taxi near the waterfront in Washington, DC’s fashionable Georgetown neighborhood. With his head down, his lips tightly pursed, he marched into the red-brick offices of U.S. News, determined to make the rankings wizard, data guru Robert Morse, his accomplice.

These are not things other schools do.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:^^^
Maybe it’s that other things may matter more than grades/scores once you hit a min baseline?

It’s why we hear of so many 4.0 uw//35/ 1550+ being shut out of top 25, whereas a kid with a 3.8uw test optional is getting in



ding ding ding ding, someone wins a prize!

There isn't one baseline. There are six lines--1, 2, 3, 4, 5, and 6. The scores you need in other categories for an offer depend on where you are relative to these academic categories. There's not one line.


Sort of. Less than a 3 in any category except athletics is generally a hard pass.


I saw this comment from a former AO on reddit and thought it was good"

https://www.reddit.com/r/ApplyingToCollege/comments/1jekub2/comment/miju4kv/?context=3

Athletes, the ones who are recruited, start the admissions process sometimes much earlier than everyone else. Some athletes, if a coach really wants them, will be recruited and admitted in their junior year sometimes. There are some recruited athletes that get in just about the time that other students start applying. Other students might not be recruited that early and end up in the same pool as everyone else. The answer to your question mostly depends on the coach, the sport, and how good the student is, and if they're going to be recruited or not. That's what determines the "boost."

The backdoor no one ever seems to mention are the corporations. The Harvard Corporation, formally known as President and Fellows of Harvard College, as an example, will have the chance to put forth some students they think are remarkable and deserve attention from admissions. Now, getting nominated means you need to somehow, organically and authentically, get to know them and have them vouch for you. It's not really a major way to get in, but it's the "backdoor" you're asking about.

This is above my pay grade tbh lol. Idk.

Contrary to popular belief, AOs don't all spend 15 min per app for everyone. We deny troves of students quickly, say roughly in 2-5 minutes. Every app does get reviewed by a human, but we have enough experience to know when a student stands out.

I personally don't think so. I've always been fond of community colleges and think that a student savvy enough and curious enough will do just fine at a CC and transferring. Now, if you ask me as a consultant, then I mean my answer changes in that it depends on the parents and students and what they want.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:that northeastern gamed the system for rankings


But that’s not a myth. It’s actually true. Took about 2 decades but they did it.


You can call it gaming, but Northeastern looked at the formula and sought to improve where they could. A myth would be that no other schools did the same thing. They might not have been as effective, but so many have. Schools took steps to lower admission rate when it was a factor (by inducing applications through push marketing), class size (through registration cutoffs), student faculty (by counting faculty that are peripheral to undergraduate education), same with resources (counting resources in IPEDS, the government database used by USNWR, that are peripheral to undergraduate education), improving alumni giving rate when it was a factor by dropping alumni from the database to lower the denominator (I think Berkeley was caught doing this), influencing other voters like it is Eurovision, admitting students that don't count against stats (foreign, Spring, etc.). There is literally no factor on USNWR that could not be "gamed" to some extent.


NEU calls it gaming: https://www.bostonmagazine.com/news/2014/08/26/how-northeastern-gamed-the-college-rankings/

quoting NEU: "We did play other kinds of games,” he says. “You get credit for the number of classes you have under 20 [students], so we lowered our caps on a lot of our classes to 19 just to make sure.” From 1996 to the 2003 edition (released in 2002), Northeastern rose 20 spots. (The title of each U.S. News “Best Colleges” edition actually refers to the upcoming year.)"

quoting NEU: "There was one thing, however, that U.S. News weighted heavily that could not be fixed with numbers or formulas: the peer assessment. This would require some old-fashioned glad-handing. Freeland guessed that if there were 100 or so universities ahead of NU and if three people at each school were filling out the assessments, he and his team would have to influence some 300 people. “We figured, ‘That’s a manageable number, so we’re just gonna try to get to every one of them,’” Freeland says. “Every trip I took, every city I went to, every conference I went to, I made a point of making contact with any president who was in that national ranking.” Meanwhile, he put less effort into assessing other schools."

Quoting the article: "but he had “gamed” the system as far as he could on his own. To break into the top 100, he’d need more intel on the news magazine’s methodology. He would also need U.S. News’s complicity. “We were trying to move the needle,” Freeland says, “and we felt there were a couple of ways in which the formula was not fair to Northeastern.” And so it was in 2004 when Freeland, a 63-year-old with bushy gray eyebrows and slightly unkempt hair, stepped out of a taxi near the waterfront in Washington, DC’s fashionable Georgetown neighborhood. With his head down, his lips tightly pursed, he marched into the red-brick offices of U.S. News, determined to make the rankings wizard, data guru Robert Morse, his accomplice.

These are not things other schools do.


They absolutely are things other schools do.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:that northeastern gamed the system for rankings


But that’s not a myth. It’s actually true. Took about 2 decades but they did it.


You can call it gaming, but Northeastern looked at the formula and sought to improve where they could. A myth would be that no other schools did the same thing. They might not have been as effective, but so many have. Schools took steps to lower admission rate when it was a factor (by inducing applications through push marketing), class size (through registration cutoffs), student faculty (by counting faculty that are peripheral to undergraduate education), same with resources (counting resources in IPEDS, the government database used by USNWR, that are peripheral to undergraduate education), improving alumni giving rate when it was a factor by dropping alumni from the database to lower the denominator (I think Berkeley was caught doing this), influencing other voters like it is Eurovision, admitting students that don't count against stats (foreign, Spring, etc.). There is literally no factor on USNWR that could not be "gamed" to some extent.


NEU calls it gaming: https://www.bostonmagazine.com/news/2014/08/26/how-northeastern-gamed-the-college-rankings/

quoting NEU: "We did play other kinds of games,” he says. “You get credit for the number of classes you have under 20 [students], so we lowered our caps on a lot of our classes to 19 just to make sure.” From 1996 to the 2003 edition (released in 2002), Northeastern rose 20 spots. (The title of each U.S. News “Best Colleges” edition actually refers to the upcoming year.)"

quoting NEU: "There was one thing, however, that U.S. News weighted heavily that could not be fixed with numbers or formulas: the peer assessment. This would require some old-fashioned glad-handing. Freeland guessed that if there were 100 or so universities ahead of NU and if three people at each school were filling out the assessments, he and his team would have to influence some 300 people. “We figured, ‘That’s a manageable number, so we’re just gonna try to get to every one of them,’” Freeland says. “Every trip I took, every city I went to, every conference I went to, I made a point of making contact with any president who was in that national ranking.” Meanwhile, he put less effort into assessing other schools."

Quoting the article: "but he had “gamed” the system as far as he could on his own. To break into the top 100, he’d need more intel on the news magazine’s methodology. He would also need U.S. News’s complicity. “We were trying to move the needle,” Freeland says, “and we felt there were a couple of ways in which the formula was not fair to Northeastern.” And so it was in 2004 when Freeland, a 63-year-old with bushy gray eyebrows and slightly unkempt hair, stepped out of a taxi near the waterfront in Washington, DC’s fashionable Georgetown neighborhood. With his head down, his lips tightly pursed, he marched into the red-brick offices of U.S. News, determined to make the rankings wizard, data guru Robert Morse, his accomplice.

These are not things other schools do.


They absolutely are things other schools do.
What other schools camped out to meet the US News reporter and wait and wait until they talk to them?
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:that northeastern gamed the system for rankings


But that’s not a myth. It’s actually true. Took about 2 decades but they did it.


You can call it gaming, but Northeastern looked at the formula and sought to improve where they could. A myth would be that no other schools did the same thing. They might not have been as effective, but so many have. Schools took steps to lower admission rate when it was a factor (by inducing applications through push marketing), class size (through registration cutoffs), student faculty (by counting faculty that are peripheral to undergraduate education), same with resources (counting resources in IPEDS, the government database used by USNWR, that are peripheral to undergraduate education), improving alumni giving rate when it was a factor by dropping alumni from the database to lower the denominator (I think Berkeley was caught doing this), influencing other voters like it is Eurovision, admitting students that don't count against stats (foreign, Spring, etc.). There is literally no factor on USNWR that could not be "gamed" to some extent.


And this is part of the reason USNWR dropped some of these metrics, with the myth being that they did it to “hurt” private schools. The Pell grant criteria existing just to help public schools is also a partial myth, because many T20s actually do very well on these metrics, especially Ivies.


Pretty much all USNWR metrics can be gamed, manipulated, or otherwise optimized. SAT/ACT - have low scorers you want to admit apply test optional or apply not for Fall admit. Top 10% - most high schools do not report class rank, so accept those in top 10% when available and make sure others you want to accept are from schools that do not report class rank. Resources - count resources associated with graduate programs like medical in totals (note how UCLA has had much higher resource scores than Berkeley for this reason. Acceptance rate (when it was considered) - promote applications through push tactics, count incomplete, simplify application to make it easier (Tulane, etc.). Class size - cutoffs in enrollment below USNWR threshholds. Student/Faculty ratio - count faculty not really engaged with undergraduate (Penn, etc.). Alumni giving rate - reduce denominator by dropping Alumni schools say can't be reached (Berkeley, etc.)
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:that northeastern gamed the system for rankings


But that’s not a myth. It’s actually true. Took about 2 decades but they did it.


You can call it gaming, but Northeastern looked at the formula and sought to improve where they could. A myth would be that no other schools did the same thing. They might not have been as effective, but so many have. Schools took steps to lower admission rate when it was a factor (by inducing applications through push marketing), class size (through registration cutoffs), student faculty (by counting faculty that are peripheral to undergraduate education), same with resources (counting resources in IPEDS, the government database used by USNWR, that are peripheral to undergraduate education), improving alumni giving rate when it was a factor by dropping alumni from the database to lower the denominator (I think Berkeley was caught doing this), influencing other voters like it is Eurovision, admitting students that don't count against stats (foreign, Spring, etc.). There is literally no factor on USNWR that could not be "gamed" to some extent.


NEU calls it gaming: https://www.bostonmagazine.com/news/2014/08/26/how-northeastern-gamed-the-college-rankings/

quoting NEU: "We did play other kinds of games,” he says. “You get credit for the number of classes you have under 20 [students], so we lowered our caps on a lot of our classes to 19 just to make sure.” From 1996 to the 2003 edition (released in 2002), Northeastern rose 20 spots. (The title of each U.S. News “Best Colleges” edition actually refers to the upcoming year.)"

quoting NEU: "There was one thing, however, that U.S. News weighted heavily that could not be fixed with numbers or formulas: the peer assessment. This would require some old-fashioned glad-handing. Freeland guessed that if there were 100 or so universities ahead of NU and if three people at each school were filling out the assessments, he and his team would have to influence some 300 people. “We figured, ‘That’s a manageable number, so we’re just gonna try to get to every one of them,’” Freeland says. “Every trip I took, every city I went to, every conference I went to, I made a point of making contact with any president who was in that national ranking.” Meanwhile, he put less effort into assessing other schools."

Quoting the article: "but he had “gamed” the system as far as he could on his own. To break into the top 100, he’d need more intel on the news magazine’s methodology. He would also need U.S. News’s complicity. “We were trying to move the needle,” Freeland says, “and we felt there were a couple of ways in which the formula was not fair to Northeastern.” And so it was in 2004 when Freeland, a 63-year-old with bushy gray eyebrows and slightly unkempt hair, stepped out of a taxi near the waterfront in Washington, DC’s fashionable Georgetown neighborhood. With his head down, his lips tightly pursed, he marched into the red-brick offices of U.S. News, determined to make the rankings wizard, data guru Robert Morse, his accomplice.

These are not things other schools do.


They absolutely are things other schools do.
What other schools camped out to meet the US News reporter and wait and wait until they talk to them?


Northeastern was an early mover, more effective, and went to greater lengths. But other schools do it.
Anonymous
Anonymous wrote:It is either:

- a well rounded, normal, happy kid in typical activities

Or

- a curated cv of an unhappy kid which was implemented by mom and dad over many years


Or, the school I like has well-rounded, normal students. For the school I don't like, which may have similar stats, the students are nerdy, pointy, etc.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:that northeastern gamed the system for rankings


But that’s not a myth. It’s actually true. Took about 2 decades but they did it.


You can call it gaming, but Northeastern looked at the formula and sought to improve where they could. A myth would be that no other schools did the same thing. They might not have been as effective, but so many have. Schools took steps to lower admission rate when it was a factor (by inducing applications through push marketing), class size (through registration cutoffs), student faculty (by counting faculty that are peripheral to undergraduate education), same with resources (counting resources in IPEDS, the government database used by USNWR, that are peripheral to undergraduate education), improving alumni giving rate when it was a factor by dropping alumni from the database to lower the denominator (I think Berkeley was caught doing this), influencing other voters like it is Eurovision, admitting students that don't count against stats (foreign, Spring, etc.). There is literally no factor on USNWR that could not be "gamed" to some extent.


And this is part of the reason USNWR dropped some of these metrics, with the myth being that they did it to “hurt” private schools. The Pell grant criteria existing just to help public schools is also a partial myth, because many T20s actually do very well on these metrics, especially Ivies.


Pell grants, along with factoring research output, greatly aided large publics when coupled with dropping alumni giving, student faculty ratio, etc. I think USNWR thought it would broadened the appeal of the ranking by greatly expanding the enrollment of the schools that are say top 50.
Anonymous
Why is this still coming up? A bunch of clueless newbies on the board?
The old articles basically say Northeastern played by the rules, while there have been many schools that not just gamed but actually cheated. Emory, Berkeley, and Columbia are some examples. Nobody ever brings up the cheating when talking about these schools. Unlike these cheating schools, Northeastern was never accused of cheating, so dumb haters keep bringing up 'gaming'.

Northeastern is probably the school that has made the most improvements over the decades, too, and now its major metrics rival those of very top schools, including retention rate, graduation rate, student profile, and outcomes. If this is the result of 'gaming', schools please do game but don't cheat.



post reply Forum Index » College and University Discussion
Message Quick Reply
Go to: