Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:that northeastern gamed the system for rankings
But that’s not a myth. It’s actually true. Took about 2 decades but they did it.
You can call it gaming, but Northeastern looked at the formula and sought to improve where they could. A myth would be that no other schools did the same thing. They might not have been as effective, but so many have. Schools took steps to lower admission rate when it was a factor (by inducing applications through push marketing), class size (through registration cutoffs), student faculty (by counting faculty that are peripheral to undergraduate education), same with resources (counting resources in IPEDS, the government database used by USNWR, that are peripheral to undergraduate education), improving alumni giving rate when it was a factor by dropping alumni from the database to lower the denominator (I think Berkeley was caught doing this), influencing other voters like it is Eurovision, admitting students that don't count against stats (foreign, Spring, etc.). There is literally no factor on USNWR that could not be "gamed" to some extent.
And this is part of the reason USNWR dropped some of these metrics, with the myth being that they did it to “hurt” private schools. The Pell grant criteria existing just to help public schools is also a partial myth, because many T20s actually do very well on these metrics, especially Ivies.
Anonymous wrote:It is either:
- a well rounded, normal, happy kid in typical activities
Or
- a curated cv of an unhappy kid which was implemented by mom and dad over many years
Anonymous wrote:What other schools camped out to meet the US News reporter and wait and wait until they talk to them?Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:that northeastern gamed the system for rankings
But that’s not a myth. It’s actually true. Took about 2 decades but they did it.
You can call it gaming, but Northeastern looked at the formula and sought to improve where they could. A myth would be that no other schools did the same thing. They might not have been as effective, but so many have. Schools took steps to lower admission rate when it was a factor (by inducing applications through push marketing), class size (through registration cutoffs), student faculty (by counting faculty that are peripheral to undergraduate education), same with resources (counting resources in IPEDS, the government database used by USNWR, that are peripheral to undergraduate education), improving alumni giving rate when it was a factor by dropping alumni from the database to lower the denominator (I think Berkeley was caught doing this), influencing other voters like it is Eurovision, admitting students that don't count against stats (foreign, Spring, etc.). There is literally no factor on USNWR that could not be "gamed" to some extent.
NEU calls it gaming: https://www.bostonmagazine.com/news/2014/08/26/how-northeastern-gamed-the-college-rankings/
quoting NEU: "We did play other kinds of games,” he says. “You get credit for the number of classes you have under 20 [students], so we lowered our caps on a lot of our classes to 19 just to make sure.” From 1996 to the 2003 edition (released in 2002), Northeastern rose 20 spots. (The title of each U.S. News “Best Colleges” edition actually refers to the upcoming year.)"
quoting NEU: "There was one thing, however, that U.S. News weighted heavily that could not be fixed with numbers or formulas: the peer assessment. This would require some old-fashioned glad-handing. Freeland guessed that if there were 100 or so universities ahead of NU and if three people at each school were filling out the assessments, he and his team would have to influence some 300 people. “We figured, ‘That’s a manageable number, so we’re just gonna try to get to every one of them,’” Freeland says. “Every trip I took, every city I went to, every conference I went to, I made a point of making contact with any president who was in that national ranking.” Meanwhile, he put less effort into assessing other schools."
Quoting the article: "but he had “gamed” the system as far as he could on his own. To break into the top 100, he’d need more intel on the news magazine’s methodology. He would also need U.S. News’s complicity. “We were trying to move the needle,” Freeland says, “and we felt there were a couple of ways in which the formula was not fair to Northeastern.” And so it was in 2004 when Freeland, a 63-year-old with bushy gray eyebrows and slightly unkempt hair, stepped out of a taxi near the waterfront in Washington, DC’s fashionable Georgetown neighborhood. With his head down, his lips tightly pursed, he marched into the red-brick offices of U.S. News, determined to make the rankings wizard, data guru Robert Morse, his accomplice.
These are not things other schools do.
They absolutely are things other schools do.
Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:that northeastern gamed the system for rankings
But that’s not a myth. It’s actually true. Took about 2 decades but they did it.
You can call it gaming, but Northeastern looked at the formula and sought to improve where they could. A myth would be that no other schools did the same thing. They might not have been as effective, but so many have. Schools took steps to lower admission rate when it was a factor (by inducing applications through push marketing), class size (through registration cutoffs), student faculty (by counting faculty that are peripheral to undergraduate education), same with resources (counting resources in IPEDS, the government database used by USNWR, that are peripheral to undergraduate education), improving alumni giving rate when it was a factor by dropping alumni from the database to lower the denominator (I think Berkeley was caught doing this), influencing other voters like it is Eurovision, admitting students that don't count against stats (foreign, Spring, etc.). There is literally no factor on USNWR that could not be "gamed" to some extent.
And this is part of the reason USNWR dropped some of these metrics, with the myth being that they did it to “hurt” private schools. The Pell grant criteria existing just to help public schools is also a partial myth, because many T20s actually do very well on these metrics, especially Ivies.
What other schools camped out to meet the US News reporter and wait and wait until they talk to them?Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:that northeastern gamed the system for rankings
But that’s not a myth. It’s actually true. Took about 2 decades but they did it.
You can call it gaming, but Northeastern looked at the formula and sought to improve where they could. A myth would be that no other schools did the same thing. They might not have been as effective, but so many have. Schools took steps to lower admission rate when it was a factor (by inducing applications through push marketing), class size (through registration cutoffs), student faculty (by counting faculty that are peripheral to undergraduate education), same with resources (counting resources in IPEDS, the government database used by USNWR, that are peripheral to undergraduate education), improving alumni giving rate when it was a factor by dropping alumni from the database to lower the denominator (I think Berkeley was caught doing this), influencing other voters like it is Eurovision, admitting students that don't count against stats (foreign, Spring, etc.). There is literally no factor on USNWR that could not be "gamed" to some extent.
NEU calls it gaming: https://www.bostonmagazine.com/news/2014/08/26/how-northeastern-gamed-the-college-rankings/
quoting NEU: "We did play other kinds of games,” he says. “You get credit for the number of classes you have under 20 [students], so we lowered our caps on a lot of our classes to 19 just to make sure.” From 1996 to the 2003 edition (released in 2002), Northeastern rose 20 spots. (The title of each U.S. News “Best Colleges” edition actually refers to the upcoming year.)"
quoting NEU: "There was one thing, however, that U.S. News weighted heavily that could not be fixed with numbers or formulas: the peer assessment. This would require some old-fashioned glad-handing. Freeland guessed that if there were 100 or so universities ahead of NU and if three people at each school were filling out the assessments, he and his team would have to influence some 300 people. “We figured, ‘That’s a manageable number, so we’re just gonna try to get to every one of them,’” Freeland says. “Every trip I took, every city I went to, every conference I went to, I made a point of making contact with any president who was in that national ranking.” Meanwhile, he put less effort into assessing other schools."
Quoting the article: "but he had “gamed” the system as far as he could on his own. To break into the top 100, he’d need more intel on the news magazine’s methodology. He would also need U.S. News’s complicity. “We were trying to move the needle,” Freeland says, “and we felt there were a couple of ways in which the formula was not fair to Northeastern.” And so it was in 2004 when Freeland, a 63-year-old with bushy gray eyebrows and slightly unkempt hair, stepped out of a taxi near the waterfront in Washington, DC’s fashionable Georgetown neighborhood. With his head down, his lips tightly pursed, he marched into the red-brick offices of U.S. News, determined to make the rankings wizard, data guru Robert Morse, his accomplice.
These are not things other schools do.
They absolutely are things other schools do.
Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:that northeastern gamed the system for rankings
But that’s not a myth. It’s actually true. Took about 2 decades but they did it.
You can call it gaming, but Northeastern looked at the formula and sought to improve where they could. A myth would be that no other schools did the same thing. They might not have been as effective, but so many have. Schools took steps to lower admission rate when it was a factor (by inducing applications through push marketing), class size (through registration cutoffs), student faculty (by counting faculty that are peripheral to undergraduate education), same with resources (counting resources in IPEDS, the government database used by USNWR, that are peripheral to undergraduate education), improving alumni giving rate when it was a factor by dropping alumni from the database to lower the denominator (I think Berkeley was caught doing this), influencing other voters like it is Eurovision, admitting students that don't count against stats (foreign, Spring, etc.). There is literally no factor on USNWR that could not be "gamed" to some extent.
NEU calls it gaming: https://www.bostonmagazine.com/news/2014/08/26/how-northeastern-gamed-the-college-rankings/
quoting NEU: "We did play other kinds of games,” he says. “You get credit for the number of classes you have under 20 [students], so we lowered our caps on a lot of our classes to 19 just to make sure.” From 1996 to the 2003 edition (released in 2002), Northeastern rose 20 spots. (The title of each U.S. News “Best Colleges” edition actually refers to the upcoming year.)"
quoting NEU: "There was one thing, however, that U.S. News weighted heavily that could not be fixed with numbers or formulas: the peer assessment. This would require some old-fashioned glad-handing. Freeland guessed that if there were 100 or so universities ahead of NU and if three people at each school were filling out the assessments, he and his team would have to influence some 300 people. “We figured, ‘That’s a manageable number, so we’re just gonna try to get to every one of them,’” Freeland says. “Every trip I took, every city I went to, every conference I went to, I made a point of making contact with any president who was in that national ranking.” Meanwhile, he put less effort into assessing other schools."
Quoting the article: "but he had “gamed” the system as far as he could on his own. To break into the top 100, he’d need more intel on the news magazine’s methodology. He would also need U.S. News’s complicity. “We were trying to move the needle,” Freeland says, “and we felt there were a couple of ways in which the formula was not fair to Northeastern.” And so it was in 2004 when Freeland, a 63-year-old with bushy gray eyebrows and slightly unkempt hair, stepped out of a taxi near the waterfront in Washington, DC’s fashionable Georgetown neighborhood. With his head down, his lips tightly pursed, he marched into the red-brick offices of U.S. News, determined to make the rankings wizard, data guru Robert Morse, his accomplice.
These are not things other schools do.
Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:^^^
Maybe it’s that other things may matter more than grades/scores once you hit a min baseline?
It’s why we hear of so many 4.0 uw//35/ 1550+ being shut out of top 25, whereas a kid with a 3.8uw test optional is getting in
ding ding ding ding, someone wins a prize!
There isn't one baseline. There are six lines--1, 2, 3, 4, 5, and 6. The scores you need in other categories for an offer depend on where you are relative to these academic categories. There's not one line.
Sort of. Less than a 3 in any category except athletics is generally a hard pass.
Anonymous wrote:Anonymous wrote:Anonymous wrote:that northeastern gamed the system for rankings
But that’s not a myth. It’s actually true. Took about 2 decades but they did it.
You can call it gaming, but Northeastern looked at the formula and sought to improve where they could. A myth would be that no other schools did the same thing. They might not have been as effective, but so many have. Schools took steps to lower admission rate when it was a factor (by inducing applications through push marketing), class size (through registration cutoffs), student faculty (by counting faculty that are peripheral to undergraduate education), same with resources (counting resources in IPEDS, the government database used by USNWR, that are peripheral to undergraduate education), improving alumni giving rate when it was a factor by dropping alumni from the database to lower the denominator (I think Berkeley was caught doing this), influencing other voters like it is Eurovision, admitting students that don't count against stats (foreign, Spring, etc.). There is literally no factor on USNWR that could not be "gamed" to some extent.
Anonymous wrote:Anonymous wrote:Anonymous wrote:that northeastern gamed the system for rankings
But that’s not a myth. It’s actually true. Took about 2 decades but they did it.
You can call it gaming, but Northeastern looked at the formula and sought to improve where they could. A myth would be that no other schools did the same thing. They might not have been as effective, but so many have. Schools took steps to lower admission rate when it was a factor (by inducing applications through push marketing), class size (through registration cutoffs), student faculty (by counting faculty that are peripheral to undergraduate education), same with resources (counting resources in IPEDS, the government database used by USNWR, that are peripheral to undergraduate education), improving alumni giving rate when it was a factor by dropping alumni from the database to lower the denominator (I think Berkeley was caught doing this), influencing other voters like it is Eurovision, admitting students that don't count against stats (foreign, Spring, etc.). There is literally no factor on USNWR that could not be "gamed" to some extent.
Anonymous wrote:Do people really believe kids are elites can’t look people in eyes and are socially inept? You see the hoops to getting accepted. The reality is nearly all are the full package.
Anonymous wrote:Anonymous wrote:Anonymous wrote:that northeastern gamed the system for rankings
But that’s not a myth. It’s actually true. Took about 2 decades but they did it.
You can call it gaming, but Northeastern looked at the formula and sought to improve where they could. A myth would be that no other schools did the same thing. They might not have been as effective, but so many have. Schools took steps to lower admission rate when it was a factor (by inducing applications through push marketing), class size (through registration cutoffs), student faculty (by counting faculty that are peripheral to undergraduate education), same with resources (counting resources in IPEDS, the government database used by USNWR, that are peripheral to undergraduate education), improving alumni giving rate when it was a factor by dropping alumni from the database to lower the denominator (I think Berkeley was caught doing this), influencing other voters like it is Eurovision, admitting students that don't count against stats (foreign, Spring, etc.). There is literally no factor on USNWR that could not be "gamed" to some extent.
Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:I was at a career day and a kid asked me how much rank of school matters.
I told him what I believe, that it matters a little bit, especially earlier in your career and all other things being equal, but it doesn’t matter as as much as people think it does and especially later in your career no one even knows that I went to a top school.
Also, if a bright kid who is at a high ranked school can’t look me in the eye or socialize with me normally or express why they want this job over others, and I don’t wanna spend the summer with them, they’re not getting the internship.
It totally depend on the person’s major/field. STEM it matters the least. Humanities/polysci/history/psych/english, etc., it matters a lot.
Reputation matters, more or less depending on field, but I think the myth is that there is a firm, ordinal ranking like USNWR perpetuates (e.g. Princeton > Stanford because USNWR has them 1 and 4 respectively). I don't think it is that defined and rigid in the minds of recruiters and those in positions of influence.
In my field, high tech, school literally never comes up at this point in my career.
Reputation can also be highly regional and industry specific. Like, Procter & Gamble loves University of Cincinnati students (they have a relationship with them for coops). Or tech companies in the Bay Area recruit heavily from San Jose State.
Anonymous wrote:Anonymous wrote:Anonymous wrote:I was at a career day and a kid asked me how much rank of school matters.
I told him what I believe, that it matters a little bit, especially earlier in your career and all other things being equal, but it doesn’t matter as as much as people think it does and especially later in your career no one even knows that I went to a top school.
Also, if a bright kid who is at a high ranked school can’t look me in the eye or socialize with me normally or express why they want this job over others, and I don’t wanna spend the summer with them, they’re not getting the internship.
It totally depend on the person’s major/field. STEM it matters the least. Humanities/polysci/history/psych/english, etc., it matters a lot.
Reputation matters, more or less depending on field, but I think the myth is that there is a firm, ordinal ranking like USNWR perpetuates (e.g. Princeton > Stanford because USNWR has them 1 and 4 respectively). I don't think it is that defined and rigid in the minds of recruiters and those in positions of influence.
In my field, high tech, school literally never comes up at this point in my career.
Anonymous wrote:Anonymous wrote:that northeastern gamed the system for rankings
But that’s not a myth. It’s actually true. Took about 2 decades but they did it.