Academic strength of Sidwell and Landon

Anonymous
Anonymous wrote:
Anonymous wrote:How can we know if the "cohort" really would be different? I mean this genuinely.

http://fwd4.me/Jes


This chart is useless without citation of sources for the data. Would you please provide sources for the data?
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:How can we know if the "cohort" really would be different? I mean this genuinely.

http://fwd4.me/Jes


This chart is useless without citation of sources for the data. Would you please provide sources for the data?


Also, each of the categories in the chart is based on one standardized-test score, hardly a strong basis for judging an individual student or the entire student body. It may be that some schools place more emphasis on test scores in their admissions process, ending up with a student body of high-scorers, while other schools may give greater consideration to other factors such as artisitic ability, leadership potential, intellectual passion, etc., which are arguably factors as or more important in judging the quality of a student or student body as are test scores.
Anonymous
Yeah, because whoever put it together hasn't done enough work already. And she owes you personally even though you can't be bothered to check her work yourself or to provide sources. You'll just condemn her work as useless while demanding she do even more. Somehow the "please" sounds insincere...
Anonymous
Anonymous wrote:Yeah, because whoever put it together hasn't done enough work already. And she owes you personally even though you can't be bothered to check her work yourself or to provide sources. You'll just condemn her work as useless while demanding she do even more. Somehow the "please" sounds insincere...


Don't be so gullible that you'll accept whetever statistics are fed to you without verficiation and citation of sources.Did you never take a basic course in statistics? It's not a question of debt and gratitude, as you seem to think, but a matter of intellectual rigor to require that statistical data have sources cited.
Anonymous
Meant to type "verification"
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:How can we know if the "cohort" really would be different? I mean this genuinely.

http://fwd4.me/Jes

This chart is useless without citation of sources for the data. Would you please provide sources for the data?

Also, each of the categories in the chart is based on one standardized-test score ....

I think you're not reading the spreadsheet closely or correctly. The NMSF numbers are based on 10-15 years of data. The SAT scores, the college admission stats, and the Presidential Scholar stats all are based on several years of data. And the sources are cited. Try taking a closer look.

BTW, I am not logged in, but I am SAM2.
Anonymous
SAM2, are you the one who put together this spreadsheet? If so, thank you for all of your hard work.

Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:How can we know if the "cohort" really would be different? I mean this genuinely.

http://fwd4.me/Jes

This chart is useless without citation of sources for the data. Would you please provide sources for the data?

Also, each of the categories in the chart is based on one standardized-test score ....

I think you're not reading the spreadsheet closely or correctly. The NMSF numbers are based on 10-15 years of data. The SAT scores, the college admission stats, and the Presidential Scholar stats all are based on several years of data. And the sources are cited. Try taking a closer look.

BTW, I am not logged in, but I am SAM2.
Anonymous
Anonymous wrote:SAM2, are you the one who put together this spreadsheet? If so, thank you for all of your hard work.

I am. You're welcome. It's been an interesting project.
Anonymous
Anonymous wrote:
Anonymous wrote:Yeah, because whoever put it together hasn't done enough work already. And she owes you personally even though you can't be bothered to check her work yourself or to provide sources. You'll just condemn her work as useless while demanding she do even more. Somehow the "please" sounds insincere...


Don't be so gullible that you'll accept whetever statistics are fed to you without verficiation and citation of sources.Did you never take a basic course in statistics? It's not a question of debt and gratitude, as you seem to think, but a matter of intellectual rigor to require that statistical data have sources cited.


No, actually, it's a matter of documenting problems rather than casting aspersions. There's no "intellectual rigor" in dismissing work as useless because it doesn't include footnotes. "Yeah -- prove it!" isn't rigorous; it's just lazy and rude.

FWIW, for a variety of reasons, this isn't the data I'd want to compare schools -- e.g. I think NM commended is a more useful measure of cohort than NMSF and that absolute numbers matter more than percentages -- but it clearly is the kind of data some people want. Having poked around trying to find some of this info previously, I know how hard it is to collect and I've seen this poster ask for additional sources/cites to inform or correct her analysis. So if you want to be helpful, give her ways to fill in the gaps or point out sources that suggest her data is flawed. And if you think it's a misconceived project, explain why. Or if you don't care enough to do either, just let it pass and don't respond. But your approach didn't contribute anything of value. It was just obnoxious.
Anonymous
Anonymous wrote:
Anonymous wrote:How can we know if the "cohort" really would be different? I mean this genuinely.

http://fwd4.me/Jes


This person is to be commended for working hard, but the methodology is out of whack. For example, a website that someone mentioned on another thread, www.matriculationstats.org, has far more accurate and useful data on college matriculation (provided by most of the private schools themselves!) than this poster's reliance on a WSJ survey that includes Pomona, but excludes Yale and Dartmouth. This is why previous posters troubled by a few select schools' reluctance to make matriculation data available have a valid point. Until the last few local schools make their matriculation data available, anyone can spin any statistics any way they care to in order to achieve a desired, non-objective result.
SAM2
Member Offline
Anonymous wrote: ... the methodology is out of whack. For example, a website that someone mentioned on another thread, www.matriculationstats.org, has far more accurate and useful data on college matriculation (provided by most of the private schools themselves!) than this poster's reliance on a WSJ survey that includes Pomona, but excludes Yale and Dartmouth. This is why previous posters troubled by a few select schools' reluctance to make matriculation data available have a valid point. Until the last few local schools make their matriculation data available, anyone can spin any statistics any way they care to in order to achieve a desired, non-objective result.

I absolutely agree that matriculationstats.org has great data presentation -- far better than what I put together, and covering many more schools around the country. But I'd like to clarify the slightly different approach I took, and perhaps correct a couple misunderstandings you might have.

First of all, you should understand that Matriculationstats is focused on evaluating NYC schools, while I am focused on DC schools, so our aims are slightly different. Because he is focused on NYC schools, it does not matter as much to him that there are gaps in his DC school list (e.g., Sidwell, GDS, Potomac, G'town Visi, etc.). For my approach, it was important to try to fill those gaps.

When I looked for ways to fill those gaps, the best data I could find comes from two Wall Street Journal studies, and one Worth Magazine study, all of which are commonly cited here on DCUM. Those studies obtained data on many of the DC schools that don't routinely report college matriculations (e.g., Sidwell, GDS, Potomac). However, each of those studies only focused on a handful of colleges, and only for a limited time period (1 year for WSJ, and 4 years for Worth). In making my own analysis, I wanted to make use of the deep research those studies did, but also extend the coverage of those existing studies. So I used the same methodology those studies did, and focused on the same colleges they did. If you want to criticize the use of colleges like Pomona, blame the WSJ, not me. (Also, you are incorrect to suggest that Yale and Dartmouth are excluded; data on those colleges is covered.)

Other than the WSJ/Worth data (which I use and he doesn't), I'm pretty sure most of the other data Matriculationstats and I use is exactly the same -- it all comes directly from the schools' own websites. If you're interested in further research, I cited links to most of the data sources in my spreadsheet. I think Matriculationstats also used some non-public data from certain NYC schools, which he obtained directly from those NYC schools or people associated with the schools. I was not interested in contacting DC schools to ask for private matriculation data, so all my analysis is based on publicly available data. If anyone wants to PM me with credible private data, I'd be happy to incorporate it.

In the end, I've blended the results of the three studies from WSJ and Worth, and added data from several DC schools. This means that the analysis for every school includes at least three years worth of data for 3-14 different colleges, and some have up to 10 years of data included. It's all public data available to anyone, and I've "shown my work" by making obvious the calculations and underlying numbers, so with a little effort, anyone can check my work and evaluate the numbers. I have not selectively excluded any data, or "spun" the data in any direction. I'm usually a pretty skeptical critic of my own work, and I think this analysis is a pretty good rough representation of college matriculations for these schools. I completely agree this analysis is not perfect -- I wish I had 10 years of data on every school, so I could dissect and re-synthesize it in various ways. But I don't have that data, so I proceeded with the tools that presented themselves.

Thanks for all the kind words, and even for the constructive criticism. If anyone has good ideas on how to make this analysis stronger, or sources for additional data, please let me know.
Anonymous

To SAM2:

Thanks so much for sharing the tables. Very useful!
Amazing how much effort you have put into this and how well -- given the data constraints -- you have done. Thank you for sharing this indeed!

Regarding the results shown in your table, I find it interesting just how much better the magnet schools (and many of the publics) are doing - on all the variables you show.

I am wondering, for a smart kid that would qualify, how much more competitive it is to get into the magnets compared to getting into the BIG privates and how different the experience in a magnet is compared to the private. Any leads / earlier threads on this by any chance?
SAM2
Member Offline
Anonymous wrote:Regarding the results shown in your table, I find it interesting just how much better the magnet schools (and many of the publics) are doing - on all the variables you show. I am wondering, for a smart kid that would qualify, how much more competitive it is to get into the magnets compared to getting into the BIG privates and how different the experience in a magnet is compared to the private. Any leads / earlier threads on this by any chance?

Public magnets in DC and other places do put up some absolutely incredible numbers. I have no magic insight, but my best guess is that part of the answer for their success is that they are selecting applicants in the 8th grade, so they have a good "body of work" they can use to predict how students will do in 9th-12th. By contrast, many of the K-12 private schools only select about 20-30% of their high school classes from 8th grade applicants. The large majority of the high school class is made up of students who applied many years (perhaps even 10 years) earlier at younger grades. Predicting the future abilities of those young applicants many years in advance is probably very hard. Some might blossom into superstars, but others might not show much academically. I've been trying for a while now to think up some way to compare/contrast between those multi-year applicant pools. I've got a couple ideas, but have had no time to try the math yet. If you have ideas for approaches, please let me know. (The one interesting exception I've seen is Hunter in NYC, which selects its classes at K, and then teaches them up through 12th.)

Also, I think most magnets select students largely on the basis of standardized test performance, so the skills that lead applicants to do well on the standardized admissions tests will readily lend themselves to high performance on other standardized tests like the PSAT and SAT, which in turn leads to strong numbers in the NMSF and Presidential Scholar analysis.

No matter how you slice it, top public magnets and top privates all seem to turn out incredibly smart graduates. I think trying to compare different schools in a ranking sense is really a futile effort. The best that you could possibly do is create some broad and vaguely differentiated tiers. There's also the very important point that your individual child (or mine) is going to develop best if she's matched carefully to the proper school, and not just forced into a top-performing school that does not mesh with her innate abilities.

I haven't studied the admissions percentages at public magnets, so I am not much help there. I've seen various private school admission rates that range from 1:7 to 1:15, but I have no idea how accurate those numbers are. I think I recall seeing sometime a report from Thomas Jefferson that detailed the exact numbers for that school, and I think the ratio was something like 1:8 or 1:10. (If you're motivated to find that data, I think it was the Blue Ribbon report that was studying the lack of racial diversity at TJ.)

I hope this helps.
Anonymous
SAM2 wrote:
Anonymous wrote: ... the methodology is out of whack. For example, a website that someone mentioned on another thread, www.matriculationstats.org, has far more accurate and useful data on college matriculation (provided by most of the private schools themselves!) than this poster's reliance on a WSJ survey that includes Pomona, but excludes Yale and Dartmouth. This is why previous posters troubled by a few select schools' reluctance to make matriculation data available have a valid point. Until the last few local schools make their matriculation data available, anyone can spin any statistics any way they care to in order to achieve a desired, non-objective result.

I absolutely agree that matriculationstats.org has great data presentation -- far better than what I put together, and covering many more schools around the country. But I'd like to clarify the slightly different approach I took, and perhaps correct a couple misunderstandings you might have.

First of all, you should understand that Matriculationstats is focused on evaluating NYC schools, while I am focused on DC schools, so our aims are slightly different. Because he is focused on NYC schools, it does not matter as much to him that there are gaps in his DC school list (e.g., Sidwell, GDS, Potomac, G'town Visi, etc.). For my approach, it was important to try to fill those gaps.

When I looked for ways to fill those gaps, the best data I could find comes from two Wall Street Journal studies, and one Worth Magazine study, all of which are commonly cited here on DCUM. Those studies obtained data on many of the DC schools that don't routinely report college matriculations (e.g., Sidwell, GDS, Potomac). However, each of those studies only focused on a handful of colleges, and only for a limited time period (1 year for WSJ, and 4 years for Worth). In making my own analysis, I wanted to make use of the deep research those studies did, but also extend the coverage of those existing studies. So I used the same methodology those studies did, and focused on the same colleges they did. If you want to criticize the use of colleges like Pomona, blame the WSJ, not me. (Also, you are incorrect to suggest that Yale and Dartmouth are excluded; data on those colleges is covered.)

Other than the WSJ/Worth data (which I use and he doesn't), I'm pretty sure most of the other data Matriculationstats and I use is exactly the same -- it all comes directly from the schools' own websites. If you're interested in further research, I cited links to most of the data sources in my spreadsheet. I think Matriculationstats also used some non-public data from certain NYC schools, which he obtained directly from those NYC schools or people associated with the schools. I was not interested in contacting DC schools to ask for private matriculation data, so all my analysis is based on publicly available data. If anyone wants to PM me with credible private data, I'd be happy to incorporate it.

In the end, I've blended the results of the three studies from WSJ and Worth, and added data from several DC schools. This means that the analysis for every school includes at least three years worth of data for 3-14 different colleges, and some have up to 10 years of data included. It's all public data available to anyone, and I've "shown my work" by making obvious the calculations and underlying numbers, so with a little effort, anyone can check my work and evaluate the numbers. I have not selectively excluded any data, or "spun" the data in any direction. I'm usually a pretty skeptical critic of my own work, and I think this analysis is a pretty good rough representation of college matriculations for these schools. I completely agree this analysis is not perfect -- I wish I had 10 years of data on every school, so I could dissect and re-synthesize it in various ways. But I don't have that data, so I proceeded with the tools that presented themselves.

Thanks for all the kind words, and even for the constructive criticism. If anyone has good ideas on how to make this analysis stronger, or sources for additional data, please let me know.


My suggestion would be to blend in the hard matriculation data that the schools make available, and to rely less on the two 1-yr WSJ studies, and more on the 4-year Worth study. The Worth Magazine survey has Harvard-Yale-Princeton as the top schools and covers four years, while the WSJ includes Pomona but not Amherst, and covers only one year. Locally, Worth has St. Albans in the top spot, NCS in second, Potomac School in third, Sidwell in fourth, Holton-Arms in fifth, GDS in sixth, Landon in seventh, and Maret in eighth, while your rankings differ somewhat. I would agree that matriculationstats is probably the gold standard, and their website includes St. Albans, NCS, Holton-Arms, Landon, Georgetown Prep, Maret, and Gonzaga, but not Potomac School, Sidwell, and Maret. The best local survey would include the last three schools (if these lists became available), along with maybe St. Stephen's/St. Agnes and a few others. It's puzzling how the top New York City schools (by matriculation rank: Collegiate, Trinity, Brearley, Chapin, and Spence) all make their matriculation data available, while a select few local schools choose to withhold that info. Oh well. You're still performing a useful service, and trying to objectify a process that is fraught with political bias and other perils.
SAM2
Member Offline
Anonymous wrote:My suggestion would be to blend in the hard matriculation data that the schools make available ...

That's exactly what I did. For example, I used the 2004 WSJ number for NCS (22 students at 10 examined colleges in 2003), and then added to that the numbers NCS supplies for those exact same 10 colleges for the period 2005-2009 (97 students at 10 colleges for that period). I performed that same analysis for NCS for the colleges studied by the 2007 WSJ report (8 colleges), and for the 2002 Worth report (3 colleges). Then I averaged the resulting percentages to obtain a blended result. The final percentage is not really a reflection of what percentage of students actually attend all top colleges in the country -- that's more what Matriculationstats was trying to examine by making judgments as to which colleges are top colleges. Instead, my analysis looks at what percentage attended certain top colleges, and just extends the WSJ/Worth methodologies to several additional years where data is available.
post reply Forum Index » Private & Independent Schools
Message Quick Reply
Go to: