Message
I think their point was that early-age intelligence tests (like WPPSI) are only moderately correlated to academic success, when viewed over a long range of time (like 5-10 years). I think the actual studies they cite suggest a correlation of just below 50%, although they do not discuss the many studies showing correlation ranges of 60-80%. The authors complain that some schools based admissions and giftedness decisions only on one early-age snapshot test, and do not confirm the results with later tests. They seem to suggest that schools should be repeatedly retesting students over multiple years to gauge their development, and should add/remove students from gifted programs depending on the results.

I think they like claiming that the early-age tests are inaccurate, because it's a relatively controversial statement that fits with their overarching thesis of "everything you thought you knew about childhood development is wrong." But when I read closely, it seemed (to me at least) that they're not really saying those early tests are completely inaccurate. Instead, they're just saying it's much more accurate to use later-age tests and/or to re-test.

One of the claims that made me suspicious was the claim that none of the largest schools systems (including Montgomery County) use re-tests to evaluate giftedness -- once a child tests into a gifted program, she's in forever. From what I know of MoCo schools, that's not accurate, since the kids must test into gifted programs at various grade levels (elementary vs. middle, vs. high school), and are not given any "free pass" just because they tested into a gifted program at a younger age. But my knowledge of MoCo is limited, so maybe I'm wrong about that.
I've been searching for a while now for some way to measure WPPSI distribution, but no luck so far. If you find anything, please post.
Anonymous wrote:
Anonymous wrote:Very cool website. I'm impressed (and jealous) that you got your hands on so much matriculation data from so many schools.

Thanks for the compliment. Other than a very few schools (like them handing me the list when we toured Bronx Science last spring), all of the information was taken from each school's website. I would be fine with using information that was generally available (like made available to parents at a school) as long as it was clear to me that it was genuine. Like the NYC schools, some DC schools seem to make that information readily available and some do not. I was mildly surprised that St. Alban's doesn't make that information available since it appears that they have the type of statistics make them look good.

I am one of the stat-heads on DCUM. It seems few of the DC schools make this info publicly available, even though they have impressive results. I am awed by what you put together. Great job.

http://www.stalbansschool.org/contentPage.aspx?pageId=42341§ionId=961
http://www.dcurbanmom.com/jforum/posts/list/67710.page
I'm not a participant in the "math wars," but something I've never understood is why each side feels the need to insist that it's preferred curriculum is strong and all other curricula are absolutely terrible. My suspicion would be that each of the various modern curricula has its own particular strengths and weaknesses. None will be perfect, but most will be pretty good. The real determinative factors will be whether a particular teacher is adept at teaching the chosen curriculum and whether particular students happen to do well with that curriculum. Some teachers and some students will do better with some curricula than with others.

Why does everyone seem to adopt the "my way and no other" attitude? Is that just a function of human nature? Or is there some reason people are especially passionate about math curricula?
Anonymous wrote:
I'm confused--why do you need OP to answer all those many questions?? Didn't OP provide raw data? Aren't you all with the questions just as able to analyze it? And just as able to fill in missing information if that's what you want to do? Why all the demands?

Yes, the poster did provide raw data and then proceeded with an analysis without description of methods. What was done with missing data? Garbage in -- garbage out. The message this analysis sent made absolutely no credible sense.

PP, I did not exclude any data. I think you must not be reading my prior posts where I ask everyone to give me more data to fill those gaps. Here's one example:
SAM2 wrote:Why is data missing for some years? Did I purposefully select certain years? I did not purposefully select or deselect any years. I included all credible data I could find. If particular schools in the spreadsheet lack data for certain years, it's because I simply have not been able to find data for those years yet. Help me out by finding and posting data!

If you want more data in the spreadsheet, then spend your time finding some more data sources for me. I've already added some extra data that other people found and posted. Can you find some too?

I'm not sure what you don't understand about my calculations. As someone else helpfully pointed out, you can just click on various spreadsheet cells to see the exact formulas I used to average the data. The simple summary is that I divided the average number of NMSFs by the average class size to get a sense of what percentage of each school reaches NMSF.

I'm also not sure what "message" you think I'm trying to send with these percentages. I've tried to make clear that I'm just offering up some data I pulled together, and I'm volunteering to supplement that data if people will find more sources. I don't have any links to any of these high schools, so I don't care about individual schools. If I have any "message," it's that people should quit making bullshit claims about any school and instead back up their claims with verifiable data.

I hope this explanation helps you. I look forward to your contributions to fill any missing-data gaps.
Anonymous wrote:Ms spreadsheet: Give us the NMF percentages for Sidwell and STA over the las 5 years? Do you have this data?

I added columns that show NMSF % for the most recent 3-year, 6-year, and 9-year periods (in addition to the total % that already was there). I personally think limiting the calculations to more recent periods makes the results even less useful than they already were, because many schools have significant data gaps. I also think very few schools change materially over a 10-15 year period -- indeed, none of the final percentages seems to shift very significantly. Nevertheless, I am in favor of more data than less. Enjoy.
Anonymous wrote:This discussion was fueled by the observation that the public TJ and the Blair Magnet Program had 4 fold the number of National Merit Finalists than STA and other private schools in this region -- approximating 40 percent of each class.

I'd like to avoid getting involved in any of this "TJ is better" squabble, but this sort of claim is precisely the reason I created the NMSF spreadsheet. Claiming that "approximately 40%" of each class at TJ is NMSF is simply false. Looking at the last 10 years of data, the figure is about 34%. Even if you look at only the last 5 years of data, it's still only 34%. You can manipulate the numbers to get close to 40% only if you take the absolute best year TJ ever had (167 NMSFs in 2006, before it slid down in recent years), and compare that high total to the absolute smallest class size (current seniors, as compared to much bigger junior and sophomore classes). And even then, the total is only about 38%.

Similarly, the claim that TJ & Blair have "4 fold the number of National Merit Finalists than STA and other private schools in this region" is misleading at best. On a pure numbers basis, TJ & Blair do have many more NMSFs (approx 153 and 44 per year), but they also each have over 1000 students. That's more students in each senior class than most private schools have in the entire high school.

None if this is meant to take away from TJ & Blair. Both are great programs. All the local privates are great schools too. But you should be able to argue that they're great programs without bullshitting the data.
Thanks for your interest, PP. I think the data you suggested adding is actually from Sept 1999 for the class of 2000, and is already in the spreadsheet. It's cited at row 51, I believe. I'm not certain I understand the logic of your point about tracking the numbers of old senior classes. But if you can dig up those numbers, maybe I can figure a way to include them. On your point about manipulating data, I understand how statistical tools can be used to skew data, but that's not what I'm interested in doing here.

Thanks again. Please keep the data coming.
OP here. Many thanks for all the kind words. I hope the data will prove useful, despite its limits. And most importantly, many thanks to those who posted links leading to more data to fill some of the gaps -- that's the most helpful of all. I'll try to answer a few questions that have been asked, before I have to go to work.

What was my original plan? Why did I start collecting the data? Simply because I was frustrated a few months ago by some people who were making unsubstantiated claims about the NMSF numbers at various schools. I can't even remember now the exact context. At the time, I just quickly dug up a few NMSF lists to test whether the DCUM claims I was seeing rang true or not. After I found some good data, I decided to plug it into a spreadsheet rather than just discard it. Over time, I added to the data and it grew into a more extensive set. I did the work to compile the data because I was interested, so why not share with everyone?

Why did I look back 10-11 years? Why not longer? Why not shorter? No particular reason. I just started gathering data for various years. As the data extended beyond 10 years ago, I lost interest and figured no one else would care. If others want more data in the spreadsheet, just post the data with a substantiating link. I'm happy to add it. On the other side of the question, I can see the argument that more recent years might be more relevant, but there's just not that much data in recent years right now. Once people can point me to full data sets for recent years, then I'm happy to add percentage calculations for other date ranges. In actuality though, I don't think most schools really change that much over a short period like 10 years -- that's not even long enough for one child to make it all the way from K through 12.

Why is data missing for some years? Did I purposefully select certain years? I did not purposefully select or deselect any years. I included all credible data I could find. If particular schools in the spreadsheet lack data for certain years, it's because I simply have not been able to find data for those years yet. Help me out by finding and posting data!

What conclusions do I draw from these numbers? I'm not trying to suggest anything about any school by posting this data. I just offer it up so that other parents who care about such things can look at actual data, and not have to rely on rumors and other unsubstantiated claims about how different schools compare.

Enjoy the data, and please keep posting links to more! Thanks.
Go to: