prepping for cogat test .. is it cheating?

Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html


Ah, okay. And yet, kids who take SAT prep classes usually get a 100-200+ bump. Just coincidence, I guess.


You are lying again. Kids don’t usually get a 100-200+ bump. Anecdotally, maybe. Statistically, the benefit from prepping is less than the standard deviation of the test. Since you probably don’t have a full grasp of these concepts I’ll boil it down for you: prepping doesn’t help.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Let me start by saying I'm not US-educated, so I have no hands on experience on this education system.

For an outsider like me, tests look like tests, Cogat in elementary looks just like the SAT test for college or the GRE test for post grad.

Are you suppose to also not study for those?

Where's the guide to know for which test to prep and for which you are not allowed? Not joking, I just don't get the US education system.


There's no such thing.. This topic keeps popping up at least once a year. Prep all you want for any test you want. The concept of "not allowed", was invented by some clever lawyer White mom (now stay at home) to dissuade the Asians from prepping. White moms don't want their precious offspring to take time off from ballet, soccer or other such trivial pursuits to prep and compete against the Asians and Nigerians, hence the misinformation campaign. It's like 'Donald Trump won the 2020 election'.


Any exam test that tests knowledge you should have accrued through education- SAT/SOL/etc prep and studying are expected and encouraged, whereas a test that determines aptitude, like the Cogat, is not supposed to be prepped for.

Do a thought exercise: if you there was a way you could prepare in advance for an IQ test, learning what to expect and tips or tricks to improve your score - is it really reflecting your ability?

Personally, I don't care if you prep or not as long as your kid can handle an AAP program on their own without external pressure from you as a parent.


My kids are long done with AAP and I was just browsing these forums with nothing to do.. Both did well on IQ tests but we prepped anyways just to be sure. One is at a T20 college via TJ. The other is at base HS.

My 2c.. Every kid at FCPS should have access to AAP level classes if they want to. It's a tragedy that we have to fight for a decent education. AAP is not that much more difficult and every parent should try to get their kids into that program. Academically, it's curriculum is better than that of most private schools in this area, especially in science and math. By getting into AAP you are also in a class where most of the kids/parents are academics-focused and avoid the bully-types. If that's the environment you want, get in by all means. Don't worry about whether or not your kid can handle AAP. They will adapt.

Prep only helps the borderline kids.. Smart kids will score high regardless of prep. Not-so-smart kids won't regardless of prep. it's the borderline kids that may fall to the right side of the wall with prep. Go for it! There's a large DCUM population that believes prep is cheating unless they approve of it (at which point they will call it enrichment). This is the same crowd that doesn't hesitate to spend thousands on doctors to get their kids extra time on tests, score that extra 200 points on the SAT (while openly promoting test optional), and tutor them to wazoo. Ignore that noise. Good luck!


Unfortunately, most kids prep even smart kids which makes it harder for smart kids to test as gifted if they don't prep compared to the smart kids that do prep. It's just a matter of having a level playing field and these tests fail to guarantee that.



If your kids are well beyond ES, it may not be as good as you remember. Now 20% of all kids get in and not all are subject to the same cutoffs due to more focus on local school norms, equity, etc. But the hardest thing for most kids in high SES areas may well be getting in, hence the need to prep as a little extra insurance. 10-20 points may matter. Or not. Because the process now is shrouded in "holistic" mystery.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html

DP The studies cited in the article are based on limited information and their findings should not be assumed for all students. Study limitations: 1) Students self-report whether they prepared for the SAT. Self-reporting is known to introduce many biases. Often students that do well don't want to acknowledge they prepared and students that didn't do well may want to show they that tried. 2) There is no way to know the intensity of use. Some students may have purchased a book but opened it once. Some may have enrolled in a course and used it once. Other students may have prepared for hours with their materials. Thus, a yes/no for preparation with no indication of usage is a poor proxy. 3) Studies look at the improvement from PSAT to SAT. But many students begin to prepare before the PSAT so the test scores would miss the initial improvement coming into the PSAT.

Individual students are best placed to know if these courses work. They know their test scores before preparing and those afterward. Large scale studies miss all the sub details needed to make that assessment. The popularity of test preparation indicates that it is more useful than what the article suggests. Families listen to friends' experiences; if they're shelling out money, it's because they have heard friends report that it works.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Let me start by saying I'm not US-educated, so I have no hands on experience on this education system.

For an outsider like me, tests look like tests, Cogat in elementary looks just like the SAT test for college or the GRE test for post grad.

Are you suppose to also not study for those?

Where's the guide to know for which test to prep and for which you are not allowed? Not joking, I just don't get the US education system.


There's no such thing.. This topic keeps popping up at least once a year. Prep all you want for any test you want. The concept of "not allowed", was invented by some clever lawyer White mom (now stay at home) to dissuade the Asians from prepping. White moms don't want their precious offspring to take time off from ballet, soccer or other such trivial pursuits to prep and compete against the Asians and Nigerians, hence the misinformation campaign. It's like 'Donald Trump won the 2020 election'.


Any exam test that tests knowledge you should have accrued through education- SAT/SOL/etc prep and studying are expected and encouraged, whereas a test that determines aptitude, like the Cogat, is not supposed to be prepped for.

Do a thought exercise: if you there was a way you could prepare in advance for an IQ test, learning what to expect and tips or tricks to improve your score - is it really reflecting your ability?

Personally, I don't care if you prep or not as long as your kid can handle an AAP program on their own without external pressure from you as a parent.


My kids are long done with AAP and I was just browsing these forums with nothing to do.. Both did well on IQ tests but we prepped anyways just to be sure. One is at a T20 college via TJ. The other is at base HS.

My 2c.. Every kid at FCPS should have access to AAP level classes if they want to. It's a tragedy that we have to fight for a decent education. AAP is not that much more difficult and every parent should try to get their kids into that program. Academically, it's curriculum is better than that of most private schools in this area, especially in science and math. By getting into AAP you are also in a class where most of the kids/parents are academics-focused and avoid the bully-types. If that's the environment you want, get in by all means. Don't worry about whether or not your kid can handle AAP. They will adapt.

Prep only helps the borderline kids.. Smart kids will score high regardless of prep. Not-so-smart kids won't regardless of prep. it's the borderline kids that may fall to the right side of the wall with prep. Go for it! There's a large DCUM population that believes prep is cheating unless they approve of it (at which point they will call it enrichment). This is the same crowd that doesn't hesitate to spend thousands on doctors to get their kids extra time on tests, score that extra 200 points on the SAT (while openly promoting test optional), and tutor them to wazoo. Ignore that noise. Good luck!


Unfortunately, most kids prep even smart kids which makes it harder for smart kids to test as gifted if they don't prep compared to the smart kids that do prep. It's just a matter of having a level playing field and these tests fail to guarantee that.



If your kids are well beyond ES, it may not be as good as you remember. Now 20% of all kids get in and not all are subject to the same cutoffs due to more focus on local school norms, equity, etc. But the hardest thing for most kids in high SES areas may well be getting in, hence the need to prep as a little extra insurance. 10-20 points may matter. Or not. Because the process now is shrouded in "holistic" mystery.


The kids from higher SES schools and centers need higher scores because of prep. Parents were already prepping the kids which is why their scores were so much higher then other schools. The big difference between schools were 20% of the kids were scoring in the 140s and schools where 20% of the kids were score in the 130s is the amount of prep. The schools with a lot of 130s tend to have fewer parents prepping. Once you add in workbooks and/or classes, you get your 140s because prep tends to lead to a 10 point bump.

Your schools with a lot of 130’s tend to me middle class and upper middle class families where parents have been reading to kids and doing things with them that reinforce reading, math, and curiosity. These are the kids who tend to show up to K knowing their letters and numbers and sounds and being able to write or read a little bit. They do well on the NNAT and CoGAT because they have been exposed to more.

The kids at the lower SES schools are at a disadvantage because their parents are less likely to read to them or expose them to basic math. They tend to show up to K not knowing their letters, numbers, shapes, colors, or sounds. That shows up in their test scores as well as their classroom performance.

Being upset because the County gets that kids who are advanced at each school is going to be different based on their backgrounds is a bit silly. The schools with a lot of 140’s who are not in pool will have peers in the gen ed classroom. The kids with 130s at a Title 1 school will not have peers in their classroom. Those kids need AAP at their school because they don’t have a peer base. Even if that means that they would not qualify for AAP at the higher SES school. Recognizing the schools in the County end up with different types of programs based on the kids at those schools is common sense. Your 135 kid at a school with a ton of 140s is going to be fine in Gen Ed.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html

DP The studies cited in the article are based on limited information and their findings should not be assumed for all students. Study limitations: 1) Students self-report whether they prepared for the SAT. Self-reporting is known to introduce many biases. Often students that do well don't want to acknowledge they prepared and students that didn't do well may want to show they that tried. 2) There is no way to know the intensity of use. Some students may have purchased a book but opened it once. Some may have enrolled in a course and used it once. Other students may have prepared for hours with their materials. Thus, a yes/no for preparation with no indication of usage is a poor proxy. 3) Studies look at the improvement from PSAT to SAT. But many students begin to prepare before the PSAT so the test scores would miss the initial improvement coming into the PSAT.

Individual students are best placed to know if these courses work. They know their test scores before preparing and those afterward. Large scale studies miss all the sub details needed to make that assessment. The popularity of test preparation indicates that it is more useful than what the article suggests. Families listen to friends' experiences; if they're shelling out money, it's because they have heard friends report that it works.


I don’t think you read carefully the references, or you don’t know what control for variables methodology means.

This reference is quite compelling:

https://www.researchgate.net/profile/Benjamin-Domingue/publication/228337033_Using_Linear_Regression_and_Propensity_Score_Matching_to_Estimate_the_Effect_of_Coaching_on_the_SAT/links/5486ec840cf268d28f06a133/Using-Linear-Regression-and-Propensity-Score-Matching-to-Estimate-the-Effect-of-Coaching-on-the-SAT.pdf?origin=publication_detail


For your own information asking friends and acquaintances is also a form of self reporting, likely way more prone to bias than anonymous self reporting. Authors control for intensity of use or motivation using proxies such as intent to apply and requesting information from college, GPA, SES etc. Contrary to what you claim they didn’t look at improvement from PSAT to SAT, but evaluated the effect of coaching on students that took or didn’t take the PSAT, ie control for another variable.

But sure, let’s not believe the study done by a Stanford professor, expert in educational testing, and let’s go by what auntie told you about the prep class her daughter took. Feel free to waste as much time and energy obsessing over who preps and if it is cheating.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html

DP The studies cited in the article are based on limited information and their findings should not be assumed for all students. Study limitations: 1) Students self-report whether they prepared for the SAT. Self-reporting is known to introduce many biases. Often students that do well don't want to acknowledge they prepared and students that didn't do well may want to show they that tried. 2) There is no way to know the intensity of use. Some students may have purchased a book but opened it once. Some may have enrolled in a course and used it once. Other students may have prepared for hours with their materials. Thus, a yes/no for preparation with no indication of usage is a poor proxy. 3) Studies look at the improvement from PSAT to SAT. But many students begin to prepare before the PSAT so the test scores would miss the initial improvement coming into the PSAT.

Individual students are best placed to know if these courses work. They know their test scores before preparing and those afterward. Large scale studies miss all the sub details needed to make that assessment. The popularity of test preparation indicates that it is more useful than what the article suggests. Families listen to friends' experiences; if they're shelling out money, it's because they have heard friends report that it works.


I don’t think you read carefully the references, or you don’t know what control for variables methodology means.

This reference is quite compelling:

https://www.researchgate.net/profile/Benjamin-Domingue/publication/228337033_Using_Linear_Regression_and_Propensity_Score_Matching_to_Estimate_the_Effect_of_Coaching_on_the_SAT/links/5486ec840cf268d28f06a133/Using-Linear-Regression-and-Propensity-Score-Matching-to-Estimate-the-Effect-of-Coaching-on-the-SAT.pdf?origin=publication_detail


For your own information asking friends and acquaintances is also a form of self reporting, likely way more prone to bias than anonymous self reporting. Authors control for intensity of use or motivation using proxies such as intent to apply and requesting information from college, GPA, SES etc. Contrary to what you claim they didn’t look at improvement from PSAT to SAT, but evaluated the effect of coaching on students that took or didn’t take the PSAT, ie control for another variable.

But sure, let’s not believe the study done by a Stanford professor, expert in educational testing, and let’s go by what auntie told you about the prep class her daughter took. Feel free to waste as much time and energy obsessing over who preps and if it is cheating.

These studies are based on student self-reporting of test preparation which introduces well known biases. Generally, authors try to avoid self-reported data where possible. Here it is not possible so they use the best data set they could access in ELS. Starting from that point, authors try to control for factors but can't capture nuances. They attempt to control for intensity of usage by using college intent, GPAs, and SES. But students can look similar on paper for these criteria and still prepare very differently for standardized tests - some buy a book and never use it because they're too busy, others prepare extensively. One student's intensity of preparation can also vary over time depending on their other activities, whether pre-PSAT, pre-SAT, from first SAT take to second SAT take, or other. That is why it is unwise to use a generalized result from these large scale studies and argue this applies to all students. If a student is willing to put in long hours preparing (whether from paid service or free like Khan), that is a critical factor.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html

DP The studies cited in the article are based on limited information and their findings should not be assumed for all students. Study limitations: 1) Students self-report whether they prepared for the SAT. Self-reporting is known to introduce many biases. Often students that do well don't want to acknowledge they prepared and students that didn't do well may want to show they that tried. 2) There is no way to know the intensity of use. Some students may have purchased a book but opened it once. Some may have enrolled in a course and used it once. Other students may have prepared for hours with their materials. Thus, a yes/no for preparation with no indication of usage is a poor proxy. 3) Studies look at the improvement from PSAT to SAT. But many students begin to prepare before the PSAT so the test scores would miss the initial improvement coming into the PSAT.

Individual students are best placed to know if these courses work. They know their test scores before preparing and those afterward. Large scale studies miss all the sub details needed to make that assessment. The popularity of test preparation indicates that it is more useful than what the article suggests. Families listen to friends' experiences; if they're shelling out money, it's because they have heard friends report that it works.


I don’t think you read carefully the references, or you don’t know what control for variables methodology means.

This reference is quite compelling:

https://www.researchgate.net/profile/Benjamin-Domingue/publication/228337033_Using_Linear_Regression_and_Propensity_Score_Matching_to_Estimate_the_Effect_of_Coaching_on_the_SAT/links/5486ec840cf268d28f06a133/Using-Linear-Regression-and-Propensity-Score-Matching-to-Estimate-the-Effect-of-Coaching-on-the-SAT.pdf?origin=publication_detail


For your own information asking friends and acquaintances is also a form of self reporting, likely way more prone to bias than anonymous self reporting. Authors control for intensity of use or motivation using proxies such as intent to apply and requesting information from college, GPA, SES etc. Contrary to what you claim they didn’t look at improvement from PSAT to SAT, but evaluated the effect of coaching on students that took or didn’t take the PSAT, ie control for another variable.

But sure, let’s not believe the study done by a Stanford professor, expert in educational testing, and let’s go by what auntie told you about the prep class her daughter took. Feel free to waste as much time and energy obsessing over who preps and if it is cheating.

These studies are based on student self-reporting of test preparation which introduces well known biases. Generally, authors try to avoid self-reported data where possible. Here it is not possible so they use the best data set they could access in ELS. Starting from that point, authors try to control for factors but can't capture nuances. They attempt to control for intensity of usage by using college intent, GPAs, and SES. But students can look similar on paper for these criteria and still prepare very differently for standardized tests - some buy a book and never use it because they're too busy, others prepare extensively. One student's intensity of preparation can also vary over time depending on their other activities, whether pre-PSAT, pre-SAT, from first SAT take to second SAT take, or other. That is why it is unwise to use a generalized result from these large scale studies and argue this applies to all students. If a student is willing to put in long hours preparing (whether from paid service or free like Khan), that is a critical factor.


Without having even the most basic understanding on statistics and how testing works, you should refrain from participating in this conversation. If you don’t approve of self reporting, how exactly do you propose collecting that data, stealing personal information? Or just not even parse through the data and do the study since your aunty already told you prepping works well and you can easily improve your score by 300 points.

Basically you’re saying disregard the statistical trends of thousands of students and look at the few outliers that spend long hours studying, never mind that you’re actually looking at the dose response, ie the more you study the higher the score because you’re actually learning the material that is tested.

You really can’t be any dumber than this!
Anonymous
Very soon studying will be considered cheating!
Anonymous
Anonymous wrote:Very soon studying will be considered cheating!


It’s not equitable to the ones that didn’t study.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html

DP The studies cited in the article are based on limited information and their findings should not be assumed for all students. Study limitations: 1) Students self-report whether they prepared for the SAT. Self-reporting is known to introduce many biases. Often students that do well don't want to acknowledge they prepared and students that didn't do well may want to show they that tried. 2) There is no way to know the intensity of use. Some students may have purchased a book but opened it once. Some may have enrolled in a course and used it once. Other students may have prepared for hours with their materials. Thus, a yes/no for preparation with no indication of usage is a poor proxy. 3) Studies look at the improvement from PSAT to SAT. But many students begin to prepare before the PSAT so the test scores would miss the initial improvement coming into the PSAT.

Individual students are best placed to know if these courses work. They know their test scores before preparing and those afterward. Large scale studies miss all the sub details needed to make that assessment. The popularity of test preparation indicates that it is more useful than what the article suggests. Families listen to friends' experiences; if they're shelling out money, it's because they have heard friends report that it works.


I don’t think you read carefully the references, or you don’t know what control for variables methodology means.

This reference is quite compelling:

https://www.researchgate.net/profile/Benjamin-Domingue/publication/228337033_Using_Linear_Regression_and_Propensity_Score_Matching_to_Estimate_the_Effect_of_Coaching_on_the_SAT/links/5486ec840cf268d28f06a133/Using-Linear-Regression-and-Propensity-Score-Matching-to-Estimate-the-Effect-of-Coaching-on-the-SAT.pdf?origin=publication_detail


For your own information asking friends and acquaintances is also a form of self reporting, likely way more prone to bias than anonymous self reporting. Authors control for intensity of use or motivation using proxies such as intent to apply and requesting information from college, GPA, SES etc. Contrary to what you claim they didn’t look at improvement from PSAT to SAT, but evaluated the effect of coaching on students that took or didn’t take the PSAT, ie control for another variable.

But sure, let’s not believe the study done by a Stanford professor, expert in educational testing, and let’s go by what auntie told you about the prep class her daughter took. Feel free to waste as much time and energy obsessing over who preps and if it is cheating.

These studies are based on student self-reporting of test preparation which introduces well known biases. Generally, authors try to avoid self-reported data where possible. Here it is not possible so they use the best data set they could access in ELS. Starting from that point, authors try to control for factors but can't capture nuances. They attempt to control for intensity of usage by using college intent, GPAs, and SES. But students can look similar on paper for these criteria and still prepare very differently for standardized tests - some buy a book and never use it because they're too busy, others prepare extensively. One student's intensity of preparation can also vary over time depending on their other activities, whether pre-PSAT, pre-SAT, from first SAT take to second SAT take, or other. That is why it is unwise to use a generalized result from these large scale studies and argue this applies to all students. If a student is willing to put in long hours preparing (whether from paid service or free like Khan), that is a critical factor.


Without having even the most basic understanding on statistics and how testing works, you should refrain from participating in this conversation. If you don’t approve of self reporting, how exactly do you propose collecting that data, stealing personal information? Or just not even parse through the data and do the study since your aunty already told you prepping works well and you can easily improve your score by 300 points.

Basically you’re saying disregard the statistical trends of thousands of students and look at the few outliers that spend long hours studying, never mind that you’re actually looking at the dose response, ie the more you study the higher the score because you’re actually learning the material that is tested.

You really can’t be any dumber than this!


The name-calling and thread-policing are funny, but that line is the funniest.

lol
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html

DP The studies cited in the article are based on limited information and their findings should not be assumed for all students. Study limitations: 1) Students self-report whether they prepared for the SAT. Self-reporting is known to introduce many biases. Often students that do well don't want to acknowledge they prepared and students that didn't do well may want to show they that tried. 2) There is no way to know the intensity of use. Some students may have purchased a book but opened it once. Some may have enrolled in a course and used it once. Other students may have prepared for hours with their materials. Thus, a yes/no for preparation with no indication of usage is a poor proxy. 3) Studies look at the improvement from PSAT to SAT. But many students begin to prepare before the PSAT so the test scores would miss the initial improvement coming into the PSAT.

Individual students are best placed to know if these courses work. They know their test scores before preparing and those afterward. Large scale studies miss all the sub details needed to make that assessment. The popularity of test preparation indicates that it is more useful than what the article suggests. Families listen to friends' experiences; if they're shelling out money, it's because they have heard friends report that it works.


I don’t think you read carefully the references, or you don’t know what control for variables methodology means.

This reference is quite compelling:

https://www.researchgate.net/profile/Benjamin-Domingue/publication/228337033_Using_Linear_Regression_and_Propensity_Score_Matching_to_Estimate_the_Effect_of_Coaching_on_the_SAT/links/5486ec840cf268d28f06a133/Using-Linear-Regression-and-Propensity-Score-Matching-to-Estimate-the-Effect-of-Coaching-on-the-SAT.pdf?origin=publication_detail


For your own information asking friends and acquaintances is also a form of self reporting, likely way more prone to bias than anonymous self reporting. Authors control for intensity of use or motivation using proxies such as intent to apply and requesting information from college, GPA, SES etc. Contrary to what you claim they didn’t look at improvement from PSAT to SAT, but evaluated the effect of coaching on students that took or didn’t take the PSAT, ie control for another variable.

But sure, let’s not believe the study done by a Stanford professor, expert in educational testing, and let’s go by what auntie told you about the prep class her daughter took. Feel free to waste as much time and energy obsessing over who preps and if it is cheating.

These studies are based on student self-reporting of test preparation which introduces well known biases. Generally, authors try to avoid self-reported data where possible. Here it is not possible so they use the best data set they could access in ELS. Starting from that point, authors try to control for factors but can't capture nuances. They attempt to control for intensity of usage by using college intent, GPAs, and SES. But students can look similar on paper for these criteria and still prepare very differently for standardized tests - some buy a book and never use it because they're too busy, others prepare extensively. One student's intensity of preparation can also vary over time depending on their other activities, whether pre-PSAT, pre-SAT, from first SAT take to second SAT take, or other. That is why it is unwise to use a generalized result from these large scale studies and argue this applies to all students. If a student is willing to put in long hours preparing (whether from paid service or free like Khan), that is a critical factor.


Without having even the most basic understanding on statistics and how testing works, you should refrain from participating in this conversation. If you don’t approve of self reporting, how exactly do you propose collecting that data, stealing personal information? Or just not even parse through the data and do the study since your aunty already told you prepping works well and you can easily improve your score by 300 points.

Basically you’re saying disregard the statistical trends of thousands of students and look at the few outliers that spend long hours studying, never mind that you’re actually looking at the dose response, ie the more you study the higher the score because you’re actually learning the material that is tested.

You really can’t be any dumber than this!


The name-calling and thread-policing are funny, but that line is the funniest.

lol


Thank you, thank you very much! Insulting dumb idiots on DCUM is my one guilty pleasure.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html

DP The studies cited in the article are based on limited information and their findings should not be assumed for all students. Study limitations: 1) Students self-report whether they prepared for the SAT. Self-reporting is known to introduce many biases. Often students that do well don't want to acknowledge they prepared and students that didn't do well may want to show they that tried. 2) There is no way to know the intensity of use. Some students may have purchased a book but opened it once. Some may have enrolled in a course and used it once. Other students may have prepared for hours with their materials. Thus, a yes/no for preparation with no indication of usage is a poor proxy. 3) Studies look at the improvement from PSAT to SAT. But many students begin to prepare before the PSAT so the test scores would miss the initial improvement coming into the PSAT.

Individual students are best placed to know if these courses work. They know their test scores before preparing and those afterward. Large scale studies miss all the sub details needed to make that assessment. The popularity of test preparation indicates that it is more useful than what the article suggests. Families listen to friends' experiences; if they're shelling out money, it's because they have heard friends report that it works.


I don’t think you read carefully the references, or you don’t know what control for variables methodology means.

This reference is quite compelling:

https://www.researchgate.net/profile/Benjamin-Domingue/publication/228337033_Using_Linear_Regression_and_Propensity_Score_Matching_to_Estimate_the_Effect_of_Coaching_on_the_SAT/links/5486ec840cf268d28f06a133/Using-Linear-Regression-and-Propensity-Score-Matching-to-Estimate-the-Effect-of-Coaching-on-the-SAT.pdf?origin=publication_detail


For your own information asking friends and acquaintances is also a form of self reporting, likely way more prone to bias than anonymous self reporting. Authors control for intensity of use or motivation using proxies such as intent to apply and requesting information from college, GPA, SES etc. Contrary to what you claim they didn’t look at improvement from PSAT to SAT, but evaluated the effect of coaching on students that took or didn’t take the PSAT, ie control for another variable.

But sure, let’s not believe the study done by a Stanford professor, expert in educational testing, and let’s go by what auntie told you about the prep class her daughter took. Feel free to waste as much time and energy obsessing over who preps and if it is cheating.

These studies are based on student self-reporting of test preparation which introduces well known biases. Generally, authors try to avoid self-reported data where possible. Here it is not possible so they use the best data set they could access in ELS. Starting from that point, authors try to control for factors but can't capture nuances. They attempt to control for intensity of usage by using college intent, GPAs, and SES. But students can look similar on paper for these criteria and still prepare very differently for standardized tests - some buy a book and never use it because they're too busy, others prepare extensively. One student's intensity of preparation can also vary over time depending on their other activities, whether pre-PSAT, pre-SAT, from first SAT take to second SAT take, or other. That is why it is unwise to use a generalized result from these large scale studies and argue this applies to all students. If a student is willing to put in long hours preparing (whether from paid service or free like Khan), that is a critical factor.


Without having even the most basic understanding on statistics and how testing works, you should refrain from participating in this conversation. If you don’t approve of self reporting, how exactly do you propose collecting that data, stealing personal information? Or just not even parse through the data and do the study since your aunty already told you prepping works well and you can easily improve your score by 300 points.

Basically you’re saying disregard the statistical trends of thousands of students and look at the few outliers that spend long hours studying, never mind that you’re actually looking at the dose response, ie the more you study the higher the score because you’re actually learning the material that is tested.

You really can’t be any dumber than this!


The name-calling and thread-policing are funny, but that line is the funniest.

lol


Thank you, thank you very much! Insulting dumb idiots on DCUM is my one guilty pleasure.


No, dear, I was laughing at you. Not with you.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html

DP The studies cited in the article are based on limited information and their findings should not be assumed for all students. Study limitations: 1) Students self-report whether they prepared for the SAT. Self-reporting is known to introduce many biases. Often students that do well don't want to acknowledge they prepared and students that didn't do well may want to show they that tried. 2) There is no way to know the intensity of use. Some students may have purchased a book but opened it once. Some may have enrolled in a course and used it once. Other students may have prepared for hours with their materials. Thus, a yes/no for preparation with no indication of usage is a poor proxy. 3) Studies look at the improvement from PSAT to SAT. But many students begin to prepare before the PSAT so the test scores would miss the initial improvement coming into the PSAT.

Individual students are best placed to know if these courses work. They know their test scores before preparing and those afterward. Large scale studies miss all the sub details needed to make that assessment. The popularity of test preparation indicates that it is more useful than what the article suggests. Families listen to friends' experiences; if they're shelling out money, it's because they have heard friends report that it works.


I don’t think you read carefully the references, or you don’t know what control for variables methodology means.

This reference is quite compelling:

https://www.researchgate.net/profile/Benjamin-Domingue/publication/228337033_Using_Linear_Regression_and_Propensity_Score_Matching_to_Estimate_the_Effect_of_Coaching_on_the_SAT/links/5486ec840cf268d28f06a133/Using-Linear-Regression-and-Propensity-Score-Matching-to-Estimate-the-Effect-of-Coaching-on-the-SAT.pdf?origin=publication_detail


For your own information asking friends and acquaintances is also a form of self reporting, likely way more prone to bias than anonymous self reporting. Authors control for intensity of use or motivation using proxies such as intent to apply and requesting information from college, GPA, SES etc. Contrary to what you claim they didn’t look at improvement from PSAT to SAT, but evaluated the effect of coaching on students that took or didn’t take the PSAT, ie control for another variable.

But sure, let’s not believe the study done by a Stanford professor, expert in educational testing, and let’s go by what auntie told you about the prep class her daughter took. Feel free to waste as much time and energy obsessing over who preps and if it is cheating.

These studies are based on student self-reporting of test preparation which introduces well known biases. Generally, authors try to avoid self-reported data where possible. Here it is not possible so they use the best data set they could access in ELS. Starting from that point, authors try to control for factors but can't capture nuances. They attempt to control for intensity of usage by using college intent, GPAs, and SES. But students can look similar on paper for these criteria and still prepare very differently for standardized tests - some buy a book and never use it because they're too busy, others prepare extensively. One student's intensity of preparation can also vary over time depending on their other activities, whether pre-PSAT, pre-SAT, from first SAT take to second SAT take, or other. That is why it is unwise to use a generalized result from these large scale studies and argue this applies to all students. If a student is willing to put in long hours preparing (whether from paid service or free like Khan), that is a critical factor.


Without having even the most basic understanding on statistics and how testing works, you should refrain from participating in this conversation. If you don’t approve of self reporting, how exactly do you propose collecting that data, stealing personal information? Or just not even parse through the data and do the study since your aunty already told you prepping works well and you can easily improve your score by 300 points.

Basically you’re saying disregard the statistical trends of thousands of students and look at the few outliers that spend long hours studying, never mind that you’re actually looking at the dose response, ie the more you study the higher the score because you’re actually learning the material that is tested.

You really can’t be any dumber than this!

Study results are only as good as the data used. These studies are using self-reported, yes/no answers as to whether students prepared for the SAT with books, courses, etc. There is no objective confirmation as to whether these responses are accurate and no objective or subjective data provided on time spent on preparation. View the study results accordingly.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html

DP The studies cited in the article are based on limited information and their findings should not be assumed for all students. Study limitations: 1) Students self-report whether they prepared for the SAT. Self-reporting is known to introduce many biases. Often students that do well don't want to acknowledge they prepared and students that didn't do well may want to show they that tried. 2) There is no way to know the intensity of use. Some students may have purchased a book but opened it once. Some may have enrolled in a course and used it once. Other students may have prepared for hours with their materials. Thus, a yes/no for preparation with no indication of usage is a poor proxy. 3) Studies look at the improvement from PSAT to SAT. But many students begin to prepare before the PSAT so the test scores would miss the initial improvement coming into the PSAT.

Individual students are best placed to know if these courses work. They know their test scores before preparing and those afterward. Large scale studies miss all the sub details needed to make that assessment. The popularity of test preparation indicates that it is more useful than what the article suggests. Families listen to friends' experiences; if they're shelling out money, it's because they have heard friends report that it works.


I don’t think you read carefully the references, or you don’t know what control for variables methodology means.

This reference is quite compelling:

https://www.researchgate.net/profile/Benjamin-Domingue/publication/228337033_Using_Linear_Regression_and_Propensity_Score_Matching_to_Estimate_the_Effect_of_Coaching_on_the_SAT/links/5486ec840cf268d28f06a133/Using-Linear-Regression-and-Propensity-Score-Matching-to-Estimate-the-Effect-of-Coaching-on-the-SAT.pdf?origin=publication_detail


For your own information asking friends and acquaintances is also a form of self reporting, likely way more prone to bias than anonymous self reporting. Authors control for intensity of use or motivation using proxies such as intent to apply and requesting information from college, GPA, SES etc. Contrary to what you claim they didn’t look at improvement from PSAT to SAT, but evaluated the effect of coaching on students that took or didn’t take the PSAT, ie control for another variable.

But sure, let’s not believe the study done by a Stanford professor, expert in educational testing, and let’s go by what auntie told you about the prep class her daughter took. Feel free to waste as much time and energy obsessing over who preps and if it is cheating.

These studies are based on student self-reporting of test preparation which introduces well known biases. Generally, authors try to avoid self-reported data where possible. Here it is not possible so they use the best data set they could access in ELS. Starting from that point, authors try to control for factors but can't capture nuances. They attempt to control for intensity of usage by using college intent, GPAs, and SES. But students can look similar on paper for these criteria and still prepare very differently for standardized tests - some buy a book and never use it because they're too busy, others prepare extensively. One student's intensity of preparation can also vary over time depending on their other activities, whether pre-PSAT, pre-SAT, from first SAT take to second SAT take, or other. That is why it is unwise to use a generalized result from these large scale studies and argue this applies to all students. If a student is willing to put in long hours preparing (whether from paid service or free like Khan), that is a critical factor.


Without having even the most basic understanding on statistics and how testing works, you should refrain from participating in this conversation. If you don’t approve of self reporting, how exactly do you propose collecting that data, stealing personal information? Or just not even parse through the data and do the study since your aunty already told you prepping works well and you can easily improve your score by 300 points.

Basically you’re saying disregard the statistical trends of thousands of students and look at the few outliers that spend long hours studying, never mind that you’re actually looking at the dose response, ie the more you study the higher the score because you’re actually learning the material that is tested.

You really can’t be any dumber than this!


The name-calling and thread-policing are funny, but that line is the funniest.

lol


Thank you, thank you very much! Insulting dumb idiots on DCUM is my one guilty pleasure.


No, dear, I was laughing at you. Not with you.


Your comebacks are not as witty as you imagine them.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html

DP The studies cited in the article are based on limited information and their findings should not be assumed for all students. Study limitations: 1) Students self-report whether they prepared for the SAT. Self-reporting is known to introduce many biases. Often students that do well don't want to acknowledge they prepared and students that didn't do well may want to show they that tried. 2) There is no way to know the intensity of use. Some students may have purchased a book but opened it once. Some may have enrolled in a course and used it once. Other students may have prepared for hours with their materials. Thus, a yes/no for preparation with no indication of usage is a poor proxy. 3) Studies look at the improvement from PSAT to SAT. But many students begin to prepare before the PSAT so the test scores would miss the initial improvement coming into the PSAT.

Individual students are best placed to know if these courses work. They know their test scores before preparing and those afterward. Large scale studies miss all the sub details needed to make that assessment. The popularity of test preparation indicates that it is more useful than what the article suggests. Families listen to friends' experiences; if they're shelling out money, it's because they have heard friends report that it works.


I don’t think you read carefully the references, or you don’t know what control for variables methodology means.

This reference is quite compelling:

https://www.researchgate.net/profile/Benjamin-Domingue/publication/228337033_Using_Linear_Regression_and_Propensity_Score_Matching_to_Estimate_the_Effect_of_Coaching_on_the_SAT/links/5486ec840cf268d28f06a133/Using-Linear-Regression-and-Propensity-Score-Matching-to-Estimate-the-Effect-of-Coaching-on-the-SAT.pdf?origin=publication_detail


For your own information asking friends and acquaintances is also a form of self reporting, likely way more prone to bias than anonymous self reporting. Authors control for intensity of use or motivation using proxies such as intent to apply and requesting information from college, GPA, SES etc. Contrary to what you claim they didn’t look at improvement from PSAT to SAT, but evaluated the effect of coaching on students that took or didn’t take the PSAT, ie control for another variable.

But sure, let’s not believe the study done by a Stanford professor, expert in educational testing, and let’s go by what auntie told you about the prep class her daughter took. Feel free to waste as much time and energy obsessing over who preps and if it is cheating.

These studies are based on student self-reporting of test preparation which introduces well known biases. Generally, authors try to avoid self-reported data where possible. Here it is not possible so they use the best data set they could access in ELS. Starting from that point, authors try to control for factors but can't capture nuances. They attempt to control for intensity of usage by using college intent, GPAs, and SES. But students can look similar on paper for these criteria and still prepare very differently for standardized tests - some buy a book and never use it because they're too busy, others prepare extensively. One student's intensity of preparation can also vary over time depending on their other activities, whether pre-PSAT, pre-SAT, from first SAT take to second SAT take, or other. That is why it is unwise to use a generalized result from these large scale studies and argue this applies to all students. If a student is willing to put in long hours preparing (whether from paid service or free like Khan), that is a critical factor.


Without having even the most basic understanding on statistics and how testing works, you should refrain from participating in this conversation. If you don’t approve of self reporting, how exactly do you propose collecting that data, stealing personal information? Or just not even parse through the data and do the study since your aunty already told you prepping works well and you can easily improve your score by 300 points.

Basically you’re saying disregard the statistical trends of thousands of students and look at the few outliers that spend long hours studying, never mind that you’re actually looking at the dose response, ie the more you study the higher the score because you’re actually learning the material that is tested.

You really can’t be any dumber than this!

Study results are only as good as the data used. These studies are using self-reported, yes/no answers as to whether students prepared for the SAT with books, courses, etc. There is no objective confirmation as to whether these responses are accurate and no objective or subjective data provided on time spent on preparation. View the study results accordingly.


It’s truly amazing how complete ignoramuses like yourself, decide that they know better than experts in the field who account for more than 30 different variables that track math credits, hours of homework per week etc., AP coursework, parent involvement in SAT prep and college application etc. The paper even surveys five other studies, on different populations, with different analysis methods and the score improvement due to prepping is less than 20 points on math, even less on verbal. That’s smaller than 5% of the test range and below the margin of error of the test.

But, yeah, DCUM lady with her vast expertise, decides this self reporting data is not reliable, and instead relies on self reported data from friends and family that for sure show score improvements of 100-200, even 300 points. The contradiction is completely lost on her, but no surprise here!

To wrap it up, kids scoring higher then your child didn’t cheat, they simply work harder have more aptitude and talent. Just accept this and don’t try to smear other kids accomplishments with silly accusations if cheating.
post reply Forum Index » Advanced Academic Programs (AAP)
Message Quick Reply
Go to: