prepping for cogat test .. is it cheating?

Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html

DP The studies cited in the article are based on limited information and their findings should not be assumed for all students. Study limitations: 1) Students self-report whether they prepared for the SAT. Self-reporting is known to introduce many biases. Often students that do well don't want to acknowledge they prepared and students that didn't do well may want to show they that tried. 2) There is no way to know the intensity of use. Some students may have purchased a book but opened it once. Some may have enrolled in a course and used it once. Other students may have prepared for hours with their materials. Thus, a yes/no for preparation with no indication of usage is a poor proxy. 3) Studies look at the improvement from PSAT to SAT. But many students begin to prepare before the PSAT so the test scores would miss the initial improvement coming into the PSAT.

Individual students are best placed to know if these courses work. They know their test scores before preparing and those afterward. Large scale studies miss all the sub details needed to make that assessment. The popularity of test preparation indicates that it is more useful than what the article suggests. Families listen to friends' experiences; if they're shelling out money, it's because they have heard friends report that it works.


I don’t think you read carefully the references, or you don’t know what control for variables methodology means.

This reference is quite compelling:

https://www.researchgate.net/profile/Benjamin-Domingue/publication/228337033_Using_Linear_Regression_and_Propensity_Score_Matching_to_Estimate_the_Effect_of_Coaching_on_the_SAT/links/5486ec840cf268d28f06a133/Using-Linear-Regression-and-Propensity-Score-Matching-to-Estimate-the-Effect-of-Coaching-on-the-SAT.pdf?origin=publication_detail


For your own information asking friends and acquaintances is also a form of self reporting, likely way more prone to bias than anonymous self reporting. Authors control for intensity of use or motivation using proxies such as intent to apply and requesting information from college, GPA, SES etc. Contrary to what you claim they didn’t look at improvement from PSAT to SAT, but evaluated the effect of coaching on students that took or didn’t take the PSAT, ie control for another variable.

But sure, let’s not believe the study done by a Stanford professor, expert in educational testing, and let’s go by what auntie told you about the prep class her daughter took. Feel free to waste as much time and energy obsessing over who preps and if it is cheating.

These studies are based on student self-reporting of test preparation which introduces well known biases. Generally, authors try to avoid self-reported data where possible. Here it is not possible so they use the best data set they could access in ELS. Starting from that point, authors try to control for factors but can't capture nuances. They attempt to control for intensity of usage by using college intent, GPAs, and SES. But students can look similar on paper for these criteria and still prepare very differently for standardized tests - some buy a book and never use it because they're too busy, others prepare extensively. One student's intensity of preparation can also vary over time depending on their other activities, whether pre-PSAT, pre-SAT, from first SAT take to second SAT take, or other. That is why it is unwise to use a generalized result from these large scale studies and argue this applies to all students. If a student is willing to put in long hours preparing (whether from paid service or free like Khan), that is a critical factor.


Without having even the most basic understanding on statistics and how testing works, you should refrain from participating in this conversation. If you don’t approve of self reporting, how exactly do you propose collecting that data, stealing personal information? Or just not even parse through the data and do the study since your aunty already told you prepping works well and you can easily improve your score by 300 points.

Basically you’re saying disregard the statistical trends of thousands of students and look at the few outliers that spend long hours studying, never mind that you’re actually looking at the dose response, ie the more you study the higher the score because you’re actually learning the material that is tested.

You really can’t be any dumber than this!

Study results are only as good as the data used. These studies are using self-reported, yes/no answers as to whether students prepared for the SAT with books, courses, etc. There is no objective confirmation as to whether these responses are accurate and no objective or subjective data provided on time spent on preparation. View the study results accordingly.


It’s truly amazing how complete ignoramuses like yourself, decide that they know better than experts in the field who account for more than 30 different variables that track math credits, hours of homework per week etc., AP coursework, parent involvement in SAT prep and college application etc. The paper even surveys five other studies, on different populations, with different analysis methods and the score improvement due to prepping is less than 20 points on math, even less on verbal. That’s smaller than 5% of the test range and below the margin of error of the test.

But, yeah, DCUM lady with her vast expertise, decides this self reporting data is not reliable, and instead relies on self reported data from friends and family that for sure show score improvements of 100-200, even 300 points. The contradiction is completely lost on her, but no surprise here!

To wrap it up, kids scoring higher then your child didn’t cheat, they simply work harder have more aptitude and talent. Just accept this and don’t try to smear other kids accomplishments with silly accusations if cheating.


Um, are you mixing up the SAT and Cogat? You're sounding a little confused now.
Anonymous
Don't hate the players, hate the game.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html

DP The studies cited in the article are based on limited information and their findings should not be assumed for all students. Study limitations: 1) Students self-report whether they prepared for the SAT. Self-reporting is known to introduce many biases. Often students that do well don't want to acknowledge they prepared and students that didn't do well may want to show they that tried. 2) There is no way to know the intensity of use. Some students may have purchased a book but opened it once. Some may have enrolled in a course and used it once. Other students may have prepared for hours with their materials. Thus, a yes/no for preparation with no indication of usage is a poor proxy. 3) Studies look at the improvement from PSAT to SAT. But many students begin to prepare before the PSAT so the test scores would miss the initial improvement coming into the PSAT.

Individual students are best placed to know if these courses work. They know their test scores before preparing and those afterward. Large scale studies miss all the sub details needed to make that assessment. The popularity of test preparation indicates that it is more useful than what the article suggests. Families listen to friends' experiences; if they're shelling out money, it's because they have heard friends report that it works.


I don’t think you read carefully the references, or you don’t know what control for variables methodology means.

This reference is quite compelling:

https://www.researchgate.net/profile/Benjamin-Domingue/publication/228337033_Using_Linear_Regression_and_Propensity_Score_Matching_to_Estimate_the_Effect_of_Coaching_on_the_SAT/links/5486ec840cf268d28f06a133/Using-Linear-Regression-and-Propensity-Score-Matching-to-Estimate-the-Effect-of-Coaching-on-the-SAT.pdf?origin=publication_detail


For your own information asking friends and acquaintances is also a form of self reporting, likely way more prone to bias than anonymous self reporting. Authors control for intensity of use or motivation using proxies such as intent to apply and requesting information from college, GPA, SES etc. Contrary to what you claim they didn’t look at improvement from PSAT to SAT, but evaluated the effect of coaching on students that took or didn’t take the PSAT, ie control for another variable.

But sure, let’s not believe the study done by a Stanford professor, expert in educational testing, and let’s go by what auntie told you about the prep class her daughter took. Feel free to waste as much time and energy obsessing over who preps and if it is cheating.

These studies are based on student self-reporting of test preparation which introduces well known biases. Generally, authors try to avoid self-reported data where possible. Here it is not possible so they use the best data set they could access in ELS. Starting from that point, authors try to control for factors but can't capture nuances. They attempt to control for intensity of usage by using college intent, GPAs, and SES. But students can look similar on paper for these criteria and still prepare very differently for standardized tests - some buy a book and never use it because they're too busy, others prepare extensively. One student's intensity of preparation can also vary over time depending on their other activities, whether pre-PSAT, pre-SAT, from first SAT take to second SAT take, or other. That is why it is unwise to use a generalized result from these large scale studies and argue this applies to all students. If a student is willing to put in long hours preparing (whether from paid service or free like Khan), that is a critical factor.


Without having even the most basic understanding on statistics and how testing works, you should refrain from participating in this conversation. If you don’t approve of self reporting, how exactly do you propose collecting that data, stealing personal information? Or just not even parse through the data and do the study since your aunty already told you prepping works well and you can easily improve your score by 300 points.

Basically you’re saying disregard the statistical trends of thousands of students and look at the few outliers that spend long hours studying, never mind that you’re actually looking at the dose response, ie the more you study the higher the score because you’re actually learning the material that is tested.

You really can’t be any dumber than this!

Study results are only as good as the data used. These studies are using self-reported, yes/no answers as to whether students prepared for the SAT with books, courses, etc. There is no objective confirmation as to whether these responses are accurate and no objective or subjective data provided on time spent on preparation. View the study results accordingly.


It’s truly amazing how complete ignoramuses like yourself, decide that they know better than experts in the field who account for more than 30 different variables that track math credits, hours of homework per week etc., AP coursework, parent involvement in SAT prep and college application etc. The paper even surveys five other studies, on different populations, with different analysis methods and the score improvement due to prepping is less than 20 points on math, even less on verbal. That’s smaller than 5% of the test range and below the margin of error of the test.

But, yeah, DCUM lady with her vast expertise, decides this self reporting data is not reliable, and instead relies on self reported data from friends and family that for sure show score improvements of 100-200, even 300 points. The contradiction is completely lost on her, but no surprise here!

To wrap it up, kids scoring higher then your child didn’t cheat, they simply work harder have more aptitude and talent. Just accept this and don’t try to smear other kids accomplishments with silly accusations if cheating.


Um, are you mixing up the SAT and Cogat? You're sounding a little confused now.


Cogat is supposed to be even less susceptible to prepping than SAT since it measures reasoning abilities instead of achievement. In reality nobody cares about Cogat enough to study the increase in score through prepping.
Anonymous
Anonymous wrote:Don't hate the players, hate the game.


But that would mean her child is not “gifted”. What a tragedy! Instead her child is gifted, and the others are cheating.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html

DP The studies cited in the article are based on limited information and their findings should not be assumed for all students. Study limitations: 1) Students self-report whether they prepared for the SAT. Self-reporting is known to introduce many biases. Often students that do well don't want to acknowledge they prepared and students that didn't do well may want to show they that tried. 2) There is no way to know the intensity of use. Some students may have purchased a book but opened it once. Some may have enrolled in a course and used it once. Other students may have prepared for hours with their materials. Thus, a yes/no for preparation with no indication of usage is a poor proxy. 3) Studies look at the improvement from PSAT to SAT. But many students begin to prepare before the PSAT so the test scores would miss the initial improvement coming into the PSAT.

Individual students are best placed to know if these courses work. They know their test scores before preparing and those afterward. Large scale studies miss all the sub details needed to make that assessment. The popularity of test preparation indicates that it is more useful than what the article suggests. Families listen to friends' experiences; if they're shelling out money, it's because they have heard friends report that it works.


I don’t think you read carefully the references, or you don’t know what control for variables methodology means.

This reference is quite compelling:

https://www.researchgate.net/profile/Benjamin-Domingue/publication/228337033_Using_Linear_Regression_and_Propensity_Score_Matching_to_Estimate_the_Effect_of_Coaching_on_the_SAT/links/5486ec840cf268d28f06a133/Using-Linear-Regression-and-Propensity-Score-Matching-to-Estimate-the-Effect-of-Coaching-on-the-SAT.pdf?origin=publication_detail


For your own information asking friends and acquaintances is also a form of self reporting, likely way more prone to bias than anonymous self reporting. Authors control for intensity of use or motivation using proxies such as intent to apply and requesting information from college, GPA, SES etc. Contrary to what you claim they didn’t look at improvement from PSAT to SAT, but evaluated the effect of coaching on students that took or didn’t take the PSAT, ie control for another variable.

But sure, let’s not believe the study done by a Stanford professor, expert in educational testing, and let’s go by what auntie told you about the prep class her daughter took. Feel free to waste as much time and energy obsessing over who preps and if it is cheating.

These studies are based on student self-reporting of test preparation which introduces well known biases. Generally, authors try to avoid self-reported data where possible. Here it is not possible so they use the best data set they could access in ELS. Starting from that point, authors try to control for factors but can't capture nuances. They attempt to control for intensity of usage by using college intent, GPAs, and SES. But students can look similar on paper for these criteria and still prepare very differently for standardized tests - some buy a book and never use it because they're too busy, others prepare extensively. One student's intensity of preparation can also vary over time depending on their other activities, whether pre-PSAT, pre-SAT, from first SAT take to second SAT take, or other. That is why it is unwise to use a generalized result from these large scale studies and argue this applies to all students. If a student is willing to put in long hours preparing (whether from paid service or free like Khan), that is a critical factor.


Without having even the most basic understanding on statistics and how testing works, you should refrain from participating in this conversation. If you don’t approve of self reporting, how exactly do you propose collecting that data, stealing personal information? Or just not even parse through the data and do the study since your aunty already told you prepping works well and you can easily improve your score by 300 points.

Basically you’re saying disregard the statistical trends of thousands of students and look at the few outliers that spend long hours studying, never mind that you’re actually looking at the dose response, ie the more you study the higher the score because you’re actually learning the material that is tested.

You really can’t be any dumber than this!

Study results are only as good as the data used. These studies are using self-reported, yes/no answers as to whether students prepared for the SAT with books, courses, etc. There is no objective confirmation as to whether these responses are accurate and no objective or subjective data provided on time spent on preparation. View the study results accordingly.


It’s truly amazing how complete ignoramuses like yourself, decide that they know better than experts in the field who account for more than 30 different variables that track math credits, hours of homework per week etc., AP coursework, parent involvement in SAT prep and college application etc. The paper even surveys five other studies, on different populations, with different analysis methods and the score improvement due to prepping is less than 20 points on math, even less on verbal. That’s smaller than 5% of the test range and below the margin of error of the test.

But, yeah, DCUM lady with her vast expertise, decides this self reporting data is not reliable, and instead relies on self reported data from friends and family that for sure show score improvements of 100-200, even 300 points. The contradiction is completely lost on her, but no surprise here!

To wrap it up, kids scoring higher then your child didn’t cheat, they simply work harder have more aptitude and talent. Just accept this and don’t try to smear other kids accomplishments with silly accusations if cheating.


Um, are you mixing up the SAT and Cogat? You're sounding a little confused now.


Cogat is supposed to be even less susceptible to prepping than SAT since it measures reasoning abilities instead of achievement. In reality nobody cares about Cogat enough to study the increase in score through prepping.


There aren't studies with "statistics" but the Cogat is well known to be highly susceptible to prepping. The test designer has put forward that scores are totally unreliable after a small amount of prepping.

This really isn't a new question.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html

DP The studies cited in the article are based on limited information and their findings should not be assumed for all students. Study limitations: 1) Students self-report whether they prepared for the SAT. Self-reporting is known to introduce many biases. Often students that do well don't want to acknowledge they prepared and students that didn't do well may want to show they that tried. 2) There is no way to know the intensity of use. Some students may have purchased a book but opened it once. Some may have enrolled in a course and used it once. Other students may have prepared for hours with their materials. Thus, a yes/no for preparation with no indication of usage is a poor proxy. 3) Studies look at the improvement from PSAT to SAT. But many students begin to prepare before the PSAT so the test scores would miss the initial improvement coming into the PSAT.

Individual students are best placed to know if these courses work. They know their test scores before preparing and those afterward. Large scale studies miss all the sub details needed to make that assessment. The popularity of test preparation indicates that it is more useful than what the article suggests. Families listen to friends' experiences; if they're shelling out money, it's because they have heard friends report that it works.


I don’t think you read carefully the references, or you don’t know what control for variables methodology means.

This reference is quite compelling:

https://www.researchgate.net/profile/Benjamin-Domingue/publication/228337033_Using_Linear_Regression_and_Propensity_Score_Matching_to_Estimate_the_Effect_of_Coaching_on_the_SAT/links/5486ec840cf268d28f06a133/Using-Linear-Regression-and-Propensity-Score-Matching-to-Estimate-the-Effect-of-Coaching-on-the-SAT.pdf?origin=publication_detail


For your own information asking friends and acquaintances is also a form of self reporting, likely way more prone to bias than anonymous self reporting. Authors control for intensity of use or motivation using proxies such as intent to apply and requesting information from college, GPA, SES etc. Contrary to what you claim they didn’t look at improvement from PSAT to SAT, but evaluated the effect of coaching on students that took or didn’t take the PSAT, ie control for another variable.

But sure, let’s not believe the study done by a Stanford professor, expert in educational testing, and let’s go by what auntie told you about the prep class her daughter took. Feel free to waste as much time and energy obsessing over who preps and if it is cheating.

These studies are based on student self-reporting of test preparation which introduces well known biases. Generally, authors try to avoid self-reported data where possible. Here it is not possible so they use the best data set they could access in ELS. Starting from that point, authors try to control for factors but can't capture nuances. They attempt to control for intensity of usage by using college intent, GPAs, and SES. But students can look similar on paper for these criteria and still prepare very differently for standardized tests - some buy a book and never use it because they're too busy, others prepare extensively. One student's intensity of preparation can also vary over time depending on their other activities, whether pre-PSAT, pre-SAT, from first SAT take to second SAT take, or other. That is why it is unwise to use a generalized result from these large scale studies and argue this applies to all students. If a student is willing to put in long hours preparing (whether from paid service or free like Khan), that is a critical factor.


Without having even the most basic understanding on statistics and how testing works, you should refrain from participating in this conversation. If you don’t approve of self reporting, how exactly do you propose collecting that data, stealing personal information? Or just not even parse through the data and do the study since your aunty already told you prepping works well and you can easily improve your score by 300 points.

Basically you’re saying disregard the statistical trends of thousands of students and look at the few outliers that spend long hours studying, never mind that you’re actually looking at the dose response, ie the more you study the higher the score because you’re actually learning the material that is tested.

You really can’t be any dumber than this!

Study results are only as good as the data used. These studies are using self-reported, yes/no answers as to whether students prepared for the SAT with books, courses, etc. There is no objective confirmation as to whether these responses are accurate and no objective or subjective data provided on time spent on preparation. View the study results accordingly.


It’s truly amazing how complete ignoramuses like yourself, decide that they know better than experts in the field who account for more than 30 different variables that track math credits, hours of homework per week etc., AP coursework, parent involvement in SAT prep and college application etc. The paper even surveys five other studies, on different populations, with different analysis methods and the score improvement due to prepping is less than 20 points on math, even less on verbal. That’s smaller than 5% of the test range and below the margin of error of the test.

But, yeah, DCUM lady with her vast expertise, decides this self reporting data is not reliable, and instead relies on self reported data from friends and family that for sure show score improvements of 100-200, even 300 points. The contradiction is completely lost on her, but no surprise here!

To wrap it up, kids scoring higher then your child didn’t cheat, they simply work harder have more aptitude and talent. Just accept this and don’t try to smear other kids accomplishments with silly accusations if cheating.


Um, are you mixing up the SAT and Cogat? You're sounding a little confused now.


Cogat is supposed to be even less susceptible to prepping than SAT since it measures reasoning abilities instead of achievement. In reality nobody cares about Cogat enough to study the increase in score through prepping.


There aren't studies with "statistics" but the Cogat is well known to be highly susceptible to prepping. The test designer has put forward that scores are totally unreliable after a small amount of prepping.

This really isn't a new question.


Can you please provide proof?
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html

DP The studies cited in the article are based on limited information and their findings should not be assumed for all students. Study limitations: 1) Students self-report whether they prepared for the SAT. Self-reporting is known to introduce many biases. Often students that do well don't want to acknowledge they prepared and students that didn't do well may want to show they that tried. 2) There is no way to know the intensity of use. Some students may have purchased a book but opened it once. Some may have enrolled in a course and used it once. Other students may have prepared for hours with their materials. Thus, a yes/no for preparation with no indication of usage is a poor proxy. 3) Studies look at the improvement from PSAT to SAT. But many students begin to prepare before the PSAT so the test scores would miss the initial improvement coming into the PSAT.

Individual students are best placed to know if these courses work. They know their test scores before preparing and those afterward. Large scale studies miss all the sub details needed to make that assessment. The popularity of test preparation indicates that it is more useful than what the article suggests. Families listen to friends' experiences; if they're shelling out money, it's because they have heard friends report that it works.


I don’t think you read carefully the references, or you don’t know what control for variables methodology means.

This reference is quite compelling:

https://www.researchgate.net/profile/Benjamin-Domingue/publication/228337033_Using_Linear_Regression_and_Propensity_Score_Matching_to_Estimate_the_Effect_of_Coaching_on_the_SAT/links/5486ec840cf268d28f06a133/Using-Linear-Regression-and-Propensity-Score-Matching-to-Estimate-the-Effect-of-Coaching-on-the-SAT.pdf?origin=publication_detail


For your own information asking friends and acquaintances is also a form of self reporting, likely way more prone to bias than anonymous self reporting. Authors control for intensity of use or motivation using proxies such as intent to apply and requesting information from college, GPA, SES etc. Contrary to what you claim they didn’t look at improvement from PSAT to SAT, but evaluated the effect of coaching on students that took or didn’t take the PSAT, ie control for another variable.

But sure, let’s not believe the study done by a Stanford professor, expert in educational testing, and let’s go by what auntie told you about the prep class her daughter took. Feel free to waste as much time and energy obsessing over who preps and if it is cheating.

These studies are based on student self-reporting of test preparation which introduces well known biases. Generally, authors try to avoid self-reported data where possible. Here it is not possible so they use the best data set they could access in ELS. Starting from that point, authors try to control for factors but can't capture nuances. They attempt to control for intensity of usage by using college intent, GPAs, and SES. But students can look similar on paper for these criteria and still prepare very differently for standardized tests - some buy a book and never use it because they're too busy, others prepare extensively. One student's intensity of preparation can also vary over time depending on their other activities, whether pre-PSAT, pre-SAT, from first SAT take to second SAT take, or other. That is why it is unwise to use a generalized result from these large scale studies and argue this applies to all students. If a student is willing to put in long hours preparing (whether from paid service or free like Khan), that is a critical factor.


Without having even the most basic understanding on statistics and how testing works, you should refrain from participating in this conversation. If you don’t approve of self reporting, how exactly do you propose collecting that data, stealing personal information? Or just not even parse through the data and do the study since your aunty already told you prepping works well and you can easily improve your score by 300 points.

Basically you’re saying disregard the statistical trends of thousands of students and look at the few outliers that spend long hours studying, never mind that you’re actually looking at the dose response, ie the more you study the higher the score because you’re actually learning the material that is tested.

You really can’t be any dumber than this!

Study results are only as good as the data used. These studies are using self-reported, yes/no answers as to whether students prepared for the SAT with books, courses, etc. There is no objective confirmation as to whether these responses are accurate and no objective or subjective data provided on time spent on preparation. View the study results accordingly.


It’s truly amazing how complete ignoramuses like yourself, decide that they know better than experts in the field who account for more than 30 different variables that track math credits, hours of homework per week etc., AP coursework, parent involvement in SAT prep and college application etc. The paper even surveys five other studies, on different populations, with different analysis methods and the score improvement due to prepping is less than 20 points on math, even less on verbal. That’s smaller than 5% of the test range and below the margin of error of the test.

But, yeah, DCUM lady with her vast expertise, decides this self reporting data is not reliable, and instead relies on self reported data from friends and family that for sure show score improvements of 100-200, even 300 points. The contradiction is completely lost on her, but no surprise here!

To wrap it up, kids scoring higher then your child didn’t cheat, they simply work harder have more aptitude and talent. Just accept this and don’t try to smear other kids accomplishments with silly accusations if cheating.


Um, are you mixing up the SAT and Cogat? You're sounding a little confused now.


Cogat is supposed to be even less susceptible to prepping than SAT since it measures reasoning abilities instead of achievement. In reality nobody cares about Cogat enough to study the increase in score through prepping.


There aren't studies with "statistics" but the Cogat is well known to be highly susceptible to prepping. The test designer has put forward that scores are totally unreliable after a small amount of prepping.

This really isn't a new question.


Can you please provide proof?


+1

I’d also like to see proof that scores are totally unreliable after a small amount of prepping.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html

DP The studies cited in the article are based on limited information and their findings should not be assumed for all students. Study limitations: 1) Students self-report whether they prepared for the SAT. Self-reporting is known to introduce many biases. Often students that do well don't want to acknowledge they prepared and students that didn't do well may want to show they that tried. 2) There is no way to know the intensity of use. Some students may have purchased a book but opened it once. Some may have enrolled in a course and used it once. Other students may have prepared for hours with their materials. Thus, a yes/no for preparation with no indication of usage is a poor proxy. 3) Studies look at the improvement from PSAT to SAT. But many students begin to prepare before the PSAT so the test scores would miss the initial improvement coming into the PSAT.

Individual students are best placed to know if these courses work. They know their test scores before preparing and those afterward. Large scale studies miss all the sub details needed to make that assessment. The popularity of test preparation indicates that it is more useful than what the article suggests. Families listen to friends' experiences; if they're shelling out money, it's because they have heard friends report that it works.


I don’t think you read carefully the references, or you don’t know what control for variables methodology means.

This reference is quite compelling:

https://www.researchgate.net/profile/Benjamin-Domingue/publication/228337033_Using_Linear_Regression_and_Propensity_Score_Matching_to_Estimate_the_Effect_of_Coaching_on_the_SAT/links/5486ec840cf268d28f06a133/Using-Linear-Regression-and-Propensity-Score-Matching-to-Estimate-the-Effect-of-Coaching-on-the-SAT.pdf?origin=publication_detail


For your own information asking friends and acquaintances is also a form of self reporting, likely way more prone to bias than anonymous self reporting. Authors control for intensity of use or motivation using proxies such as intent to apply and requesting information from college, GPA, SES etc. Contrary to what you claim they didn’t look at improvement from PSAT to SAT, but evaluated the effect of coaching on students that took or didn’t take the PSAT, ie control for another variable.

But sure, let’s not believe the study done by a Stanford professor, expert in educational testing, and let’s go by what auntie told you about the prep class her daughter took. Feel free to waste as much time and energy obsessing over who preps and if it is cheating.

These studies are based on student self-reporting of test preparation which introduces well known biases. Generally, authors try to avoid self-reported data where possible. Here it is not possible so they use the best data set they could access in ELS. Starting from that point, authors try to control for factors but can't capture nuances. They attempt to control for intensity of usage by using college intent, GPAs, and SES. But students can look similar on paper for these criteria and still prepare very differently for standardized tests - some buy a book and never use it because they're too busy, others prepare extensively. One student's intensity of preparation can also vary over time depending on their other activities, whether pre-PSAT, pre-SAT, from first SAT take to second SAT take, or other. That is why it is unwise to use a generalized result from these large scale studies and argue this applies to all students. If a student is willing to put in long hours preparing (whether from paid service or free like Khan), that is a critical factor.


Without having even the most basic understanding on statistics and how testing works, you should refrain from participating in this conversation. If you don’t approve of self reporting, how exactly do you propose collecting that data, stealing personal information? Or just not even parse through the data and do the study since your aunty already told you prepping works well and you can easily improve your score by 300 points.

Basically you’re saying disregard the statistical trends of thousands of students and look at the few outliers that spend long hours studying, never mind that you’re actually looking at the dose response, ie the more you study the higher the score because you’re actually learning the material that is tested.

You really can’t be any dumber than this!

Study results are only as good as the data used. These studies are using self-reported, yes/no answers as to whether students prepared for the SAT with books, courses, etc. There is no objective confirmation as to whether these responses are accurate and no objective or subjective data provided on time spent on preparation. View the study results accordingly.


It’s truly amazing how complete ignoramuses like yourself, decide that they know better than experts in the field who account for more than 30 different variables that track math credits, hours of homework per week etc., AP coursework, parent involvement in SAT prep and college application etc. The paper even surveys five other studies, on different populations, with different analysis methods and the score improvement due to prepping is less than 20 points on math, even less on verbal. That’s smaller than 5% of the test range and below the margin of error of the test.

But, yeah, DCUM lady with her vast expertise, decides this self reporting data is not reliable, and instead relies on self reported data from friends and family that for sure show score improvements of 100-200, even 300 points. The contradiction is completely lost on her, but no surprise here!

To wrap it up, kids scoring higher then your child didn’t cheat, they simply work harder have more aptitude and talent. Just accept this and don’t try to smear other kids accomplishments with silly accusations if cheating.


Um, are you mixing up the SAT and Cogat? You're sounding a little confused now.


Cogat is supposed to be even less susceptible to prepping than SAT since it measures reasoning abilities instead of achievement. In reality nobody cares about Cogat enough to study the increase in score through prepping.


There aren't studies with "statistics" but the Cogat is well known to be highly susceptible to prepping. The test designer has put forward that scores are totally unreliable after a small amount of prepping.

This really isn't a new question.


Well.. fire that moron and hire someone else more competent!
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html

DP The studies cited in the article are based on limited information and their findings should not be assumed for all students. Study limitations: 1) Students self-report whether they prepared for the SAT. Self-reporting is known to introduce many biases. Often students that do well don't want to acknowledge they prepared and students that didn't do well may want to show they that tried. 2) There is no way to know the intensity of use. Some students may have purchased a book but opened it once. Some may have enrolled in a course and used it once. Other students may have prepared for hours with their materials. Thus, a yes/no for preparation with no indication of usage is a poor proxy. 3) Studies look at the improvement from PSAT to SAT. But many students begin to prepare before the PSAT so the test scores would miss the initial improvement coming into the PSAT.

Individual students are best placed to know if these courses work. They know their test scores before preparing and those afterward. Large scale studies miss all the sub details needed to make that assessment. The popularity of test preparation indicates that it is more useful than what the article suggests. Families listen to friends' experiences; if they're shelling out money, it's because they have heard friends report that it works.


I don’t think you read carefully the references, or you don’t know what control for variables methodology means.

This reference is quite compelling:

https://www.researchgate.net/profile/Benjamin-Domingue/publication/228337033_Using_Linear_Regression_and_Propensity_Score_Matching_to_Estimate_the_Effect_of_Coaching_on_the_SAT/links/5486ec840cf268d28f06a133/Using-Linear-Regression-and-Propensity-Score-Matching-to-Estimate-the-Effect-of-Coaching-on-the-SAT.pdf?origin=publication_detail


For your own information asking friends and acquaintances is also a form of self reporting, likely way more prone to bias than anonymous self reporting. Authors control for intensity of use or motivation using proxies such as intent to apply and requesting information from college, GPA, SES etc. Contrary to what you claim they didn’t look at improvement from PSAT to SAT, but evaluated the effect of coaching on students that took or didn’t take the PSAT, ie control for another variable.

But sure, let’s not believe the study done by a Stanford professor, expert in educational testing, and let’s go by what auntie told you about the prep class her daughter took. Feel free to waste as much time and energy obsessing over who preps and if it is cheating.

These studies are based on student self-reporting of test preparation which introduces well known biases. Generally, authors try to avoid self-reported data where possible. Here it is not possible so they use the best data set they could access in ELS. Starting from that point, authors try to control for factors but can't capture nuances. They attempt to control for intensity of usage by using college intent, GPAs, and SES. But students can look similar on paper for these criteria and still prepare very differently for standardized tests - some buy a book and never use it because they're too busy, others prepare extensively. One student's intensity of preparation can also vary over time depending on their other activities, whether pre-PSAT, pre-SAT, from first SAT take to second SAT take, or other. That is why it is unwise to use a generalized result from these large scale studies and argue this applies to all students. If a student is willing to put in long hours preparing (whether from paid service or free like Khan), that is a critical factor.


Without having even the most basic understanding on statistics and how testing works, you should refrain from participating in this conversation. If you don’t approve of self reporting, how exactly do you propose collecting that data, stealing personal information? Or just not even parse through the data and do the study since your aunty already told you prepping works well and you can easily improve your score by 300 points.

Basically you’re saying disregard the statistical trends of thousands of students and look at the few outliers that spend long hours studying, never mind that you’re actually looking at the dose response, ie the more you study the higher the score because you’re actually learning the material that is tested.

You really can’t be any dumber than this!

Study results are only as good as the data used. These studies are using self-reported, yes/no answers as to whether students prepared for the SAT with books, courses, etc. There is no objective confirmation as to whether these responses are accurate and no objective or subjective data provided on time spent on preparation. View the study results accordingly.


It’s truly amazing how complete ignoramuses like yourself, decide that they know better than experts in the field who account for more than 30 different variables that track math credits, hours of homework per week etc., AP coursework, parent involvement in SAT prep and college application etc. The paper even surveys five other studies, on different populations, with different analysis methods and the score improvement due to prepping is less than 20 points on math, even less on verbal. That’s smaller than 5% of the test range and below the margin of error of the test.

But, yeah, DCUM lady with her vast expertise, decides this self reporting data is not reliable, and instead relies on self reported data from friends and family that for sure show score improvements of 100-200, even 300 points. The contradiction is completely lost on her, but no surprise here!

To wrap it up, kids scoring higher then your child didn’t cheat, they simply work harder have more aptitude and talent. Just accept this and don’t try to smear other kids accomplishments with silly accusations if cheating.


Um, are you mixing up the SAT and Cogat? You're sounding a little confused now.


Cogat is supposed to be even less susceptible to prepping than SAT since it measures reasoning abilities instead of achievement. In reality nobody cares about Cogat enough to study the increase in score through prepping.


There aren't studies with "statistics" but the Cogat is well known to be highly susceptible to prepping. The test designer has put forward that scores are totally unreliable after a small amount of prepping.

This really isn't a new question.


Can you please provide proof?


There’s an entire industry of classes, books, practice tests, tutors etc. designed to raise scores by… prepping. Why the heck would “everyone” prep if it did nothing to the score? Let’s use some common sense here.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:It’s funny how many people swear up and down hat it’s cheating, but nobody can provide a source.


+1. It’s just a way to rationalize why their kid didn’t do better.

The scores raise marginally on retakes but just because of familiarity with the test. For most tests taken at school there’s a mock up session for this purpose.

The test shouldn’t be taken less than 6 months apart but that only has to do with the bank of questions, there’s a limited number and questions may repeat.

There are countless studies on SAT and other tests that prepping doesn’t help much. But to each his own, prep if you think your child needs it, don’t if you feel it’s not appropriate. My view is that it’s a waste of time past 2-3 familiarization sessions, and I think it’s better to develop actual skills like math and reading that are much more useful in the child’s academic career.


I know. The service we hired would only guarantee a 200-point improvement, but I'm told with some work you can easily bring up 300.


You need to understand the difference between fact and advertising, otherwise I have some enlargement pills to sell you. They are guaranteed to work!


I know my kid's SAT score went up 300 after Princeton Review. The prep really made a difference.


You are lying.

Carefully done research has shown that the increase in SAT scores through coaching is 10-20 points. Here is a review with scientific papers as references.

https://slate.com/technology/2019/04/sat-prep-courses-do-they-work-bias.html

DP The studies cited in the article are based on limited information and their findings should not be assumed for all students. Study limitations: 1) Students self-report whether they prepared for the SAT. Self-reporting is known to introduce many biases. Often students that do well don't want to acknowledge they prepared and students that didn't do well may want to show they that tried. 2) There is no way to know the intensity of use. Some students may have purchased a book but opened it once. Some may have enrolled in a course and used it once. Other students may have prepared for hours with their materials. Thus, a yes/no for preparation with no indication of usage is a poor proxy. 3) Studies look at the improvement from PSAT to SAT. But many students begin to prepare before the PSAT so the test scores would miss the initial improvement coming into the PSAT.

Individual students are best placed to know if these courses work. They know their test scores before preparing and those afterward. Large scale studies miss all the sub details needed to make that assessment. The popularity of test preparation indicates that it is more useful than what the article suggests. Families listen to friends' experiences; if they're shelling out money, it's because they have heard friends report that it works.


I don’t think you read carefully the references, or you don’t know what control for variables methodology means.

This reference is quite compelling:

https://www.researchgate.net/profile/Benjamin-Domingue/publication/228337033_Using_Linear_Regression_and_Propensity_Score_Matching_to_Estimate_the_Effect_of_Coaching_on_the_SAT/links/5486ec840cf268d28f06a133/Using-Linear-Regression-and-Propensity-Score-Matching-to-Estimate-the-Effect-of-Coaching-on-the-SAT.pdf?origin=publication_detail


For your own information asking friends and acquaintances is also a form of self reporting, likely way more prone to bias than anonymous self reporting. Authors control for intensity of use or motivation using proxies such as intent to apply and requesting information from college, GPA, SES etc. Contrary to what you claim they didn’t look at improvement from PSAT to SAT, but evaluated the effect of coaching on students that took or didn’t take the PSAT, ie control for another variable.

But sure, let’s not believe the study done by a Stanford professor, expert in educational testing, and let’s go by what auntie told you about the prep class her daughter took. Feel free to waste as much time and energy obsessing over who preps and if it is cheating.

These studies are based on student self-reporting of test preparation which introduces well known biases. Generally, authors try to avoid self-reported data where possible. Here it is not possible so they use the best data set they could access in ELS. Starting from that point, authors try to control for factors but can't capture nuances. They attempt to control for intensity of usage by using college intent, GPAs, and SES. But students can look similar on paper for these criteria and still prepare very differently for standardized tests - some buy a book and never use it because they're too busy, others prepare extensively. One student's intensity of preparation can also vary over time depending on their other activities, whether pre-PSAT, pre-SAT, from first SAT take to second SAT take, or other. That is why it is unwise to use a generalized result from these large scale studies and argue this applies to all students. If a student is willing to put in long hours preparing (whether from paid service or free like Khan), that is a critical factor.


Without having even the most basic understanding on statistics and how testing works, you should refrain from participating in this conversation. If you don’t approve of self reporting, how exactly do you propose collecting that data, stealing personal information? Or just not even parse through the data and do the study since your aunty already told you prepping works well and you can easily improve your score by 300 points.

Basically you’re saying disregard the statistical trends of thousands of students and look at the few outliers that spend long hours studying, never mind that you’re actually looking at the dose response, ie the more you study the higher the score because you’re actually learning the material that is tested.

You really can’t be any dumber than this!

Study results are only as good as the data used. These studies are using self-reported, yes/no answers as to whether students prepared for the SAT with books, courses, etc. There is no objective confirmation as to whether these responses are accurate and no objective or subjective data provided on time spent on preparation. View the study results accordingly.


It’s truly amazing how complete ignoramuses like yourself, decide that they know better than experts in the field who account for more than 30 different variables that track math credits, hours of homework per week etc., AP coursework, parent involvement in SAT prep and college application etc. The paper even surveys five other studies, on different populations, with different analysis methods and the score improvement due to prepping is less than 20 points on math, even less on verbal. That’s smaller than 5% of the test range and below the margin of error of the test.

But, yeah, DCUM lady with her vast expertise, decides this self reporting data is not reliable, and instead relies on self reported data from friends and family that for sure show score improvements of 100-200, even 300 points. The contradiction is completely lost on her, but no surprise here!

To wrap it up, kids scoring higher then your child didn’t cheat, they simply work harder have more aptitude and talent. Just accept this and don’t try to smear other kids accomplishments with silly accusations if cheating.


Um, are you mixing up the SAT and Cogat? You're sounding a little confused now.


Cogat is supposed to be even less susceptible to prepping than SAT since it measures reasoning abilities instead of achievement. In reality nobody cares about Cogat enough to study the increase in score through prepping.


There aren't studies with "statistics" but the Cogat is well known to be highly susceptible to prepping. The test designer has put forward that scores are totally unreliable after a small amount of prepping.

This really isn't a new question.


Can you please provide proof?


There’s an entire industry of classes, books, practice tests, tutors etc. designed to raise scores by… prepping. Why the heck would “everyone” prep if it did nothing to the score? Let’s use some common sense here.


You need to substantiate your claim, that the test maker says that their test is flawed.

There are way too many things in the market, pink slime being one of them. People buy it, so they sell it. Just because people believe in it doesn’t mean that it’s true.
Anonymous
Anonymous wrote:
Anonymous wrote:Don't hate the players, hate the game.


But that would mean her child is not “gifted”. What a tragedy! Instead her child is gifted, and the others are cheating.

AAP is not a gifted program.
When a child actively prepares for tests, including COGAT, NNAT, SOL, etc. it demonstrates their appreciation for and interest in learning. In this regard, they are likely to adapt well to the AAP program. While it is not a very good program for truly gifted children, it does provide an excellent opportunity for students who are motivated to delve deeper into their studies compared to their peers in general education.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Don't hate the players, hate the game.


But that would mean her child is not “gifted”. What a tragedy! Instead her child is gifted, and the others are cheating.

AAP is not a gifted program.
When a child actively prepares for tests, including COGAT, NNAT, SOL, etc. it demonstrates their appreciation for and interest in learning. In this regard, they are likely to adapt well to the AAP program. While it is not a very good program for truly gifted children, it does provide an excellent opportunity for students who are motivated to delve deeper into their studies compared to their peers in general education.


The commonwealth of Virginia requires a gifted program for schools. AAP is the gifted program for FCPS. And it's a good program for my "actually" gifted kid as well as my "only 120s IQ" kid.

Ymnv
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Don't hate the players, hate the game.


But that would mean her child is not “gifted”. What a tragedy! Instead her child is gifted, and the others are cheating.

AAP is not a gifted program.
When a child actively prepares for tests, including COGAT, NNAT, SOL, etc. it demonstrates their appreciation for and interest in learning. In this regard, they are likely to adapt well to the AAP program. While it is not a very good program for truly gifted children, it does provide an excellent opportunity for students who are motivated to delve deeper into their studies compared to their peers in general education.


The commonwealth of Virginia requires a gifted program for schools. AAP is the gifted program for FCPS. And it's a good program for my "actually" gifted kid as well as my "only 120s IQ" kid.

Ymnv


AAP is NOT a gifted program, but fcps fulfills the gifted mandate through the aap program.

Truly gifted children aren’t being served.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Don't hate the players, hate the game.


But that would mean her child is not “gifted”. What a tragedy! Instead her child is gifted, and the others are cheating.

AAP is not a gifted program.
When a child actively prepares for tests, including COGAT, NNAT, SOL, etc. it demonstrates their appreciation for and interest in learning. In this regard, they are likely to adapt well to the AAP program. While it is not a very good program for truly gifted children, it does provide an excellent opportunity for students who are motivated to delve deeper into their studies compared to their peers in general education.


The commonwealth of Virginia requires a gifted program for schools. AAP is the gifted program for FCPS. And it's a good program for my "actually" gifted kid as well as my "only 120s IQ" kid.

Ymnv


AAP is NOT a gifted program, but fcps fulfills the gifted mandate through the aap program.

Truly gifted children aren’t being served.


The implication is of course that your child is one of “those” gifted kids that should be served by the public school system, but are not, because all the undeserving kids that make it through.

Are you proposing something here? That those kids should be kicked out and maybe restrict the program to top 0.1%? Top 0.01%? At that point it’s only your kid qualified for the program. Maybe you should homeschool.
Anonymous
Anonymous wrote:
Anonymous wrote:Unless someone grew up with no moral compass whatsoever, people know what cheating is and they know when they are engaged in activities which are not strictly ethical.


Is that he best argument the prepping-is-cheating folks have: you know it when you see it?! If you're going to call us cheaters, at least provide an argument.

Our students study for curricular tests in math, science, English, history, etc.We rehearse before delivering a speech. Athletes warm up before a game, practice extensively, and receive coaching. Similar for musicians. Are those sorts of prepping also forms of cheating?

As others have noted, FCPS teachers themselves help students to prep for these tests. Why i that particular amount of prep OK but any additional prep a form of cheating? And what doesn't FCP mention this prepping-is-cheating policy on their webpage?
https://www.fcps.edu/node/39761


It’s always interesting to see the lengths people will go to to justify cheating. Do you also think it’s okay to cheat on your taxes? Because probably lots of other people are doing it, too, right?

I hope at some point in your life, you figure out how to know dishonesty when you see it, without needing someone to provide you with an argument. Adults should be able to figure out right from wrong: that’s part of being an adult.
post reply Forum Index » Advanced Academic Programs (AAP)
Message Quick Reply
Go to: