Anonymous wrote:Anonymous wrote:The author's argument in that blog is awful. Yes, all standardized assessments have their limitations, which is why a *good* teacher will not just look at the test results as the sole measure of a student's abilities, but rather will use it as a tool to help identify weaknesses. The time it would take for a teacher to assess every single student individually to get that same data would be exponentially greater than the time spent taking the test, meaning the teacher would have less time to actually teach. I am so over this fantasy world where we pretend teachers have endless time and resources to give every child a one-on-one experience tailored exactly to their needs and skill sets.
Also note that the blog is a marketing tool for the teacher's own book on math instruction, so she has a bit of an interest in criticizing current systems.
I've only heard teachers say that iready gave them the same information they already have about students. IOW, it has no value.
Anonymous wrote:Anonymous wrote:The author's argument in that blog is awful. Yes, all standardized assessments have their limitations, which is why a *good* teacher will not just look at the test results as the sole measure of a student's abilities, but rather will use it as a tool to help identify weaknesses. The time it would take for a teacher to assess every single student individually to get that same data would be exponentially greater than the time spent taking the test, meaning the teacher would have less time to actually teach. I am so over this fantasy world where we pretend teachers have endless time and resources to give every child a one-on-one experience tailored exactly to their needs and skill sets.
Also note that the blog is a marketing tool for the teacher's own book on math instruction, so she has a bit of an interest in criticizing current systems.
I've only heard teachers say that iready gave them the same information they already have about students. IOW, it has no value.
Anonymous wrote:The author's argument in that blog is awful. Yes, all standardized assessments have their limitations, which is why a *good* teacher will not just look at the test results as the sole measure of a student's abilities, but rather will use it as a tool to help identify weaknesses. The time it would take for a teacher to assess every single student individually to get that same data would be exponentially greater than the time spent taking the test, meaning the teacher would have less time to actually teach. I am so over this fantasy world where we pretend teachers have endless time and resources to give every child a one-on-one experience tailored exactly to their needs and skill sets.
Also note that the blog is a marketing tool for the teacher's own book on math instruction, so she has a bit of an interest in criticizing current systems.
Anonymous wrote:
To me, and I do not have incredibly strong feelings on the test and no ulterior motive, I think you underscore what some are saying is the problem. A child who got this wrong for picking an equation from the correct fact family which is the strategy they used to get the correct result looks exactly the same oin results as a child who got this flat out wrong.That seems to obviously mean that children of very different abilities or deficiencies get lumped together. Hence that data is of questionable use/accuracy.
Anonymous wrote:Anonymous wrote:Anonymous wrote:This brochure says the I-Ready report comes back with information per each state standard or common core standard. Is the problem that the state standards aren't specific enough? The common core standards are very specific.
https://www.tsc.k12.in.us/uploaded/Technology/Tech_Training_Docs/Textbook_Adoption/iReady-User-Guide_(1).pdf
Her blog says this, however I- Ready training says teachers can access more. Can anyone confirm what exactly teachers can see?
"The teacher can never see the questions the child answered correctly or incorrectly, nor can she even access a description of the kinds of questions the child answered correctly or incorrectly. The most a teacher will ever know is that a child scored poorly, for example, in number and operations. Folks, that is a giant category, and far too broad to be actionable."
We also wouldn't be having this conversation if teachers who are teaching ESL and special ed students directed their concerns specifically for ESL and special ed students instead of general ed. Instead this teacher decides to rail on a test by calling it dangerous and can't even show a problem that has much confusion in it. The work general ed students get in school is much more difficult than this equation as it should be. The question is straightforward. This teacher deals with a population that is struggling for other reasons than understanding subtraction.
Anonymous wrote:Anonymous wrote:This brochure says the I-Ready report comes back with information per each state standard or common core standard. Is the problem that the state standards aren't specific enough? The common core standards are very specific.
https://www.tsc.k12.in.us/uploaded/Technology/Tech_Training_Docs/Textbook_Adoption/iReady-User-Guide_(1).pdf
Her blog says this, however I- Ready training says teachers can access more. Can anyone confirm what exactly teachers can see?
"The teacher can never see the questions the child answered correctly or incorrectly, nor can she even access a description of the kinds of questions the child answered correctly or incorrectly. The most a teacher will ever know is that a child scored poorly, for example, in number and operations. Folks, that is a giant category, and far too broad to be actionable."
Anonymous wrote:Anonymous wrote:
To me, and I do not have incredibly strong feelings on the test and no ulterior motive, I think you underscore what some are saying is the problem. A child who got this wrong for picking an equation from the correct fact family which is the strategy they used to get the correct result looks exactly the same oin results as a child who got this flat out wrong.That seems to obviously mean that children of very different abilities or deficiencies get lumped together. Hence that data is of questionable use/accuracy.
I think you are confusing a child using a strategy to get an answer and a child identifying a fact family. The answer was 4 and I believe there was only one equation that the answer was 4. It wasn't like there were several problems that led to an answer of 4.
Anonymous wrote:This brochure says the I-Ready report comes back with information per each state standard or common core standard. Is the problem that the state standards aren't specific enough? The common core standards are very specific.
https://www.tsc.k12.in.us/uploaded/Technology/Tech_Training_Docs/Textbook_Adoption/iReady-User-Guide_(1).pdf
Anonymous wrote:Anonymous wrote:Anonymous wrote:
Did you even read the article? Please stop talking about things you know nothing about. Sure, you can solve a subtraction problem, and you went to school. But that does not mean you understand the pedagogy of mathematics. Teaching kids the "trick" or the algorithm is not the answer. Kids have to have an understanding of the concept. I'm sure you think using flashcards is a brilliant idea. Unfortunately, that just won't work with many kids.
The issue with the iReady is that the teacher never knows which problems the kid got wrong. All the report says is that the child is low in "Number Sense" or "Operations." Those are HUGE topics and leave the teacher wasting time trying to really get at where the breakdown in problem solving lies. This assessment is a waste of time, and schools that are using the computer interventions that can come with it are doing a huge disservice to children. FCPS can do better.
NP, but as a math person, I strongly disagree with you. There was nothing at all wrong with the problem as written, and only one equation correctly solved the question as asked. Two other equations were part of the same fact family and could be utilized to find the correct answer, but only one equation was a direct translation of the problem. Translating a word problem directly into a math equation is an important skill for kids to have. I'm sure there were other problems in which kids were asked to select all of the equations in the same fact family.
You keep insisting that iready isn't accurate, but on what basis? The linked article was just speaking about special ed kids, who really ought to be receiving appropriate accommodations, and kids who gave up and filled in whatever. That's been happening since the dawn of standardized tests. Back in the day, I remember at least a few kids filling out the bubble forms for the ITBS tests by making pretty pictures with the bubbles rather than trying to do the test. What's your angle with this? Are you the teacher who said that it doesn't tell you anything you didn't already know? If so, it might be useless, but it's apparently accurate. Are you a parent? If you're a parent, are you upset that your kids scores were lower than you expected?
There seems to be a lot of vitriol and histrionics from a few people over what is basically just another standardized test. I only have two data points, but my anecdata shows that the test is pretty accurate.
To me, and I do not have incredibly strong feelings on the test and no ulterior motive, I think you underscore what some are saying is the problem. A child who got this wrong for picking an equation from the correct fact family which is the strategy they used to get the correct result looks exactly the same oin results as a child who got this flat out wrong.That seems to obviously mean that children of very different abilities or deficiencies get lumped together. Hence that data is of questionable use/accuracy.
Anonymous wrote:Anonymous wrote:Anonymous wrote:Anonymous wrote:
Did you even read the article? Please stop talking about things you know nothing about. Sure, you can solve a subtraction problem, and you went to school. But that does not mean you understand the pedagogy of mathematics. Teaching kids the "trick" or the algorithm is not the answer. Kids have to have an understanding of the concept. I'm sure you think using flashcards is a brilliant idea. Unfortunately, that just won't work with many kids.
The issue with the iReady is that the teacher never knows which problems the kid got wrong. All the report says is that the child is low in "Number Sense" or "Operations." Those are HUGE topics and leave the teacher wasting time trying to really get at where the breakdown in problem solving lies. This assessment is a waste of time, and schools that are using the computer interventions that can come with it are doing a huge disservice to children. FCPS can do better.
NP, but as a math person, I strongly disagree with you. There was nothing at all wrong with the problem as written, and only one equation correctly solved the question as asked. Two other equations were part of the same fact family and could be utilized to find the correct answer, but only one equation was a direct translation of the problem. Translating a word problem directly into a math equation is an important skill for kids to have. I'm sure there were other problems in which kids were asked to select all of the equations in the same fact family.
You keep insisting that iready isn't accurate, but on what basis? The linked article was just speaking about special ed kids, who really ought to be receiving appropriate accommodations, and kids who gave up and filled in whatever. That's been happening since the dawn of standardized tests. Back in the day, I remember at least a few kids filling out the bubble forms for the ITBS tests by making pretty pictures with the bubbles rather than trying to do the test. What's your angle with this? Are you the teacher who said that it doesn't tell you anything you didn't already know? If so, it might be useless, but it's apparently accurate. Are you a parent? If you're a parent, are you upset that your kids scores were lower than you expected?
There seems to be a lot of vitriol and histrionics from a few people over what is basically just another standardized test. I only have two data points, but my anecdata shows that the test is pretty accurate.
To me, and I do not have incredibly strong feelings on the test and no ulterior motive, I think you underscore what some are saying is the problem. A child who got this wrong for picking an equation from the correct fact family which is the strategy they used to get the correct result looks exactly the same oin results as a child who got this flat out wrong.That seems to obviously mean that children of very different abilities or deficiencies get lumped together. Hence that data is of questionable use/accuracy.
True, but there is not just one problem to determine if the child meets the standard or not for subtraction. Either way, this child needs instruction on how to translate a word problem to an equation.
Anonymous wrote:Anonymous wrote:Anonymous wrote:
Did you even read the article? Please stop talking about things you know nothing about. Sure, you can solve a subtraction problem, and you went to school. But that does not mean you understand the pedagogy of mathematics. Teaching kids the "trick" or the algorithm is not the answer. Kids have to have an understanding of the concept. I'm sure you think using flashcards is a brilliant idea. Unfortunately, that just won't work with many kids.
The issue with the iReady is that the teacher never knows which problems the kid got wrong. All the report says is that the child is low in "Number Sense" or "Operations." Those are HUGE topics and leave the teacher wasting time trying to really get at where the breakdown in problem solving lies. This assessment is a waste of time, and schools that are using the computer interventions that can come with it are doing a huge disservice to children. FCPS can do better.
NP, but as a math person, I strongly disagree with you. There was nothing at all wrong with the problem as written, and only one equation correctly solved the question as asked. Two other equations were part of the same fact family and could be utilized to find the correct answer, but only one equation was a direct translation of the problem. Translating a word problem directly into a math equation is an important skill for kids to have. I'm sure there were other problems in which kids were asked to select all of the equations in the same fact family.
You keep insisting that iready isn't accurate, but on what basis? The linked article was just speaking about special ed kids, who really ought to be receiving appropriate accommodations, and kids who gave up and filled in whatever. That's been happening since the dawn of standardized tests. Back in the day, I remember at least a few kids filling out the bubble forms for the ITBS tests by making pretty pictures with the bubbles rather than trying to do the test. What's your angle with this? Are you the teacher who said that it doesn't tell you anything you didn't already know? If so, it might be useless, but it's apparently accurate. Are you a parent? If you're a parent, are you upset that your kids scores were lower than you expected?
There seems to be a lot of vitriol and histrionics from a few people over what is basically just another standardized test. I only have two data points, but my anecdata shows that the test is pretty accurate.
To me, and I do not have incredibly strong feelings on the test and no ulterior motive, I think you underscore what some are saying is the problem. A child who got this wrong for picking an equation from the correct fact family which is the strategy they used to get the correct result looks exactly the same oin results as a child who got this flat out wrong.That seems to obviously mean that children of very different abilities or deficiencies get lumped together. Hence that data is of questionable use/accuracy.
Anonymous wrote:Anonymous wrote:
Did you even read the article? Please stop talking about things you know nothing about. Sure, you can solve a subtraction problem, and you went to school. But that does not mean you understand the pedagogy of mathematics. Teaching kids the "trick" or the algorithm is not the answer. Kids have to have an understanding of the concept. I'm sure you think using flashcards is a brilliant idea. Unfortunately, that just won't work with many kids.
The issue with the iReady is that the teacher never knows which problems the kid got wrong. All the report says is that the child is low in "Number Sense" or "Operations." Those are HUGE topics and leave the teacher wasting time trying to really get at where the breakdown in problem solving lies. This assessment is a waste of time, and schools that are using the computer interventions that can come with it are doing a huge disservice to children. FCPS can do better.
NP, but as a math person, I strongly disagree with you. There was nothing at all wrong with the problem as written, and only one equation correctly solved the question as asked. Two other equations were part of the same fact family and could be utilized to find the correct answer, but only one equation was a direct translation of the problem. Translating a word problem directly into a math equation is an important skill for kids to have. I'm sure there were other problems in which kids were asked to select all of the equations in the same fact family.
You keep insisting that iready isn't accurate, but on what basis? The linked article was just speaking about special ed kids, who really ought to be receiving appropriate accommodations, and kids who gave up and filled in whatever. That's been happening since the dawn of standardized tests. Back in the day, I remember at least a few kids filling out the bubble forms for the ITBS tests by making pretty pictures with the bubbles rather than trying to do the test. What's your angle with this? Are you the teacher who said that it doesn't tell you anything you didn't already know? If so, it might be useless, but it's apparently accurate. Are you a parent? If you're a parent, are you upset that your kids scores were lower than you expected?
There seems to be a lot of vitriol and histrionics from a few people over what is basically just another standardized test. I only have two data points, but my anecdata shows that the test is pretty accurate.