Why iReady is dangerous

Anonymous
Anonymous wrote:
Anonymous wrote:

Did you even read the article? Please stop talking about things you know nothing about. Sure, you can solve a subtraction problem, and you went to school. But that does not mean you understand the pedagogy of mathematics. Teaching kids the "trick" or the algorithm is not the answer. Kids have to have an understanding of the concept. I'm sure you think using flashcards is a brilliant idea. Unfortunately, that just won't work with many kids.

The issue with the iReady is that the teacher never knows which problems the kid got wrong. All the report says is that the child is low in "Number Sense" or "Operations." Those are HUGE topics and leave the teacher wasting time trying to really get at where the breakdown in problem solving lies. This assessment is a waste of time, and schools that are using the computer interventions that can come with it are doing a huge disservice to children. FCPS can do better.


NP, but as a math person, I strongly disagree with you. There was nothing at all wrong with the problem as written, and only one equation correctly solved the question as asked. Two other equations were part of the same fact family and could be utilized to find the correct answer, but only one equation was a direct translation of the problem. Translating a word problem directly into a math equation is an important skill for kids to have. I'm sure there were other problems in which kids were asked to select all of the equations in the same fact family.

You keep insisting that iready isn't accurate, but on what basis? The linked article was just speaking about special ed kids, who really ought to be receiving appropriate accommodations, and kids who gave up and filled in whatever. That's been happening since the dawn of standardized tests. Back in the day, I remember at least a few kids filling out the bubble forms for the ITBS tests by making pretty pictures with the bubbles rather than trying to do the test. What's your angle with this? Are you the teacher who said that it doesn't tell you anything you didn't already know? If so, it might be useless, but it's apparently accurate. Are you a parent? If you're a parent, are you upset that your kids scores were lower than you expected?

There seems to be a lot of vitriol and histrionics from a few people over what is basically just another standardized test. I only have two data points, but my anecdata shows that the test is pretty accurate.


To me, and I do not have incredibly strong feelings on the test and no ulterior motive, I think you underscore what some are saying is the problem. A child who got this wrong for picking an equation from the correct fact family which is the strategy they used to get the correct result looks exactly the same oin results as a child who got this flat out wrong.That seems to obviously mean that children of very different abilities or deficiencies get lumped together. Hence that data is of questionable use/accuracy.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:

Did you even read the article? Please stop talking about things you know nothing about. Sure, you can solve a subtraction problem, and you went to school. But that does not mean you understand the pedagogy of mathematics. Teaching kids the "trick" or the algorithm is not the answer. Kids have to have an understanding of the concept. I'm sure you think using flashcards is a brilliant idea. Unfortunately, that just won't work with many kids.

The issue with the iReady is that the teacher never knows which problems the kid got wrong. All the report says is that the child is low in "Number Sense" or "Operations." Those are HUGE topics and leave the teacher wasting time trying to really get at where the breakdown in problem solving lies. This assessment is a waste of time, and schools that are using the computer interventions that can come with it are doing a huge disservice to children. FCPS can do better.


NP, but as a math person, I strongly disagree with you. There was nothing at all wrong with the problem as written, and only one equation correctly solved the question as asked. Two other equations were part of the same fact family and could be utilized to find the correct answer, but only one equation was a direct translation of the problem. Translating a word problem directly into a math equation is an important skill for kids to have. I'm sure there were other problems in which kids were asked to select all of the equations in the same fact family.

You keep insisting that iready isn't accurate, but on what basis? The linked article was just speaking about special ed kids, who really ought to be receiving appropriate accommodations, and kids who gave up and filled in whatever. That's been happening since the dawn of standardized tests. Back in the day, I remember at least a few kids filling out the bubble forms for the ITBS tests by making pretty pictures with the bubbles rather than trying to do the test. What's your angle with this? Are you the teacher who said that it doesn't tell you anything you didn't already know? If so, it might be useless, but it's apparently accurate. Are you a parent? If you're a parent, are you upset that your kids scores were lower than you expected?

There seems to be a lot of vitriol and histrionics from a few people over what is basically just another standardized test. I only have two data points, but my anecdata shows that the test is pretty accurate.


To me, and I do not have incredibly strong feelings on the test and no ulterior motive, I think you underscore what some are saying is the problem. A child who got this wrong for picking an equation from the correct fact family which is the strategy they used to get the correct result looks exactly the same oin results as a child who got this flat out wrong.That seems to obviously mean that children of very different abilities or deficiencies get lumped together. Hence that data is of questionable use/accuracy.


True, but there is not just one problem to determine if the child meets the standard or not for subtraction. Either way, this child needs instruction on how to translate a word problem to an equation.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:

Did you even read the article? Please stop talking about things you know nothing about. Sure, you can solve a subtraction problem, and you went to school. But that does not mean you understand the pedagogy of mathematics. Teaching kids the "trick" or the algorithm is not the answer. Kids have to have an understanding of the concept. I'm sure you think using flashcards is a brilliant idea. Unfortunately, that just won't work with many kids.

The issue with the iReady is that the teacher never knows which problems the kid got wrong. All the report says is that the child is low in "Number Sense" or "Operations." Those are HUGE topics and leave the teacher wasting time trying to really get at where the breakdown in problem solving lies. This assessment is a waste of time, and schools that are using the computer interventions that can come with it are doing a huge disservice to children. FCPS can do better.


NP, but as a math person, I strongly disagree with you. There was nothing at all wrong with the problem as written, and only one equation correctly solved the question as asked. Two other equations were part of the same fact family and could be utilized to find the correct answer, but only one equation was a direct translation of the problem. Translating a word problem directly into a math equation is an important skill for kids to have. I'm sure there were other problems in which kids were asked to select all of the equations in the same fact family.

You keep insisting that iready isn't accurate, but on what basis? The linked article was just speaking about special ed kids, who really ought to be receiving appropriate accommodations, and kids who gave up and filled in whatever. That's been happening since the dawn of standardized tests. Back in the day, I remember at least a few kids filling out the bubble forms for the ITBS tests by making pretty pictures with the bubbles rather than trying to do the test. What's your angle with this? Are you the teacher who said that it doesn't tell you anything you didn't already know? If so, it might be useless, but it's apparently accurate. Are you a parent? If you're a parent, are you upset that your kids scores were lower than you expected?

There seems to be a lot of vitriol and histrionics from a few people over what is basically just another standardized test. I only have two data points, but my anecdata shows that the test is pretty accurate.


To me, and I do not have incredibly strong feelings on the test and no ulterior motive, I think you underscore what some are saying is the problem. A child who got this wrong for picking an equation from the correct fact family which is the strategy they used to get the correct result looks exactly the same oin results as a child who got this flat out wrong.That seems to obviously mean that children of very different abilities or deficiencies get lumped together. Hence that data is of questionable use/accuracy.


True, but there is not just one problem to determine if the child meets the standard or not for subtraction. Either way, this child needs instruction on how to translate a word problem to an equation.


Who knows? No one ever sees which problems a child answers correctly or incorrectly. We are supposed to rely entirely upon iready's judgment.

I'm not sure it's dangerous. But it seems no better than all the other standardized tests, and possibly worse. It also seems to test math in a strange way, which hardly seems useful in a test that measures growth/screener.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:

Did you even read the article? Please stop talking about things you know nothing about. Sure, you can solve a subtraction problem, and you went to school. But that does not mean you understand the pedagogy of mathematics. Teaching kids the "trick" or the algorithm is not the answer. Kids have to have an understanding of the concept. I'm sure you think using flashcards is a brilliant idea. Unfortunately, that just won't work with many kids.

The issue with the iReady is that the teacher never knows which problems the kid got wrong. All the report says is that the child is low in "Number Sense" or "Operations." Those are HUGE topics and leave the teacher wasting time trying to really get at where the breakdown in problem solving lies. This assessment is a waste of time, and schools that are using the computer interventions that can come with it are doing a huge disservice to children. FCPS can do better.


NP, but as a math person, I strongly disagree with you. There was nothing at all wrong with the problem as written, and only one equation correctly solved the question as asked. Two other equations were part of the same fact family and could be utilized to find the correct answer, but only one equation was a direct translation of the problem. Translating a word problem directly into a math equation is an important skill for kids to have. I'm sure there were other problems in which kids were asked to select all of the equations in the same fact family.

You keep insisting that iready isn't accurate, but on what basis? The linked article was just speaking about special ed kids, who really ought to be receiving appropriate accommodations, and kids who gave up and filled in whatever. That's been happening since the dawn of standardized tests. Back in the day, I remember at least a few kids filling out the bubble forms for the ITBS tests by making pretty pictures with the bubbles rather than trying to do the test. What's your angle with this? Are you the teacher who said that it doesn't tell you anything you didn't already know? If so, it might be useless, but it's apparently accurate. Are you a parent? If you're a parent, are you upset that your kids scores were lower than you expected?

There seems to be a lot of vitriol and histrionics from a few people over what is basically just another standardized test. I only have two data points, but my anecdata shows that the test is pretty accurate.


To me, and I do not have incredibly strong feelings on the test and no ulterior motive, I think you underscore what some are saying is the problem. A child who got this wrong for picking an equation from the correct fact family which is the strategy they used to get the correct result looks exactly the same oin results as a child who got this flat out wrong.That seems to obviously mean that children of very different abilities or deficiencies get lumped together. Hence that data is of questionable use/accuracy.


I think you are confusing a child using a strategy to get an answer and a child identifying a fact family. The answer was 4 and I believe there was only one equation that the answer was 4. It wasn't like there were several problems that led to an answer of 4.
Anonymous
Anonymous wrote:This brochure says the I-Ready report comes back with information per each state standard or common core standard. Is the problem that the state standards aren't specific enough? The common core standards are very specific.

https://www.tsc.k12.in.us/uploaded/Technology/Tech_Training_Docs/Textbook_Adoption/iReady-User-Guide_(1).pdf


Her blog says this, however I- Ready training says teachers can access more. Can anyone confirm what exactly teachers can see?
"The teacher can never see the questions the child answered correctly or incorrectly, nor can she even access a description of the kinds of questions the child answered correctly or incorrectly. The most a teacher will ever know is that a child scored poorly, for example, in number and operations. Folks, that is a giant category, and far too broad to be actionable."
Anonymous
Anonymous wrote:
Anonymous wrote:

To me, and I do not have incredibly strong feelings on the test and no ulterior motive, I think you underscore what some are saying is the problem. A child who got this wrong for picking an equation from the correct fact family which is the strategy they used to get the correct result looks exactly the same oin results as a child who got this flat out wrong.That seems to obviously mean that children of very different abilities or deficiencies get lumped together. Hence that data is of questionable use/accuracy.


I think you are confusing a child using a strategy to get an answer and a child identifying a fact family. The answer was 4 and I believe there was only one equation that the answer was 4. It wasn't like there were several problems that led to an answer of 4.


No, I’m not confused about that, though I can see how a child taking the test may be. I could suggest you are confused that “the answer was 4” given that was not an answer choice. We would not necessarily be having this conversation if the choices were: A) 7 B) 4 C)2 D) 11.
Anonymous
Anonymous wrote:
Anonymous wrote:This brochure says the I-Ready report comes back with information per each state standard or common core standard. Is the problem that the state standards aren't specific enough? The common core standards are very specific.

https://www.tsc.k12.in.us/uploaded/Technology/Tech_Training_Docs/Textbook_Adoption/iReady-User-Guide_(1).pdf


Her blog says this, however I- Ready training says teachers can access more. Can anyone confirm what exactly teachers can see?
"The teacher can never see the questions the child answered correctly or incorrectly, nor can she even access a description of the kinds of questions the child answered correctly or incorrectly. The most a teacher will ever know is that a child scored poorly, for example, in number and operations. Folks, that is a giant category, and far too broad to be actionable."


We also wouldn't be having this conversation if teachers who are teaching ESL and special ed students directed their concerns specifically for ESL and special ed students instead of general ed. Instead this teacher decides to rail on a test by calling it dangerous and can't even show a problem that has much confusion in it. The work general ed students get in school is much more difficult than this equation as it should be. The question is straightforward. This teacher deals with a population that is struggling for other reasons than understanding subtraction.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:This brochure says the I-Ready report comes back with information per each state standard or common core standard. Is the problem that the state standards aren't specific enough? The common core standards are very specific.

https://www.tsc.k12.in.us/uploaded/Technology/Tech_Training_Docs/Textbook_Adoption/iReady-User-Guide_(1).pdf


Her blog says this, however I- Ready training says teachers can access more. Can anyone confirm what exactly teachers can see?
"The teacher can never see the questions the child answered correctly or incorrectly, nor can she even access a description of the kinds of questions the child answered correctly or incorrectly. The most a teacher will ever know is that a child scored poorly, for example, in number and operations. Folks, that is a giant category, and far too broad to be actionable."


We also wouldn't be having this conversation if teachers who are teaching ESL and special ed students directed their concerns specifically for ESL and special ed students instead of general ed. Instead this teacher decides to rail on a test by calling it dangerous and can't even show a problem that has much confusion in it. The work general ed students get in school is much more difficult than this equation as it should be. The question is straightforward. This teacher deals with a population that is struggling for other reasons than understanding subtraction.


It helps identify students in general ed that have weaknesses and hopfully will lead to earlier identification of students who need special education services that are currently not receiving any.
Anonymous
Anonymous wrote:
To me, and I do not have incredibly strong feelings on the test and no ulterior motive, I think you underscore what some are saying is the problem. A child who got this wrong for picking an equation from the correct fact family which is the strategy they used to get the correct result looks exactly the same oin results as a child who got this flat out wrong.That seems to obviously mean that children of very different abilities or deficiencies get lumped together. Hence that data is of questionable use/accuracy.


My kids received problems like that all of the time in regular FCPS worksheets and tests in first and second grade. If they had chosen a member of the same fact family and not the one that directly translates the word problem, it would have been marked wrong. I doubt any kids are confused about this, since they talk about this quite a bit in first and second grade. There's nothing weird at all about the problem.

It also should be noted that the test is adaptive, and getting a problem wrong doesn't penalize you in the way you seem to think. If the kid gets one wrong, an easier question is offered. If the kid gets that one right, a harder problem is offered, and so on. At the end of the test, the score is the "level" at which the kid is getting about half of the problems correct. Unless a kid is repeatedly making mistakes, misunderstanding one problem here or there shouldn't make a difference in the final score. Both of my kids tend to make careless mistakes. They still both tested about where I would have expected them to be.

I'm not passionately pro-Iready. I do think FCPS needed some sort of achievement and growth metric that is a little less subject to teacher bias or incompetence (like DRA). Iready seems neither better nor worse than anything else out there. If the people arguing against iready are just fundamentally opposed to standardized testing, that's a completely different argument.
Anonymous
The author's argument in that blog is awful. Yes, all standardized assessments have their limitations, which is why a *good* teacher will not just look at the test results as the sole measure of a student's abilities, but rather will use it as a tool to help identify weaknesses. The time it would take for a teacher to assess every single student individually to get that same data would be exponentially greater than the time spent taking the test, meaning the teacher would have less time to actually teach. I am so over this fantasy world where we pretend teachers have endless time and resources to give every child a one-on-one experience tailored exactly to their needs and skill sets.

Also note that the blog is a marketing tool for the teacher's own book on math instruction, so she has a bit of an interest in criticizing current systems.
Anonymous
Anonymous wrote:The author's argument in that blog is awful. Yes, all standardized assessments have their limitations, which is why a *good* teacher will not just look at the test results as the sole measure of a student's abilities, but rather will use it as a tool to help identify weaknesses. The time it would take for a teacher to assess every single student individually to get that same data would be exponentially greater than the time spent taking the test, meaning the teacher would have less time to actually teach. I am so over this fantasy world where we pretend teachers have endless time and resources to give every child a one-on-one experience tailored exactly to their needs and skill sets.

Also note that the blog is a marketing tool for the teacher's own book on math instruction, so she has a bit of an interest in criticizing current systems.


I've only heard teachers say that iready gave them the same information they already have about students. IOW, it has no value.
Anonymous
Anonymous wrote:
Anonymous wrote:The author's argument in that blog is awful. Yes, all standardized assessments have their limitations, which is why a *good* teacher will not just look at the test results as the sole measure of a student's abilities, but rather will use it as a tool to help identify weaknesses. The time it would take for a teacher to assess every single student individually to get that same data would be exponentially greater than the time spent taking the test, meaning the teacher would have less time to actually teach. I am so over this fantasy world where we pretend teachers have endless time and resources to give every child a one-on-one experience tailored exactly to their needs and skill sets.

Also note that the blog is a marketing tool for the teacher's own book on math instruction, so she has a bit of an interest in criticizing current systems.


I've only heard teachers say that iready gave them the same information they already have about students. IOW, it has no value.


Many teachers say this because they don't want to be measured themselves. There's an inherent reason for them not to like standardized tests especially when we've stopped putting the oneness on children and parents to work towards improvement and instead blame the teachers. PP is right. They don't like being put in the position of delivering bad news to parents or being blamed for children's lack of knowledge. Their internal grade as a teacher is rarely reflective of the grades on these tests, but I think there is still an inclination for teachers to be against standardized testing that measures them against their peers. A screener early in the year though would be better received than an end of year assessment. The I-ready test is a tool that takes less time to assess than the teacher would individually. One of the biggest complaints by teachers is time. There isn't time to meet with the highest or lowest reading group, differentiate math lessons, provide new and exciting lesson plans, provide writing feedback, prepare IEP plans, and on and on. We read the complaints here over and over. Tests have been given for centuries because they are effective teaching and assessing tools and take less time than a teacher assessing each child one by one.
Anonymous
She's a Title 1 Math Resource Teacher at Belvedere Elementary. She's not a classroom teacher.
Anonymous
Anonymous wrote:
Anonymous wrote:The author's argument in that blog is awful. Yes, all standardized assessments have their limitations, which is why a *good* teacher will not just look at the test results as the sole measure of a student's abilities, but rather will use it as a tool to help identify weaknesses. The time it would take for a teacher to assess every single student individually to get that same data would be exponentially greater than the time spent taking the test, meaning the teacher would have less time to actually teach. I am so over this fantasy world where we pretend teachers have endless time and resources to give every child a one-on-one experience tailored exactly to their needs and skill sets.

Also note that the blog is a marketing tool for the teacher's own book on math instruction, so she has a bit of an interest in criticizing current systems.


I've only heard teachers say that iready gave them the same information they already have about students. IOW, it has no value.


If they already know it all, then they shouldn't be concerned about the limitations of the data it provides, which is what the author was so concerned about.
Anonymous
I’m a 3rd grade teacher. The iready took away hours of instructional time 3 times this year. After my students spent 2 weeks taking sol tests, they had to sit and take the iready which was brutal for them. Walking around and looking at the questions, there were many that are not covered in our curriculum. It also gets harder as the students get the answers correct. I had average math students working on 3 digit Long division problems, creating a lot of frustration. The score reports are very vague. I learned nothing new about my students through the reports. When my students were writing letters of advice for my next years class, they said that the sols are nothing and that the iready is more stressful.
post reply Forum Index » Fairfax County Public Schools (FCPS)
Message Quick Reply
Go to: