"Teacher of the Year" quits over Common Core tests

Anonymous
No, that's not how the test-results part of the performance evaluation works. Nobody is proposing a simple "high test scores = good teacher, lower test scores = bad teacher" performance evaluation method.



How does it work?
Anonymous
No, that's not how the test-results part of the performance evaluation works. Nobody is proposing a simple "high test scores = good teacher, lower test scores = bad teacher" performance evaluation method.


But you do agree that they are proposing to use test results as part of the performance evaluation.
Anonymous
Anonymous wrote:
No, that's not how the test-results part of the performance evaluation works. Nobody is proposing a simple "high test scores = good teacher, lower test scores = bad teacher" performance evaluation method.


But you do agree that they are proposing to use test results as part of the performance evaluation.


They, who?

Yes, there are proposals to use test results as part of the performance evaluation. As far as I know, they are all based on some kind of value-added model -- for example, how much the class's test scores went up from last year to this year.
Anonymous
They, who?


Race to the Top (a federal program) requires teacher evaluations be tied to student achievement (testing).
Anonymous

Value added would work if they don't include students who failed the year before (because they would not be up to the past level and would be taking the next level's test). If they gave the failing students the same test from the year before, that might be fair (because they might have improved at the previous level). However, if the student was more than a level behind, they would have to go back to a previous test that could reflect their growth.

I'm sure this would all get ferreted out.
Anonymous
Anonymous wrote:
Value added would work if they don't include students who failed the year before (because they would not be up to the past level and would be taking the next level's test). If they gave the failing students the same test from the year before, that might be fair (because they might have improved at the previous level). However, if the student was more than a level behind, they would have to go back to a previous test that could reflect their growth.

I'm sure this would all get ferreted out.


Value-added doesn't work because there are far more factors than the model can control for.
Anonymous
Value-added doesn't work because there are far more factors than the model can control for.


You mean that this wasn't thought through very well before they made it a requirement for Race to the Top?

There's some kind of a pattern here.
Anonymous

So glad they stopped funding for Race to the Top.
Anonymous

Value added would work if they don't include students who failed the year before (because they would not be up to the past level and would be taking the next level's test). If they gave the failing students the same test from the year before, that might be fair (because they might have improved at the previous level). However, if the student was more than a level behind, they would have to go back to a previous test that could reflect their growth.

I'm sure this would all get ferreted out.


You've never taught disadvantaged kids, have you?




Anonymous
Anonymous wrote:
Value-added doesn't work because there are far more factors than the model can control for.


You mean that this wasn't thought through very well before they made it a requirement for Race to the Top?

There's some kind of a pattern here.


There are also studies that show that value-added models do work. Here is the American Statistical Association position about them:

https://www.amstat.org/policy/pdfs/ASA_VAM_Statement.pdf

In any case, school systems that didn't like the idea were free not to apply for Race to the Top grants. MCPS didn't.
Anonymous

There are also studies that show that value-added models do work. Here is the American Statistical Association position about them:

https://www.amstat.org/policy/pdfs/ASA_VAM_Statement.pdf

In any case, school systems that didn't like the idea were free not to apply for Race to the Top grants. MCPS didn't.


Lots of caveats in this statement. I wouldn't call it an endorsement of testing and evaluation of teachers.
Anonymous
These studies, however, have taken place in districts in which VAMs are used for low-stakes
purposes. The models fit under these circumstances do not necessarily predict the relationship
between VAM scores and student test score gains that would result if VAMs were implemented
for high-stakes purposes such as awarding tenure, making salary decisions, or dismissing
teachers.
The quality of education is not one event but a system of many interacting components. The
impact of high-stakes uses of VAMs on the education system depends not only on the statistical
properties of the VAM results but on their deployment in the system, especially with regard to
how various types of evidence contribute to an overall evaluation and to consequences for
teachers.
From your article that shows that VAM (value added models) DO work:



It is unknown how full implementation of an accountability system incorporating test-based
indicators, such as those derived from VAMs, will affect the actions and dispositions of teachers,
principals and other educators. Perceptions of transparency, fairness and credibility will be
crucial in determining the degree of success of the system as a whole in achieving its goals of
improving the quality of teaching. Given the unpredictability of such complex interacting forces,
it is difficult to anticipate how the education system as a whole will be affected and how the
educator labor market will respond. We know from experience with other quality improvement
undertakings that changes in evaluation strategy have unintended consequences. A decision to
use VAMs for teacher evaluations might change the way the tests are viewed and lead to changes
in the school environment. For example, more classroom time might be spent on test preparation
and on specific content from the test at the exclusion of content that may lead to better long-term
learning gains or motivation for students. Certain schools may be hard to staff if there is a
perception that it is harder for teachers to achieve good VAM scores when working in them.
Overreliance on VAM scores may foster a competitive environment, discouraging collaboration
and efforts to improve the educational system as a whole.
Anonymous

Sorry---that didn't copy right. The quotes are from the article cited above (the one supposedly showing that VAMs do work):

https://www.amstat.org/policy/pdfs/ASA_VAM_Statement.pdf


hardly a ringing endorsement of VAMs
Anonymous


More from the article:

• Estimates from VAMs should always be accompanied by measures of precision and a
discussion of the assumptions and possible limitations of the model. These limitations are
particularly relevant if VAMs are used for high-stakes purposes. o VAMs are generally based on standardized test scores, and do not directly measure
potential teacher contributions toward other student outcomes.
o VAMs typically measure correlation, not causation: Effects – positive or negative –
attributed to a teacher may actually be caused by other factors that are not captured in
the model.
o Under some conditions, VAM scores and rankings can change substantially when a
different model or test is used, and a thorough analysis should be undertaken to
evaluate the sensitivity of estimates to different models.
• VAMs should be viewed within the context of quality improvement, which distinguishes
aspects of quality that can be attributed to the system from those that can be attributed to
individual teachers, teacher preparation programs, or schools. Most VAM studies find
that teachers account for about 1% to 14% of the variability in test scores, and that the
majority of opportunities for quality improvement are found in the system-level
conditions. Ranking teachers by their VAM scores can have unintended consequences
that reduce quality.

Anonymous
In any case, school systems that didn't like the idea were free not to apply for Race to the Top grants. MCPS didn't.


I can see why they didn't.
post reply Forum Index » Schools and Education General Discussion
Message Quick Reply
Go to: