As states work to implement new evaluation systems, the tenuous, if not ambiguous, relationship between value-added (VA) scores and principals' observations will need to get nailed down and better understood. New research by Douglas Harris, William Ingle and Stacey Rutledge provides some insight, asking, "How consistent are principals' impressions of their teachers with their value-added data?"
Researchers interviewed elementary and secondary principals in one Florida school district and asked them first to rate a sample of teachers on a number of different factors and second to give each teacher an overall score. These ratings were then compared with those teachers' value-added data.
Researchers found a positive but quite weak correlation between the principals' overall rating and their teachers' value-added scores. They also found that some principals perceived teachers with high VAM scores as working too often in isolation and participating too infrequently in school extracurricular activities. As a result, these principals often gave lower overall ratings to these teachers. Notably, principals were not aware of teachers' VAM scores at the time they rated teachers.
At least there was a strong, statistically significant correlation where it would most be expected, that is, when principals were only asked to judge teachers "teaching ability."