Existential Takes on First Year Induction and Professional Development

See all posts

While states and school districts scramble to find funding to boost induction and professional development opportunities, two new studies funded out of the Institute for Education Sciences give pause.

One study of no fewer than 17 school districts looks at whether districts should be spending a lot of money on induction. The researchers compared two highly respected and well delivered teacher induction models (those of the New Teacher Center and the Educational Testing Service) with the standard district fare, delivered at a fraction of the price. In one year of study, they could find no differences in any of the outcomes we all care about. All three models reported the same attrition rate. The teachers' students all made equivalent progress as measured by standardized test scores.

The findings are sorely discouraging, but far too preliminary to alter practice. The study examined the two programs during their first year of implementation in the districts (plagued with all of the usual problems associated with startups) against the districts' models, which had been well established. Second year results are forthcoming.

The second study reaches similarly discouraging conclusions about the impact of professional development on teachers' effectiveness. A well respected program to give teachers more knowledge about reading instruction, LETRS, was put into six districts which had already been using a scientifically based reading program. Teachers who participated in the LETRS training learned more about the science of reading, but did not produce higher student achievement scores than their peers who did not get the training. Even when some teachers were given 60 hours of coaching, no differences were found.

The findings contrast with a number of reading studies showing that more knowledgeable teachers are more effective. It may be again that the study tried to short circuit a process that needs more than a single school year to take off. One might also argue that the training provided was not particularly intensive and that the coaches didn't get any more training in reading than the teachers they were asked to coach. Still, we'd have guessed that some measurable difference might show up.