edTPA: Slow this train down

See all posts

Two weeks ago, in the Chronicle of Higher Education, Linda Darling-Hammond delivered a strong pitch for the "edTPA," the next generation Teacher Performance Assessment (TPA), an instrument that is used to judge a teacher candidate's teaching skills. With its origins in the "PACT," a TPA that was developed by Darling-Hammond and colleagues at Stanford, the edTPA has been gaining steam nationwide as a requirement for program completion.

What exactly is a TPA?  Basically, it's an exercise in which the teacher candidate plans and delivers a few lessons, submitting planning and assessment materials as well as videotapes for review. California's teacher preparation programs pioneered three different TPAs beginning in 2002 and the state made completing one of them a licensing requirement in 2008. Now the edTPA is being rolled out nationwide, with some 7,000 teacher candidates currently involved. There's also been a major push by AACTE, the membership organization for teacher prep, to get states to adopt the edTPA.

TPAs hold tremendous promise in that they can serve as a central organizing principle for preparation programs, suffering as they tend to from a lack of curricular continuity.  But the edTPA isn't being pushed for this reason...at least that's not how the arguments are being framed.  No -- the primary argument is that the edTPA would be a much better tool for teacher prep program accountability than measures that look directly at teacher effectiveness using stats on student performance (a case in point). Darling-Hammond, AACTE and others in the field explicitly argue that if states adopt the edTPA as a new licensing screen, they will not only succeed in keeping bad apples out of teaching, but also will be able to distinguish good teacher prep programs from bad ones.

And therein lies the rub.  

What may be a very good culminating exercise for any program to administer is not necessarily a sufficiently valid and reliable measure of either an individual teacher or the quality of a program. For example, the edTPA allows candidates to choose the lessons they will deliver, rehearse as many times as they wish, and edit the videotape of their teaching. If a  prospective elementary teacher chooses to teach  a lesson on parallel lines rather than on equivalent fractions (because she really dislikes fractions), and even then edits out an instructional faux pas, is the resulting lesson a valid assessment of her overall teaching skills?

In fact, because California programs are not required to report to the state aggregate pass rates (either how many  pass on the first try or how many ultimately pass after repeated attempts), there is no information available as to whether their expensive TPA exercises ($200-400 cost per candidate) represent any kind of screen for unqualified candidates. That in itself is a very odd decision for a state to make about a licensing test.  Further, we've found indications from a few California institutions that the pass rate is indeed quite high (98 percent), which, if correct, is too high to justify the time and resources dedicated to administration. 

It's not that we haven't been down this road before. If pass rates on the edTPA are in the stratosphere, they will  mirror results of the "Praxis III," an expensive performance assessment for new teachers that several states adopted in the last decade--also having been told by teacher educators that it would serve as an effective screening tool and eliminate the need for other types of accountability measures. For better or worse, the Praxis III is now moribund.

The best way to answer questions on the potential use of the edTPA for accountability purposes is with hard evidence regarding the relationship of scores to student performance that is produced from large scale studies published in peer-reviewed journals.  Unfortunately, there is no such evidence to date. Among the five research papers found on the edTPA website, there is only one that goes beyond a very small "preliminary" study done several years ago. Not only is this latest study conducted by the assessment's own developers and unpublished, it is not the large-scale study one might expect for a national initiative of this magnitude. Thousands of California teachers have received their license after passing the edTPA's precursor. Grist for the research mill certainly would seem available.      

More and more public and private endeavors are striving to use "big data." Unfortunately, "no data" is too often the byword in teacher preparation.  We're ready to join the edTPA bandwagon when it's clear that: 1) the passing score bears some relationship to teacher effectiveness, and 2) there is no potential for candidates to achieve that score by gaming the assessment through excessive coaching, videotape editing and the like. Otherwise, we're inclined to stay on the decent-culminating-project bandwagon for TPAs.