A second study out of Florida has failed to find the value of any pre-service teacher preparation -- or of teacher smarts for that matter.
The study by Matthew Chingos and Paul Peterson didn't have quite the puzzling results of an earlier Douglas Harris and Tim Sass study (which found, among other things, that high school math teacher effectiveness is inversely related to math SAT scores.) But if we're not scratching our heads, we're at least scratching our chins.
Given decades of research pointing to increased effectiveness of teachers with greater academic talent, we at least would not have predicted that teachers from St. Petersburg College, the least selective institution of the 11 Florida institutions studied, would hold their own against teachers from the most selective of the 11, which was the University of Florida. But they did. Then again, the University of Florida's education school is not all that selective (only recently having bolstered its requirements of teachers candidates to have a combined math and verbal SAT score of 1010, still in the low end of what could be considered respectable.)
In the dynamic field of economic research on teacher performance, this study is a way-station, not a final destination. Consider for example, Tennesee's value-added analysis discussed in last month's TQB, which found Teach For America corps members and Vanderbilt University graduates (whose average combined math and verbal SAT score of 1304 probably put them just slightly below TFAers in mental firepower) to be the most effective teachers in the state. Those results make it hard to believe that basic smarts isn't a necessary (if not sufficient) factor in teacher effectiveness. Perhaps somewhere between "selectivity" as defined by the University of Florida and "selectivity" as defined by Vanderbilt lies a sweet spot into which teacher preparation admissions must tap.