Drowning in data from the OECD

See all posts

Though I've never considered myself an isolationist, I am increasingly questioning the value of looking beyond American borders for insights into education, at least regarding teacher quality.

The study out last week from the OECD, Effective Teacher Policies: Insights from PISA, technically falls right into my sweet spot, as it sets out to surface the definitive set of teacher policies that make up the 'secret sauce' of school success. By digging into the policies shared in common by those nations whose students do well on the PISA, all nations would have the roadmap they need to similarly succeed.

If there's a roadmap to be had, it's unreadable through the coffee stains, not to mention a bit sticky from too many peanut butter and jelly sandwiches being passed around the car.

Of the 35 nations participating in the PISA, OECD identifies 19 as "high performing" for their students' performance on the international PISA tests, as they are all "above average." As the US rarely is anything but average, it is not among the 19. The OECD has taken an inclusive approach, one that allows it to deal with that pesky problem of too many Asian countries at the top (which would have given us Westerners permission to dismiss any lessons on the basis of irrelevance). Unfortunately, it also means that equal consideration is granted to nations like Slovenia and the United Kingdom—both barely above average—as genuinely high-performers like Singapore and Taipei.

It's clear that there were more differences than commonalities among them, but three teacher policies do surface as practiced by most or all of the 19 above-average nations: mandatory clinical practice provided to teacher candidates before starting to teach; 'bespoke' professional development opportunities (which simply means PD that is tailor-made to or by individual schools); and, teacher evaluation systems that are either mandated by law or simply ingrained in the practice of schools. It's not clear how common or uncommon these same three policies are among the 16 straggler nations, but that would have been worth noting more explicitly.

OECD covers itself with caveats and limitations, so my observations here should not be construed as charging them with false news, not by any stretch. Instead my beef is about the utility of the comparisons among these countries, given the extensive distilling they must do to analyze and present on thousands of datapoints, stripping these data of their context and complexity.

For example, we learn that Japan only has 20 days of student teaching. Horrifyingly short, is our initial reaction, but we are then told not to make anything of it because that country has a strong induction program to make up for it. Or we are told that Singapore has no entry exams to get into teaching, but cautioned not to make too much of that, as it only generally admits students into teacher preparation who are in the top third of their classes. These are two instances where the nuance is explained. In most cases, no explanations are given, and even if they were, would anyone read them?

It simply is not possible to do justice to the facts on the ground in each of these nations without repeatedly misleading readers. Though not intentional, the results are the same.

The source of OECD's data is part of the problem. It relies on survey data it collects from teachers and principals who work in the schools where 15-year-old students are taking the PISA. This churns up a whole host of data that we know just ain't right. Surveyed principals from the United States claimed that 100 percent of their teachers were formally evaluated (see here), making the US tops among nations in its approach to evaluation. Ninety one percent of US principals report that their science teachers have all majored in science, quite a bit higher than data from NCES, which puts it at 84 percent.

Official government sources that might produce more reliable data can't be used. That's because different countries collect data differently—or more often, not at all—so OECD administers a common survey as the only viable alternative. What that produces, however, are results that conflict with other trusted sources, consistently painting a rosier picture, at least for the US. About 160 US principals answer a survey question about the average student-teacher ratios here, reporting an average of 14.2 for secondary schools—even though the official US figure for the same year is projected at 15.6 (a figure that includes elementary grades, which have smaller classes).

Are data with so many conditions, caveats, and lack of nuance better than having no data at all? Arguably, but by no means definitively.

OECD states that "its goal is not to develop a blueprint for teacher policies, but rather to illustrate the existing evidence and gaps, and thereby contribute to the ongoing debate about effective teacher policies." I know no more what to make of that statement than I know what to do with much of the data.

What's the answer to this problem? Not, as my opening suggests, to only look within. But I do think there is a better way. Narrow the scope of these international studies significantly. Identify a small number of questions that need to be answered and present them in their entirety, nuance and all. For example, what do these countries really do by way of clinical preparation of their teacher candidates? Or how do these countries evaluate teachers? Otherwise OECD's ambitions, while alluring, sink under their own weight.