U.S. Education Deserves the Same Statistical Sophistication as Baseball and Elections

We're a nation that's coming to respect statistics. Billy Bean convinced us that better statistics could beat bigger payrolls in sports. Nate Silver helped humble Karl Rove's money machine with better statistics. Maybe it's time to take a more careful look at the international test statistics judging how good our nation's schools are.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

(Co-authored with Richard Rothstein)

For better or worse, we're a nation that's coming to respect statistics. Billy Bean convinced us that better statistics could beat bigger payrolls in sports. Nate Silver helped humble Karl Rove's money machine with better statistics. So maybe it's time to take a more careful look at the international test statistics judging how good our nation's schools are.

Just last month, when the new Third International Mathematics and Science Survey (TIMSS) scores were released, Education Secretary Arne Duncan called them "unacceptable," saying they "underscore the urgency of accelerating achievement... and the need to close large and persistent achievement gaps."

It was no different two years ago when the Program for International Student Assessment (PISA), released its latest scores. Secretary Duncan said they showed American students "napping at the wheel while emerging competitors prepare their students for economic leadership... As disturbing as these national trends are for America, enormous achievement gaps among black and Hispanic students portend even more trouble for the U.S. in the years ahead."

This all sounds like America's schools are pretty awful and not making much progress. But here is where a more careful look at statistics suggests a very different story. In a report we just completed, What Do International Tests Really Show about American Student Performance, we show that Duncan and other pundits' conclusions from international test results are oversimplified, often exaggerated, and misleading. They ignore the complexity of testing and may lead educational policymakers to pursue inappropriate and even harmful reforms.

Since the last PISA release in 2010, we have been digging deeper into its database, as well as into the older databases for TIMSS and for both versions of our domestic National Assessment of Educational Progress (NAEP). We used this rich information to disaggregate scores by students' socioeconomic characteristics, school composition, and other informative criteria.

Analyzing such better statistics, we found that average U.S. scores in reading and math on the PISA test are low partly because a much higher share of U.S. students come from disadvantaged social class groups whose performance is relatively low in every country.

Two reasonable adjustments to the latest U.S. PISA scores -- one assuming that U.S. students had the same social class composition as Canada, Finland, and Korea, and the other correcting for a flaw in the way low social class U.S. students were sampled -- jump U.S. rankings to fourth from 14th in reading and to 10th from 25th in math.

Our point is the same as Nate Silver's: just as predicted voting outcomes depend in part on who is expected to vote, school outcomes depend in part on who the students are taking the test. Nations with more lower social class students will have lower overall scores, because these students don't perform as well academically, even in good schools in the highest scoring countries. Policymakers should understand how our lower and higher social class students perform in comparison to similar students in other countries before recommending sweeping school reforms.

So when we compare international reading and math test results on one of these tests -- the PISA -- for U.S. high, middle, and low social class students with similar social class students in major European countries such as France, Germany, the U.K., and top-scoring Canada, Finland, and Korea, we get some revealing results:

•In the PISA reading test, advantaged students in the U.S. perform as well as students in all of these comparison countries, including the top scoring ones, and disadvantaged students in the U.S. perform better than disadvantaged students in France Germany, and the U.K.

•The reading and math achievement of lower-social class U.S. students improved substantially in 2000-2009, while achievement of similarly disadvantaged students declined in all these other countries except Germany.

•Where we don't do as well as a nation on the PISA test is in math. But even in math, our disadvantaged students do as well or better as disadvantaged students in France, Germany and the U.K., and, as mentioned, our disadvantaged students are making greater progress in math over the past ten years than disadvantaged students in the top scoring countries.

•Most of our policy makers claim that the achievement gap between advantaged and disadvantaged students in the U.S. is much higher than in other countries. Our data show that claim is misleading. The gap is smaller in the U.S. than in similar post-industrial countries. It is larger than the gap in top scoring countries, but even then, many of the differences are small.

• In the PISA math test, the U.S. achievement gap is relatively small because the scores of both disadvantaged and advantaged students are relatively low when compared to similar students in the top scoring countries. So that's not good. But, please, no more claims that our disadvantaged students are farther behind our advantaged students than in other countries.

•Finland has been touted as a model the U.S. should copy. We find that both advantaged and disadvantaged Finnish students score much higher than U.S. students. But the scores of lower social class Finnish students fell substantially on the PISA tests since 2000, while low social class U.S. students rose. Doesn't it make sense for U.S. education reformers to be cautious about copying a system where achievement for the most vulnerable groups is falling?

Another caution brought out by our comparisons between U.S. student performance on the PISA and TIMSS test is to avoid basing educational policy on the results of only one test. Average U.S. math scores have been rising steadily on the TIMSS test and on our own National Assessment of Educational Progress (NAEP). Remember, math is not our strong suite internationally. But this is where TIMSS and NAEP show steady progress. And, as in the PISA, the largest U.S. gains in math on the TIMSS are for our most disadvantaged students.

No doubt that U.S. education can be improved. But we need to make education policy with sophisticated use of statistics, including statistics that show our schools might actually be doing better with the students we have than other countries' schools do with the students they have.

Click here for the report.

Popular in the Community

Close

What's Hot