The release of the US News & World Report college rankings is as good an excuse as any to talk about the sheer ridiculousness of organizing complex institutions into rank order and pert decimal scores.
The criticisms of the index itself are nicely summarized in this Atlantic article, but for those that don't have time for the full Gladwell, they are basically of two kinds:
- The ranking is flawed. The methodology constantly changes, schools juke their stats, it's based on bullshit surveys that only measure the school's established reputation, etc. The data might be good enough to distinguish Harvard University from Southern Methodist Tech, but there is too much noise to say that Yale is better than Princeton or that Oregon State is better than Penn State.
- Most of the information we use to determine the quality of education isn't readily measurable. How good are the teachers? Do students get enough personal attention? Is the campus social life welcoming or cliquey? If you list everything that made your college experience positive or negative, you won't find it in the number tables of these rankings.
Sometimes companies would ask us for big packages of countries -- 10, 20, 50 at a time.
"Can't you just give us a ranking?" they would ask. "Tell us which country is the worst of that list, and we won't make shoes there."
Or, they would suggest, better yet, give us an index. If you tell us that Bolivia scores 8.2 out of 10 and Iran scores 8.3 out of 10, we can make our decision on quantitative data rather than just putting our finger in the air.
"But that is putting your finger in the air," I would tell them. "It's just us doing it instead of you. Bolivia and Iran, their politics, their demographics, their economics, they look nothing like each other. Two numbers isn't going to make that go away."
This is one of my beefs with the Failed States Index, the Economic Freedom Index, the Human Development Index, the Corruptions Perceptions Index, the dozens of other indices that purport to rank countries according to some difficult-to-measure variable.
Countries have all the same problems as colleges. The data out there aren't strong enough to justify precise determinations, only broad tranches. Yes, Norway is less corrupt than Angola, but I don't need an index to tell me that.
But is Norway less corrupt than Luxembourg? More developed than Switzerland? Slightly more failed than Finland? Given the limited data and even more limited number of indicators these indices use, the answers to those questions are a re-statement of your methodology, not a useful analysis of the conditions in those two countries.
I used to try to tell the companies this, that any attempt to rank countries according to their ability to prevent corporate human rights violations would be like trying to rank kittens according to cuteness. After you separate them into the already-obvious tranches, it's just a judgement call, preferences disguised as data.
"But it would be so much easier for me if you could do that anyway." Only one corporate person ever actually said it this directly, but afterward I started hearing it, in subtler ways, from the others.
Eventually I realized that the only reason the companies pushed so hard, why they insisted so strongly on rankings and scores over information and analysis, was because it made it not their problem anymore. They weren't qualified to pull 50 "good" countries from 100 uncategorized ones, so they used us to push the responsibility away. "It's not me saying Bolivia is an 8.2," they could tell their boss. "A human rights NGO said it was." Making shoes there is NGO-approved.
I don't know if high schoolers use college rankings to decide where they should get educated. And I don't know if multinational corporations use country indices to decide where they should make shoes. I just hope that in both cases, they know that most of what they're seeing is either totally obvious or entirely unsubstantiated.