Beware of College Rankings Without Context

When the U.S. Department of Education released its first College Scorecard in September, the media narrative was that college rankings would become more reliable, because they could take advantage of large amounts of federally verified data.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

When the U.S. Department of Education released its first College Scorecard in September, the media narrative was that college rankings would become more reliable, because they could take advantage of large amounts of federally verified data. Prior to publishing the website, the Department of Education released a large dataset that reported information on college attendees--but only those who received federal financial aid. This dataset showed financial and performance outcomes for the nation's colleges and universities for students at six and ten years post-enrollment.

By the time the website was completed, the visible data only reflected the 10-year outcomes. The six-year data isn't reflected in the main dashboards of the site, and isn't easily accessible to the public via the scorecard website. This causes some confusion for visitors who find that many of the stories and rankings use both the 10-year and six-year dataset, or the six-year dataset alone.

The scorecard has made it easier to produce rankings, but those rankings are not necessarily better information than previously available. Media organizations often apply lenses to the data specific to their audience, but fail to explain why the data they are using are important, or what filters they might have applied to get to their final data subset.

Defining 'upward mobility'
Days after the scorecard was released, National Public Radio's "Planet Money" put together a collection of rankings, including a list of schools that emphasize upward mobility. The list looked at the percentage of students receiving Pell Grants, which go to low-income students; the net price of college for families making less than $48,000; the number of first-generation college students at an institution; the four-year graduation rate; and median income 10 years after enrolling, among other things.

The top three schools out of 50 were Harvard University, the Massachusetts Institute of Technology, and Stanford University. All three had low net prices for low-income families, graduation rates above 90 percent, and median graduate incomes of more than $80,000. All of those statistics are impressive, but what does that really tell us? Is anyone surprised that graduates of elite colleges and universities earn large salaries?

Which numbers matter most?
At the Chronicle of Higher Education, Andy Thomason put together five rankings using the scorecard data. He noted that there are many caveats, including that the income data provided only includes students who received federal financial aid. But still, he created a list of the lowest median incomes among colleges where students scored an average of at least 1400 on the SAT--a very specific ranking indeed.

Low is a relative term. The federal scorecard salary data is a point in time capture of 2011 incomes. The data shows that median income for college graduates ages 25-34 at that time was $46,570. All but one school in the SAT score filtered pool had median earnings higher than that, and the single college that didn't had median earnings of $46,100, at a mere $470 below par.

We also need to consider whether median salary should be the final metric of college value in the first place. Our students need to be able to earn a living, of course, but they also need to find fulfilling careers. Should rankings penalize a school that graduates more teachers and social workers than lawyers, engineers, and doctors?

Factors such as salary, graduation rate, debt at graduation, and student loan default rate all matter, but so are harder-to-measure factors such as career satisfaction. Any ranking that doesn't consider the student experience and the kinds of careers students find after graduation is missing a big part of the picture.

From the questionable to the sensationalized
While some rankings were simply unhelpful, others, unfortunately, were sensational.

StartClass.com ran a list in November that purported to show colleges whose alumni make less than high school graduates. The site listed 25 schools with the lowest median earnings for students six years after enrolling, many of which were below $30,000, the median salary for high school graduates between 25 and 34.

The largest problem with the StartClass comparison is that the numbers they used are, quite simply, not a valid comparison of datasets. The figure the StartClass site uses for the median salary of high school graduates--$30,000--is calculated for high school graduates 7-16 years post high school graduation (aged approximately 25-34). The data from Government Scorecard was point-in-time data for students six years after they enrolled in college, regardless of whether they graduated or not (aged approximately 24-25).

Their ranking, therefore, compares the salaries of those who have been active in the workforce for between seven and 16 years against those who have been in the workforce for, at most, two years.

It's even more egregious when you consider that the StartClass site included the appropriate data--median salary 10 years after enrollment--but failed to utilize that when it came to deciding which colleges' students "earned less than high school graduates." If they had, the list would have only been one item long, because only one school in their dataset had students earning less than $30,000 10 years after enrolling.

College rankings will continue to be controversial, and no matter how objective they claim to be, they are the product of imperfect measurements and decisions about the importance of one factor over another. But when rankings offer no context for the data they use, they can be intentionally or unintentionally misleading.

We welcome accountability for graduate outcomes, and recognize the value of a scorecard that offers students the tools to make an informed decision about their college journey. However, until scorecards can take into consideration factors of college education that can't be quantified purely by income figures, such as campus experiences, career fulfillment, and contribution to society, they will not present an accurate comparison.

Popular in the Community

Close

What's Hot