Universities Look For New Ways To Rank Themselves

So Many College Rankings, Which One Do You Choose?
People walk into The Princeton University Store on University Place Wednesday, Sept. 26, 2012, in Princeton, N.J. Two people have been charged with prostitution at Princeton University's off-campus bookstore. Authorities say 23-year-old bookstore employee Eric Everett met 20-year-old Brittany Smith on Craigslist and agreed to pay her for sex. Princeton police say they met at the store and had at least three liaisons there. Each time, police say, they stole food from the store, which is owned by the university. Neither suspect is a Princeton student. (AP Photo/Mel Evans)
People walk into The Princeton University Store on University Place Wednesday, Sept. 26, 2012, in Princeton, N.J. Two people have been charged with prostitution at Princeton University's off-campus bookstore. Authorities say 23-year-old bookstore employee Eric Everett met 20-year-old Brittany Smith on Craigslist and agreed to pay her for sex. Princeton police say they met at the store and had at least three liaisons there. Each time, police say, they stole food from the store, which is owned by the university. Neither suspect is a Princeton student. (AP Photo/Mel Evans)

This article comes to us courtesy of the Hechinger Report, where it was originally published.

He may be the leader of the free world, but when President Barack Obama proposed that the government grade universities based on their cost and success rates, a lot of other people were ahead of him.

At a time when students and their families are demanding to know what they’re getting for their mounting investments in higher education, several foundations and research centers are already working on new ways to show them.

Even some universities and colleges themselves — reasoning that it’s better to come up with their own ratings than have them imposed by someone else — are quietly working on new ways to gauge what graduates learn and earn, though many remain reluctant so far to make the results public.

“One thing everyone seems to agree on is that we should have a good way for people to choose where to go to college,” said Zakiya Smith, strategy director at the Lumina Foundation, which is offering $10,000 in a crowd-sourced competition to come up with the best way to make more user friendly an existing U.S. Department of Education Website called the College Scorecard.

Obama has proposed that the government publicly rate colleges and universities by 2015 based on such things as average student debt, graduation rates, and graduates’ earnings.

“The answers will help parents and students figure out how much value a college truly offers,” the president said in a speech at the University of Buffalo.

That’s information consumers increasingly want. In a survey released in January by Hart Public Opinion Research, 84 percent supported the idea of making colleges disclose information about graduation, job-placement, and loan-repayment rates.

“People are looking at, ‘Where do we get the biggest bang for our buck,’” said Terrell Halaska, a partner at the higher-education consulting firm HCM Strategists. “They’re desperately looking for high-quality consumer information. They don’t know where to turn. There are 1,000 different ranking systems out there.”

For their part, universities have responded with official skepticism to the idea that the government should add yet another one. But some are privately working on their own ratings systems.

With money from the Bill & Melinda Gates Foundation, 18 higher-education institutions have been at work on something called the Voluntary Institutional Metrics Project, coordinated by HCM, which proposes to provide college-by-college comparisons of cost, dropout and graduation rates, post-graduate employment, student debt and loan defaults, and how much people learn. (Gates and Lumina are among the funders of The Hechinger Report, which produced this story.)

It’s that last category that has proven trickiest. After two years, the group still hasn’t figured out how to measure what is, after all, the principal purpose of universities and colleges: whether the people who go to them actually learn anything, and, if so, how much.

The many existing privately produced rankings, including the dominant U.S. News & World Report annual “Best Colleges” guide, have historically rewarded universities based on the quality of the students who select them, and what those students know when they arrive on campus — their SAT scores, class rank, and grade-point averages — rather than what they learn once they get there.

U.S. News has been gradually shifting toward incorporating in its rankings such “outputs” as graduation rates, the publisher says.

Still, the most-popular rankings “have been almost completely silent on teaching and learning,” said Alexander McCormick, director of the National Survey of Student Engagement, or NSSE — yet another attempt by universities themselves to measure their effectiveness. And that, he said, is “like rating the success of hospitals by looking only at the health of their patients when they arrive.”

NSSE, which is based at the Indiana University School of Education, seeks to change that calculation. Each spring, it surveys freshmen and seniors at as many as 770 participating universities and colleges about their classroom experiences, how much they interact with faculty and classmates, whether their courses were challenging, and how much they think they’ve learned.

But the project also spotlights a big problem with potentially valuable ratings collected by the institutions themselves: The schools are often unwilling to make them public.

“This tells you something about the sensitivity that exists right now about comparisons of institutions,” McCormick said. “A lot of institutional leaders essentially said, ‘If this is going to be public, we’re not going to do it.’ ”

So while it was conceived in 2000 with great fanfare as a rival to the U.S. News rankings, NSSE remains obscure and largely inaccessible. The results are given back to the participating institutions, and while a few schools make some of them public, others don’t, thwarting side-by-side comparisons.

There are other drawbacks to letting universities rate themselves. One is that the information is self-reported, and not independently verified, potentially inviting manipulation of the figures. In the last two years, seven universities and colleges have admitted falsifying information sent to the Department of Education, their own accrediting agencies, and U.S. News: Bucknell, Claremont McKenna, Emory, George Washington, Tulane’s business school, and the law schools at the University of Illinois and Villanova.

Also, surveys like the one used by NSSE depend on students to participate, and to answer questions honestly. Last year, fewer than one-third of students responded to the NSSE survey.

“We depend on them to provide candid answers,” McCormick said. “But students aren’t stupid. It wouldn’t take long for them to figure out, ‘The way I fill out this survey will affect where my institution comes out in the pecking order.’ ”

Student surveys are nonetheless a major part of another planned ranking of universities called U-Multirank, a project of the European Union.

Recognizing that it’s not always possible to compare very different institutions — as universities themselves often argue — U-Multirank will measure specific departments, ranking, for example, various engineering and physics programs.

Of the more than 650 universities hat have signed on, 13 are American; the first rankings are due out at the beginning of next year.

“It doesn’t make sense to rank universities only on the level of the university as a whole,” said Frank Ziegele, managing director of Germany’s Centre for Higher Education and one of the coordinators of the project. “The existing rankings focus on a very narrow range of indicators, such as reputation and research, but they’re perceived as being comprehensive.”

Ziegele said his project will use statistical methods to weed out dishonest answers from students on surveys about their educations.

“You could never prevent completely that there are some answers that are biased, but as soon as we have doubts we would choose to leave out that indicator for that university,” he said.

The League of European Research Universities, which includes Oxford and Cambridge, is already refusing to take part, as are some other institutions. Many of those already do well in existing global rankings including the ones produced by the Times Higher Education magazine and the publishing company QS Quacquarelli Symonds, and the Shanghai World University Rankings.

For all of this activity, there’s evidence that students and their families don’t rely as much on rankings as university administrators seem to fear. Rankings are a mediocre 12th on a list of 23 reasons for selecting a college students gave in an annual survey by the University of California, Los Angeles Higher Education Research Institute.

Still, said Smith, of the Lumina Foundation, “People appreciate information. When you buy a car, a lot of things may come into consideration, but you still want to know what the gas mileage is. And you have the right to know.”

Before You Go

10. Georgia Institute of Technology

The Happiest Universities To Work For: CareerBliss 2013 Ranking

Popular in the Community

Close

What's Hot