Fifteen years in the making, the National Research Council's ratings of doctoral programs are -- according to chairman of the project committee Jeremiah P. Ostriker -- "a little unsatisfactory."
Inside Higher Ed reports that the rankings are the culmination of a five-year process and more than $4 million spent on research -- and that it marks the first such publication since 1995. The NRC looked at 5,000 programs in 62 disciplines, and considered a total of 212 universities. The data is analyzed and reported according to two different sets of ratings. In the "S" method, faculty members in each of the 62 fields were asked what they look for in a successful program, and the surveyed programs were rated according to these requirements. In the "R" method, faculty members were asked to list their favorite programs, and criteria for excellent were deduced from these preferences.
In addition to offering two different sets of ratings, the NRC only states a confidence interval for each programs rank. Rather than, for instance, stating that a certain program ranks 15th among its peers, the NRC report might say that it ranks between 7th and 29th, according to the "R" ratings, with a 90 percent level of confidence.
Needless to say, many are skeptical about the efficacy of the ratings. The rankings are met with concern not only because they may be outdated (information was gathered between 2005 and 2006, and was supposed to have been published in 2007) and employ incorrect information (Washington University has already challenged their status) but because they are too confusing to be of use.
Nevertheless, academics are excited by the potential uses of the data collected by the NRC. According to the Chronicle of Higher Education, the quarter of a million data points collected may become useful later on. The Chronicle reports that president of the Council of Graduate Schools Debra W. Stewart said on Monday:
There's going to be a short-term response and a long-term response to this report. The long-term response will be the important one. I think that the framework of this report will help support an ethos of continuous improvement.
Inside Higher Ed compiled a table of the possible top three programs, according to each ranking method, in each discipline. Below, check out some of the programs that ranked consistently well across the board.
Do you think these results were worth a 15-year wait? Do you understand the rankings? Let us below.