College officers understand that not everyone can visit colleges. Between plane fare, rental cars and hotel stays, it can get costly. Students can visit colleges in their area to get an idea of college life and what they would like.
At the Rochester Institute of Technology, where I serve as President, we survey all graduates 90 days after graduation to find out how many of them ha...
It is sadly ironic that growing acceptance of the notion that an education is and should be a private good arises, in large part, from the dramatic disinvestment in public higher education.
I'm not sure that alternative diets with some salient feature to define them need to compete with one another. A second and related concern is that this kind of exercise may tend to foster a preoccupation with labels, rather than compositional details.
It's tough to get rich saving the world, and graduates committed to social justice have a hard time competing in earned income with their counterparts who study, say, engineering, math and computer science.
Just two of the 14 kids who applied to Dartmouth from Jeff's high school last year were accepted. So why is Dartmouth beckoning him with bi-weekly emails and come-hither glossy fliers?
It's that time again when colleges and universities report their admissions statistics, graduation rates and other numbers, and groups compile and analyze those statistics to rank our nation's top higher education institutions
Insignificant changes in the rankings among elite institutions often make the headline news in national media. This raises false perceptions among parents and policy makers alike about the veracity and the realities of rankings, and further contributes to the growing income disparity between the wealthy and those of modest means.
What I recommend to students is to create their own ranking. They should start by writing down what their top wants for a college are such as major, location, size, internships, ability to participate in sports, international study, etc.
As an information, data, and statistics junkie, I rarely find statistics per se to be deceptive -- incorrectly applied, yes, but deceptive, no. What can be deceptive in statistics, however, are the underlying assumptions and full description of the data.
The graduation rate calculation needs to account for students who eventually are able to complete their degrees, against steep odds, while preserving the incentive for full time students to complete their work in a timely fashion.
Stick to the rankings? Which rankings: Those that measure the quality of teaching? The quality of research? The best program in your intended major? The most accessible professors?
When did we lose our high hopes for personal transformation in higher learning? What cynic convinced us that the idealism of a life spent working in the public interest is worth much less than a life spent making money in furtherance of corporate interests?
After we graduate, what truly matters is how well the school prepared us for the real world. Did we leave campus mentally, physically and financially ready for what's to come? Did our schools help us become successful young professionals?
Is Norway less corrupt than Luxembourg? More developed than Switzerland? Slightly more failed than Finland? Given the limited data and even more limited number of indicators these indices use, the answers to those questions are a re-statement of your methodology, not a useful analysis of the conditions in those two countries.
Schools track how different types of students perform and the IRS and Social Security Administration track incomes. Those datasets just aren't talking to each other.