Nowhere in the recently released White House "college scorecard" can I find a section about a school's worth in inspiring careers in public service, working for social justice, living simply, not exalting money as the only benchmark worth talking about. Instead, it equates excellence with salaries.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

I went to law school but never practiced law in a law firm. I took jobs in public interest work that paid significantly less than my peers were earning in the first years after law school. Soon I moved into a different line of work entirely, but even today, because of my personal choices, I earn a lot less than other college presidents, and I do not regret that for one nanosecond.

I was a Political Science major in college, but never went into politics (in the traditional sense, at least!)

Should my personal choices be held against my law school, Georgetown? My undergraduate alma mater, Trinity?

Actually, my choices should redound to their great credit since both taught me that there's more to life than making money, that the ultimate reward of a great education is found in the lives I can help and even change through my work.

But nowhere in the recently released White House "college scorecard" can I find a section about a school's worth in inspiring careers in public service, working for social justice, living simply, not exalting money as the only benchmark worth talking about. Instead, the scorecard equates excellence with salaries, a not-so-subtle suggestion that schools whose graduates earn less than others must not be very good, which ignores a whole host of other educational values.

Where's the checkbox for the fact that I am immensely satisfied with my career choices, life's work and personal fulfillment? Don't look, it's not there.

The White House says that the scorecard is a way to pick a college, that it "...makes it easier for you to search for a college that is a good fit for you." The scorecard does no such thing. It simply displays five data points that are all about money. Nowhere does the scorecard address actual learning or other important "fit" characteristics of the total collegiate experience.

The scorecard is the latest example of how, when it comes to education policy at all levels, Washington foments the tyranny of data. Such reductionism trumps institutional mission, sound faculty judgment about student learning, assessment of student outcomes that do not always fit into some wonk's tidy scorecard. Many educational outcomes defy quantification because, done well, great teaching and learning enlarge intellect beyond the rote regurgitation of facts on bubble sheets, shape perspectives on the human condition, illuminate ethics and values, promote research and exploration to discover new knowledge that can actually change the way people live, communicate and thrive.

Mission used to count for a lot in choosing a college. A small, private, woman-centered university like Trinity could not be more different from a large public university like George Mason -- yet the scorecard treats us as virtually interchangeable. Our students, however, choose us precisely because of that difference -- some love the big campus, some want small classes, some desire lots of football games, some prefer a school that has a more artistic bent. Even among institutions with apparently similar missions, the differences can be quite stark: Trinity is quite different from Catholic University even as VCU is different from UVA. The glory of American higher education used to be its great diversity of institutional type -- recognizing that students are all very different, and having choice among different genres of higher education is a great advantage in this country -- but in the hands of federal regulators, we are all whirled together in the data blender to come out as some kind of gray tasteless soup.

The "college scorecard" is certainly well-intended. Who can object to an effort to provide data comparing colleges and universities in one easy-to-read format? The intention is not the problem, but the product is ridiculous in the data it chooses to present, duplicative of other websites that offer more and better information, and misleading for student consumers who should never choose a college solely on the basis of four or five data points.

The scorecard presents several data points that are largely unhelpful to many students despite their apparent utility. Let's consider some of the points:

Net Price and Borrowing Costs: Probably the most useful data to present, the "scorecard" still is quite deficient in that it masks the actual bottom-line expenses that students might pay after all financial aid is calculated. A student with great need will pay much less, a student with little need will pay more. Residence on campus changes the calculation dramatically, as do additional fees for specialized courses and programs. No student could tell by looking at the number how much he or she will really pay. The only way to get the actual number is to work with the financial aid offices at the schools the student applies to in order to get the actual estimates of likely expenses. Moreover, the net price is completely neutral on the question of the quality and value of the education a student will receive at the college. See this excellent essay by Alison Byerly in insidehighered.com on how to interpret "Affordability and Value."

Graduation Rate: It is simply outrageous that the U.S. Department of Education continues to use a data point that it also clearly acknowledges as flawed and not really measuring what the name claims to measure -- the rate is a statistic of brand loyalty, how many "first time full-time" students stay at the institution they start at and graduate within six years. Students who transfer and complete elsewhere are treated as drop-outs. About half of U.S. undergraduates transfer, so the data is really misleading. Schools that enroll large numbers of non-traditional students are profoundly hurt by this statistic since those students often take courses in many places and longer to finish, and yet, those schools may well be the best places for such students. USDE should stop using this factoid until it redresses the terrible problems with the data. Instead, how about using degree attainment rates -- how many students who enroll eventually complete degrees? That's different from the "graduation rate."

Default Rate: It's not clear what the utility of this factoid is to prospective students, since it reveals behaviors of graduates in the past, and the rates for the years of the recession are much higher than the norm. Moreover, graduate and professional students, who can borrow significantly more money, are also in the rate. It blends too much while not being particularly predictive of the collegiate experience for prospective undergraduate students.

Average Earnings of Graduates: USDE has not yet populated this part with any data, but the note says it will do so soon -- and yet, to my knowledge, the Department of Education has not even asked institutions to provide the data, and it's unclear how we would get the data other than voluntary surveys of our graduates. More to the point, the presentation of this factoid on the scorecard suggests that USDE has bought into the view that the only thing that counts is money -- so a school that graduates a lot of social workers, teachers and clinical psychologists is not going to score well on this point, while those places that have a lot of engineers and computer scientists will look great. (And, by the way, that will have the salutary effect of proving a point about pay equity since the lower paying professions tend to have more women.) While not on the form -- yet -- there's also regulatory murmuring about gathering additional employment data in relation to student majors -- heaven help the English, History and French Departments when the feds find out that most of their majors probably are not working in literature, research or romance. (I know several French majors who turned out to be very fine technology executives.)

Sure, we should tell our prospective students what our graduates do -- and so we do. The idea that this information is "secret" is preposterous, and reveals the fact that whoever designed the scorecard clearly has not gone on an admissions tour recently, or perhaps had a single bad experience that's now exalted to federal policy (not an unusual result from the types of regulations we've seen recently.) Touting the good jobs of graduates is at time-honored admissions practice. Just look at the viewbooks and testimonials on websites. We have millions of satisfied customers, despite the negative pall of the federal insinuation that perhaps nobody really gets work after graduation.

Nowhere on the scorecard is a student advised to visit the college or university, talk to current students, think about the alignment of the student's own interests and abilities with what the school offers academically, socially and spiritually.

The scorecard is agnostic on such important topics for undergraduates as the ready availability of services for students with disabilities, or the climate for women on campus, or the availability of residence hall spaces where students can live together while learning a new language.

Rather than exalting money as the only benchmarks worth considering, a truly helpful college scorecard would ask the student to measure his or her prospective choices according to these questions:

  • How often will I get to speak to a real professor, not just a teaching assistant?

  • Will I experience a good deal of diversity among students, faculty and staff, or is everyone the same?
  • If I'm not a Division I athlete, will I still get a chance to play a sport that I love for recreation?
  • Can I feel comfortable on the campus even though I might be different from other students?
  • Will the faculty be able to work with my learning differences in ways that will help me to master collegiate level study and research?
  • Do women have opportunities to hold campus leadership positions in this institution?
  • Will I have an opportunity to have a few great internships so I can figure out what kind of career field I might enjoy?
  • What are the campus health services like, will they be able to support my issues?
  • Do I like the people I meet when I visit?
  • Will I be safe on this campus?
  • Is there a clear value center in the educational program at this university?
  • Yes, those are many questions, perhaps too many for the White House website. But those are the kinds of questions prospective students must probe in making any college choice.

    Choosing a college is not like using a toaster -- or buying a car online, as I heard former Secretary of Education Margaret Spellings say to a group of college presidents one day. Choosing a college is one of the most profoundly serious life choices a person can make, because the right college or university can begin a process of personal change and growth that lasts a lifetime.

    There's no real data point for that. But it's the kind of "return on investment" over the years that makes so many alumnae and alumni so rabid about their alma maters.

    It's fine for President Obama to say that colleges and universities must return quality and value for the investment of tuition dollars, and in fact, most of us do far better at that than any federally-created scorecard will ever be able to demonstrate.

    But rather than reducing the value of higher education to a few disconnected factoids, the president should urge students and families to be better consumers by visiting colleges, asking more probing questions, making choices that are right for the student from an educational perspective, not because the football team won a bowl game or somebody famous went there.

    Choosing a college is not easy; the administration should stop implying that it should be.

    Popular in the Community

    Close

    What's Hot