I Learned Nothing in College

As I reflect on my undergraduate experience, I feel that it was a colossal waste of time. My memories are a vague blur of hanging out with friends and cramming for exams --forgetting all the information on my way out the door.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

Ok, ok, maybe that's an exaggeration. I learned some things in college. I learned that I can't stomach cheap beer; I'd rather play drinking games with a box of wine. I learned what kind of guys I should never date again. I learned how to live with kids my own age and quite frankly I do best living alone. Most importantly, I met two English professors who had a huge impact on my life, who taught me not just how to write but also to believe in myself. Meeting them made my college years worth it.

For the most part, as I reflect on my undergraduate experience, I feel that it was a colossal waste of time. My memories are a vague blur of hanging out with friends and cramming for exams --forgetting all the information on my way out the door.

Is college worth it? In terms of getting a job, it's all about who you know and getting lucky. College is important because you need that little slip of paper that says you graduated to qualify you for a job. But no one cares where you went to school unless it's an Ivy League or popular local university like USC for Los Angeles. When I interned at NBC, the head recruiter told us that she never looked at GPAs or read cover letters. Yikes.

Now of course if you want to be a doctor or lawyer or accountant or some kind of trade-specific professional, you need a specialized education. But most of this is learned in graduate programs. In my experience, as someone who works in media, I never needed any training. I needed internship and writing experience. Another example: My best friend works in finance. She just passed her series 7 finance exam and was an acting major in college. Another friend was an Econ major in college and now works as a yoga instructor. It doesn't add up.

Apparently, "back in the day," school was about learning. A friend of my parents told me he used to be so busy with his academics that he didn't have time to think about socializing. Wow!

The problem isn't just that college is a waste of time; it's that I think college is actually harmful for the students who attend. It's a breeding ground for a toxic, misogynistic, elitist social culture. A place where self-esteem goes to die. Think Greek parties with themes like "CEOs and Office Hoes." Think date rape cover-ups and roofies. Is it any surprise that young men and women go off to school and fall into deep depressions or attempt suicide? (According to the National Institute of Mental Health, 30 percent of college students report feeling so depressed that they have trouble functioning and suicide is the third leading cause of death for ages 15-24.)

I wish I could go back to high-school and shake my 17-year-old self who was dying to get into Dartmouth and just tell her: none of this matters! I'd tell her to just pick somewhere fun in a big city where you can meet cool friends and have a vibrant social life, instead of being stuck on some knoll with nothing to do but drink.

My attitude may seem cynical or lackadaisical, but I honestly am worried for college students. I believe the college experience is like a riptide. Instead of leading youth towards academic fulfillment or a higher sense of purpose and self, the college experience is leading them away from themselves. These kids are vulnerable. They are young, away from home, full of dreams and hormones. I'm worried they're going to drown.

Popular in the Community

Close

What's Hot