After I graduated from college three months ago, I was eager to land my first real job. My goal was to be a software developer. I started programming in high school but I didn't think of making the "official" switch to computer science until it was too late to change my major. This was no big deal for me. I was certain I could do the work but many of my potential employers couldn't get past the B.A. in Economics on my resume. They pictured me working with Excel, not writing code.
I knew there were lots of companies hiring developers. If I wanted a career in technology, I had to rethink my whole strategy. Fortunately, I managed not only to get my foot in the door but also to have (what feels to be) a grand entrance into the world of technology. I joined HuffPost's tech team two months ago and I've been doing work that would make any parent proud. Since then, many people have asked me how I made the switch from "business guy" to "tech guy."
This is an important topic, not only to me but also to the nation as a whole. The United States is simultaneously experiencing a job crisis and a skills gap. There are currently three million jobs unfilled in America yet there are millions of people unemployed. So, if you are looking to break into the world of technology, it is both possible and exciting. Our own HuffPost CTO John Pavley is a self-taught hacker. If we can do it, so can you.
1. Pick your battle
The most important thing you can do to enter any new industry is to think like an economist. Ideally, you want to position yourself in an area of high-demand that is currently experiencing a shortage of developers. Here, employers will be more likely to overlook your liberal arts degree if you have the right skills. This mostly applies to your first job as a developer. Afterwards, employers will care less about your background and more about your professional experience.
In my case, I focused on iOS, the platform that powers Apple's mobile devices. I chose iOS mainly because I'm a hopeless Apple fanboy. However, I like to think that there were also some non-emotional reasons for choosing iOS. For example, as everyone already knows, anything mobile is on fire. The adoption of smartphones has been 10 times faster than the adoption of personal computers in the '80s. As for the iOS versus Android debate, the latest figures indicate that most companies are spending their limited resources getting on iOS before Android.
Apple's mobile platform passes with flying colors on the demand side, but what about the supply side? Is there a shortage of iOS developers? Yes there is. Everyone and their mother wants to be on your mobile device but there aren't enough mobile engineers to meet this growing demand. There are also significant barriers to entry for iOS developers. On the technical side, iOS uses a programming language called Objective-C that is exclusively used by Apple. Anyone who wants to learn my trade will have to learn a non-transferable computer language. On top of the "language barrier," there are also significant monetary barriers for iOS developers. There is a $100 yearly developer fee, payable to Apple, if you want to create iOS apps. In addition, you can only develop apps for iOS on a Mac computer, which can easily set you back several thousand dollars. These monetary barriers keep a lot of talented people out of iOS, especially abroad.
2. Acquire the skills you need
All this talk about supply and demand does not mean you can sit back and relax once you choose a technology relevant to today's market. All the economics is just so that potential employers will be more willing to give you a chance. You still have to work as hard, maybe even harder, than everyone else.
I group the required skills you need to acquire into two categories. First, there are the technical skills that you are hired for, be it Java, PHP, Python, you name it. Second, there are the core principles of computer science that all good technologists need to know. These core principles include topics such as data structures, algorithms, software architecture, computer architecture, and so on.
You are not going to use most of this knowledge on a daily basis, but omitting this foundation would be like trying to become a surgeon without knowing basic biochemistry. Sure, you can still cut people up and do just fine. In the long run, however, you can only achieve excellence by knowing what is possible and impossible in your field, knowing what has been done before and understanding why things are the way they are.
If you think that's a lot to learn, you are right. In terms of picking up the practical skills, there is a sea of resources at your fingertips. So many, in fact, that you may feel overwhelmed. There are free tutorials and video classes for most popular languages under the sun. For example, I used a combination of Stanford's iOS programming course, Lynda.com's iOS course and the examples on Apple's website. Also, don't forget to get help from human beings. The Internet is great, but sometimes the fastest way to learn is from real people.
Learning the core principles is a bit trickier. College is usually not good for practical skills, but it is great for the big ideas and the core foundation. If you are still in school, you can get 80 percent of the required understanding with two or three college-level computer science classes. If you are out of school, you can take advantage of the many free resources at your disposal. MIT and Stanford have most of their courses online. If you need the motivation, take advantage of your local college or university. It will cost money, but the investment is worth it.
3. Accumulate demonstrable experience
Even if you pick the right technology and learn everything you need to learn, hiring you still represents a huge risk. If you have no relevant experience, how does the employer know you are as good as you say you are? On one hand, you could be a perfect fit and eventually become a great engineer. On the other hand, you could fail miserably and they would have to find your replacement. Hiring is very costly for companies so you have to mitigate the risk you bring to the equation.
The only way you can prove that you really do possess the skills they need is by accumulating demonstrable experience. The key word here is demonstrable. I could have spent six months poring over every iOS book ever published but at the end of the day I would have had nothing to show for it.
The best way to accumulate demonstrable experience is with a professional portfolio. If you haven't already, head over to your favorite domain registrar and register yourname.com (e.g. pietrorea.com for me) or the closest variation you can find. This will be where you showcase your work. You can also add an "About" page so you can discuss your background. Also make sure to create a Github account, which lets you publish your code so others can see it publicly.
As you learn your chosen technology, make sure you spend time developing toy applications that you can include in your personal portfolio. If you want to be a web developer, create a couple of nifty websites. If you want to go into iOS like I did, develop a couple of simple apps and submit them to the App store. In my case, I developed an invitations-based iOS app with a classmate and created a Youtube video about it.
At first it may seem as if all the tech jobs are reserved exclusively for computer science majors. The truth is that a degree in a technical field is just a piece of paper. Technology moves so fast that most of what is used in the real world is not taught in classrooms anyway. In the tech world, eagerness to learn trumps formal training. With the right skills and a little bit of luck, you too can be a developer.