In his State of the Union Address president Obama proposed that Congress increase the minimum wage to $9.00 per hour.
Almost immediately a chorus of opposition based on neoclassical economics emerged, arguing that such a change would kill job creation. As former Bush Administration economist Greg Mankiw notes, "there is 79 percent agreement among his peers that a minimum wage increases unemployment among young and unskilled workers." But let's be clear, what Mankiw really means to say is a 79 percent agreement among neoclassical economists.
The neoclassical economic argument against the minimum wage is grounded in the view that if a worker and employer agree on a wage then this wage level must be welfare maximizing for both of them and by definition for society. The only thing a government regulated price for labor can do is distort labor markets and lead to less, not more economic welfare.
In fact, a higher minimum wage would spur economic growth, while also increasing economic fairness.
First, here's why the president's proposal won't kill jobs. The argument among neoclassical economists on the minimum wage is too narrowly focused on microeconomics. In other words, they examine the impact of higher minimum wages on individual firms and workers subject to them as opposed to the economy as a whole. While a study by Neumark, Salas, and Wascher, found that a higher minimum wage did result in fewer jobs in the affected firms, others by Card & Krueger and Dube, Lester and Reich found that higher wage requirements had no negative impact on employment. But, regardless of what these studies conclude, there are at least two limitations to them. First, they only look at first order effects, in other words, whether a higher minimum wage leads an employer to hire less. But this is not the right way to look at it. To more accurately assess employment impacts, we need to look at second order impacts as well. In the case of the minimum wage, an employer may or may not hire the same number of workers, but if they hire fewer because of higher labor costs, then it's important to recognize that the workers also spend more money (since their wages are higher). And this increased spending creates other jobs.
Second, and more importantly, job creation is really determined in the realm of macroeconomics.So when opponents of a modest increase in the minimum wage say something like "A fundamental law of economics -- the law of demand -- states that when the price of anything (including labor) increases, the quantity demanded will decrease," they are thinking of products and services, not labor. Labor demand is fundamentally determined at the macroeconomic level. For the sake of argument, let's stipulate that the proposed minimum wage would increase prices for some employers, reduce the demand for their goods or services, and therefore reduce their hiring. Those workers would then be unemployed raising the overall unemployment rate. In response the Fed would reduce interest rates in the face of higher unemployment and keep them there until we reach full employment again. The fact that so many neoclassical economists refuse to acknowledge this suggests that they have subscribed to the "lump of labor fallacy" (which refers to the notion that the amount of work available to workers is fixed) which they so readily purport to embrace. This is why in past periods when the minimum wage was higher (in constant dollar terms) unemployment was not. For example, when the minimum wage was $10.21 per hour (in 2012 dollars) in 1978 unemployment was in fact 2 percentage points lower than today.
One benefit of the increased minimum wage is that many low wage workers are able to earn a bit more income. Given the stagnation of wages at the lower end of the labor market over the last couple of decades this is clearly a good thing. But to be clear, it isn't economic growth. A higher minimum wage could draw more workers, especially male workers, back into the labor market by making it more remunerative to work. Furthermore, the adoption of labor-saving technology is a key driver of labor productivity and its adoption by firms is driven in part by the cost of labor and in part by the cost of technology. When the price of labor is high, the ROI from investing in labor-saving technology is higher. Often referred to as the "Webb effect," the theory is that a higher wage floor leads to higher levels of efficiency.
However, many U.S. organizations have been slow to adopt labor-saving technology when they otherwise might have because of the low costs of unskilled labor in the United States. Compared with other advanced nations, low-skilled labor is particularly cheap in the United States. In 2009 the minimum wage was raised for the first time in over a decade to $7.25 per hour; however, this is still far below other developed countries. For example, in 2009 the minimum wage in the United Kingdom was $8.00 per hour (and $10.90 in London), $11.60 in Ireland, $11.75 in France, and $14.31 in Australia. In these countries investing in labor-saving technology makes more economic sense. That's why not surprisingly countries with higher wages are generally more likely to adopt labor-saving technology, like self-service technology. Indeed, one study on the effects of the minimum wage on part-time employment concludes that "if the federal government raises the minimum wage, employers in some sectors may expedite the adoption of automated equipment and new technology to increase labor productivity."
Of course some will argue at this point that labor-saving technology will also lead to fewer jobs and that indeed the minimum wage is a job killer. But this is an even more fallacious notion as the one that a higher minimum wage reduces employment, as we have dealt with here. If opponents of the minimum wage oppose it because it will lead to automation, then they need to get out there on the barricades and fight against the robots.