The Delaware River, which winds south from New York on its way to the Chesapeake Bay, forms a natural boundary between New Jersey and Pennsylvania. Were it not for the divide formed by the river, the differences between those living on its east and west banks would be largely indistinguishable, even down to their accents. Except, perhaps, for loyalty to the Flyers or Devils.
But in 1992, the boundary provided the basis for a seminal study on the effect of rising minimum wage rates on employment.
In a paper published in 1993, Princeton economists David Card and Alan Krueger (now chairman of the Council of Economic Advisors) tracked the impact of a 19 percent increase in New Jersey's minimum wage rate, comparing its effect on employment to employment across the river in Pennsylvania, where the minimum wage remained constant at the pre-increase rate of $4.25 an hour.
It was a perfect natural experiment. The populations on either side of the Delaware were in almost all respects identical; the border between them was not a barrier to either commerce or employment. Workers from Pennsylvania were free to seek jobs in New Jersey. Business owners in New Jersey were not constrained from relocating to Pennsylvania. Customers, of course, were free to cross the bridge in search of the best deals.
What the authors found was that "contrary to the central prediction of a text book model of the minimum wage, but consistent with a growing number of studies based on cross-sectional-time series comparisons of affected and unaffected markets and employers, we find no evidence that the rise in New Jersey's minimum wage reduced employment at fast-food restaurants in the state." In fact, they found that employment increased after the minimum wage rose. It's a finding that has been born out again and again.
The debate over the minimum wage has been sparked - once more - by President Obama's call in the State of the Union address for an increase in the wage, over time, to $9 an hour. Opponents of an increase - of any increase, and in many cases of a mandated minimum wage at all - generally fall back on the alleged "job killer" argument: that businesses compelled to pay a higher wage will hire fewer workers, cut payrolls or, worse, be forced to shut down altogether. As anecdotal arguments go, those are pretty scary. But smart policy shouldn't be driven by scary stories - it should be based in facts.
There will always be anecdotal evidence that some employers suffer under increased costs associated with a marginal rise in the minimum wage (and a marginal increase is all anyone serious has ever proposed, so there's no need to raise the straw man of a $100 minimum hourly wage, or even of $20). But studies conducted over the last 30 years that have followed employment trends over time have consistently shown that such increases, on the whole, do not have a negative impact either on employment or on the viability of businesses.
In fact, there are studies that show rising minimum wage rates are a driver of both education and innovation; that the increases encourage employers to seek technological advances that allow them to reduce the number of low-wage workers they employ and encourage high school students to extend their education in order to qualify for higher-skill jobs.
There are limits, of course. Demand for the least-skilled workers to fill the least-desirable or unskilled, non-union jobs will persist. And there will always be unskilled workers for whom such jobs are the only entry point into the workforce
The problem is that, as many have pointed out, including most recently John Cassidy at The New Yorker, the current minimum wage of $7.25 is simply not enough to live on. Someone earning the federal minimum wage grosses less than $15,000 a year. That may work for a single person in Alabama, where a one-bedroom apartment can be had for $540 a month, but housing (and all other) costs are significantly higher in large urban areas on the coasts where most of the population lives.
And while the federal minimum wage was increased in dollar terms over the last several decades, when you account for inflation you find that the $1.60 an hour minimum wage in 1968 actually had more purchasing power than the current rate does today. The increases in minimum wage rates have simply not kept up with increases in the cost of living.
So what to do about the working poor?
A vast body of empirical research shows intended results of modest increases clearly far outweigh any negative unintended consequences.
There are consequences to poverty, and it's worth engaging the consequences and costs to society when more than 14 percent of the population, nearly 45 million Americans, fall below the poverty line; when the gross wages of a minimum-wage worker in Los Angeles, where the state's minimum wage is already at $8 an hour, are but 75 percent of the average annual rent of an apartment in the county.
Poverty and poor health, for instance, are intertwined.
Wealthier, better-educated people live longer; they are, in general, better-equipped to access the healthcare system. And there is a negative correlation between poverty and risky behaviors associated with poor health outcomes: smoking, drinking, obesity and lack of exercise. That's not anecdotal, it's based on reams of research and is the reality for millions of Americans.
These behaviors come with broad costs to society: the poor rely on government for health care in vastly greater numbers than the "wealthy", those earning $50,000 or more annually.
Does increasing the minimum wage eliminate these costs? No, but it does mitigate them to some degree. Will some businesses that have to bear the nominal increase in costs associated with an increase in the minimum wage suffer? At the margins, yes.
But in terms of the broader benefit to society, occasional, incremental increases to the base wage to assure that the working poor have a way to keep from slipping even further behind, is in society's long term interest.