Our country faces no more important task right now than putting millions of unemployed people back to work. We know that many won't get there without intensive job training and placement programs connected to business demands. What isn't clear enough is how to do that most effectively.
The United States spends billions of dollars every year on workforce training programs funded by 12 federal agencies, 50 states and at least 240 large foundations. One of the challenges to understanding what works best is that most of the programs use different data and different benchmarks for success.
Many don't even agree on what constitutes a job. Does starting a temporary job count as a job placement? Nearly two thirds of job training programs say it does, according to a national survey, whose initial results were released today by Public/Private Ventures (P/PV). About 40 percent want to see a minimum number of hours or wage level before a job is considered a job.
Job retention is even trickier to define. Nearly a quarter of the 214 programs surveyed as part of P/PV's Benchmarking Project check with participants 90 days after the job started to see if they're working at that time (the "snapshot" approach). If so, they've hit the three-month retention mark considered crucial for success in some federal programs. Half the programs surveyed use continuous employment with any employer for the 90-day period. And another quarter look for continuous employment with the same employer in that time frame. Not surprisingly, the highest retention rates were among the programs taking the 90-day snapshot.
Another critical question is who gets counted in the placement rates: everyone who enrolls in a training program or everyone who finishes it? Then there's the question of who gets chosen for the program in the first place. Some select job seekers with relatively high skills levels, while others concentrate on harder-to-place applicants, including former inmates or high-school dropouts. Guess which programs have the highest placement rates?
But how a program defines "success" is rarely a matter of choice--definitions are often dictated by who's funding the work. To further complicate matters, most workforce development programs rely on more than one source of funding.
That invariably means different sets of metrics and reports for the same services. Right now, the Labor Department asks for different data than the Health and Human Services Department, which asks for different data than the Department of Education. Their databases don't talk to each other, and programs that submit information generally can't pull it back from the databases in ways that could inform and improve their services.
Add to that the myriad state, local and philanthropic groups that support workforce development and that often ask for different data. The picture that emerges is of harried staff members keeping different sets of books for different backers.We are left with no clear picture, however, of what approaches are most likely to help people secure and keep jobs with the wages and benefits that would support a family.
So what can be done? The first step is to move toward more consistent definitions. P/PV is already working with private foundations that support programs in New York City to help align their performance standards; funders in Baltimore have undertaken similar efforts. It's time for federal, state and local agencies to do the same.
The technology needs to follow, so that training programs can not only share data easily with their funders, but also get the information back with some value added: summary tables, year-to-year comparisons, and trends for demographic groups or targeted industries.
Finally, public and private programs need to continue evaluating what constitutes "good performance." What definitions of job placement and retention indicate actual success for job seekers? What strategies work best for the hard-to-place populations? How can we improve what we do?
With support from the Annie E. Casey Foundation, P/PV is continuing research that began in 2004 and working to expand the pool of programs it surveys to better understand the answers to these questions. The Benchmarking Project is has also created a learning community to help boost performance across the workforce field, using both the data collected and the experiences of participating organizations to discern effective program strategies.
Some commentators have questioned whether workforce training really works given the pervasive joblessness our country now faces. The answer is not to give up on these efforts, but rather to find a way to quantify their successes and ensure that programs across the spectrum adopt a consistent set of benchmarks for returning Americans to work.