After nearly 35 years with basically no information on just about any chemical, is it possible to have too much "minimal" information? Sounds hard to imagine, but some apparently think so.
Here's the issue. Under the newly-released House draft of a chemical regulation reform bill, EPA would identify the 300 chemicals of greatest concern, the so-called priority list. Industry would then have 18 months to submit a "minimum data set" (basically a rundown of what's already known about each chemical) for every compound on that list. This doesn't seem to be a particularly contentious proposal.
The House discussion draft then requires EPA to complete its regulatory review of those 300 compounds within two years. This is where it gets a little tricky.
The unspoken but obvious fact is that EPA will not get through all 300 regulatory assessments in two years. It might not get through 300 assessments in ten years if industry throws up enough obstacles. (That's not necessarily a disaster, as long as the 300 are the compounds that really present the greatest public health risk, a subject we'll get to later.)
But what about the thousands of chemicals not on the priority list, to which millions of people are presumably exposed and about which we will still know little or nothing? Shouldn't we have at least a minimum amount of data on these as well? Or must the public and companies seeking safer chemicals remain in the dark while industry manages to slow EPA's review of the priority chemicals to a crawl?
The House draft avoids that problem by requiring chemical manufacturers to submit a minimum data set on all 83,000 registered chemicals even as EPA slogs through the work of rendering final safety decisions on the top 300. The timetable is to have this minimum data made public on all chemicals within five years of the bill's enactment. The proposal also allows "tiering," or ranking, of chemicals into categories of greater or lesser concern, so that EPA could minimize how much information is required for a minimum data set where that makes sense.
This is a brilliant piece of legislative jujitsu that takes the looming nightmare of industry delaying tactics and turns it into major opportunity. With at least a minimum amount of data available on all chemicals, independent scientists, businesses that make things with chemicals and the public will be able to make informed choices about what to study, what to put into products, and with a little luck, what they want to buy.
And what should be in a minimum data set? We'd say at least the following:
All of the data on each chemical from:
* The chemical dossiers prepared under the European chemical regulatory program REACH (Regulation, Evaluation, Authorization of Chemicals)
* The EPA's voluntary High Production Volume (HPV) challenge program, which was last year renamed CHAMP
* Industry's files -- and we mean everything
* The peer-reviewed scientific literature pertaining to any aspect of a chemical
* Other government agencies, such as the Food and Drug Administration or the Centers for Disease Control and Prevention
* The National Children's Study
* EPA's TOXCAST and other high-throughput screening batteries
Add to that any additional data the EPA feels it needs to make a safety determination on the chemical.
This raises the next key question:
How can EPA publish a list of 300 priority chemicals without first having the minimum data set in hand? We'll talk about that when we get to prioritization (section 6 of the draft), but suffice it to say that ensuring that there is a minimum set of data before you set priorities, not just after, is essential to the success of reform.