Before you buy a car, you check its safety, reliability and performance ratings. Before you buy a house, you hire an experienced inspector to look for hidden problems. Before you expose your children to toxic chemicals? Not so much.
Most used car dealers have more reliable information on their offerings than the EPA has on just about all chemicals in commerce.
When it comes to protecting us from toxic substances, current law has produced an information wasteland -- where meaningful science on chemical risks is virtually nowhere to be found.
Rewriting the Toxic Substances Control Act (TSCA) to fix this problem is a cornerstone of any meaningful reform. So it's not surprising that the House Energy and Commerce Committee made this issue the focus of the first of a series of meetings it's hosting to review the "discussion draft" of a reform bill it released last week (April 14). It's called the Toxic Chemicals Safety Act of 2010.
Lack of data must never again be an obstacle to protecting public health. To achieve that goal, EPA must have absolute and unencumbered authority to ask for any study that it needs to better understand the risks of any chemical.
The House draft recommends this and several other critical improvements. Most notably, it gives EPA the option to request data through an agency order instead of a formal rule-making. Under current law it takes 2- to-10 years to complete a rule requiring just a single test on chemicals. With thousands of them each needing scores of tests to assess their risks, it's no surprise that the EPA long ago stopped trying to get health and safety data through this process.
The draft law would also prescribe test methods before studies are done, curtailing the insidious industry tactic of delaying regulations by testing repeatedly for the same effect using slightly different methods. All health and safety test results would be available to the public via the Internet while protecting legitimate trade secrets.
But most important, the draft would require EPA to develop a minimum data set that must include information on hazard, exposure and use. It then mandates that the data for top priority chemicals be submitted to the EPA within three years of the bill's passage. The proposal also would allow EPA to tier and categorize the more than 80,000 chemicals in its inventory, presumably based on the rational notion that some compounds present greater risks and would require a different set of data to justify their use.
Yet even with these core reforms, many difficult questions remain:
* Should some minimum data set apply to all chemicals?
* Should the new law require a specific menu of information for every chemical -- a "minimum mandatory data set?"
* If so, what should be in it and who should generate it?
* What data will the EPA need to take action against a chemical to protect public health?
* Should data be generated and made public for the thousands of chemicals now on the EPA's priority list, even if the EPA cannot use it anytime soon?
After 30 years of forced ignorance and its consequences, it's tempting to take the punitive approach and argue for sweeping data mandates. But generating large mountains of data is not the goal. The goal is to make sure that the EPA has the tools and the mandate to take quick action against high-risk chemicals while steadily working through the staggering list of compounds it needs to assess. This involves a core minimum data set, but also the flexibility and authority to go light on some chemicals and drill down hard on others.
Requiring data generation is not a burden on industry; it is the minimum duty a company must fulfill if it wants to put a chemical in commerce. A new law must be designed to enable a steady flow of informed decisions that eliminate the most worrisome exposures as soon as possible. Good decisions won't happen without good data to support them.
My next blog will explore the specifics of data requirements under a new TSCA.
Follow Richard Wiles on Twitter: www.twitter.com/AlexEWG