The EPA says Corexit 9500A, the chemical dispersants used to break up the oil from the Gulf of Mexico spill, are indeed toxic, but no more so than the oil itself. They're not something you'd put in the water if you didn't have to, but by breaking up the thick gooey oil, the dispersants reduce the mechanical/physical mechanism by which the oil suffocates or harms some wildlife. So, as is the case with so many environmental threats that trigger a knee-jerk "AAAIIIGGGH!", this one involves tradeoffs, and is more complicated than it seems at first, or than many environmentalists would like to admit.
The list of these, of course, is long. In fact, it's rare the environmental bogeyman that is only a threat.
- DDT harms reproduction in some bird species but is an effective anti-mosquito agent against malaria.
- Nuclear radiation can cause cancer, but the tradeoff of avoiding the risk of nuclear power is the harm that comes from burning fossil fuels.
- Maybe the best example of all is mercury. Eating too much of some species of seafood raises exposure to mercury, which at high levels can impair the healthy neural development of a fetus. But the fatty acids you get by eating those fish are great for... ready... the healthy neural development of a fetus.
So how are we to make sense of these risks? And why does the scary side always
seem to grab our attention more than the benefit side? Wouldn't weighing the tradeoffs be a smarter way to make a fully informed healthy decision?
Well, sure. But that's not how our risk perception system evolved. It's designed to keep us alive, not get straight A's in school. It developed to be on the lookout for danger, not benefit. When something potentially perilous comes along, we instinctively assess it for the harm it might do, not the gain. And the system is set on a hair-wire trigger that sounds the alarm instantly if there even might be a threat. Subconsciously, before our thinking brain has even gotten the raw data to thoughtfully analyze the tradeoffs and complexities, which takes time, the animal instinct/self-preservation parts of the brain do a quick initial scan of information and if there is even the hint of peril... "AAAAIIIGGGGH."
The problem is, the perils of the modern information/technology age are more complicated than the simple dangers our risk perception system evolved to cope with. We didn't have to think about tradeoffs when the wolf was howling, or the bad guys with clubs were attacking, or it got dark. It's not so easy to figure out how to deal with climate change, or nanotechnology, or genetically modified food. There are certainly lots of 'cons' involved in the products and processes of modern life, but there are often pros too, and if we don't consider the whole picture, the ways we choose to protect ourselves may make us feel good, but leave us at greater peril.
That moderate reasoned approach is easier proposed than accomplished, however, because as thoughtful as we humans like to think we are, the risk perception system is not a simple matter of factual analysis. It's a just what we think, but also how we feel, and we know from decades of research into the psychology of risk that not only do we tend to over focus on the negative, but also;
- Human-made risks innately feel scarier than natural ones.
- Risks produced by industries we rightly don't trust feel scarier than they might actually be, just because of where they come from.
- Whole classes of substances and categories of things can be stigmatized as dangerous when only some of them are.
- Risks associated with a single catastrophic event feel scarier than risks that might be much bigger but which are chronic and spread out in space and time.
- Risks getting a lot of attention feel scarier than bigger ones lurking in the background.