Russian Forest Fires Stoke Nuclear Fears. Beware Fear Itself!

Greenpeace claims that some of the forest fires in Russia are burning in areas contaminated in 1986 with radiation from the Chernobyl nuclear power plant disaster. Should people in that region be alarmed?
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

Greenpeace claims that some of the forest fires in Russia are burning in areas contaminated in 1986 with radioactivity from the Chernobyl nuclear power plant disaster. Should people in that region be alarmed? It depends on lots of details -- how much radioactive material was taken up by the trees and bushes that are burning, the energy and radiological stability of those particles, the dose of particles people would be actually be exposed to after dilution in the atmosphere -- but probably not. As is well-noted in a Dot Earth blog by Andy Revkin, Chernobyl, Fires and Radiation .

But will we sound the alarm? Oh, you bet. and not just in Russia. Radiation!!!!!! AAAIIIGGH!!! Chernobyl!!!! AAAAIIIGGGHH! What's that you say? The details matter? Yeah, right. Whatever. CHERNOBYL!!!! RADIATION!!!!! ALERT! ALERT! ALERT!

Don't blame Greenpeace for this instinctive reaction to risk. And don't blame the press. Blame our instincts. YOUR instincts. Greenpeace and the news media only sound the alarm when they know we're ready to listen. And because of one critical aspect of the instinctive way human risk perception works, we are instantly ready to hear about and fear certain dangers, and the details be damned.

Stigma, is what the academics call this effect. Certain products or processes or companies or places become so identified as dangerous that anything to do with them sets off alarms in our risk response system, and the facts that follow don't matter. In a 1986 study of public opinion in Phoenix toward the proposed Yucca Mountain Nevada nuclear waste disposal site, subjects were asked to say the first thing that came to mind when they heard "underground nuclear waste storage." The top five categories were Dangerous (including words like "danger," "hazardous," "toxic," "harmful," "disaster"), Death ("sickness," "death," "dying," "destruction") Negative ("wrong," "bad," "terrible," "horrible"), Pollution ("contamination," "spills," "leakage"), and War ("bombs," "nuclear war," "holocaust.") You think the details mattered -- of the nuclear physics of the waste, or the impregnability of the waste containers, or the geologic stability of the disposal site? Nope. Radiation = Danger. End of story.

Chemicals. Offshore drilling. Genetically modified food. They have all been stigmatized, and automatically initiate fear in a risk perception system set on a hair wire trigger to detect anything that might even hint at danger. These are the learned versions of things we're probably born afraid of, like snakes or spiders or the dark, things that set off instinctive fears. We don't have to think about those things. We KNOW they're dangerous.

Only, that's as untrue of radiation and genetically modified food as it is of snakes and spiders. The details do matter. They matter if we want to make the healthiest choices for ourselves and our families. And they really matter if we want to make wise policy choices for society. What kind of energy mix do we end up with when we automatically assume radiation = danger? We end up with policy that favors electricity generation by fossil fuels, the pollution from which is FAR more dangerous (fine particles alone kill tens of thousands of people a year, to say nothing of what CO2 is doing to the climate of the earth) than even the worst industrial releases of nuclear radiation (the World Health Organization estimates the lifetime cancer death toll from Chernobyl will be about 4,000.)

But to get to the details about a risk, and judge it wisely, we have to get past the effect of stigma and the predominantly instinctive, emotional, affective way we very quickly asses whether we might be in danger, giving less careful consideration to the facts (or none at all). Our risk perception system has done a fabulous job getting the species this far, but with its emphasis more on how the facts feel than what the facts actually say, it's not the brightest bulb for illuminating the complex risks and tradeoffs of our modern technological/information age.

In that sense, the alarms being sounded about forest fires in areas contaminated by Chernobyl fall-out are actually great, even though they play on our stigmatized fears of anything nuclear. Our fear of that risk may be overblown, but the dangers of how we go about perceiving and responding to risk in the first place are quite real. We can be too afraid of some things, and not afraid enough of others, and while our perceptions might feel safe, they could end up making things worse. Our rationally reasoning minds have figured this out. Now we have to hope our cognitive firepower can use the facts about how we perceive and respond to risk, so we can do a better job of it.

Popular in the Community

Close

What's Hot