Anticipating Disaster

07/16/2012 09:55 am ET | Updated Sep 15, 2012
  • Daniel Little Professor of philosophy and chancellor of the University of Michigan-Dearborn

When people think about the value that universities bring to our society through research, the examples that come to mind generally fall in the areas of engineering, medicine, and new products. What we don't think about as much is the role that university-based research plays in helping all of us -- citizens and policy makers alike -- to anticipate and plan for future risky developments in our society. And yet this is one of the roles for which the social and behavioral sciences are best prepared. Disasters like the earthquake-tsunami-nuclear meltdown catastrophe experienced by Japan last year illustrates clearly how important this kind of analysis is in today's world. 

A very good example of this other kind of research is the work that Yale sociologist Charles Perrow has done on technology accidents. His current book, The Next Catastrophe: Reducing Our Vulnerabilities to Natural, Industrial, and Terrorist Disasters, should be "must" reading for our planners and policy makers. Perrow demonstrates dozens of alarming ways in which we have built a technology-dependent social system that embeds the possibility of major failures with catastrophic consequences. "Disasters expose our social structure and culture more sharply than other important events.... Two of the major themes in this work are the inevitable failure of organizations, public and private, to protect us from disasters and the increasing concentration of targets that make the disasters more consequential" (kindle location 103).

Perrow has studied the phenomenon of "normal accidents" for a long time. This idea derives from the fact that complex technologies correspond to complex and tightly-coupled social organizations, and those organizations are vulnerable to compound failures like the Challenger disaster. Perrow extends this idea to the major infrastructures -- energy, transportation, habitation patterns -- through which our daily lives are facilitated. And he highlights the fault lines through which these major infrastructure systems can fail.

Perrow argues that our greatest vulnerabilities derive from the confluence of different kinds of concentration: concentration of energy (LNG containers, nuclear power plants), concentration of population (densely populated cities intermingled with highly dangerous technologies), and concentration of economic and political power (so creators of hazards can fend off regulation and planning). 

A significant part of our country's inability to appropriately anticipate and mitigate the effects of large failures is the concentration of political and economic power to which Perrow refers. Speaking of the chemical and electrical industries, Perrow writes: "We shall also see how the regulatory agency bows to industry pressure and cuts back on inspections... The chemical industry's record on self-regulation is not good. By claiming that they meet their trade association standards of low pollution, chemical companies escape federal inspection and pollute more... Industry self-regulation again encourages cheating on safety and reliability" (kl 224). And Perrow finds this phenomenon of the economic interests of an industry defeating the efforts to regulate effectively to be ubiquitous across a range of industries and sectors. His analysis of land development in flood zones and hurricane alleys comes to the same conclusion:

We need a more centralized regulatory system. Local initiative is simply not reliable in the case of mitigation. Localities are reluctant to enforce state standards, national standards are few, and enforcement is lax. (kl 681)

His central recommendation is this: reduce the size of potential sites of failure, either through accident or deliberate attack. Decentralization is a plus when it comes to infrastructure accidents. Here is how Perrow puts the point:

I have argued for decentralization, but it has to be accompanied with a reduction in what economists call (unfortunately) "information asymmetry," where one party knows more than the others. What we need is a representative governance mechanism that ensures full disclosure and standard-setting and coordinating mechanisms. (kl 649)

The Next Catastrophe is a very important piece of rigorous social research with major implications for policy and planning. We would like to think that basic safety and crisis-aversion policies are well developed in our society. But sadly, that seems not to be the case. Alerting us to realistic scenarios that can evolve from our current practices is an important function of academic social and behavioral research. And the terrible consequences for several generations of Japanese citizens of the Fukushima disaster are a sober reminder of how high the stakes really are.