05/18/2010 06:47 pm ET Updated May 25, 2011

Planning for the Unimaginable

The Gulf Coast oil spill seems on the verge of decimating the ecology and economy of a multi-state area, yet a month ago, who would have imagined this could happen? While many have fought coastal drilling and warned of its environmental dangers, the odds-on betting would have been for a tanker spill like the Exxon Valdez or damage to an oil rig, not a gaping hole a mile below the surface that no one knows how to cap.

Indeed, the most devastating events in our national life over the past decade have all struck us by surprise. Who imagined a 9/11? A post-war insurgency in Iraq that lasted twenty times longer than the war itself? A total collapse of preparedness and recovery efforts when a hurricane slammed into New Orleans? The financial meltdown of 2008?

All of these events seemed unimaginable when they happened. Indeed, the 9/11 Commission cited the government for its lack of imagination in conceiving of planes being used as flying bombs. Yet each of these surprises seemed strangely predictable after the fact. The 9/11 Commission showed that the "dots" were there but went unconnected (young Middle Easter men taking flying lessons but leaving before they learned how to land, for example). Analysis of the post-invasion policies of the Bush Administration showed that we could have anticipated the collapse of civil government and the rise of sectarian violence. Investigation of the mistakes that led to the Katrina disaster identified multiple failures of planning, coordination, and oversight as well as multiple warnings of what would happen should New Orleans suffer a direct hit. Hearings on the financial collapse pointed to many warning signs during the housing bubble that was engineered, even if inadvertently, by both government action and Wall Street risk taking.

Nassim Nicholas Taleb -- scholar, writer, statistician, risk engineer and even a former Wall Street trader -- has a label for these unpredictable, high impact, yet understandable-in-hindsight events. He calls them Black Swans. The bad news in his 2007 book, The Black Swan, is that we will never predict such events, even when they are positive. The good news is that we can still "plan" for them by building robustness into our thinking and systems.

There are no doubt many reasons why we get surprised. We are, at best, biased and imperfect decision makers, no matter how rational we think we are. We often lack historical precedents that would even suggest such surprises are possible. We can only handle so much cognitive complexity before we get hopelessly lost in all the possibilities. We make faulty assumptions and fail to question assumptions that might be suspect. We are subject to the "first law of management" -- a line of thinking will continue in motion in a straight line until some unknown force requires it to change. The old saw that "if you always do what you've always done, you'll always get what you've always got" has a corollary: "if you always think what you've always thought, you'll always do want you've always done."

If we are doomed to be surprised, we are not doomed to fail in our response. Several courses of action seem prudent. First, we need to institutionalize thinking about the unimaginable. While Taleb may be right that we will get surprised, we can still improve our ability to forecast areas and types of events that might surprise us. Experts in many fields, futurists, risk traders, and even science fiction writers help us see possibilities.

Second, we need a way to listen to those who are thinking against the grain. Short sellers on Wall Street were "forecasting" the housing bust before it hit the rest of us, but nobody paid attention. We can find ways to institutionalize forced attention to counterfactual signals, even if it means setting up organizations, with access at the highest levels of government, whose sole mission is to challenge public and private sector organizations who don't want to be challenged. Unfortunately, most out-of-the-box thinking now gets channeled into institutions that have a vested interest in ignoring it. The warnings about Katrina and Iraq, for example, were expressed mostly within the organizations that were least likely to listen. Majority thinking seldom relishes minority challenge. The Minerals Management Service is responsible for both regulating the off-shore oil industry and advancing oil exploration. Expecting them to reign in the very industry that they are charged with promoting is like asking your prom date to tell you if you look ugly.

Third, we need to realign incentives to reduce and protect unchallenged risk taking and head-in-the-sand thinking. As long as leaders in any sector can take chances (of commission or omission) whose costs if they are wrong get covered by someone else, we almost beg to be surprised. For the private sector, this means requiring stronger capital pools to hedge against unforeseen events. We are likely to see BP's promise to pay the costs of the oil spill become more selective as the bill grows. If they were required to set aside a percentage of profits in the equivalent of a "rainy day" fund, and if each time they were cited for violating required safety precautions the percentage increased, they might put more intellectual (as well as fiscal) resources aside for Black Swans. For government, realigning incentives means budgeting for catastrophic failure before it happens, building a reserve pool (trust fund) before the fact, rather than adding to the national debt by paying for it after being surprised.

Fourth, we need stronger penalties for those who consciously endanger the public safety, economy and environment - who enhance the possibility of Black Swans by their negligence. Financial penalties are not enough. As Taleb says, "capitalism as about reward and punishments, not just rewards." Until corporate leaders face the prospect of criminal prosecution (rather than just public shame and a golden parachute), they may not wake up -- and wake their organizations up -- to the responsibility to avert the unimaginable.

Finally, we can improve our preparation for Black Swans with a good dose of humility. Drilling offshore for decades and expecting never to have a catastrophic failure is hubris just as is invading a country whose culture and politics we don't understand and expecting the mission to be accomplished in a matter of weeks. Humility will not prevent the unpredictable, but it may prepare us psychologically to face and plan for it.