Imagine a parallel world in which no supplier of computer operating systems had offered an upgraded product since, say, 1990. Software developers would still be coming up with ideas, but they would have to work on machines running Windows 3.0 or Mac System 6. It is fair to assume that people living in this world would be missing out on many things we take for granted. And that their computers would crash more often.
Yet, something analogous might be happening in our own world. Many of the systems on which the global economy relies are abuzz with 21st century energy and creativity, but governed -- and constrained -- by 20th century norms and institutions.
This insight emerged from the World Economic Forum's annual survey on the question "What risks should the world's leaders be addressing over the next 10 years?" As explored in the Forum's Global Risks 2012 report, published on Wednesday, one theme that came up again and again was poorly-designed regulations having unforeseen consequences.
This is not to say that our community of experts thinks regulations are, in general, a bad thing -- far from it. True, overly-strict regulation can stifle an industry, denying us the potential benefits of the activity being regulated. But regulations that are inadequate can rebound in unanticipated ways, with the same ultimate effect.
An example? The meltdown at Japan's Fukushima nuclear power plant, after the March 2011 tsunami, could have been prevented by safeguards defined with more imagination and forethought. Lack of such safeguards led to a disaster that reverberated around the world, fueling public worry about nuclear power and prompting German Chancellor Angela Merkel to decommission her country's nuclear power plants.
In the complex, interdependent systems on which the world depends, causal effects are nonlinear and virtually impossible to predict. Who could have guessed that a tsunami in Asia would rewrite energy policy in Europe? The global financial system is another prime example of a complex system, as we discovered when mortgage-backed securities originating in the US crippled banks around the world.
Discussions typically center on regulations and approaches to regulating, a term that is politically loaded in many countries. We prefer to talk about "safeguards", to promote discussion from a systems perspective. And when defining safeguards for a complex system, we must strike a complex balance. On the one hand, safeguards must not be so restrictive that they prevent innovation from bringing benefits. On the other hand, they must not be so loose that they allow for a significant chance of a catastrophe.
In other words, we should seek in our systemic safeguards what we seek in our computer operating systems -- an environment that unleashes innovation and liberates individuals' creativity, but which minimizes the chance of us inadvertently wiping our hard drives.
While the financial crisis of 2007-2008 is the most high-profile recent case of "this program has stopped responding", the systems on which wealth increasingly depends are many and varied. Consider fast-changing emerging sciences such as nanotechnology and DNA synthesis. How should we define safeguards applying to them? Too onerous, and researchers might fail to make discoveries that could transform our world. Too hands-off, and a disaster - say, the release of toxic nanoparticles -- could wreak havoc. This, in turn, would cause a public backlash to prompt more onerous regulations, returning us to the first problem.
From fiscal imbalances to the management of land and waterways to climate change and greenhouse gas emissions, comparable issues emerge.
How, then, do we define safeguards for complex systems? There is no silver bullet, but we can suggest some heuristics. We should make more effort to understand who bears risks and who reaps benefits, and try to align incentives. We should pay more attention to understanding the cognitive biases of leaders and populations -- the quirks and foibles of the human mind, which are known to influence decision-making processes.
We must find ways to avoid a stifling global regulatory monoculture, without encouraging regulatory arbitrage. This implies embracing trial-and-error, accepting that in complex systems we will inevitably get things wrong.
Finally, and above all, we need to take defining systemic safeguards much more seriously. We need the people who define them to have access to the best brains in the industry, to be able to closely monitor in real time the direction in which innovations are moving and craft flexible safeguards that can be tightened or adapted in response to emerging risks and opportunities.
If only it were as easy as doing a Windows Update.
Lee Howell is Managing Director at the World Economic Forum. He is responsible for the Global Risks 2012 publication.