In the new NYT Sunday mag, Roger Lowenstein offers a late-Nineties-style paean to risk: he thinks we should take more risks, at least with our money. Some would say we’re already taking unwise ones. There's good evidence that the last two decades have seen a massive transfer of risk away from private corporations and
Lowenstein's particular argument has to do with hedge funds and derivatives, a field I'm quite unqualified to judge. He does, though, make a revealing error of fact—revealing not just about how he thinks but about how politicians, citizens, voters (does that mean you?) often think, and the perils therein.
He says that we (Americans; American institutions) often prepare for disasters than never happen, and that preparedness often doesn't pay. After all, as Philip Larkin wrote,"Most things may never happen." Lowenstein gives examples: "A few years ago," he writes, "the chief claim on the public tranquillity was the fear of 'deflation,' meaning that the price of just about everything would fall. Before that, it was fear of 'Y2K.' Neither transpired."
Well, midnight on New Year's Day of the year 2000 did not, in fact, convince most computer networks that the year had changed to 1900 (or to 00), thus fouling up everything computerized and time-sensitive from Sydney to Sacramento and bringing the world's trains, airplanes, power plants, and stock markets to a screeching halt.
But why not? Because local, state and federal governments, universities, hospitals, for-profit corporations, and everybuggy else with a mainframe or a network figured out, just in time, that there was a problem, and hired small armies of programmers to correct it. American Radio Works has the story. Managers worked to convince their superiors (and others) that the Y2K problem, while not the Johannine apocalypse, wasn't a hoax. Y2K fixers put out APBs for programmers who could work with the old-school computer languages involved. The feds put an expert, John Koskinen, in charge, and federal agencies, industry giants, and other affected parties actually listened.
Computer expert David Eddy, who may have coined the term Y2K, told American Radio Works: "I'd love to do a poll…and eliminate anybody that actually worked on year 2000 work and just talk to, what I would call civilians, and if you ask them, I bet you hard money that most civilians would say, 'Oh, Y2K, whole thing was a hoax. Bodies didn't fall from the sky at the stroke of midnight, I knew the thing was a hoax.' But the reason nothing bad happened was that so many people put so much hard work into it." Koskinen: "The only way to be a hero… would be for half the world to stop and then somehow get it started again which was not one of our goals. Like a lot of things in government, if it works well nobody cares much." Eddy adds that with the disaster averted, he can't even put his Y2K work on his resume.
Plenty of big risks—some new, some not—remain in the post-Y2K (not to mention post-9/11) world. Some come from our rapidly changing global climate. Some come from flu bugs. One may be a volcano in the Canary Islands. Another comes from endocrine disruptors, which have also inspired decent science fiction.
The lesson isn't that we should all buy bottled water, guns and jars of peanut butter bigger than our cats, but that a complex society faces real problems, some of which are best dealt with in advance, and many of which require that elected officials and others who hold tons of power listen to experts and take their advice seriously. That isn't happening, not by a long shot in Washington, and voters aren't interested (yet) in making it happen. Some experts are wrong, and some are blowhards. Some risks are worth taking, and some disasters are so unlikely they may not be worth preventing. But if we keep ignoring the people who know what they're talking about, we’ll be taking more risks than writers like Lowenstein ever dreamed.