To preserve a livable climate, we need technology deployment. That's what drives innovation, as Bill Gates Gates used to argue.
So I listened to Gates' TED speech a few hours after he gave it in Long Beach, CA. Let's just call that an IT miracle.
The good news is that at least 80% of it was easy to swallow, (unlike his recent blog post on energy).
The speech was more like a miraculous ice cream cone made up of 80% homemade chocolate-chocolate chip ice cream, only 20% of which was inedible.
Since TED is all hush-hush, most people get only the snippets the media shares, such as HuffPost's headline: "Bill Gates' TED Speech 2010: 'We Need Energy Miracles'." Mongabay.com reported:
Gates said the world needs to reduce carbon emissions to zero by 2050 and suggested researchers spent the next 20 years developing new technologies and the follow 20 years implementing them. [sic]
Yes, Bill Gates keeps diminishing the value of aggressive action now, which is just plain suicidal. We need both massive technology deployment now and much more innovation. But the former is the sine qua non for having any chance to preserve a livable climate. Ironically, the former is also the key to the latter, something Gates himself used to argue. Strangely, Gates strongly praises Gore's book even though its main thrust is directly at odds with Gates'.
This post will:
Let's start with the homemade chocolate-chocolate chip ice cream.
First, Gates has finally gone on the record as to how serious a threat is posed by global warming and unrestricted emissions of greenhouse gases. He warns it could lead to starvation around the planet. He notes:
Now the exact amount of how you map from a certain increase in CO2 to what temperature will be and where the positive feedbacks are, there's some uncertainty there -- but not very much. And there's certainly uncertainty about how bad the effects will be, but they will be extremely bad."
Gates is unequivocal on the science: "CO2 is warming the planet." He understands that we have to get near zero emissions by mid-century, especially the rich countries. He talked to the "top scientists" and learned "until we get near to zero, the temperature will continue to rise."
He recognizes "the IPCC is not necessarily the worst case" in term of impacts -- though by now, that conclusion still deserves a "Duh" (see Intro to global warming impacts: Hell and High Water). For the plausible worst-case, see UK Met Office: Catastrophic climate change, 13-18°F over most of U.S. and 27°F in the Arctic, could happen in 50 years, but "we do have time to stop it if we cut greenhouse gas emissions soon."
In the TED speech, He didn't attack efficiency and renewables and immediate action with a string of dubious or illogical claims as he recently did (see "Bill Gates disses energy efficiency, renewables, and near-term climate action while embracing the magical thinking of Bjorn Lomborg (and George Bush)." Woo hoo!
Indeed, he notes that "we do need a market incentive" -- a price for carbon either in the form of "cap-and-trade" or an "energy tax."
He further asserts we can achieve a factor of 3 to 6 in efficiency gain across the board. Here is where he dives into the inedible stuff.
He fails to spell out just how aggressive we must be in technology deployment to achieve that efficiency gain. After all, we have the ability to dramatically increase the efficiency of almost every major human enterprise is now -- cost-effectively. We don't need energy miracles, we need to address market and regulatory barriers.
And while he correctly asserts that even if we do all of that efficiency, we can't possibly solve the climate problem without multiple, massively scaled carbon-free energy sources. He identifies the five most likely candidates for massive scaling as carbon capture and storage, nuclear power, wind and solar (both PV and solar thermal). But he spends most of his time talking about nuclear, raising questions about renewables (transmission and storage) while pushing the notion that "We Need Energy Miracles."
By miracles, he says, he doesn't mean things that are "impossible." The "microprocessor" and the "personal computer" are the "miracles" he means. As we'll see, the PC in particular doesn't match his (new) theory of how you get mass deployment of low cost innovative technology.
He doesn't dis action now, but says that action now is "equally or maybe less important" than accelerating the pace of innovation breakthroughs.
And yes, when asked about the timescale issue, he does say "we need 20 year to invent and 20 years to deploy" his energy miracles.
Bizarrely, he says "a lot of great books have been written about" this subject and "I'll be sending you" the new book by Gore, Our Choice. But had he read the book -- or even picked it up -- then he would have noticed that it is almost directly at odds with his argument. Right there on the back jacket next to Gore's picture is an excerpt from the introduction by Gore beginning:
It is now abundantly clear that we have at our fingertips all of the tools we need to solve the climate crisis period. The only missing ingredient is collective will...
Our Choice gathers in one place all of the most effective solutions that are available now and that, together, will solve this crisis.
If he could wish for anything in the world, Gates said he would not pick the next 50 years' worth of presidents or wish for a miracle vaccine.
He would choose energy that is half as expensive as coal and doesn't warm the planet.
I was acting assistant secretary (and principal deputy assistant secretary) of energy for energy efficiency and renewable energy from 1995 to 1998, helping to run the billion-dollar federal office in charge of research, development, demonstration, and deployment of most low-carbon technologies, including three of Gates' would be miracles. For much of that time I was in charge of technology and market analysis for the office. Since then, I have written a number of books on low carbon technology development and deployment.
So I have thought a lot about whether Gates is right that we need multiple "energy miracles" developed through a $10 billion-a-year government R&D effort to stabilize at 350 to 450 ppm.
Put more quantitatively, the question is -- What are the chances that multiple (4 to 8+) carbon-free technologies that do not exist today can each deliver the equivalent of 350 Gigawatts baseload power (~2.8 billion Megawatt-hours a year) and/or 160 billion gallons of gasoline cost-effectively by 2050? [Note -- that is about half of a stabilization wedge.] For the record, the U.S. consumed about 3.7 billion MW-hrs in 2005 and about 140 billion gallons of motor gasoline.
Put that way, the answer to the question is painfully obvious: "two chances -- slim and none." Indeed, I have repeatedly challenged readers and listeners over the years to name even a single technology breakthrough with such an impact in the past three decades, after the huge surge in energy funding that followed the energy shocks of the 1970s. Nobody has ever named a single one that has even come close.
Yet somehow the government is not just going to invent one TILT (Terrific Imaginary Low-carbon Technology) in the next few years, we are going to invent several TILTs comparable to the microprocessor. Seriously. Hot fusion? No. Cold fusion? As if. Space solar power? Come on, how could that ever compete with solar baseload (aka CSP)? Hydrogen? It ain't even an energy source, and after billions of dollars of public and private research in the past 15 years -- including several years running of being the single biggest focus of the DOE office on climate solutions I once ran -- it still has actually no chance whatsoever of delivering a major cost-effective climate solution by midcentury if ever (see "California Hydrogen Highway R.I.P.).
I don't know why the energy miracle crowd can't see the obvious -- so I will elaborate here. I will also discuss a major study that explains why deployment programs are so much more important than R&D at this point. Let's keep this simple:
I have discussed most of this at length in previous posts (listed below), so I won't repeat all the arguments here. Let me just focus on a few key points. A critical historical fact was explained by Royal Dutch/Shell, in their 2001 scenarios for how energy use is likely to evolve over the next five decades (even with a carbon constraint):
"Typically it has taken 25 years after commercial introduction for a primary energy form to obtain a 1 percent share of the global market."
This tells you two important things. First, new breakthrough energy technologies simply don't enter the market fast enough to have a big impact in the time frame we care about. We are trying to get 5% to 10% shares -- or more -- of the global market for energy, which means massive deployment by 2050 (if not sooner).
Second, if you are in the kind of hurry we are all in, then you are going to have to take unusual measures to deploy technologies far more aggressively than has ever occurred historically. That is, speeding up the deployment side is much more important than generating new technologies. Why? Virtually every supply technology in history has a steadily declining cost curve, whereby greater volume leads to lower cost in a predictable fashion because of economies of scale and the manufacturing learning curve.
Why deployment now completely trumps research
How do we achieve rapid innovation in existing technologies, as Gates suggests he wants?
A major 2000 report by the International Energy Agency, Experience Curves for Energy Technology Policy has a whole bunch of experience curves for various energy technologies. Let me quote some key passages:
Wind power is an example of a technology which relies on technical components that have reached maturity in other technological fields... Experience curves for the total process of producing electricity from wind are considerably steeper than for wind turbines. Such experience curves reflect the learning in choosing sites for wind power, tailoring the turbines to the site, maintenance, power management, etc, which all are new activities.
Existing data show that experience curves provide a rational and systematic methodology to describe the historical development and performance of technologies...
The experience curve shows the investment necessary to make a technology, such as PV, competitive, but it does not forecast when the technology will break-even. The time of break-even depends on deployment rates, which the decision-maker can influence through policy. With historical annual growth rates of 15%, photovoltaic modules will reach break-even point around the year 2025. Doubling the rate of growth will move the break-even point 10 years ahead to 2015.
Investments will be needed for the ride down the experience curve, that is for the learning efforts which will bring prices to the break-even point. An indicator for the resources required for learning is the difference between actual price and break-even price, i.e., the additional costs for the technology compared with the cost of the same service from technologies which the market presently considers cost-efficient. We will refer to these additional costs as learning investments, which means that they are investments in learning to make the technology cost-efficient, after which they will be recovered as the technology continues to improve.
... for major technologies such as photovoltaics, wind power, biomass, or heat pumps, resources provided through the market dominate the learning investments. Government deployment programmes may still be needed to stimulate these investments. The government expenditures for these programmes will be included in the learning investments.
We are really in a race to get technologies into the learning curve phase: "The experience effect leads to a competition between technologies to take advantage of opportunities for learning provided by the market. To exploit the opportunity, the emerging and still too expensive technology also has to compete for learning investments."
In short, you need to get from first demonstration to commercial introduction as quickly as possible to be able to then take advantage of the learning curve before your competition does. Again, that's why if you want mass deployment of the technology by 2050, we are mostly stuck with what we have today or very soon will have. Some breakthrough TILT in the year 2025 will find it exceedingly difficult to compete with technologies like CSP or wind that have had decades of such learning.
And that is why the analogy of a massive government Apollo program or Manhattan project is so flawed. Those programs were to create unique non-commercial products for a specialized customer with an unlimited budget. Throwing money at the problem was an obvious approach. To save a livable climate we need to create mass-market commercial products for lots of different customers who have limited budgets. That requires a completely different strategy.
The vast majority -- if not all -- of the wedge-sized solutions for 2050 will come from technologies that are now commercial or very soon will be. And federal policy must be designed with that understanding in mind. The IEA report concluded:
A general message to policy makers comes from the basic philosophy of the experience curve. Learning requires continuous action, and future opportunities are therefore strongly coupled to present activities. If we want cost-efficient, CO2-mitigation technologies available during the first decades of the new century, these technologies must be given the opportunity to learn in the current marketplace. Deferring decisions on deployment will risk lock-out of these technologies, i.e., lack of opportunities to learn will foreclose these options making them unavailable to the energy system....
... the low-cost path to CO2-stabilisation requires large investments in technology learning over the next decades. The learning investments are provided through market deployment of technologies not yet commercial, in order to reduce the cost of these technologies and make them competitive with conventional fossil-fuel technologies. Governments can use several policy instruments to ensure that market actors make the large-scale learning investments in environment-friendly technologies. Measures to encourage niche markets for new technologies are one of the most efficient ways for governments to provide learning opportunities. The learning investments are recovered as the new technologies mature, illustrating the long-range financing component of cost-efficient policies to reduce CO2 emissions. The time horizon for learning stretches over several decades, which require long-term, stable policies for energy technology.
Is Gates a hypocrite?
After Gates put out his first piece dissing energy efficiency and action, I wrote a very critical analysis. Afterwards, a couple of technologists wrote to point out how hypocritical Gates was to push innovation-through-big-government-R&D, given that he has long been touting innovation-through-deployment for his own industry.
As recently as two (!) years ago in a Carnegie Mellon speech, Gates argued:
But Paul Allen and I thought, okay, we'll do software. We'll build a platform, and encourage other people to write software. Now, there was an assumption there that we could get millions of machines out, because, after all, if you want to make it economic to spend tens of millions developing software, and sell it for $100 or so, you've really got to get that base out there.
But because we made that bet, and we got that going, it became a virtuous cycle. That is, as more machines would sell, it created the market for a broader range of software, and that further drove the market for the machines, and in fact that volume allowed the price of the machine to come down. And that's why from 1975 onward, that personal computer market actually not only became significant, it actually become the center of the entire computer industry.
The large machines we use today, and the big server farms, or corporate data servers, these are all based on the Windows PC architecture which, because of its volume, has come down in price, and improved in performance very, very dramatically. And so we have a large software industry.
One technologist (who wants to remain anonymous) wrote:
The man built his career on shipping "what we have now" and then improving it, using programmers paid out of the revenues gained from shipping not-quite-yet-ready product. Not once cent of Big Government R&D Breakthrough Command Economy directly flowed to Microsoft. To be fair, big government R&D did lead to things like the integrated circuit and the Internet, both of which had something to do with enabling Bill's fortune. His business strategy for his entire life was antithetical to the Lomborg nonsense "don't do anything until the Big Research Lab In The Sky Makes It Perfect."
The time to act -- to deploy -- is now.