Cover Image

PAPERBACK
$34.75



View/Hide Left Panel

1
Introduction

Models are fundamental tools for estimating the costs and the effectiveness of different policies for reducing greenhouse gas (GHG) emissions. The wide array of models for performing such analysis differ in the level of technological detail, treatment of technological progress, spatial and sector details, and representation of the interactions between the energy sector and the overall economy and environment. These differences affect model results, including cost estimates. More fundamentally, these models differ as to how they represent basic processes that have a large impact on policy analysis—such as technological learning and cost reductions that come through increasing production volumes—or how they represent baseline conditions. Critical to the development of the federal climate change research and development (R&D) portfolio are reliable estimates of the costs and other potential impacts on the U.S. economy of various strategies for reducing and mitigating greenhouse gas emissions. Thus, at the request of the U.S. Department of Energy (DOE), the National Research Council (NRC) organized a workshop to consider some of these types of modeling issues.

A planning committee was appointed by the NRC to organize the workshop and moderate discussions. John Weyant (Stanford University), Marilyn Brown (Georgia Institute of Technology), William Nordhaus (Yale University), Karen Palmer (Resources for the Future), Rich Richels (Electric Power Research Institute), and Steve Smith (Pacific Northwest National Laboratory) worked with NRC staff to organize the 2-day event in Washington, D.C. The planning committee structured the workshop as four major sessions that addressed specific issues of interest to the modeling and policy communities: (1) Uses and Abuses of Bottom-Up Marginal Abatement Supply Curves; (2) Uses and Abuses of Learning, Experience, Knowledge Curves; (3) Offsets—What’s Assumed, What Is Known/Not Known, and What Difference They Make; and (4) Story lines, Scenarios, and the Limits of Long-Term Socio-Techno-Economic Forecasting.

The workshop opened with introductory remarks and an overview from John Weyant, the chair of the NRC planning committee and director of Stanford’s Energy Modeling Forum. Richard Duke, the Department of Energy’s deputy assistant secretary for climate change policy, and Richard Newell, administrator of the Energy Information Administration (EIA), provided the perspective of the sponsoring agency (DOE) and the EIA, respectively, on the topics of this workshop.

John Weyant opened with a reminder that this was the second NRC workshop sponsored by the DOE’s Office of Policy and International Affairs on the modeling of greenhouse gas mitigation. The previous such workshop took place on October 2-3, 2008, and a summary of that workshop was released in 2009 (NRC, 2009). The goal of the earlier workshop was to cover a broad range of issues associated with making greenhouse gas mitigation



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 1
1 Introduction Models are fundamental tools for estimating the costs and the effectiveness of different policies for reduc - ing greenhouse gas (GHG) emissions. The wide array of models for performing such analysis differ in the level of technological detail, treatment of technological progress, spatial and sector details, and representation of the interactions between the energy sector and the overall economy and environment. These differences affect model results, including cost estimates. More fundamentally, these models differ as to how they represent basic processes that have a large impact on policy analysis—such as technological learning and cost reductions that come through increasing production volumes—or how they represent baseline conditions. Critical to the development of the federal climate change research and development (R&D) portfolio are reliable estimates of the costs and other potential impacts on the U.S. economy of various strategies for reducing and mitigating greenhouse gas emissions. Thus, at the request of the U.S. Department of Energy (DOE), the National Research Council (NRC) organized a workshop to consider some of these types of modeling issues. A planning committee was appointed by the NRC to organize the workshop and moderate discussions. John Weyant (Stanford University), Marilyn Brown (Georgia Institute of Technology), William Nordhaus (Yale Uni - versity), Karen Palmer (Resources for the Future), Rich Richels (Electric Power Research Institute), and Steve Smith (Pacific Northwest National Laboratory) worked with NRC staff to organize the 2-day event in Washington, D.C. The planning committee structured the workshop as four major sessions that addressed specific issues of interest to the modeling and policy communities: (1) Uses and Abuses of Bottom-Up Marginal Abatement Supply Curves; (2) Uses and Abuses of Learning, Experience, Knowledge Curves; (3) Offsets What’s Assumed, What Is Known/­Not Known, and What Difference They Make; and (4) Story lines, Scenarios, and the Limits of Long- Term Socio-Techno-Economic Forecasting. The workshop opened with introductory remarks and an overview from John Weyant, the chair of the NRC planning committee and director of Stanford’s Energy Modeling Forum. Richard Duke, the Department of Energy’s deputy assistant secretary for climate change policy, and Richard Newell, administrator of the Energy Information Administration (EIA), provided the perspective of the sponsoring agency (DOE) and the EIA, respectively, on the topics of this workshop. John Weyant opened with a reminder that this was the second NRC workshop sponsored by the DOE’s Office of Policy and International Affairs on the modeling of greenhouse gas mitigation. The previous such workshop took place on October 2-3, 2008, and a summary of that workshop was released in 2009 (NRC, 2009). The goal of the earlier workshop was to cover a broad range of issues associated with making greenhouse gas mitigation 

OCR for page 1
 MODELING THE ECONOMICS OF GREENHOUSE GAS MITIGATION cost projections, and, specifically, to identify gaps in the underlying economic research and modeling. The current workshop, as Weyant described it, aimed to focus on a limited number of key analytic challenges that emerged from the first workshop. Weyant pointed out the extensive ties to the first workshopthe planning group chair for that event was Richard Newell, one of the introductory keynote speakers for the second workshop. Marilyn Brown, John Weyant, and William Nordhaus also served on the planning committee for or as a speaker at each workshop. Richard Duke followed Weyant with a discussion of the motivation for the present workshop. After underscor- ing how much Secretary Steven Chu had hoped to be delivering the welcoming remarks himself, Duke provided some thoughts on the agenda from the perspective of someone with experience with both abatement supply curves and learning curves as well as someone involved in climate policy at DOE. He noted that, when attempting to model the long-term energy system transformations that are necessary to address climate change, it is important to try to capture speculative technology changes—and yet this is so difficult to do. He mentioned the potential for insights through marginal abatement supply curves, but also that these curves contain hidden assumptions that are fundamental to their construction. He noted the importance of the offsets and story line issues being discussed in the final session. Duke finished with a description of some recent legislative and international initiatives to address climate change, including Secretary Chu’s international outreach activities. Richard Newell followed with remarks intended to set the stage for the rest of the workshop. Newell noted that he was the chair for the planning committee that put together the first workshop in this series. He also noted that the EIA’s analyses and forecasts are independent of DOE and that his views should not be construed as represent - ing those of DOE or the Administration. He began his talk by framing two major considerations in the economic modeling of greenhouse gas mitigation. The first is establishing a baseline picture of what the future may look like without any particular greenhouse gas policy. Newell pointed out that the baseline provides a counterfactual description of the future in the absence of some policy, but that baseline itself is subject to considerable economic, technological, and policy uncertainty. The baseline is not nearly as pure as is often imagined in textbooks and includes a significant number of technology, economic, and policy assumptions. Second, in estimating the nature of a future with greenhouse gas policies, the interest of policymakers is not just the allowance prices for carbon, impacts on gross domestic product, or the total cost of the policy, but potentially much more detailed impacts as well, such as the production and consumption of specific fuels, the level of deployment of specific technologies, emission levels, and other sectoral and regional impacts. Additionally, he noted that, although modelers want to understand the effect of policy relative to the baseline, it is important to remember that many people in the world do not think in those terms. They are interested instead, for example, in what will be the trajectory of natural gas prices and use with climate policy, not in how the trajectory of both change as one moves from the baseline to the policy case. Newell cautioned that these kinds of demands emerging from the policy process need to be kept in mind when models are being developed. Modelers need to be conscious that, just because certain categories of results are desired, it does not necessarily mean that such results can always be provided. Newell then went on to provide some thoughts on the four topics of the workshop and how they relate to baseline energy-economic modeling as well as policy analysis against the baseline. First, with bottom-up mar- ginal abatement supply curves, Newell reminded the workshop audience of the long-running debate attempting to reconcile the large technical potential for reduction of energy use and emissions through energy efficiency with the relatively low acceptance of these technologies in the marketplace. There is an ongoing discourse about the extent to which this lack of acceptance of energy-efficient technologies is explainable by real-world costs and benefits or whether it is attributable to market imperfections owing to principal-agent problems or imperfect information. There is also the possibility of inconsistent behavior on the part of households and firms, namely that they do not minimize costs as often as is assumed in economic models. With regard to learning curves, Newell noted that there is a strong empirical observation of technical learning as indicated by the relationship between cumulative production experience and manufacturing cost reductions. This relationship is a key feature of the process of technological change that comes up in almost every conversation with industry representatives—thus appearing to Newell and most people to be a real phenomenon. One of the modeling issues associated with learning curves is the potential for double counting—for example, including cost reductions associated with cumulative production experience and increasing R&D expenditures separately in a model. Another learning curve issue is the selective incorporation of learning, including learning-

OCR for page 1
 INTRODUCTION related cost reductions for some technologies but not others. On a third topic, the role of offsets in greenhouse gas modeling, the word Newell used to characterize the issue was “huge.” Newell used the example of EIA’s analysis of H.R. 2454 (the American Clean Energy and Security Act of 2009, or simply the Waxman-Markey bill), passed by the U.S. House of Representatives in the summer of 2009. In that analysis, offsets constitute up to 78 percent of cumulative abatement through 2030. If one limits offsets, the allowance price increases by more than 60 percent, all else constant. Offsets were one of two key sensitivities that EIA found in its analysis (the other was the cost and the availability of options for generating electricity with low or no greenhouse gas emissions). Finally, with regard to the issue of story lines, Newell noted that model projections are not meant to be an exact prediction of the future, but rather a representation (a story line) of a plausible energy future given the cur - rent technological and demographic economic trends and what is assumed about current laws, regulations, and consumer behavior. These assumptions and projections, though, are highly uncertain, given that they are subject to many events that cannot be foreseen, such as energy supply disruption, policy changes, and technological breakthroughs. Generally, the differences between various story lines can often be useful to look at, or even more useful to look at than the results of any individual policy case. But there is often considerable debate around even the direction of an effect felt as a result of an individual factor, such as whether an individual policy initiative or behavioral trend will be a positive or a negative, a total cost or a benefit, or will lead to an increase or a decrease in emissions, or result in increased or decreased use of a particular technology.