CONDUCT OF THE STUDY

The committee began its work in June 2000. As envisioned by the statement of task, the committee first developed an analytic framework for assessing benefits. The committee reviewed a number of reports (see Appendix C) prepared by others over the years evaluating DOE’s R&D program. Unlike most of these reports, the charge for this project focuses attention on assessing the actual outcomes of DOE’s energy R&D programs. The committee therefore elected to take a case-study, data-intensive approach to this project, recognizing that time and resource constraints would prevent it from resolving every analytic issue and closing all the gaps in data that ideally would be needed to implement the analytic framework.

Because of these constraints, the committee identified a representative sample of programs and projects as a basis for arriving at overall findings and recommendations. As outlined in the discussion of the task statement in Chapter 1, this selection was designed both to identify lessons learned from the range of programs conducted by DOE and to evaluate the utility of the analytic framework in a diversity of circumstances.

The committee then asked the Office of Fossil Energy and the Office of Energy Efficiency and Renewable Energy at DOE to provide information required by the framework, and to do so following the detailed procedures specified in Appendix D. Both the framework and the procedures are essential parts of the methodology developed by the committee. Both offices supplied a great deal of statistical and analytic information in response to the committee’s request. Much of the data provided had to be developed specifically for this study. Because the programs changed over time, the task of documenting programs as far back as 1978 was at times extremely challenging.

Each of the 39 case studies was assigned to a committee member for analysis. With the help of an independent consultant, committee members assessed the DOE submissions for quality and conformance to the analytic methods prescribed by the committee. Considerable iteration and correction took place in this process to ensure that the committee’s procedures were followed. As the study proceeded, the framework was refined. The cooperation of DOE staff in this process was exemplary, and it is gratefully acknowledged.

The committee met as a whole and in subgroups to ensure that the analytic process was being applied consistently across all of the case studies. In addition, considerable attention was paid to the use of common assumptions, designed to promote comparability of results across case studies as well as conservatism in the valuing of benefits. One such assumption is embodied in the 5-year rule, which assumes the technology would have entered the market 5 years later without government involvement. For example, if a technology entered the market with DOE involvement in 1992, the 5-year rule assumes the technology would have gotten to market in 1997 without a government program. Another assumption is the 2005 rule, by which the committee assessed benefits for all the technologies evaluated by the committee as being installed in the market by 2005 and assessed those benefits over their useful economic life. The year 2005 was used because the committee was reasonably sure of economic and other conditions up to that time and did not want to project out further because of uncertainties.

As part of its deliberations, the committee invited members of government, industry, and public interest groups to comment on the goals, performance, and effectiveness of the relevant DOE research and development programs over the period of interest. Appendix B lists the formal comments received during the course of the project. In analyzing the case studies, the committee also directly contacted other representatives of industries that participated with DOE in the case study programs to secure their views on the value of the research and DOE’s role in it.

In these ways, the committee attempted to be conservative in the judgments it drew from the available data. While much more can and should be done to refine the methodology launched with this study, the committee believes the methodology has come far enough to allow stating with confidence the findings and recommendations included in this report.

ASSESSMENT OF THE METHODOLOGY

The committee considers that the analytic methodology described in this chapter is useful as an internally consistent and comprehensive framework for the objective comparison of the benefits and costs of energy R&D programs across programs and technologies. Its opinion is based on the actual application of the methodology in the 39 case studies of diverse technologies. In the course of this experience, however, a number of lessons bearing on the methodology’s implications and future utility were identified.

To provide perspective on the more detailed analyses that follow, as well as to suggest directions for improvement, several of the lessons learned are discussed here:

  • Specifying categories of benefits by means of systemic analysis is a useful discipline. In particular, benefit evaluation must take care to give adequate weight to benefits other than realized economic benefits (the upper left corner cell of the matrix). Quantifying realized economic benefits is usually easier than quantifying the kinds of benefits that fit in the eight other cells, and the temptation is great to focus on these easily quantified benefits. But, as the committee has noted, environmental and security benefits, while harder to value in dollar terms, are equally important objectives of public funding. Similarly, creating options in the face of future oil price changes and acquiring knowledge that can be



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement