and Loeb identified breakthrough Air Force capabilities spawned by basic research.6 Coffey and coauthors describe methods and challenges in tracing the development of selected technologies from basic research through significant exploitation of the research.7

The Department of Energy (DOE) utilized a similar retrospective analysis, with the assistance of the NRC, examining the impacts on energy-producing and energy-using industries of R&D programs executed by the DOE laboratories over the time period 1978-2000. The report summarizing the findings of the assessment makes the case in economic terms for a return on investment that by itself could justify funding the research, while recognizing that societal impacts are far more difficult to measure and are not readily quantifiable.8

An exemplary example of after-the-fact analysis of impact is the economic impact analyses performed by NIST. Stimulated in part by the concerns of many in Congress that a program such as the NIST Advanced Technology Program (ATP) was inappropriate for government and that the R&D carried out under its aegis would be better left to industry, NIST began the ATP program in the late 1980s with a coordinated plan to measure economic impact. Using outside expertise and a well-articulated process, this impact analysis was applied to all funded activities and its record made available to all interested parties. A summary of the first 10 years of ATP funding, published in 2003, provides methodologies and results gained from their application to many industrial sectors. NIST has gone on to apply such organized studies to many of its laboratory-based efforts, and the historic record of accomplishment provides a strong justification for future work in the areas whose impacts have been assessed.9

SUMMARY OF FINDINGS

Measuring the impact of R&D activities is the most subjective aspect of assessment and is ill-suited to quantitative measures. Transitions are measurable at every level of R&D (see Figure 2-1). The impact of research can only be measured after the fact. Near-term impacts, including many transitions during R&D phases, require looking back at the recent past but can be monitored for most of the R&D effort. Long-term impacts require deeper historical probes and are more likely to be assessed for only a few notable examples. In both cases, organized processes for gathering and analyzing the data require management attention and designated leadership. Developing the data and presenting them in a manner that makes the data useful to the intended audience is a job for professionals. The R&D organization will benefit from having an appropriately supported historian and internal report requirements to ensure the utility of the process.

____________________________

6 W. Berry and C. Loeb, 2007. Breakthrough Air Force Capabilities Spawned by Basic Research. National Defense University, Washington, D.C.

7 T. Coffey, J. Dahlburg, and E. Zimet, 2005. The S&T Innovation Conundrum. National Defense University, Washington, D.C.

8 National Research Council, 2001. Energy Research at DOE: Was It Worth It? Energy Efficiency and Fossil Energy Research 1978 to 2000. National Academy Press, Washington, D.C.

9 R. Ruegg and I. Feller, 2003. Evaluating Public R&D Investment Models, Methods, and Findings from ATP’s First Decade. National Institute of Standards and Technology, Gaithersburg, Md.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement