National Academies Press: OpenBook

The Advanced Technology Program: Assessing Outcomes (2001)

Chapter: Panel II: ATP's Assessment Program

« Previous: Panel I: The ATP Objective: Addressing the Financing Gap for Enabling Technologies
Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×

Page 117

Panel II:

ATP's Assessment Program

INTRODUCTION

David Goldston

Office of Congressman Sherwood Boehlert

David Goldston introduced Rosalie Ruegg, Director of the Economic Assessment Office of the ATP who would provide an overview of the ATP assessment program, and Irwin Feller, to comment on the utility of economic assessment in federal programs such as ATP.

DELIVERING PUBLIC BENEFITS WITH PRIVATE-SECTOR EFFICIENCY THROUGH THE ATP

Rosalie Ruegg 6

The Advanced Technology Program

In her opening remarks, Ms. Ruegg emphasized that the ATP program is led by private industry and that the cost of its awards is matched by direct industry contributions. “By insisting on cost sharing, we keep the program anchored in the market economy, focused on efficiency and the bottom line. At the same time, the program's selection criteria ensure the funding of highly enabling technologies.”

In a more detailed discussion of the ATP's economic assessment methods, she recommended the earlier National Research Council (NRC) report summarizing the STEP Board's first meeting on ATP.



6 Rosalie Ruegg retired from the Advanced Technology Program in April 2000.

Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×

Page 118

Continuing, she pointed out that the program and its evaluation continue to evolve. The 1990s were a decade of innovation for ATP operations, and it was a decade of program evaluation. Program funding expanded early in the decade, under both the Bush and Clinton administrations—albeit much more rapidly under the latter. Program funding then entered a period of uncertainty and decline. 7

Large and Small Business, Universities, and Laboratories

From 1990 through 1999, the ATP has co-funded 468 projects, with 1,067 participants and another 1,027 subcontractors. More than half of the projects are led by small businesses. More than 145 universities participate, and more than 20 National Laboratories. 8 Funding of these projects by ATP and industry has totaled about $3 billion, with each contributing about half. These projects have seeded innovations that are leading to broad benefits for the nation.

Direct and Indirect Paths to Impact

ATP cost sharing affects the economy through new technical capabilities that enable new and better ways of doing things, generating productivity gains, new business opportunities, employment benefits, solutions to a wide variety of problems, and, more generally, increases in the nation's standard of living and quality of life. These contributions are achieved by both direct and indirect paths.

Direct Returns

The direct path is particularly significant because it is the path along which the ATP is able specifically to encourage U.S. businesses to accelerate development and use of new technologies. It includes both private returns to companies directly involved in the ATP-funded projects (productivity gains, new business opportunities, and so on) and the benefits to their customers of better products and lower costs. The customers typically realize benefits in excess of what they must pay, and this uncompensated benefit from publicly funded R&D is known as a “market spillover.” Private returns and market spillovers comprise part of the social return of the technology developed by a project.

Indirect Returns

The indirect path tends to be slower and less amenable to planning, but is no less important. It involves the take-up of the knowledge generated by a project by

7 See the Introduction for ATP appropriations.

8 Through the end of 2000, the ATP funded 522 projects with 1,162 participants and an approximately equal number of subcontractors. Through that time, 176 universities had participated.

Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×

Page 119

others outside the project who have not directly contributed to the investment cost. Even if a project's participating companies fail to commercialize their project's technology—even if David Morgenthaler's jockey falls off the horse, we still have the horse—indirect impacts may nevertheless be realized as the knowledge is acquired and exploited by others. All of the indirect impacts can be considered spillovers from the original R&D.

The complete social return of an ATP project is the net result of the combination of direct- and indirect-path effects—private returns to the company from the project, market-spillover benefits to that company's customers, and a variety of indirect benefits to other companies and to their customers in turn. Assessing these impacts is a challenging task indeed.

Examples of Direct Benefits

The distinction between direct and indirect benefits of ATP projects can be appreciated by considering two examples of completed projects that have been assessed by the ATP. 9

Illinois Superconductor Corporation. To illustrate the kinds of impacts that occur along the direct path, consider the Illinois Superconductor Corporation of Mt. Prospect, Illinois, founded in 1990 with eight employees. It was awarded a $1.98 million ATP grant in 1992 (matching the grant with its own $1.56 million) to develop a novel high-temperature superconducting thick-film materials technology. The company's target market was to improve signal transmission in cellular phones. The three-year project was successful, and the resulting technology reduces the number of towers needed to cover a given area by 40 percent. The company has made an initial public offering of its shares, built a production plant, and is producing products based on the ATP work. By 1997, when the ATP reviewed it, it had 75 employees. Cell phone users see lower costs, reflecting the lower costs to the phone companies of these higher powered but sparser networks of towers.

Indirect benefits from the wider circulation of knowledge about the project are hard to measure, although the aesthetic benefits of fewer towers are certainly persuasive to many property owners. One measurable indicator is the number of patents that cite earlier patents developed in the course of ATP projects. In the Illinois Superconductor case, we have identified several such patents already.

Aastron Biosciences. Aastron Biosciences, Inc., was a start-up in 1991, when it was funded by the ATP to develop its stem cell expansion technology. This is a new approach to bone marrow transplantation for cancer treatment, which may

9 William F. Long, Advanced Technology Program: Performance of Completed Projects (Status Report Number 1), NIST Special Publication 950-1, March 1999.

Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×

Page 120

also have other significant medical applications. The ATP provided $1.2 million, and the company another $1.5 million, to fund the project.

The new technology—which allows a small amount of stem cells to be removed from a donor and then “grown” in a tabletop apparatus—represents a dramatic improvement over the best alternatives. It reduces the number of visits a donor must make by 75 percent. The benefits to medical staff include its simplicity, as a result of smaller training requirements. For patients, the process is less painful, with fewer side effects and better medical outcomes. The cost per patient treated is lower, as well.

The technology is in clinical trials now. According to recent news, two terminal cancer patients, who needed bone marrow transplantation, were unable to find a suitable donor. Tiny samples of matching stem cells, however, were found in an umbilical donor bank, and the new technology allowed the small samples to be expanded enough to enable treatment for the patients, who otherwise could not have been treated.

In other cases ATP-funded companies have developed their technologies, received patents, published results, and then opted out of their businesses. In other words, there is no impact on the direct path. But the projects may yet have indirect impacts, because others have shown interest in those patents and publications, or researchers involved in the projects have taken their knowledge and skills elsewhere.

The ATP's Multi-component Assessment Program

Evaluation is complex because the paths that the technologies travel are complex. The evaluation tools that we use represent reasonable attempts to get at both direct and indirect impacts. Our current approaches include

  • statistical profiling of applicants, projects, participants, and technologies;

  • progress tracking of all projects and participants (through a business reporting system and other surveys);

  • status reports for all completed projects;

  • detailed microeconomic case studies of selected projects and programs;

  • econometric and statistical studies of innovation, productivity, and portfolio impacts;

  • limited use of macroeconomic analysis for selected projects and special issue studies; and

  • development and testing of new assessment models and tools.

Positive Results Overall

These evaluation studies are designed, conducted, and managed by a team of economists on the staff of the ATP's Economic Assessment Office. They are

Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×

Page 121

aided in this work by experts from the National Bureau of Economic Research (NBER), a number of university centers with relevant specialties, and consulting universities, institutions, and individuals.

In closing, I would like to review the results of our assessments of the first 50 ATP projects that were completed. A positive feature of the ATP is that we are willing to look at failures as well as successes. Our data shows that

  • 72 percent completed all of their research;

  • 52 percent published technical results;

  • 54 percent were awarded patents;

  • 16 percent received prestigious awards from organizations outside the ATP;

  • 60 percent had incorporated their technologies in products that were on the market;

  • 80 percent either had products on the market or expected them shortly;

  • 90 percent had identifiable outputs of either knowledge (representing the indirect path) or products (the direct path);

  • 52 percent had both outputs of knowledge and products; and

  • about 25 percent were judged by ATP to have strong outlooks—with potential economic impacts of billions of dollars each. About an equal number were considered to have weak outlooks, and the remaining half were assigned medium outlooks.

Terminated Projects

One of the features of the ATP which makes it stand out among federal programs is that it terminates projects that are not working. We have carefully tabulated the reasons for ending 40 projects that have been terminated, of 468 projects funded, as of April 2000. In 44 percent of these cases the companies or joint ventures asked that the projects be terminated because they had changed their strategic goals or their organizational structures, or because the markets or other factors had changed. Four of them (10 percent) cited financial distress; in other words they ran of ATP funds before they met their goals. Another 13 percent of the terminations involved joint ventures that failed to reach agreement, and so did not start their projects. Another fifth of the terminations were caused by lack of technical progress. About 5 percent were due to early success. And 8 percent were canceled by the ATP because they no longer met the ATP criteria (of high risk, for example).

Early Program Results

Thus far, the ATP program shows some encouraging results that are worth reviewing here.

Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×

Page 122

Leapfrog Technologies. First, ATP projects have developed many “leapfrog” technologies: either brand new solutions (37 percent of the identified applications) or dramatic improvements in cost or performance (63 percent).

Platform Technologies. The projects have yielded many rich technology platforms with multiple uses. Many of the technologies have won highly sought after prizes. Most projects have yielded multiple applications of their technology.

Increased Collaboration. The emphasis on collaboration among companies, universities, and non-profit organizations has been strong. Of the 468 projects, 157 (more than one-third) have involved joint ventures. Most of the single applicants have taken advantage of alliances and subcontractors. The ATP involves a rich and productive mixture of large and small companies, universities, national labs, and others.

Accelerated R&D. High-risk R&D has been accelerated by this program; 86 percent of the projects are said to be farther ahead in the R&D cycle than they would be if they had not been funded.

The estimated public benefits of several projects alone exceed the total costs of the ATP program. This estimate is inherently conservative, since it measures only a part of the ATP portfolio of projects and only their directly measurable impacts.

Distinguishing Features of ATP

In concluding her presentation, Ms. Ruegg outlined the characteristics of the ATP that distinguish it from other public and private technology programs:

Innovation with National Benefits

  • emphasis on innovation for broad national economic benefit;

  • focus on enabling technologies with high spillover potential; and

  • goal of overcoming difficult research challenges.

Industry Leadership-Competitive Review

  • industry leadership in planning and implementing projects;

  • project selection based on technical and economic merit; and

  • project selection rigorously competitive, based on peer review.

Collaboration, Follow-through, and Sunset Provisions

  • encouragement of company-university-laboratory collaboration;

Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×

Page 123

  • positioned after basic science and before product development;

  • requirement that projects have well-defined goals, and sunset provisions;

  • requirement that selection boards demonstrate the project's need for ATP funding;

  • coordination with other public and private funding sources; and

  • U.S. companies planning and organizing for technology applications.

Evaluation

The importance the ATP places on the evaluation of impacts and potential impacts at every stage of the process is worth stressing. The ATP's Business Reporting System tracks progress during and after the performance of each project, assessing the project's goals and expected commercial advantage, its strategies for commercialization, and the collaborative activities and experiences of its members. Each project is evaluated also in terms of the effect of ATP on the project's timing, scale, scope, risk level, ability to do long-term R&D, and ability to attract private investment dollars. After the project is finished, the ATP Assessment Office follows up with studies of progress in commercialization and knowledge dissemination, and the identities of customers and competitors.

PERSPECTIVES ON PROGRAM EVALUATION AT THE ADVANCED TECHNOLOGY PROGRAM

Irwin Feller

Pennsylvania State University

Economists, according to Dr. Feller, are said to have an irrational passion for dispassionate reality. This leads them by training and socialization to endorse evaluation and assessment. But a dispassionately rational stance also leads one to dispassionately scrutinize the impact of dispassionate rationality on public policy making, which has often been characterized by passionate irrationality.

Today's comments focus on three aspects of the evaluation of the ATP:

  • technical aspects of the evaluation process itself;

  • use of the findings of that evaluation within the ATP; and

  • use of the findings of evaluation outside of the ATP.

The Evaluation Process

As an economist with an interest in the economics of R&D and in technology transfer in the private and public sectors, I have been involved in recent years in various program evaluations of federal and state technology development programs.

Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×

Page 124

A Favorable Comparison

In the context of the national science and technology initiatives of the past twenty years—including the Bayh-Dole University and Small Business Patent Act of 1980; the creation of cooperative research and development agreements (CRADAs) under the Federal Technology Transfer Act of 1986; various forms of federal and state cooperative university-industry R&D programs; and the Small Business Innovation Research (SBIR) program under the Small Business Innovation Development Act of 1982—ATP's commitment to and technical approach to assessment compares quite favorably.

The ATP's assessment activities

  • began early;

  • draw on the work of leading scholars in the field;

  • address substantive and complex conceptual and empirical issues;

  • use multiple methodologies;

  • address both obvious and subtle dimensions of impact; and

  • are widely disseminated for external review.

Advancing the Art of Assessment

In keeping with its original objectives—funding what we once called “precompetitive” or “generic” technologies and which are now being called “platform” or “enabling” technologies—the ATP has gone beyond the efforts of other programs that have sought to measure direct benefits by trying to measure indirect or “spillover” benefits. Measuring these impacts is a difficult, if not heroic, task. The ATP's assessment techniques are at the state-of-the-art and in many ways have advanced it.

The ATP's Assessment Program

A Credible Case that theProgram Works

What are the impacts of evaluation on the ATP's procedures? The assessment program has provided a credible case that the program works—that it is proceeding according to design and producing measurable economic and technical benefits. The question immediately arises, however, whether the products of the ATP Assessment Office have had impact on the program's operation or on the world outside the ATP—Congress, for example.

How Effectively Does the ATP Use the Results of Evaluation?

Evaluators generally use the terms formative and summative to describe evaluations of a program; the former is directed at improving a program's per-

Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×

Page 125

formance, and the latter at assessing its overall results. Summative evaluations generally are for experimental programs, where decisions must be made about expanding, continuing, or terminating a program. To quote Adam Jaffe, “We know enough about spillover prediction and measurement to improve the ATP's project selection and evaluation of outcomes using more systematic and explicit treatment.” 10

As an outside observer, I do not know, however, if the ATP incorporates findings from its assessments in program activities. I do know, from my work with other programs and agencies, that the record of use is mixed. Sometimes the information is ignored. Sometimes it is rejected for good reason. Sometimes it is ignored or rejected for other than good reason. It would be very useful to ask the researchers who have evaluated the ATP how effectively their work has been used.

Public Dissemination of Findings

One of the desirable traits of the ATP assessment program is that its findings have been widely published. This public airing of findings is in contrast to many other programs which may use evaluations for internal purposes but do not release the results. These internal assessments produce a fugitive literature at best, with no true accountability. Again, the ATP is performing admirably in this respect.

What are the Impacts of the ATP's Evaluation on Outside DecisionMakers?

Tolstoy once observed, “Doing good will not make you happy but doing bad will surely make you unhappy.” An evaluator's corollary is “A good evaluation showing bad results will surely kill a program, but an evaluation that shows good results may not save the program.” One might add that a good evaluation may not kill a bad program.

In discussing the impacts of assessments on the political environment for the ATP, I draw on the work of Paul Hallacher at Penn State. 11 He has contrasted the political histories and current status of the Manufacturing Extension Partnership (MEP) program and the ATP. Each began about the same time, not from grassroots movements, but from policy networks. MEP has since developed a strong political base. While its funding may be questioned in Congress. It has become an accepted part of the federal agenda, while the ATP has not.

Why is This?

What are the differences between the programs that have led to these different outcomes? One major factor is that MEP has involved state matching funds

10 Adam B. Jaffe, “The importance of ‘spillovers' in the policy mission of the Advanced Technology Program,” Journal of Technology Transfer, 23(2):11-19, 1997.

11 P. M. Hallacher, Effects of Policy Subsystem Structure on Policymaking: The Case of the Advanced Technology Program and the Manufacturing Extension Partnership, University Park, PA: The Pennsylvania State University, 2000.

Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×

Page 126

and therefore greater buy-in from the states. It also has a broader geographical spread. MEP has reliable support in most but not all state capitals. It also partakes of the cachet of small- and medium-sized business and the strong political support this generates. The ATP, by way of contrast, still confronts charges of “corporate welfare.”

Institutionalizing the Program: Sprints vs. Marathons

We have heard much about horse races this morning. I'm uncomfortable with these analogies (having never sat on a horse until quite recently). Instead, having done some running, I would liken ATP to a middle-distance runner. Evaluations, though, are often like sprints, which are often done under a tight schedule to meet some congressional or other funding deadline.

But the race is really a marathon. The winner is determined by who tires first—the program's opponents or its proponents. The challenge is to convert ATP—documented impact by documented impact—into a credible, institutionalized part of the federal science and technology apparatus and slowly defuse the ideological objections of its opponents.

The ATP assessment program is a model for other U.S. technology programs. As a dispassionate rationalist, I would like to believe that over time, U.S. policymakers would find these assessments persuasive.

DISCUSSANTS

Mr. Goldston remarked that he was neither a horseman nor a runner but was willing to carry forward the sporting analogies: The congressional approach is to view the race as both a marathon and a relay race—and one in which no one knows what happened in the previous lap. He then introduced Dr. Nicholas Vonortas of George Washington University and Jim Turner of the minority staff of the House Science Committee.

Nicholas Vonortas

George Washington University

In opening his remarks, Dr. Vonortas asked whether “we might have a business model in this room,” asking next “Why are such meetings not being broadcast to the real decision makers in Congress?” He noted that he is one of a small group of economists, who as undergraduates in the mid-1970s became interested in technological change. At that time, we suddenly developed an urgent need for new (energy-related) technologies. The available literature was of two kinds. First was mainstream economics, with a few people paying attention to technology.

Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×

Page 127

Second was the literature in development economics, fairly disassociated from mainstream economics, dealing with very important questions related to technology and economic development. The business literature was still disconnected from economics proper and looked down upon by economists.

Then in the 1980s the United States learned that it had a major problem of “competitiveness.” With relatively few exceptions, economists in industrialized countries had just started paying attention to questions of technology, shifting comparative advantage, and competitiveness. Their understanding of the field was still very rudimentary. Policy makers relied on this not-very-well-developed literature for guidance on setting up programs such as the ATP. 12 This movement produced a series of significant new laws, such as the National Cooperative Research Act of 1984, which eased antitrust review of cooperative research by reducing civil penalties and raising the standard of proof.

To me the associated policy deliberation sounded simplistic. American policy makers said, “The Japanese are doing it, so we'll do it, too.” Some economists were vaguely aware that there was something called “spillovers.”

In 1990, I landed a few blocks away from the Academy, at George Washington University, in the Department of Economics and in a graduate program on science and technology policy. That was the first year of ATP funding. Since then both the ATP and I have grown together. The ATP has been bold in its assessment program and has allowed the two literatures of economics and business to merge. Economists now understand a little better the direct and indirect paths that lead to innovations.

Policy makers have also grown more sophisticated—learning, for example, about complex and simple technologies. This progress is important, but it is not enough. As a result of ATP and other work, we know that innovation involves much more than technology. Because of the globalization of our world, we will need more intelligent policy to support high-technology industry.

European policy makers are perhaps more comfortable with programs like ATP. Part of this comfort reflects the fact that European policy makers never relied on the defense spin-off model to the extent Americans did. Also, the European states have a long tradition of support for national champions and “strategic sectors,” though definitions of these sectors has changed over time. From an academic perspective, part of this relative comfort can be traced to different academic traditions. Overseas, studies of the economic impact of technology are placed under the heading of “socioeconomic research.” Here in the States we consider it much more straightforward economic research, which obviously places certain limitations to the arguments one can use. Needless to say, we are also aware that quality of life is what is really at stake.

12 A summary of these programs can be found in the Introduction to this volume.

Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×

Page 128

James Turner

House Science Committee

Mr. Turner opened his remarks with a point of comparison. The ATP, compared with other federal technology programs, is unquestionably far ahead of the pack in its use of program evaluation. In fact, other programs are beginning to apply some of the ATP techniques. At the same time, the program remains in an unstable budgetary position, and always seems to be threatened with extinction. The many GAO reports on the ATP have been very influential in this debate. 13 The purported, but often unexamined, virtues of small business have also held sway over many of those in Congress who authorize funds for ATP. This has led to some of the tinkering with the program's funding, structure, and goals, creating the uncertainty about which IBM's Kathleen Kingscott and others have complained today.

The first hearings in the series that led to the ATP was on April 28, 1987. The hearings were remarkably bipartisan, largely, perhaps, because of the sense of a common adversary in Japan. Sherwood Fawcett, CEO of Battelle Memorial Institute, appeared, explaining how xerography (based on a 1938 process patented by Battelle Memorial Institute) almost died over the decades before it was successfully commercialized. The committee discussed the need for patient capital in such cases.

Rosalie Ruegg's presentation was heartening to him for many reasons. One of them is the fact that she stressed the cancellation of projects that do not work out. If a program makes multi-year funding awards, it should not let projects that are clearly failures go on simply because they were once funded. Turner said that he did not know whether 40 projects terminated out of 468 is the right number but that he believed that 30 would be too few.

Turner was also struck by Dr. Feller's comparison of the political context of MEP and ATP. Both programs were founded under the same statute, the 1988 Omnibus Trade and Competitiveness Act. MEP was designed to have a strong base in each of the 50 states, and therefore quickly became politically invulnerable. The ATP has never enjoyed that deep and broad support, perhaps because the program, despite its merits, has not had the resources to aid companies in a large number of Congressional districts.

QUESTIONS FROM THE AUDIENCE

Lewis Branscomb endorsed Dr. Feller's maxim, “A good evaluation showing bad results will surely kill a program, but an evaluation that shows good results may not save the program.” Conversely, he added, if a program is

13 See Box F, “GAO Reviews of the ATP” in the Introduction.

Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×

Page 129

politically strong, it does not matter whether you do the evaluations at all. This suggests, he said, that evaluation is largely irrelevant to the political support a program may have. We tend to think of evaluation as measuring program performance (the functional goals specified in the law). The federal agencies that are charged with carrying out the law tend to watch the legislative process, and then—assuming that the political phase is over—carry on with administering the program. Political evaluation as well as economic evaluation is needed. Few federal agencies practice intellectually sophisticated political evaluation.

The Political Impact of Project Termination

These issues go right to the heart of the assessment process, and we should continue discussing them, said Dr. Wessner. What, he asked, is the political impact when a program does what is almost never done, that is, admit failure and identify the projects that do not work?

David Goldston agreed that ATP is nearly unique in this respect. He found this a positive feature of the program, so long as the failure rate is substantially below 100 percent. On the other hand, he was disturbed by ATP's data, presented by Rosalie Ruegg, suggesting that 13 percent of the terminations involved joint ventures that failed to reach agreement and did not start their projects. The key point to remember is that failures are inevitable, and we should be aware of that likelihood and shut projects down when necessary.

The other fundamental point, Mr. Goldston observed, is whether the projects would have been done anyway without federal money, and more generally whether there are other more productive uses of federal money. These questions have been addressed by the ATP's assessment program, but they remain at the crux of one of the difficult—perhaps insoluble—political questions about the ATP. 14

High Risk Means Some Failure

If ATP had a 100-percent success rate, said Jim Turner, it would be evidence that the program is not selecting risky enough technology projects. One of the rationales for the ATP is that it is a high-risk program, which tackles projects that would not be done by private investors. The termination rate (about 8.5 percent of projects) is evidence that substantial risks are being undertaken. The statistic that should give you pause, if you worry that the private sector would undertake these projects anyway, is the 2 percent of projects that were terminated because they succeeded ahead of schedule. That would raise the question of whether the

14 Dr. Feldman's research addresses this point and provides an empirical response to the question of whether the projects would have been done without ATP funding in the negative. She finds the projects generally would not be carried out in the absence of ATP funding. See her paper in this volume.

Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×

Page 130

projects are not risky enough, and whether ATP is skimming the cream of investments that would be more appropriately made by the private sector.

Incorporating Evaluation Results

Rosalie Ruegg rose to clarify several points, in particular the observations about the project termination statistics:

  • the two projects terminated because they succeeded earlier were cases in which the teams found better ways to get to the projects' goals; and

  • ATP funds were never released to the two joint ventures that were terminated because they failed to reach agreement.

Taking up Dr. Feller's question about whether the ATP uses its evaluation results to refine the program's selection criteria and its other operations, Ms. Ruegg said the answer to that question is that the ATP devotes a great deal of attention to feeding its evaluation results back into the selection process. Three economists from the evaluation staff, for example, provided training to the FY2000 selection boards, drawing specifically on evaluation results for their material. As more evaluation results are accumulated, this interaction becomes richer and its impact on the selection process greater.

Even project failures can have successful elements, she stressed, because they result in additions to the knowledge base. The “indirect path” may mean that other companies carry certain aspects of a project forward into the market. Many of the terminated projects, for example, had patenting activity before they ended, and evaluators can often discern signs of knowledge from those patents being exploited by others.

Jim Turner observed that another indirect impact of the projects was its benefits for NIST. The ATP helps keep the agency on the cutting edge of technology and reinforces its ability to perform its mission.

Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×
Page 117
Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×
Page 118
Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×
Page 119
Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×
Page 120
Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×
Page 121
Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×
Page 122
Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×
Page 123
Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×
Page 124
Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×
Page 125
Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×
Page 126
Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×
Page 127
Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×
Page 128
Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×
Page 129
Suggested Citation:"Panel II: ATP's Assessment Program." National Research Council. 2001. The Advanced Technology Program: Assessing Outcomes. Washington, DC: The National Academies Press. doi: 10.17226/10145.
×
Page 130
Next: Panel III: Stimulating R&D Investment »
The Advanced Technology Program: Assessing Outcomes Get This Book
×
Buy Paperback | $80.00 Buy Ebook | $64.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

This report examines the operations of the APT, reviews its extensive assessment program, and provides NRC Committee findings concerning the ATP’s operations and recommendations for potential improvements to the program. The report includes a summary of a major conference held in April 2000 as well as seven papers, including surveys of the industry participants or users of the ATP program, a summary of the results of fifty awards, detailed assessments of major joint ventures, and a description of the current selection process. It is the most comprehensive study to date of the program’s origins, operations, achievements, and assessment. Its conclusion: the program works.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!