National Academies Press: OpenBook
« Previous: Summary
Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

1
Introduction

1.1
SBIR—PROGRAM CREATION AND ASSESSMENT

Created in 1982 by the Small Business Innovation Development Act. the Small Business Innovation Research (SBIR) program was designed to stimulate technological innovation among small private-sector businesses while providing the government cost-effective new technical and scientific solutions to challenging mission problems. SBIR was also designed to help to stimulate the U.S. economy by encouraging small businesses to market innovative technologies in the private sector.1

As the SBIR program approached its twentieth year of existence, the U.S.

1

The SBIR legislation drew from a growing body of evidence, starting in the late 1970s and accelerating in the 1980s, which indicated that small businesses were assuming an increasingly important role in both innovation and job creation. David L. Birch, “Who Creates Jobs?” The Public Interest 65:3-14, 1981. This evidence gained new credibility with the Phase I empirical analysis by Zoltan Acs and David Audretsch of the U.S. Small Business Innovation Data Base, which confirmed the increased importance of small firms in generating technological innovations and their growing contribution to the U.S. economy. See Zoltan Acs and David Audretsch, Innovation and Small Firms, Cambridge MA: MIT Press, 1990. For the importance of small businesses to job creation, see also Steven J. Davis, John Haltiwanger, and Scott Schuh, “Small Business and Job Creation: Dissecting the Myth and Reassessing the Facts,” Business Economics 29(3):113-122, 1994. More recently, a report by the Organization for Economic Cooperation and Development (OECD) notes that small and medium-sized enterprises are attracting the attention of policy makers, not least because they are seen as major sources of economic vitality, flexibility, and employment. Small business is especially important as a source of new employment, accounting for a disproportionate share of job creation. See Organisation for Economic Co-operation and Development, Small Business Job Creation and Growth: Facts, Obstacles, and Best Practices, Paris: Organisation for Economic Co-operation and Development, 1997.

Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

Congress requested that the National Research Council (NRC) of the National Academies conduct a “comprehensive study of how the SBIR program has stimulated technological innovation and used small businesses to meet federal research and development needs,” and make recommendations on improvements to the program.2 Mandated as a part of SBIR’s renewal in December 2000, the NRC study has assessed the SBIR program as administered at the five federal agencies that together make up 96 percent of SBIR program expenditures. The agencies are, in decreasing order of program size: the Department of Defense (DoD), the National Institutes of Health (NIH), the National Aeronautics and Space Administration (NASA), the Department of Energy (DoE), and the National Science Foundation (NSF). The SBIR program at DoD is the largest of all the SBIR programs. At $943 million in 2005, DoD accounts for over half the program’s funding.

The NRC Committee assessing the SBIR program was not asked to consider if SBIR should exist or not—Congress has affirmatively decided this question on three occasions.3 Rather, the Committee was charged with providing an evidence based assessment of the program’s operations, achievements, and challenges as well as recommendations to improve the program’s effectiveness.

1.2
SBIR PROGRAM STRUCTURE

Eleven federal agencies are currently required to set aside 2.5 percent of their extramural research and development budget exclusively for SBIR contracts. Each year these agencies identify various R&D topics, representing scientific and technical problems requiring innovative solutions, for pursuit by small businesses under the SBIR program. These topics are bundled together into individual agency “solicitations”—publicly announced requests for SBIR proposals from interested and qualifying small businesses. A small business can identify an appropriate topic it wants to pursue from these solicitations and, in response, propose a project for an SBIR grant, a process now immensely facilitated by the Internet. The required format for submitting a proposal is different for each agency. Proposal selection also varies, though peer review of proposals on a competitive basis by experts in the field is typical. Each agency then selects the proposals that are found best to meet program selection criteria, and awards contracts or grants to the proposing small businesses. Since the SBIR program’s inception at DoD, all SBIR awards have been contracts awarded on a competitive basis.

As conceived in the 1982 Act, SBIR’s grant-making process is structured in three phases at all agencies:

  • Phase I grants essentially fund feasibility studies in which award win-

2

See Public Law 106-554, Appendix I—H.R. 5667, Section 108.

3

These are the 1982 Small Business Development Act, and the subsequent multiyear reauthorizations of the SBIR program in 1992 and 2000.

Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

ners undertake a limited amount of research aimed at establishing an idea’s scientific and commercial promise. Today, the legislation anticipates Phase I grants as high as $100,000.4

  • Phase II grants are larger—the legislated amount is $750,000—and fund more extensive R&D to further develop the scientific and commercial promise of research ideas.

  • Phase III. During this phase, companies do not receive additional funding from the SBIR program. Instead, grant recipients should be obtaining additional funds from a procurement program (if available) at the agency that made the award, from private investors, or other sources of capital. The objective of this phase is to move the technology from the prototype stage to the marketplace.

The Phase III Challenge

Obtaining Phase III support is often the most difficult challenge for new firms to overcome. In practice, agencies have developed different approaches to facilitate SBIR grantees’ transition to commercial viability; not least among them are additional SBIR grants.5 The multiple approaches taken to address the Phase III challenge are described in Chapter 5. The Department of Defense has shown considerable initiative in its efforts to enhance commercialization and capture returns for the program. Unlike some major agency participants in the program (e.g., NIH & NSF), DoD seeks to acquire and use many of the technologies and products developed through the SBIR program.

Previous NRC research has shown that firms have different objectives in applying to the program. Some want to demonstrate the potential of promising research but may not seek to commercialize it themselves. Others seek to fulfill agency research requirements more cost-effectively through the SBIR program than through the traditional procurement process. Still others seek a certification of quality (and the investments that can come from such recognition) as they push science-based products towards commercialization.6

4

With the agreement of the Small Business Administration, which plays an oversight role for the program, this amount can be substantially higher in certain circumstances and is also often lower, especially with smaller SBIR programs, e.g., EPA or the Department of Agriculture.

5

The Phase III challenge was explored at a conference convened at the National Academies on June 14, 2005. The proceedings of this conference are reported in National Research Council, SBIR and the Phase III Challenge of Commercialization, Charles W. Wessner, ed., Washington, DC: The National Academies Press, 2007.

6

See Reid Cramer, “Patterns of Firm Participation in the Small Business Innovation Research Program in Southwestern and Mountain States,” in National Research Council, The Small Business Innovation Research Program: An Assessment of the Department of Defense Fast Track Initiative, Charles W. Wessner, ed., Washington, DC: National Academy Press, 2000.

Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

1.3
SBIR REAUTHORIZATIONS

The SBIR program approached reauthorization in 1992 amidst continued concerns about the U.S. economy’s capacity to commercialize inventions. Finding that “U.S. technological performance is challenged less in the creation of new technologies than in their commercialization and adoption,” the National Academy of Sciences at the time recommended an increase in SBIR funding as a means to improve the economy’s ability to adopt and commercialize new technologies.7

Following this report, the Small Business Research and Development Enhancement Act of 1992 (P.L. 102-564), which reauthorized the SBIR program until September 30, 2000, doubled the set-aside rate to 2.5 percent.8 This increase in the percentage of R&D funds allocated to the program was accompanied by a stronger emphasis on the commercialization of SBIR-funded technologies.9 Legislative language explicitly highlighted commercial potential as a criterion for awarding SBIR grants. For Phase I awards, Congress directed program administrators to assess whether projects have “commercial potential,” in addition to scientific and technical merit, when evaluating SBIR applications.

The 1992 legislation mandated that program administrators consider the existence of second-phase funding commitments from the private sector or other non-SBIR sources when judging Phase II applications. Evidence of third-phase follow-on commitments, along with other indicators of commercial potential, was also to be sought. Moreover, the 1992 reauthorization directed that a small business’s record of commercialization be taken into account when evaluating its Phase II application.10

The Small Business Reauthorization Act of 2000 (P.L. 106-554) extended SBIR until September 30, 2008. It called for a two-phase assessment by the

7

See National Research Council, The Government Role in Civilian Technology: Building a New Alliance, Washington, DC: National Academy Press, 1992, p. 29.

8

For fiscal year 2003, this has resulted in a program budget of approximately $1.6 billion across all federal agencies, with the Department of Defense having the largest SBIR program at $834 million, followed by the National Institutes of Health (NIH) at $525 million. The DoD SBIR program, is made up of 10 participating components: Army, Navy, Air Force, Missile Defense Agency (MDA), Defense Advanced Research Projects Agency (DARPA), Chemical Biological Defense (CBD), Special Operations Command (SOCOM), Defense Threat Reduction Agency (DTRA), National Imagery and Mapping Agency (NIMA), and the Office of Secretary of Defense (OSD). NIH counts 23 separate institutes and agencies making SBIR awards, many with multiple programs.

9

See Robert Archibald and David Finifter, “Evaluation of the Department of Defense Small Business Innovation Research Program and the Fast Track Initiative: A Balanced Approach,” in National Research Council, The Small Business Innovation Research Program: An Assessment of the Department of Defense Fast Track Initiative, op. cit, pp. 211-250.

10

A GAO report had found that agencies had not adopted a uniform method for weighing commercial potential in SBIR applications. See U.S. General Accounting Office, Federal Research: Evaluations of Small Business Innovation Research Can Be Strengthened, AO/RCED-99-114, Washington, DC: U.S. General Accounting Office, 1999.

Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

National Research Council of the broader impacts of the program.11 The goals of the SBIR program, as set out in the 1982 legislation, are: “(1) to stimulate technological innovation; (2) to use small business to meet federal research and development needs; (3) to foster and encourage participation by minority and disadvantaged persons in technological innovation; and (4) to increase private sector commercialization innovations derived from federal research and development.

1.4
STRUCTURE OF THE NRC STUDY

This NRC assessment of SBIR has been conducted in several ways. In an exceptional step, at the request of the agencies, a formal research methodology was developed by the NRC. This methodology was then reviewed and approved by an independent National Academies panel of experts.12 As the research began, information about the program was also gathered through interviews with SBIR program administrators and during two major conferences where SBIR officials were invited to describe program operations, challenges, and accomplishments.13 These conferences highlighted the important differences in the goals, and practices of the SBIR program at each agency. The conferences also explored the challenges inherent in assessing such a diverse range of program objectives and practices and the limits of using common metrics across agencies with significantly different missions and objectives.

Implementing the approved research methodology, the NRC Committee deployed multiple survey instruments and its researchers conducted a large number of case studies that captured a wide range of SBIR firms. The Committee then evaluated the results and developed both agency-specific and overall findings and recommendations for improving the effectiveness of the SBIR program at each agency. This report includes a complete assessment of the operations and achievements of the SBIR program at DoD and makes recommendations as to how it might be further improved.

11

The current assessment is congruent with the Government Performance and Results Act (GPRA) of 1993: <http://govinfo.library.unt.edu/npr/library/misc/s20.html>. As characterized by the GAO, GPRA seeks to shift the focus of government decision making and accountability away from a preoccupation with the activities that are undertaken—such as grants dispensed or inspections made—to a focus on the results of those activities. See <http://www.gao.gov/new.items/gpra/gpra.htm>.

12

The SBIR methodology report is available on the Web. National Research Council, An Assessment of the Small Business Innovation Research Program—Project Methodology, Washington, DC: The National Academies Press, 2004, accessed at <http://books.nap.edu/catalog.php?record_id=11097#toc>.

13

The opening conference on October 24, 2002 examined the program’s diversity and assessment challenges. For a published report of this conference, see National Research Council, SBIR: Program Diversity and Assessment Challenges, Charles W. Wessner, ed., Washington, DC: The National Academies Press, 2004. The second conference, held on March 28, 2003 was titled, “Identifying Best Practice.” The conference provided a forum for the SBIR Program Managers from each of the five agencies in the study’s purview to describe their administrative innovations and best practices.

Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

1.5
SBIR ASSESSMENT CHALLENGES

Program Diversity and Flexibility

At its outset, the NRC’s SBIR study identified a series of assessment challenges that must be addressed. As the October 2002 conference made clear, the administrative flexibility found in the SBIR program makes it difficult to make cross-agency assessments. Although each agency’s SBIR program shares the common three-phase structure, the SBIR concept is interpreted uniquely at each agency. At DoD, the program is spread across the three services and seven agencies involving widely different missions, ranging from missile defense to Navy submarines to Army support for special forces to the special needs of DARPA.

This flexibility is a positive attribute in that it permits each agency to adapt its SBIR program to the agency’s particular mission, scale, and working culture. For example, NSF operates its SBIR program differently than DoD because “research” is often coupled with procurement of goods and services at DoD but normally not at NSF. Programmatic diversity means that each agency’s SBIR activities must be understood in terms of their separate missions and operating procedures. While commendable in itself, this diversity of objectives, procedures, mechanisms, and management makes an assessment of the program as a whole more challenging.

Nonlinearity of Innovation

A second challenge concerns the linear process of commercialization implied by the design of SBIR’s three phase structure.14 In the linear model, illustrated in Figure 1-1, innovation begins with basic research supplying a steady stream of fresh and new ideas. From among these ideas, those that show technical feasibility become innovations. Such innovations, when further developed by firms, can become marketable products driving economic growth.

As NSF’s Joseph Bordogna observed at the launch conference, innovation almost never takes place through a protracted linear progression from research to development to market. Research and development drives technological innovation, which, in turn, opens up new frontiers in R&D. True innovation, Bordogna noted, can spur the search for new knowledge and create the context in which the next generation of research identifies new frontiers. This nonlinearity, illustrated in Figure 1-2, underscores the challenge of assessing the impact of the SBIR

14

This nonlinear perception was underscored by Duncan Moore: “Innovation does not follow a linear model. It stops and starts.” See the National Research Council, SBIR: Program Diversity and Assessment Challenges, op. cit.

Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
FIGURE 1-1 The Linear Model of Innovation.

FIGURE 1-1 The Linear Model of Innovation.

FIGURE 1-2 A Feedback Model of Innovation.

FIGURE 1-2 A Feedback Model of Innovation.

program’s individual awards. Inputs do not match up with outputs according to a simple function.15

Measurement Challenges

A third assessment challenge relates to the measurement of outputs and outcomes. Program realities can and often do complicate the task of data gathering. In some cases, for example, SBIR recipients receive a Phase I award from one agency and a Phase II award from another. In other cases, multiple SBIR awards may have been used to help a particular technology become sufficiently mature to reach the market. Also complicating matters is the possibility that for any particular grantee, an SBIR award may be only one among other federal and

15

For a higher level view that pure research and applied research can be considered as independent variables rather than as the extremes of a linear dichotomy of pure vs. applied research, see Donald E. Stokes, Pasteur’s Quadrant, Basic Science and Technological Innovation, Washington, DC: Brookings Institution Press, 1997.

Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

nonfederal sources of funding. Causality can thus be difficult, if not impossible, to establish.

The task of measuring outcomes is also made harder because companies that have garnered SBIR awards can also merge, fail, or change their name before a product reaches the market. In addition, principal investigators or other key individuals can change firms, carrying their knowledge of an SBIR project with them. A technology developed using SBIR funds may eventually achieve commercial success individually, at an entirely different company than the one that received the initial SBIR award.

Gauging Commercial Success

Complications plague even the apparently straightforward task of assessing commercial success. For example, research enabled by a particular SBIR award may take on commercial relevance in new unanticipated contexts. At the launch conference, Duncan Moore, former Associate Director of Technology at the White House Office of Science and Technology Policy (OSTP), cited the case of SBIR-funded research in gradient index optics that was initially considered a commercial failure when an anticipated market for its application did not emerge. Years later, however, products derived from the research turned out to be a major commercial success.16 Today’s apparent dead end can sometimes be a lead to a major achievement tomorrow, while others are, indeed, dead ends. Yet, even technological dead ends have their value, especially if they can be determined for the low costs associated with an SBIR award.

Gauging commercialization is also difficult when the product in question is destined for public procurement. The challenge is to develop a satisfactory measure of how useful an SBIR-funded innovation has been to an agency mission. A related challenge is determining how central (or even useful) SBIR awards have proved to be in developing a particular technology or product. Often, multiple SBIR awards and other funding sources contribute to the development of a product or process for DoD. In some cases, the Phase I award can meet the agency’s need—completing the research with no further action required. In other cases, Phase II awards, supplemental funding, and substantial management and financial resources are required for “success.”

Measurement challenges are substantial. For example, one way of measuring commercialization success is to count product sales. Another is to focus on the products developed using SBIR funds that are procured by DoD. In practice, however, large procurements from major suppliers are typically easier to track than products from small suppliers such as SBIR firms. In other cases, successful Phase II awards are just that—they meet the agency need and no further commer-

16

Duncan Moore, “Turning Failure into Success” in National Research Council, SBIR: Program Diversity and Assessment Challenges, op. cit., p. 94.

Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

cialization takes place. In other cases, substantial commercialization occurs, and then ceases as a promising firm or technology is acquired by a defense supplier.

Moreover, successful development of a technology or product does not always translate into successful “uptake” by the procuring agency. Often, the absence of procurement may have little to do with the product’s quality or the potential contribution of SBIR. Small companies, especially new entrants to the program, entail greater risk for program officers. Perceived uncertainties about reliability, timeliness of supply, and risks of program delays all militate against acquisition of successful technologies from new, unproven firms.

Understanding and Anticipating Failure

Understanding failure is equally challenging. By its very nature, an early-stage program such as SBIR should anticipate a significant failure rate. The causes of failure are many. The most straightforward, of course, is technical failure, where the research objectives of the award are not achieved. In some cases, the project can be a technically successful but a commercial failure. This can occur when a procuring agency changes its mission objectives and hence its procurement priorities. NASA’s new Mars Mission is one example of a mission shift that may result in the cancellation of programs involving SBIR awards to make room for new agency priorities. Cancelled weapons system programs at the Department of Defense can have similar effects.

Technologies procured through SBIR may also fail in the transition to acquisition. Some technology developments by small businesses do not survive the long lead times created by complex testing and certification procedures required by the Department of Defense. Indeed, small firms encounter considerable difficulty in surmounting the long lead times, high costs, and complex regulations that characterize defense acquisition. In addition to complex federal acquisition procedures, there are strong disincentives, noted above, for high-profile projects to adopt untried technologies. Technology transfer in commercial markets can be equally difficult. A failure to transfer to commercial markets can occur even when a technology is technically successful if the market is smaller than anticipated, competing technologies emerge or are more competitive than expected, or the product is not adequately marketed. Understanding and accepting the varied sources of project failure in the high-risk, high-reward environment of cutting-edge R&D is a challenge for analysts and policy makers alike.

Evaluating SBIR: “Compared to What?”

This raises the issue concerning the standard by which SBIR programs should be evaluated. An assessment of SBIR must take into account the expected distribution of successes and failures in early-stage finance. As a point of comparison, Gail Cassell, Vice President for Scientific Affairs at Eli Lilly, has noted

Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

that only one in ten innovative products in the biotechnology industry will turn out to be a commercial success.17 Similarly, venture capital funds often achieve considerable commercial success on only two or three out of twenty or more investments.18

In short, commercial success tends to be concentrated. Yet, commercial success is not the only metric of the program. At the Defense Department, SBIR can and does provide a variety of valuable services and products that do not achieve widespread commercial success, even if they do have sales or licensing revenue.

In setting metrics for SBIR projects, therefore, it is important to have a realistic expectation of the success rate for competitive awards to small firms investing in promising but unproven technologies. Similarly, it is important to have some understanding of what can be reasonably expected—that is, what constitutes “success” for an SBIR award, and some understanding of the constraints and opportunities successful SBIR awardees face in bringing new products to market. This is especially relevant in the case of a constrained, regulation-driven market such as the defense procurement market. From the management perspective, the rate of success also raises the question of appropriate expectations and desired levels of risk taking. A portfolio that always succeeds would not be pushing the technology envelope. A very high rate of “success” would, thus, paradoxically suggest an inappropriate use of the program. Even when technical success is achieved, as noted above, it does not automatically transfer into commercial success for a variety of reasons related to the defense mission and to procurement procedures. Understanding the nature of success and the appropriate benchmarks for a program with this focus is therefore important to understanding the SBIR program and the approach of this study.

17

Gail Cassell, “Setting Realistic Expectations for Success,” in National Research Council, SBIR: Program Diversity and Assessment Challenges, op. cit., p. 86.

18

SBIR awards often occur earlier in the technology development cycle than where venture funds normally invest. Nonetheless, returns on venture funding tend to show the same high skew that characterizes commercial returns on the SBIR awards. See John H. Cochrane, “The Risk and Return of Venture Capital,” Journal of Financial Economics 75(1):3-52, 2005. Drawing on the VentureOne database Cochrane plots a histogram of net venture capital returns on investments that “shows an extraordinary skewness of returns. Most returns are modest, but there is a long right tail of extraordinary good returns. Fifteen percent of the firms that go public or are acquired give a return greater than 1,000 percent! It is also interesting how many modest returns there are. About 15 percent of returns are less than 0, and 35 percent are less than 100 percent. An IPO or acquisition is not a guarantee of a huge return. In fact, the modal or “most probable” outcome is about a 25 percent return.” See also Paul A. Gompers and Josh Lerner, “Risk and Reward in Private Equity Investments: The Challenge of Performance Assessment,” Journal of Private Equity 1(Winter 1977):5-12. Steven D. Carden and Olive Darragh, “A Halo for Angel Investors,” The McKinsey Quarterly 1, 2004 also show a similar skew in the distribution of returns for venture capital portfolios.

Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

1.6
SBIR ASSESSMENT RESULTS

Drawing on interviews, multiple survey instruments and case studies, and overcoming many of the research challenges identified above, the NRC Committee has developed a number of findings and practical recommendations for improving the effectiveness of the SBIR program at the Department of Defense.

The Committee found that the SBIR program at DoD is, in general, meeting the legislative and mission-related objectives of the program. The program is contributing directly to enhanced capabilities for the Department of Defense and the needs of those charged with defending the country.

Further, the Committee found that the DoD program also provides substantial benefits for small business participants in terms of market access, funding, and recognition. The program supports a diverse array of small businesses contributing to the vitality of the defense industrial base while providing greater competition and new options and opportunities for DoD managers. In addition, the Committee noted that the DoD SBIR program is generating significant intellectual capital, contributing to new scientific and technological knowledge, and generating numerous publications and patents.

The Committee’s recommended improvements to the program have been designed to enable the DoD SBIR managers to address the program’s congressional goals more efficiently and effectively. These include further work to improve the Phase III transition by (among other approaches) changing incentives faced by program managers so that they are motivated to make better use of the SBIR program. The Committee also recommends that additional funding should be provided for program management and assessment in order to encourage and support the development of an innovative and results-oriented SBIR program. The Committee’s complete findings and recommendations are listed in Chapter 2.

Chapter 3 provides a comprehensive overview of the distribution of SBIR awards by DoD, providing a basis (as drawn out in Chapter 4) for understanding program outcomes. Chapter 5 describes the Phase III challenge of commercialization at DoD. Chapter 6 describes the diversity of management structures as well as current practices and recent reforms found among the different services and agencies that fund SBIR programs at DoD. Together, this report provides the most detailed and comprehensive picture to date of the SBIR program at the Department of Defense.

Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 12
Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 13
Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 14
Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 15
Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 16
Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 17
Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 18
Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 19
Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 20
Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 21
Suggested Citation:"1 Introduction." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 22
Next: 2 Findings and Recommendations »
An Assessment of the SBIR Program at the Department of Defense Get This Book
×
Buy Hardback | $122.00 Buy Ebook | $99.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The SBIR program allocates 2.5 percent of 11 federal agencies' extramural R&D budgets to fund R&D projects by small businesses, providing approximately $2 billion annually in competitive awards. At the request of Congress, the National Academies conducted a comprehensive study of how the SBIR program has stimulated technological innovation and used small businesses to meet federal research and development needs. Drawing substantially on new data collection, this book examines the SBIR program at the Department of Defense and makes recommendations for improvements. Separate reports will assess the SBIR program at NSF, NIH, DOE, and NASA, respectively, along with a comprehensive report on the entire program.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!