National Academies Press: OpenBook
« Previous: 7 Contributions to Knowledge
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

8
Program Management

The discussion of program management includes developing topics, planning and implementing the grant cycle, conducting outreach to stimulate grant-worthy proposals, selecting proposals for grant, encouraging firms to take the next steps needed to develop and commercialize their technologies, bridging across funding gaps, implementing reporting by grantees, and evaluating program results. The treatment of program management also covers the degree of program flexibility, size of grants, online capabilities, and administration of the program.

8.1
TOPIC DEVELOPMENT AND SELECTION

8.1.1
Topics

As the Small Business Innovation Research (SBIR) program evolved from control by the National Science Foundation’s (NSF’s) research divisions to centralized management in the mid-to-late 1990s, solicitation topics were reduced in number and oriented less toward scientific disciplines and more toward broad technology areas that would better mesh with business sectors. A purpose was to orient the topics in a way that would increase private-sector commercialization of the innovations derived from the grants. “The NSF SBIR/STTR [Small Business Technology Transfer] program aligned the solicitation topics with external investment and market opportunities and simultaneously preserved the science and engineering alliances with the NSF directorates.”1

1

Office of Industrial Innovation (OII), Strategic Plan (draft, June 2, 2005), p. 13.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

In 2005, the program identified the following seven broad topic areas:

Biotechnology (BT)

Electronics Technology (EL)

Information Based Technology (IT)

Advanced Materials and Manufacturing (AM)

Chemical Based Technology (CT)

Security Based Technology (ST)

Manufacturing Innovation (MI)

When a solicitation is held in a topic area, it is fleshed out with subtopics. The subtopics add specificity to the solicitation. (For additional discussion, see Chapter 4.)

For example, Table 8.1-1 shows the 2004 “Advanced Materials” (AM) second- and third-level subtopics. Note that the third level serves to eliminate, as well as to define, areas of inquiry. AM included “Manufacturing” and “Chemical Processes,” but these are shown here only to the second level.

In its first solicitation, the Securities Technologies (ST) topic area was defined as cross-disciplinary, and proposals submitted under this topic had to represent the convergence of at least two of the following three technologies: nanotechnology, biotechnology, and information technology (both hardware and software). Proposals also had to “be responsive to a subtopic within the solicitation,” listed in Table 8.1-2, effective April 2004.

8.1.2
Sources for Topic Ideas

According to NSF SBIR program management:

… topics are rooted in the agency’s vision and strategic goals. In particular, SBIR and STTR are uniquely positioned to emphasize NSF vision of innovation. Since NSF is not the final customer for the SBIR/STTR grantees, it is imperative that our grantees are positioned to tap into private sector capital, which is essential for commercializing the technology developed under the SBIR grant. Therefore, NSF topics reflect the market opportunity and are aligned with the broad investment business. At the same time the topics also resonate with the science and engineering disciplines that NSF supports within its Directorates and Divisions. (SOURCE: “NSF SBIR Response to NRC Questions,” January 2004)

This study found no formal process for soliciting outside input in the generation of NSFs topic ideas—such as a white-paper process used to develop thematic ideas in concert with industry. Rather, the sources for topic and subtopic ideas were said to come from NSFs program managers as they interact with industry and others at conferences and workshops, or through an approach devised by a program manager and approved by the SBIR director.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

TABLE 8.1-1 Advanced Materials, Second- and Third-Level Topics—Example 1 from the 2004 Solicitation

  1. Advanced Materials

    1. Environmentally benign technology

      • Improved techniques for recycling

      • Processing of recycled materials

      • Pollution prevention/avoidance processes

    1. High temperature materials

      • Metal, ceramic, and composite materials developed for high temperature applications (e.g., improved turbine blade materials/processing)

    1. Structural materials

      • Improved strength, toughness, fracture resistance, etc., materials

      • Processing and material improvements

    1. Corrosion-resistant coatings

      • Surface coatings and modifications which lead to improved corrosion resistance

      • Improvements in materials for corrosion resistance

    1. Tribological and wear-resistant coatings

      • Surface coatings and modifications which lead to improved wear resistance and/or reduced friction

      • Material improvements in tribology/wear

    1. Engineered materials

      • Improved processing and/or materials with engineering applications other than those listed above (No nanotechnology, biotechnology, or electronic materials)

      • Smart materials

      • Shape memory alloys

    1. Surface modification and thin film technology

      • Process improvements for modifying surfaces and applying thin films

      • Material improvements related to process modification that are not related to corrosion or tribology

  1. Manufacturing (also developed to a third level—but only shown to level 2)

    1. Polymer processing and rheology

    2. Casting/molding processes

    3. Machining and material removal processes

    4. Deformation processes

    5. Powder material processing

    6. Composite manufacturing processes

    7. Additive manufacturing

    8. Manufacturing process control

    9. Machine design

    10. Joining and assembly processes

    11. Nontraditional material removal processes

  1. Manufacturing systems

    1. Chemical Processes (also developed to a third level—but only shown to level 2)

    2. Separations applications

    3. Novel catalytic systems

    4. Photochemical or electrochemical applications

    5. Fluid flow applications

    6. Combustion-related processes

    7. Applications of plasma technology

    8. Thermal energy applications

    9. Reactor engineering applications

    10. Chemical technology

NOTE: The first-level topic is Advanced Materials; the second-level topics are those numbered; the third-level topics are those preceded by bullets.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

TABLE 8.1-2 Security Technologies, with Subtopics Listed—Example 2 from the 2004 Solicitation

  1. Prevention

    1. Tools and systems for smart buildings/structures for

      • Public resources (energy, water, air) monitoring and control

      • Human resources coordination during emergency situations

  1. Networked sensors and tools to provide real-time information on structural integrity

  2. Systems beyond optical recognition (finger, facial, or retinal) that provide quick (under two minutes), unambiguous identity authentication

  1. Detection

    1. Terahertz sources and detectors

    2. Systems utilizing hardened RFID and other modalities for secure supply chain management, traceability, and counterfeit detection

    3. Multiscale integration tools including new-generation packaging to enable nano-micro-meso system integration

    4. Compact, cost-effective, environmentally friendly and long-lived power supplies (e.g., for widely disbursed wireless sensor networks)

      • Biomimetic

      • Energy-scavenging systems

      • Photovoltaic systems

      • Acoustic-voltaic systems

      • Other systems that provide energy densities exceeding 1000 Wh/kg

    1. Proteomic-based biometric systems (NOTE: NOT PCR-based)

  1. Treatment

    1. Site-specific wireless and wireline data/information systems to empower responders and emergency managers

    2. Systems and tools for wide-area rapid treatment dissemination (including agricultural applications)

    3. Systems with information management capability for rapid susceptibility

  1. Remediation

    1. Systems and approaches for chemical (including industrial), biological, or radiological event remediation

      • Homes

      • Workplaces

      • Reservoirs

    1. Stand-alone, single-use, widely dispersible sensors or detectors for environmental monitoring (NOTE: NOT lab-on-a-chip systems)

    2. Environmentally friendly agent-specific widely dispersible decontamination media

      • Organic-based

      • Inorganic-based

  1. Attribution

    1. Taggants and anticounterfeit/product authentication systems with unique spectral and other signatures

    2. Field-deployable front-end sample preparation systems to extend the reach of laboratory-based analytic equipment

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

An example of the latter, an NSF SBIR program manager promoted revision of the manufacturing topic. According to the program manager,2 the manufacturing topic area had, over time, become “stale” and “not enough connected to the real world of manufacturing, such that introducing high tech alone would not help manufacturing’s competitive problem.” The manufacturing subtopics were described as holdovers from the time when the SBIR had operated as a decentralized program with close affiliation to its research divisions and with an academic orientation to the topics.

To revitalize the manufacturing topic area, the program manager put together a panel, using volunteers from the topic area’s review panels. As a result, “the topics became more open and more targeted to issues of established manufacturing. In addition, the experts sent a recommendation for more reviewers for the manufacturing area.”3

Topics are posted as part of a solicitation announcement. During the time between the initial solicitation announcement and the proposal due date, it appears that modifications may be made at the subtopic level if developments or ideas from the outside suggest to the program manager(s) that modifications are warranted. These modifications reportedly are handled by addendums to the solicitation list. Letters may be sent out to companies to alert them to the additional ideas. Supplemental funds may be provided to support newly identified areas of special interest.

Through outreach activities of program managers and postings on the program’s Web site, further information on acceptable subtopics is conveyed to the public. For example, feedback was given to potential proposers on the acceptability of homeland security as a research topic area prior to issuing the topic area and after release of the solicitation. Outreach activities by program managers can further delineate topic areas after release of the solicitation.

NSF SBIR program management provided the following statement regarding topic modification after announcement of a solicitation:

NSF SBIR/STTR topics are not modified or changed once the solicitation is announced and published on the NSF SBIR Web site. The published solicitation includes submission instructions and proposal submission deadlines. However, NSF SBIR program management will make modifications to the topics areas prior to the solicitation announcement. NSF SBIR program management will stay aligned with NSF strategic goals, make changes based on current technological trends, and give careful consideration to the market and investment community. (SOURCE: “NSF SBIR Response to NRC Questions,” January 2004)

Presumably, the apparent discrepancy between what this inquiry found and the NSF management statement lies in the distinction between topics and subtopics. Furthermore, it should be noted that what may appear at first glance to

2

Based on interview with NSF Program Manager Cheryl Albus on January 7, 2004.

3

Ibid.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

represent a change at the subtopic level, may not always actually change the acceptability of a given research theme. For example, next-generation vehicles were spelled out in one solicitation’s subtopic list, but not in the next. Yet, next-generation vehicles reportedly remained an acceptable topic of research proposed under other specified subtopics.

The program may need more transparency in defining its topics and more communication with its partnering communities. A useful model is provided by the Advanced Technology Program. (ATP). From 1994 through 1998, the ATP used a white-paper process whereby any organization or individual could propose a topic area for a competition. The purpose was to ensure maintenance of a bottom-up approach that would result in selection of topics in touch with industry and the marketplace. The program published guidelines for preparing and submitting the white papers. White papers were grouped by common themes, and when ideas gained momentum, public workshops would be held to assess the level of interest, to further develop the ideas, and, if merited, to prepare a topic description that would be widely circulated for comments. A review board periodically considered emergent ideas for topics, helped establish priority among competition topics, and recommended to the director those that might form a subprogram for a series of funding competitions. In this way, the bottom-up selection process was maintained as the program experimented with topic specification. At the same time, in parallel to the “topic competitions,” the ATP held an “open competition” as part of each solicitation to provide an open door to all topics.

8.1.3
Agency-Driven versus Investigator-Driven Approach to Topics

In most cases, NSFs topic specification leaves open the approaches and techniques that an investigator can take to respond to the particular problem or opportunity in the topic. At the same time, the NSF’s SBIR program information in the past stated that the applicant must propose within the announced topic areas or the proposal is rejected, and it currently states that the proposal may not be considered if it is not responsive to the program announcement/solicitation.4

While none of the companies interviewed in the case studies complained that the NSF overspecified topics, they did comment that the NSF SBIR program’s broad definition of topic areas is unique compared with other agency programs. However, firms did comment on the length of time they had to wait until their topic areas come up for solicitation. According to one program official, the time to wait for a topic to be repeated has gone up, to as long as eighteen months.5 Also, the larger eligible small firms were concerned that the NSF’s SBIR program limits the number of proposals a firm can submit in response to a given solicitation to four.

4

Reasons for returns are listed under “Frequently Asked Questions,” found at <http://www.nsf.gov/funding/preparing/faq/faq_r.jsp?org+IIP#returns>.

5

Telephone interview with Joseph Hennessey, NSF, March 3, 2006.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

TABLE 8.1-3 Criteria Used to Guide “Topic” Development

Criteria Used to Guide Development of Topicsa

Weight (%)

Cutting edge of the field

45

Likely commercial technologies

45

Other

10

aThe response suggested that “topic” was interpreted broadly to include subtopics.

SOURCE: NRC Program Manager Survey.

8.1.4
Topic Decision Making

The broad topic areas that were in effect several years ago were reportedly developed by the SBIR program management staff acting as a working group.6 The subtopic areas change, on average, about 35 percent each year.7 The decisions are made by the program manager(s) responsible for the solicitations, with informal input from industry and with concurrence from other members of the program management staff who edit and adjust the topics/subtopics according to the criteria assigned the weights shown in Table 8.1-3.8 The program director makes the final topic selection.9

The Office of Industrial Innovation’s (OII’s) 2005 Strategic Plan noted the “increased awareness of and the necessity for the SBIR/STTR program to be aligned with national needs.”10 It set forth an action plan to: (a) identify technologies with external investment/market focus; (b) exploit emerging discoveries from NSF-supported science, math, and engineering disciplines as subtopics; and (c) respond to national priorities set forth by the administration and other emerging or pressing societal needs. The perspective expressed in the 2005 Strategic Plan is that most of the currently identified major topics will stay relatively constant and that changes will likely be made in response to emerging national needs.

The NSF has also made decisions about topics it will not accept. Guidelines state that it will return proposals if they propose research in the following areas: (1) weapons research; (2) biomedical research (except bioengineering research); (3) any topics that fall in the area of classified research; and (4) any topics for which the primary purpose is demonstration, technical assistance, literature survey, or market research.

6

Based on interview with NSF Program Manager Cheryl Albus on January 7, 2004.

7

NRC Program Manager Survey completed by Joseph Hennessey, NSF. Note that the survey referred only to topics, not subtopics, but the nature of the responses suggested a broader interpretation than the major topics only.

8

Ibid.

9

Ibid.

10

Office of Industrial Innovation, Strategic Plan, p. 13.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

Applicants are instructed to be responsive to NSF topics if they wish their proposals to be considered. They are instructed to designate “one, and only one, of the topics, keeping in mind that a firm cannot submit more than 4 proposals per solicitation (which includes the parent company and any of its subsidiaries). The topic name and the appropriate subtopic letter MUST be identified on the cover sheet.”

In connection with its 2003 solicitation, the NSF provided the following instructions for classifying proposals by topic, noting that the NSF had made substantive revisions to subtopics under its then-four technology areas:

NSF has established a cascading decision-making procedure in selecting the fit of each sub-topic under the four broad solicitation topics. The hierarchy for the fit of sub-topic starts at the top with BT [Biotechnology], followed by EL [Electronics], followed by IT [Information Technology], and finally AM [Advanced Materials]. The following are presented as illustrative examples. If the research is biology-based, it is BT. If the research is electronics or information or materials-based for applications in biotechnology such as devices for medical or bioinformatics or biocompatible materials, it should be submitted to the BT topic. If the research is electronics or photonics or magnetism-based, it is EL. Most instrumentation outside the BT application area fits into EL. If research is information or materials-based, such as embedded software or nano carbon tubes for use as semiconductors for electronic applications, it should be submitted to the EL topic. If the research is computer science or cognitive science-based, it is IT. If the research is modeling and simulation of engineering applications with software as the resultant commercial product, it is IT. If the research is on structural materials or chemical processes, it is AM. If research is on mechanical parts or manufacturing processes, it is AM. These examples are not meant as a comprehensive list of research opportunities but to assist in finding the proper fit for research ideas under the four NSF solicitation topics.”


(SOURCE: NSF Web site, <http://www.eng.nsf/gov/sbirspecs>, as of November 2003.)

The NSF SBIR FY2006 Phase I solicitation lists four topics: Advanced Materials (AM), Information Technology (IT), Manufacturing Innovation (MI), and Emerging Opportunities (EO). Each topic is further developed under “Research Topics.” Additional information on recent developments regarding topics lists is provided in the overview of Section 4.4.3 of this report.

Funding is not apportioned among the topics/subtopics in an attempt for equality among them. Rather, the strategy is to fund the projects with the most merit. Sometimes topics/subtopics are narrowed or eliminated to reduce the number of applications.11 It was not determined when this narrowing or elimination of topics/subtopics occurs or what its effect is on applicants—both issues of possible concern.

11

NRC Program Manager Survey, op. cit.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

In conclusion, the NSF’s SBIR program headquarters decides the topic areas for the program and the subtopics for a given competition with the assistance of program managers and, sometimes, other program-manager-devised sources, such as convened panels.12 The topics are fixed once the solicitation is issued, but subtopics appear to evolve in response to ongoing developments.

8.2
OUTREACH

The NSF has an active outreach program and has sponsored the SBIR spring and fall national conferences since the inception of the program in 1982. These conferences provide information on upcoming competitions, include workshops that offer training, and allow face-to-face meeting opportunities for grantees and prospective applicants with program managers and each other. In addition to the prime audience of small businesses, these conferences are aimed at sales and marketing professionals, university researchers interested in business, scientists, prospective partners, and others.

Notifications of upcoming conferences and meetings are easy to find through an online search, which yields notices of meetings on several Web sites. For example, one site sponsored by the NSF gives substantial information about the SBIR and STTR programs.13 It lists national and regional conferences and events, details upcoming solicitations, provides links to federal agencies, provides a guide to state resources, announces partnering opportunities, and enables companies and others to join an email list to receive notices of NSF-sponsored national SBIR/STTR conferences.

The Small Business Administration (SBA) provides outreach for the program as a whole.14 State SBA-related sites, as well as other federal agency sites, are also rich sources of SBIR/STTR information for prospective applicants. The Pacific Northwest National Laboratory (PNNL) operates an agency-wide SBIR/ STTR online alert service.15

Privately operated sites also provide SBIR information. The SBIR Resource Center™ operates one such privately operated site that purports to provide up-to-date information covering ten SBIR/STTR agencies in one place.16 Another site claims to be “the most comprehensive and easy to use SBIR information site.”17

12

Based on a telephone interview with NSF Program Manager Rosemarie Wesson on December 1, 2003, and a face-to-face interview with NSF Program Manager Cheryl Albus on January 7, 2004.

13

See <http://www.sbirworld.com>.

14

See <http://www.sba.gov/sbir>.

15

The PNNL-operated online alerting service may be found at <http://www.pnl.gov/edo/sbir.stm>.

16

For more about the SBIR Resource Center, see <http://www.win-sbir.com>.

17

See the Web site for “SBIR Gateway” at <http://www.zyn.com/sbir>.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

8.2.1
Agency Outreach Objectives

A review of NSF-supported Web sites targeted at potential applicants suggests that the NSF has the objective of reaching areas that have submitted large numbers of applications as well as those that have not. Searches of NSFs outreach offerings identified many varied outreach opportunities. For example, a 2004 search of the NSF-supported SBIRworld.com revealed a national NSF SBIR outreach conference scheduled in Atlanta, Georgia, and a recently completed national conference in Boise, Idaho. Both Georgia and Idaho are states that supply a relatively low number of SBIR proposals. The search revealed a workshop, “How to Prepare Winning Proposals for SBIR and STTR,” scheduled in Livermore, California—California being the state with the highest number of SBIR applications and grants. The search showed several regional meetings—including one in North Carolina, a relatively low-application state, and one in Ohio, a relatively high-application state. Similarly, a 2005 search of the same NSF-sponsored Web site showed multiple workshops aimed at prospective applicants in diverse parts of the country. It included an “SBIR Grant Writing Workshop” to be held at Florida State University and an “SBIR/STTR Phase I Proposal Preparation Workshop” to be held at the Moore School of Business, part of the University of South Carolina. Florida is a mid-tier state in terms of numbers of applications, and South Carolina has fewer SBIR applications than average. Thus, the outreach activities of the NSF are not limited to a single geographical region or to either low- or high-application states.18

Through its association with the “Experimental Program to Stimulate Competitive Research and Institutional Development” (EPSCoR), however, the NSF’s SBIR program, like the other agency SBIR programs, has a special tie to low-application states. The EPSCoR program aims to increase the ability of states that receive a low proportion of federal research funds to become more successful in attracting such funds. It often partners with other state programs that have similar goals. For example, in conjunction with EPSCoR, Nevada’s Small Business Development Center and the Nevada Commission on Economic Development joined forces to increase the number of SBIR grants going to firms in the state. They tried to raise funds for state-based “Phase 0” competitions and to provide assistance to companies developing SBIR proposals.19 The NSF’s SBIR program receives supplemental funding from EPSCoR that allows it to participate in SBIR events in EPSCoR states. Each year, the OII schedules one of its national conferences in an EPSCoR state and one in a non-EPSCoR state.20 The OII works jointly with EPSCoR to develop a competitive research infrastructure within the

18

Not all the state and regional effects would be NSF sponsored. In some cases, when resources permit, NSF staff members participate in state SBIR-related workshops and meetings.

19

This outreach activity in the State of Nevada is described at <http://www.nevada.edu/epsor/sbir.html>.

20

Telephone interview with Joseph Hennessey, NSF, October 18, 2005.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

TABLE 8.2-1 NSF SBIR Outreach Activities and Their Relative Importance

Type of Outreach Activity

Importance as a Share of NSF’s Overall Outreach Program (%)

SBIR National Conferences

50

State conferences

15

NSF National Agency SBIR Meeting

10

Other agency conferences and outreach meetings

10

The SWIFT Bus Tour

10

Academic conferences

5

Total

100

SOURCE: NRC Program Manager Survey.

EPSCoR states and territories. According to program administrators, “Special consideration is given by the SBIR/STTR program regarding funding decisions for proposals received from EPSCoR states and territories in collaboration with the EPSCoR program.”21

8.2.2
Outreach Programs

Table 8.2-1 lists the major types of outreach activities undertaken by NSF SBIR staff and indicates the relative importance program managers place on each activity.

According to the NRC Program Manager Survey, the NSF views the SBIR national conferences as the premiere outreach activity because they draw the largest number of applicants and are the most cost-effective. Next in importance are the state conferences. The NSF assigns its own national conference a weight of 10 out of 100, indicating that it is of equal importance to other agencies’ conferences and outreach meetings and of equal importance to “the SWIFT Bus Tour”—a bus tour of program managers from several federal agencies who publicize SBIR grant opportunities by periodically traveling together to regional state-sponsored small business conferences and meetings. Academic conferences are considered by OII to be the least important of the outreach activities.

The SBIR office often engages in partnering to provide outreach services. Table 8.2-2 lists the types of organizations that partner with the NSF for outreach. As noted previously, the NSF also partners online with various sources to make its outreach activities known.

Assistance programs that help companies prepare their SBIR proposals are offered by universities, state agencies, regional associations, and mentor companies. While these programs are generally not sponsored or run by the NSF, the

21

“NSF SBIR Response to NRC Questions,” January 2004.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

TABLE 8.2-2 Partnering to Provide Outreach Services

Partners of NSF to Provide Outreach Services

Business organizations

State and other nonfederal government agencies

Academic units

Private firms

SOURCE: NRC Program Manager Survey.

program often contributes its staff to serve as speakers.22 In that sense, the events comprise a component of the NSF outreach program.

The NRC Phase II Survey provided information on the frequency with which respondents received assistance from these organizations in preparing their Phase II proposals. Most of the respondents (91 percent) received no assistance in preparing the Phase II proposals that led to the referenced grant; only 9 percent did. All of those who received assistance found it useful: Most (62 percent) found it “very useful,” and the rest (38 percent) found it “somewhat useful.” Those who did receive assistance received it from universities (5 percent), from state agencies (3 percent), and to a lesser extent from mentor companies (1 percent). None of the survey respondents received assistance from regional associations.

The case studies also provided examples of companies applying for SBIR grants with the help of these assistance programs. For example, MicroStrain tapped Vermont’s EPSCoR Phase O grants to leverage its ability to gain federal SBIR grants. T/J Technologies learned how the SBIR worked from MERRA, a Michigan-based organization aimed at boosting the state’s technology businesses. Even after MERRA was dissolved, T/J received assistance from its former staff members to obtain additional federal research funds.

It should also be noted that NSF SBIR program managers provide one-on-one counseling to individual potential applicants. Metrics for this activity are given in Section 8.2.3.

8.2.3
Agency Outreach Benchmarks and Metrics

The NRC Program Manager Survey elicited the following metrics for the NSF outreach:

  • An estimated 20 percent of a program manager’s work time is spent on outreach activities.

  • The NSF SBIR program manager attends 8 conferences on average each year, spending an average of 20 total days in attendance.

22

Telephone interview with Joseph Hennessey, NSF, October 18, 2005.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
  • The NSF SBIR program manager spends an average of about four hours per week providing one-on-one counseling to individual potential applicants (not counting existing grantees).

The NSF has achieved nationwide coverage in terms of the geographical dispersion of Phase I and Phase II applications. The program has also received applications from Puerto Rico and the U.S. Virgin Islands. By the start of 2005, Phase IIB had attracted applications from 36 states.

8.3
GRANT SELECTION

8.3.1
Description of Selection Processes for Phase I, Phase II, and Phase IIB Grants

At its Phase I and Phase II stages, the NSF’s SBIR program uses a peer review process to identify proposals for potential selection. Individual reviewers rate proposals, then, meeting as a panel, provide a consensus funding recommendation that goes to the relevant program managers, who may accept or override the recommendation. The program managers then make recommendations to NSF headquarters as to whether to fund each proposal. These recommendations may be accepted or overturned on a variety of grounds. Examples of grounds for overturning a program manager’s recommendation are that the firm had essentially the same proposal funded in another program, that there was insufficient funding to permit funding the proposal, or that one or more “Additional Factors” are triggered, such as the firm had received “an excessive number of grants.”23

The selection of proposals for grants at each phase centers on the application of the same two formal selection criteria. However, the detailed guidance on considerations to be taken into account in applying these two criteria differs for each phase. Guidance from the NSF indicates that while proposals must address both merit review criteria, reviewers are asked to address only those considerations that are relevant to the proposal being considered and for which they are qualified to make judgments. Furthermore, the considerations are termed “suggestions and not all will apply to any given proposal.” The application of “additional factors” may also enter the selection process, as discussed further in Section 8.3.3.

The two formal merit review criteria used for both Phase I and II proposals, including Phase IIB proposals, are the following:24

23

These examples are based on an interview with NSF Program Manager Rosemarie Wesson on December 1, 2003, and a face-to-face interview with NSF Program Manager Cheryl Albus onJanuary 7, 2004. The examples are consistent with later responses of OII staff to the study’s Program Manager Survey, which identified additional criteria that are sometimes applied in making grants. The use of the additional criteria was confirmed by discussions with OII management on August 17, 2005.

24

The National Science Board approved revised criteria for evaluating proposals at its meeting on March 28, 1997 (NSB 97-72).

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
  1. What is the intellectual merit of the proposed activity? (This criterion addresses the overall quality of the proposed activity to advance science and engineering through research and education.)

Underlying considerations in applying this criterion are the following:

  • Is the proposed plan a sound approach for establishing technical and commercial feasibility?

  • To what extent does the proposal suggest and explore unique or ingenious concepts or applications?

  • How well qualified is the team (the principal investigator, other key staff, consultants, and subgrantees) to conduct the proposed activity?

  • Is there sufficient access to resources (materials and supplies, analytical services, equipment, facilities, etc.)?

  • Does the proposal reflect the state-of-the-art in the major research activities proposed? (Are advancements in state-of-the-art likely?)

  • Added to the foregoing for Phase II proposals As a result of Phase I, did the firm succeed in providing a solid foundation for the proposed Phase II activity?

  1. What are the broader impacts of the proposed activity? (This criterion addresses the overall impact of the proposed activity.)

Underlying considerations in applying this criterion are the following:

  • What may be the commercial and societal benefits of the proposed activity?

  • Does the proposal lead to enabling technologies (instrumentation, software, etc.) for further discoveries?

  • Does the outcome of the proposed activity lead to a marketable product or process?

  • Evaluate the competitive advantage of this technology versus alternate technologies that can meet the same market needs.

  • How well is the proposed activity positioned to attract further funding from non-SBIR sources once the SBIR project ends?

  • Can the product or process developed in the project advance NSF goals in research and education?

  • Does the proposed activity broaden the participation of underrepresented groups (e.g. gender, ethnicity, disability, geography, etc.)?

  • Has the proposing firm successfully commercialized SBIR/STTR–supported technology where prior grants have been made?

Prior to listing the two main criteria listed above, the NSF Web site states, “Other factors that may enter into consideration include the following: the bal-

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

ance among NSF programs; past commercialization efforts by the firm where previous grants exist; excessive concentration of grants in one firm or with one principal investigator; participation by woman-owned and socially and economically disadvantaged small business concerns; distribution of grants across the States; importance to science or society; and critical technology areas.”

After listing the two merit criteria, the NSF lists another set of “additional factors” to be addressed in proposals and taken into consideration by reviewers: “Integration of Research and Education,” and “Integrating Diversity into NSF Programs, Projects, and Activities.”

The NSF statement of selection criteria next indicates another criterion for Phase II proposals: “Review of the Proposal’s Commercialization Plan.” Considerations in this review include the following:

  • Market Need, Expected Outcomes, and Impact: Does the company present a compelling value proposition for the Phase II Project? Does the discussion of need demonstrate that there is market-pull and breadth of potential commercial impact for the innovation? In addition, does the proposer make a solid case that there are potential societal, educational, and scientific benefits of this project? Does the noncommercial impact add to the overall significance of work being proposed?

  • The Company: Does the company have focused objectives and the appropriate core competencies? Does the company have the appropriate resources to perform the tasks being proposed and to take the project through to commercialization? If the company has several years of experience, has it experienced growth? Does the company have a good record of commercializing prior SBIR/STTR projects or other research? Does it appear that the company can grow/maintain itself as a sustainable business entity?

  • The Market, Customer, and Competition: Does the PI/company understand the market in which the product will be introduced? Is the customer adequately and correctly described? Are the benefits to the customer and the hurdles to acceptance of the innovation adequately described? Does the PI/company know and understand the competitive environment? How would you rate the proposer’s ability to execute a marketing and sales program to bring the technology successfully to market in view of this competition (or competitive environment)? What are the strengths and weaknesses of the company’s marketing and sales strategy?

  • Intellectual Property (IP): Is intellectual property addressed and are there plans for sufficient protection to get the product to market and attain at least a temporal competitive advantage? What is the company’s prior

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

record in this area? Please comment on the company’s strategy to build a sustainable business through protection of intellectual property.

  • The Financing: Has the company properly estimated the amount of funding needed in Phase III? Does the company have a high probability of securing this funding? Has the PI/company identified specific companies for financial commitments, prototype purchase, and/or will they fund themselves? If there are no “hard” commitments for funding (that is, letters of interest or intent), does the company have a solid road map for pursuing the funding needed to commercialize?

  • Revenue Stream: Are the plans for generating a revenue stream adequately described? Are the revenue projections and the assumptions behind the revenue projections realistic? Is the revenue stream sustainable? Will it lead to robust company growth or at least sustain the product (and/or the service) through its life cycle?

While Phase I proposals are evaluated on both of the main criteria, the focus has been on technical feasibility. Approximately 25 percent of reviewers have both technical and business backgrounds, such as being a CEO of a past grant recipient company. At its meeting in 2004, the Committee of Visitors (COV) recommended that more consideration be given to commercial potential in evaluating Phase I proposals and that the review panels for Phase I proposals have more well-qualified representatives from the business sector.

At the Phase II level, selection criteria since 1992 have focused on commercial potential and commercialization planning, in addition to the research proposed. In 2004 the COV recommended that Phase II reviewers give more attention to the societal impact considerations in the “broader impacts” criterion, implying that the current focus is too narrow. Thus, changes in Phase II would also result from OII’s implementation of COV-recommended changes in considering the second merit selection criteria, “Broader Impacts.”

The Phase IIB selection process has always focused on the ability of applicants to secure third-party financing. There is no indication of change in this practice. However, there has been a recent change in the selection process for “supersized” Phase IIB grants. Now the selection process requires oral presentations by applicants and their third-party financiers as part of the grant-decision process. According to a company participant who had gone through the selection process for a supersized Phase IIB grant, the focus of the oral presentation was on the business case of the company and its financing.25 NSF management also reportedly would like to increase its due diligence by funding program manager staff visits to companies, particularly those who are applying for the supersized Phase IIB funding.

25

See the case study of Language Weaver in Appendix D.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

The review process concludes with a debriefing of unsuccessful applicants and notification of winners. The debriefing provides the following materials electronically to the principal investigator and the company officer/organization representative: (a) verbatim copies of reviews, excluding the names of reviewers; (b) summaries of review panel deliberations, if any; (c) a description of the process by which the proposal was reviewed; and (d) the context of the decisions (such as the number of proposals and grant recommendations, and information about budget availability).

8.3.2
Peer Review Panels—Membership, Selection, and Qualifications

As part of a proposal, SBIR applicants are asked to provide a list of prospective reviewers whom they regard as experts in the relevant field(s). Applicants are also asked to provide a list of individuals whom they do not wish to be considered as reviewers for their proposal.

The program materials state that “special efforts are made to recruit reviewers from nonacademic institutions, minority-serving institutions, or adjacent disciplines to that principally addressed in the proposal.” Yet 60 percent of the reviewers continue to come from academia; 20 percent are industry scientists; 15 percent are other industry personnel; and 5 percent come from other sources.26 Warnings and requests are made to all prospective reviewers about the need to avoid potential conflicts of interest. Per policy, the NSF will not use a reviewer in the peer review process if that individual has any affiliation with a company who has a proposal under review for funding in that funding cycle. This includes employees of the company or consultants to that company. The NSF has used past and potentially future applicants as reviewers if they determine that no conflict of interest exists.

According to discussions with several program managers, it is each NSF program manager’s responsibility to find reviewers for his or her topic areas.27 Reportedly, they give more attention to the designation by applicants of “who should not review their proposal” than the designation of “who should review it.”28 The program managers devise various “schemes for building their reviewer pools,” in addition to obtaining suggestions from applicants. One example that was given as a way program managers build reviewer pools is to send out letters to deans of major university departments asking them to recommend to their new faculty members that they participate as reviewers in the SBIR program.29

According to program officials, “there are no restrictions prohibiting submitters in a current solicitation, or past or future applicants to serve as NSF

26

NRC Program Manager Survey.

27

Based on an interview with NSF Program Manager Rosemarie Wesson on December 1, 2003, and a face-to-face interview with NSF Program Manager Cheryl Albus on January 7, 2004.

28

Interview with Cheryl Albus, NSF Program Manager, January 7, 2004.

29

Ibid.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

reviewers.”30 A case against the practice of using current proposers as reviewers, even if they are not used to review directly competing proposals to their own, is that when grant dollars are limited, a decision to fund one proposal may mean insufficient funding for other proposals. A potential real or apparent conflict of interest could arise when a current applicant reviews the proposals of others in the same budgetary cycle—a problem that would not be eliminated by requiring all reviewers to sign a conflict of interest (COI) form.31

All NSF SBIR reviewers are provided with instructions and guidance regarding the SBIR peer review process and are compensated for their time. NSF policy regarding the NSF review process and compensation can be found in the NSF Proposal and Grant Manual.32

Proposals are divided into appropriate technical topics, and review panels are assembled. Each panel at the Phase I stage typically deals with eight to ten proposals. Each reviewer is provided his or her proposals electronically 30 days prior to the panel meeting to prepare individual reviews, which are then submitted electronically to the NSF. The panel typically meets as a group to discuss each proposal with the program manager. The perceived strengths and weaknesses of each proposal are then presented in a panel summary.33

For Phase I proposals a minimum of three and a maximum of six reviewers are used. The average is four reviewers.34 A typical panel makeup for Phase I reviews is three technical reviewers and one reviewer with a combined technical and business background. However, as has been noted previously, the SBIR COV concluded in 2004 that inadequate consideration is being given to commercial potential in evaluating Phase I proposals. The COV has recommended that more well-qualified representatives from the business sector be used to staff business panels for Phase I review.

Phase II proposals typically receive a minimum of three technical and three business reviews. Technical and business reviewers external to the NSF’s SBIR program office perform the review. At Phase II, specific arrangements for proposal reviewers vary depending on the program manager. Some program managers, for example, mail their proposals to technical reviewers and convene a panel for business reviews. Some convene a combined panel of technical and business reviewers. Within this variability, however, the business reviewers are reportedly always convened as a panel.35 The business reviewers include entre-

30

“NSF SBIR Response to NRC Questions,” op. cit.

31

The conflict of interest form is based on NSF Manual 14, NSF Conflicts of Interest and Standards of Ethical Conduct. The form may be found on-ine at <http://www.eng.nsf.gov/sbir/COI_Form.doc>.

32

NSF SBIR guidance for reviewers is provided online at <http://www.eng.nsf.gov/sbir/peer_review.htm>. The NSF Proposal and Grant Manual is provided online at <http://www.inside.nsf.gov/pubs/2002/pam/pamdec02.6html>. See also “NSF Response to NRC Questions,” op. cit.

33

NRC Program Manager Survey, op cit., and “NSF SBIR Response to NRC Questions,” op. cit.

34

Ibid.

35

Ibid.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

preneurs, business school professors, professional investors, corporate managers and investors, and others with business and financial experience.

Reviewers at the Phase IIB review come from within the SBIR program office. They are SBIR program managers and other program administrators. These program managers and administrators typically have technical and business qualifications. However, it should also be noted that at this phase, primary reliance is placed on the ability of applicants to obtain third-party financing to signal that a proposal should be funded. Discussions with program officials suggest that few, if any, applicants who meet the third-party financing requirement are not approved. In fact, the practice is that would-be applicants who do not meet the financing requirement withdraw their applications; they are not rejected and are not recorded in the database as failed proposals. Thus, at this phase, internal program reviewers are asked primarily to exercise their judgment to decide if third-party financing requirements are adequately met.

It should be noted that the COV’s recommendation in 2004 that the reviewers give more attention to societal impact considerations has implications for the future make-up of “business reviewers.” It was the experience of the ATP, which has a similar criterion for broad benefits potential (interpreted as such), that entrepreneurs and other business experts tend to be strong in assessing potential for commercialization, but they are not necessarily strong in assessing potential for widely dispersed spillover benefits to society. To increase the likelihood that factors contributing to spillover effects are taken into account, the ATP engaged economists as reviewers, together with business reviewers, and provided briefings on what constituted broad societal benefits to help reviewers better understand this selection criterion.

8.3.3
Transparency of Selection Process

Although the selection process at first glance appears relatively straightforward, it appears less transparent on further examination. In some instances, the NSF will employ “additional factors” as described earlier. As indicated in Section 8.3.1, these “additional factors” that may enter into consideration during proposal selection include the following: the balance among NSF programs; past commercialization efforts by the firm; excessive concentration of grants in one firm or with one principal investigator; participation by woman-owned and socially and economically disadvantaged small business concerns; distribution of grants across the states; importance to science or society; and critical technology areas. However, there is no further explanation of how, or when, these factors might be applied.

The case studies found that several of the companies had come to the conclusion that the NSF’s SBIR program selection process was unfair. Their concern arose from the reportedly uneven application of the factor “excessive concentration of grants in one firm.” According to the case study interviewees, companies

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

are unable to tell if and when this factor may be applied by NSF, presumably to reduce multiple awards to companies. The interviewees said they are not told in advance if they have an excessive concentration of grants, or what might constitute an excessive concentration. They observed that other companies, who have received more grants than they have received, continued to receive grants at a time when the relevant case study firms were told they had had enough. Thus, they say they proposed to the NSF’s SBIR program believing they were eligible and their proposal received favorable reviews, but then they were turned down based on the NSF’s application of the “excessive concentration” factor. The companies say that if they had known they were considered by NSF to have received too many grants, they would have avoided the costs of applying.36

Inspection of the program data does not reveal if and how the program is using these other factors. For example, given that woman- and minority-owned companies have a smaller approval rate than other companies, one might conclude either that their proposals are of poorer quality or that the “additional factor” in this case is being applied in reverse. Of course, it is also possible that the approval rate would be even lower without application of “the factor.” Or, perhaps, this additional factor is not being applied. The point is that the selection process lacks transparency when it comes to application of “additional factors.”

As long as the main selection criteria are applied without triggering “additional factors,” the case study companies appeared to believe the selection process to be “fair.”

8.3.4
Scoring Procedures

Reviewers are instructed to score all proposals against the technical merit and broader impact criteria. The NSF does not provide a weighting system for reviewers to use in scoring proposals against the criteria. The reviewers do not assign a numerical score to proposals and do not score the individual components of proposals, such as qualifications of the principal investigator, adequacy of facilities, qualifications of other staff, commercial potential, etc. Rather, they give the proposals an overall rating of “Excellent,” “Very Good,” “Good,” “Fair,” or “Poor.”37 Each reviewer submits a summary rating and an accompanying narrative.

The reviewers are then typically convened as a panel to make a consensus recommendation. The panel makes this consensus recommendation based on the merit of the proposal and whether it should be recommended for funding. The panel gives each proposal one of three designations: (1) “Highly Recommended,” (2) “Recommended if Funding Permits,” and (3) “Do Not Consider for Funding.” Based on a recommendation by their advisory committee the NSF SBIR program added a recommendation category in the Phase II review panel process

36

See case studies in Appendix D for the MER Corp and the NVE Corporation.

37

NRC Program Manager Survey, op. cit.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

called “Fund with Revision” in addition to “Highly Recommend” and Do not Fund.” A typical proposal that receives a recommendation of “Fund with Revision” is one that is generally meritorious but is missing some key information in the technical and/or commercialization plan that, if available, might result in a “Highly Recommend” classification. The program manager contacts the Phase II proposer and typically provides the proposer with copies of the individual reviews and asks the proposer to address the issues in a short period of time. It is not a resubmission of the proposal but clarification of issues. If the program manager is satisfied with the response from the proposer, he or she can recommend the proposal for an award.

Indeed, all input from the individual reviewers and the panel of reviewers is advisory to the program manager. The program manager acts on the advice of the panel, taking into account the other proposals under consideration and presumably the “additional factors” which govern selection. The SBIR program manager—who is also the topic manager—makes a final recommendation. The final grant decision is made by the NSF Grants Office.

8.3.5
Role of Program Manager

The program manager plays a key role in grant selection. The program manager’s responsibilities include contributing to topic development, providing elements of program outreach, selecting proposal reviewers, and making recommendations for funding decisions—taking into account the advice of reviewers. If the 2004 recommendations of the COV are followed, the program manager may also make site visits to grantees in the future.

8.3.6
Resubmission Procedures and Outcomes

The NSF’s SBIR program has a limited resubmission policy, which operates at the discretion of OII. It affects Phase I, II, and IIB applications differently.

Phase I proposals that are declined by the NSF can be resubmitted to a new solicitation, but Phase II declined proposals can not be resubmitted, even by invitation. There is no appeal or reconsideration of a Phase I proposal that has been declined. A company whose Phase II proposal has been declined can request a reconsideration of the decline decision by a standard NSF process. This is not a rebuttal of reviews or a rereview of the proposal. The reconsideration process is carried on by an independent group (outside of the SBIR program) to determine if the correct procedures and policies were followed in the review and decision process.

For selected Phase II proposals that are considered contenders given improvements in their commercialization plan, applicants may be given two to five days to resubmit. They are provided the review panel’s comments on the proposal’s commercialization plan. An invitation to resubmit must be received by an appli-

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

cant from the review panel chair; otherwise, Phase II proposals that have been declined are not eligible for resubmission. About a third of invited resubmissions reportedly receive Phase II funding.38

There is no resubmission for a Phase IIB application. Phase II grantees have a single attempt to gain a Phase IIB grant. Once that window passes, grantees are not permitted to reapply.39

As noted by several of the case study companies, there is no appeals process. Any invitation to resubmit must come from the panel chair; an appeal from an applicant to revise and resubmit is not permitted.

Several of the case study companies indicated frustration that they were not allowed to revise and resubmit under two conditions: (1) when the company judged from the reviewer comments that reviewer objections would be quick and easy to remedy, or (2) when the company deduced from split reviewer opinions and poorly explained rationales for the rejection decision that a reviewer failed to understand the proposal and that the fault likely lay with the reviewer rather than the proposal. That there is a basis for the expressed frustration of these companies is suggested by observations in the 2004 COV report. Commenting that there is room for improvement in the feedback given to small businesses from the review process, the COV observed that when there is “wider variation in individual reviews,” better documentation of the basis for a consensus decision is needed. The case study results also suggest the importance of providing applicants with clear communication about a funding decision based on conflicting reviews. If a fault is found with a proposal that appears to be easily remedied, it is equally important either that the submitter be allowed to make quick changes and resubmit or that a debriefing make it clear why the decision has gone against the applicant, eliminating the chance of resubmission.40

8.4
TRAINING (AFTER SUCCESSFUL APPLICATION)

Small technology-driven companies seeking to commercialize a product, service, or process may need special assistance in any of the following areas: (1) identifying potential partners and investors; (2) understanding negotiation processes with a variety of other businesses; (3) valuating technologies for negotiating licensing, partnering, spin-offs, and sales of technology; (4) protecting intellectual property; (5) changing or augmenting internal cultures to reflect the importance of commercialization as well as research; (6) making arrangements for tests, trials, certifications, and demonstrations required for commer-

38

Robert-Allen Baker, “Commercialization Support at NSF,” undated draft, p. 11.

39

Based on program information posted at the NSF Web site and an interview with Joseph Hennessey, NSF on March 3, 2006.

40

In discussions with NSF SBIR program managers about the absence of an appeal process, they made a compelling argument that they would be unable to hold to their target selection schedule and also accommodate an appeals process, but they could accommodate resubmission by invitation.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

cial success; (7) enhancing abilities to present business opportunities to potential investors; and (8) generally shaping strategic choices and developing more detailed business plans.

8.4.1
Training Programs for Agency Phase I and Phase II Grantees

Within the constraints imposed by legislation and feasibility, the NSF’s SBIR program provides training support to small businesses. Specifically, the training is intended to foster commercialization by building grantees’ business acumen. The commercialization assistance offered by the NSF in Phase I aims at assisting the companies “to attend to issues related to commercialization using an educational model.”41 The assistance, while administered in Phase I, has centered on enabling companies to develop a commercialization plan as required for Phase II. The NSF reportedly adopted this strategy because it believed that grantees would benefit from having business coaching and a commercialization plan whether or not they proceeded to Phase II.

In 2001 the NSF’s SBIR program began the Commercialization Planning Program (CP2), a commercialization assistance program run by Dawnbreaker, Inc. In a recent round of training, Dawnbreaker assisted 76 companies in developing commercialization strategies for their NSF-funded technologies.42

The training, which entails intervention at the Phase I stage, is aimed at assisting participating firms to begin (or continue) the development of a commercialization strategy for the NSF-funded technology. The Dawnbreaker training takes participating firms through preliminary strategic planning, interacting with customers, sizing markets, and examining the strengths and weaknesses of competitors. Based on their analyses, participants are asked to refine their plans for marketing, distribution, and financing. The process yields a Commercialization Plan, approximately 15 pages long, that conforms to NSF Phase II solicitation guidelines.

Until late in 2002, Dawnbreaker alone provided the program under contract with the NSF. That year, the contract was renegotiated and split with another vendor, Foresight Science & Technology, Inc. The training program provided by The Foresight Science and Technology, Inc., Training Group uses online training lessons and tutorials, a commercialization/marketing plan template, and supplemental resources, including lessons on deal making and regulations.43 Both

41

Dawnbreaker, Inc., SBIR: The Phase III Challenge: Commercialization Assistance Programs (1990–2005), July 15, 2005, p. 15.

42

More about the NSF’s Commercialization Planning Program ( CP2) may be found at the trainer’s Web site, <http://www.dawnbreaker.com/gov/nsf.html>.

43

Both of the Phase I commercialization assistance contractors meet with their assigned clients during the Phase I Grantees Workshop. Additional information about Foresight Science & Technology and its training offerings may be found at the company’s Web site, <http://www.seeport.com/training/>.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

Dawnbreaker and Foresight maintain Web sites that offer password-protected entry for SBIR companies.

Dawnbreaker’s CP2 has its kickoff meeting at a Grantees Workshop, a two-day event which all NSF Phase I grantees are required to attend. All Phase I grantees must include in their Phase I budget the costs of attending the workshop and an explicit statement acknowledging that attendance at the Grantees Workshop is required.44

The Grantees Workshop is held at the NSF halfway through the six-month grant period. During the conference, the contractor (Dawnbreaker) meets with each grantee company’s principal investigator in a one-on-one meeting that lasts an hour. Thereafter, the grantee companies can continue in the CP2 and work with the contractor to develop a commercialization plan if they wish.

In 2003, the NSF modified the guidelines for the CP2. The modification required more fully developed financial projections as a part of a company’s business plan. To meet this requirement the training was simplified in order to focus on developing financial projections.45 In keeping with this increased focus on financial projections, the two companies providing commercialization assistance have increased their attention to providing market research information to participating companies.46

The focus until 2005 had been on providing assistance in Phase I. A new NSF initiative in 2005 supported participation of a group of Phase II NSF grant recipients in an “Opportunity Forum™” that features netorking with and presentations of business opportunities to potential investors. To this end, the NSF formed a partnership with the Department of Energy (DoE) that allowed the NSF to “piggyback” on a DoE-arranged Opportunity Forum™ run by Dwnbreaker.47,48 The NSF identified about 20 Phase II companies with topics of interest to DoE, and approximately 12 to 14 of these participated in the forum.49

Participation of companies in the forum generally requires special coaching to assist them in meeting the informational needs of an audience comprised of potential partners and investors. The training associated with the forum typically includes a kickoff meeting, an advanced workshop, and the forum itself. The tools developed by companies include a PowerPoint presentation that emphasizes business opportunities rather than technical details.

The NSF’s Matchmaker program also attempts to help grantees find suitable partners for commercialization. It encourages grantees to obtain additional

44

NRC Program Manager Survey.

45

Dawnbreaker, SBIR: The Phase III Challenge, op. cit., p. 15.

46

Ibid., p. 18.

47

DoE, the agency with the earliest Opportunity Forum, has reportedly long opened its Opportunity Forum to partnering with SBIR programs run by other agencies. See also Dawnbreaker, SBIR: The Phase III Challenge, op. cit. p. 5.

48

DoE Opportunity Forum brochure, 2005, shows joint participation in theforum of both NSF and DoE SBIR Phase II grantees.

49

Telephone interview with Joseph Hennessey, NSF, on October 18, 2005.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

sources of funding by matching interested Phase II grantees with prospective partners and investors from the venture-capital, angel-investor, and strategic-partner communities.50 As of late 2005, the activity had not met program expectations regarding business participation rates, and the OII was looking for ways to increase participation.51

It should be noted that because companies often receive SBIR grants from multiple agencies that offer commercialization assistance, some NSF grantees have received commercialization assistance from other agency programs prior to, during, or after the CP2, and generally through the same contractors.52

8.4.2
Rating the Effectiveness of Various Training Efforts

In addition to providing training for the NSF’s SBIR program, Dawnbreaker has provided training for participation of Phase II companies in Opportunity Forums for other agencies, including DoE, EPA, the National Institutes of Health (NIH), and the ATP of the National Institute of Standards and Technology (NIST). Although it is still too early to find metrics for the effectiveness of the training for the NSF, Dawnbreaker has compiled metrics for these other programs that may serve as benchmarks for evaluating the effectiveness of its Phase II training for the NSF.

The primary metrics have been the amount of funding received by Opportunity Forum participants within 18 months of the forum, and the percentage of participants who received funding within 18 months. The highest recorded success rates for participation in an Opportunity Forum to date have been for the NIST-ATP program, for which 70 percent of the 20 participating companies reportedly obtained a total of $60 million in investor funding within 18 months of the forum.53 The benchmark performances by SBIR companies range from 40 percent to 68 percent of just over 30 participants receiving close to a total of $40 million within 18 months. No control group was used against which to compare these metrics.

The NRC Program Manager Survey results provide a comparative rating by NSF program managers of the usefulness of the various support functions provided to grantees.54 Table 8.4-1 shows the rating of comparative usefulness. The

50

SBIR Phase II grantees and interested prospective partners and investors are encouraged to sign up for the Matchmaker program by sending an email to http://www.SBIRMatch@nsf.gov>. See also “NSF SBIR Response to NRC Questions,” op. cit.

51

“NSF SBIR Response to NRC Questions,” op. cit.

52

An example of a company that has completed Dawnbreaker commercialization programs offered by the NSF and by other agencies is Materials and Electronchemical Research (MER) Corp, which is among the case study companies given in Appendix D. MER participated in DoE’s commercialization assistance programs in 1989, 1994, and 1995, and in the NSF CP2 in 2001 and 2003. Ibid, p. 9.

53

Dawnbreaker, SBIR: The Phase III Challenge, op. cit., p. 18.

54

As was noted earlier, rather than have all its program managers complete this survey (as did the other agencies), the NSF responded by having its senior advisor, Joseph Hennessey, complete the survey on behalf of all of its program managers.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

TABLE 8.4-1 NSF Rating of Support Functions by Their Usefulness

Type of support function

Usefulness Rating (1= highest)

Business plan development

1

Commercialization planning assistance to Phase I grantees

2

Partnering

3

Matchmaking with VC and other funders

4

Information and planning

4

Government contracting guide

5

SOURCE: NRC Program Manager Survey.

most useful training functions were reported to be “business plan development” and “commercialization planning assistance to Phase I grantees.”

8.4.3
Take-up Rates and Projections

All NSF Phase I grantees attend the Grantees Workshop. Thereafter, there is an attrition rate. Between 40 percent and 80 percent of eligible companies each year since the program began have reportedly continued to participate in the NSF’s CP2 after the Grantees Workshop.55

8.4.4
Constraints on Commercialization Assistance Training

Training for grantees is constrained and guided by the 1992 SBIR reauthorization legislation. The legislation allowed agencies to spend no more than $4,000 per company for commercialization assistance during Phase I.56 This amount has remained unchanged since 1992, and, hence, has declined in constant dollars. It appears not to be based on a realistic assessment of the actual cost of providing such services.

The 1992 legislation also makes it possible for individual companies—authorized by an agency—to spend up to $4,000 of their Phase II grant for such services.57 However, “the cost to a vendor of negotiating individual contracts and nondisclosure agreements with Phase II companies makes … this method of working untenable…. experience has demonstrated that Phase II services offered in a programmatic fashion and paid for by one or more agencies are … preferable.”58 Again, the amount set has declined in constant dollars. The bottom

55

The rates are based on attrition rates after the mandatory Grantees Workshop, which reportedly ranged from 20 percent to 60 percent in recent years. See Robert-Allen Baker, Commercialization Support at NSF, op cit., Section 5.2.2, p. 5.

56

Dawnbreaker, SBIR: The Phase III Challenge, op. cit., p. 2.

57

Ibid.

58

Ibid.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

line appears to be that it has been easier under the existing legislation to provide individual company business planning under Phase I than Phase II, and that, at best, the ability to do so has been eroding over time.

8.5
THE NSF PHASE IIB PROGRAM

8.5.1
Description

Introduced as a pilot program in 1998, the NSF SBIR Phase IIB program is intended to help bridge the gap between technology research and commercialization by providing an incentive for SBIR grantees to seek partnerships with investors and to continue their research while securing the support of third-party investors. The Phase IIB option extends “the R&D efforts beyond the current Phase II grant to meet the product/process/software requirements of a third-party investor to accelerate the Phase II project to the commercialization stage and/or enhance the overall strength of the commercial potential of the Phase II project.”59 The NSF’s SBIR program essentially holds back what could otherwise be part of its Phase II funding and grants it as a supplement to those Phase II grantees who show evidence of commercialization potential as indicated by their ability to attract third-party funding. The Phase IIB grant is provided in a partial match to the amount of third-party funding.

Prior to November 1, 2003, the maximum amount of a Phase IIB grant was $250,000. This supplement extended the Phase II grant for one year, and the combined initial Phase II and supplemental IIB grants typically would not exceed three years in duration.

After November 1, 2003, the maximum Phase IIB supplement was increased to $500,000. For a Phase IIB supplement in excess of $250,000, the initial Phase II grant could be extended for two years, with the combined initial Phase II and supplemental IIB grants not exceeding four years in duration. Thus, the total cumulative grant for the Phase II and Phase IIB supplement increased from $750,000 to $1 million in 2003.

To be eligible to apply for a Phase IIB grant, a company must have completed one year of work on the initial Phase II grant (or receive special permission from the NSF SBIR program officer). They must also meet the requirements for third-party funding, with a third-party investor providing at least $100,000. Furthermore, to be eligible, the applicant must apply for the Phase IIB grant during the original performance period of the relevant Phase II grant. The NSF’s SBIR program office announces the deadlines for submission of Phase IIB proposals. (For example, a choice of two deadlines was recently offered: March 1 or September 1.)

59

The source is the NSF SBIR Web site, which provides information on the Phase IIB option, http://www.nsf.gov/eng/sbir/phase_IIB.jsp#ELIGIBILITY.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

As in the case of a Phase II grant, there are two proposal review criteria.

  1. What is the intellectual merit of the proposed activity?

    This criterion is supported by the following additional questions:

    • Will the completion of the proposed activity lead to a solid foundation of the scientific and engineering knowledge and understanding base?

    • Has the firm progressed satisfactorily in the Phase II activity to justify a Phase IIB activity?

    • Is the proposed plan a sound approach for establishing technical feasibility that could lead to commercialization?

  1. What are the broader impacts of the proposed activity?

    This second criterion is supported by the following additional questions:

    • Does the commercialization plan summary in the proposed activity show a clear path to commercial and societal benefits?

    • Does the proposed activity reflect changes to the Phase II commercialization plan that further improves the chances of conversion of research in order to provide societal benefits?

    • What are the expectations of the third party, and how effective will the third-party-funded activity be in leading to commercial and societal benefit?

    • What are the competitive advantages of the subject technology?

8.5.2
Use of Matching Funds

The Phase IIB grant requires third-party funding twice that of the NSF grant. To put it another way, the NSF matches third-party investment with $0.50 on the dollar. The minimum size of a Phase IIB grant is $50,000, requiring $100,000 in third-party funding. The maximum Phase IIB grant is $500,000, requiring $1 million in third-party funding. The additional federal funds are to be used only for advancing the research of the project. The third-party investor funds can be used for research or for business-related efforts in order to accelerate the innovation to commercialization, including market research, advertising, patent applications, and refining business plans. The method by which the investor will provide the funding to the company must be identified. The third-party funding can be cash, liquid assets, or tangible financial instruments but not in-kind or other “intangible assets.” Loans and investments with contingency clauses are not acceptable. Self-funding does not qualify for the Phase IIB option.

For Phase IIB grants up to and including $250,000, the third-party funding may consist of other government funding, such as other federal funding and state and local funding, as well as private-investor funding. A Phase IIB grant in excess of $250,000 is considered “supersized,” or a “Phase IIB+ grant,” and

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

the third-party funding for the amount in excess of $250,000 must come from one or more nongovernmental private investors. For example, a Phase IIB grant of $400,000 would require third-party funding of $800,000. Of this amount, as much as $500,000 could include funding from other government sources (i.e., the third-party funding against the first $250,000 of the Phase IIB grant), and the remaining $300,000 must be from private investors (i.e., the third-party funding against the additional $150,000 of the Phase IIB grant).

To meet third-party funding requirements, NSF emphasizes “money in the bank.” Vaguely worded letters of commitment are not adequate. Beyond actually having the third-party funding in hand, the only commitments that are acceptable are those that are “date certain,” that is, commitments that define a series of payments that are scheduled to be made by the third-party investor on specified dates, with evidence provided that the grant recipient actually receives payment as promised.

8.5.3
Application and Selection Procedures

About 30 percent of Phase II grantees have applied for Phase IIB grants. Approximately 80 percent of these applicants have been successful in receiving a Phase IIB grant. As indicated earlier, the 20 percent of applicants that do not receive grants are not officially recorded as failed applications. Rather, applicants who cannot successfully find third-party financing withdraw their application, and those who do find the financing receive the Phase IIB grant. It does not appear that any applicants who met the third-party investment requirement had been turned down.

Applicants must submit Phase IIB proposals using the NSF FastLane system, and they must follow other proposal preparation and submittal directives posted at the NSF SBIR website. The additional work proposed must expand on the technical work being performed in the present Phase II project and must fall within the scope of that project. (See Section 8.11.1 for more on FastLane.)

All Phase IIB proposals are reviewed in-house by NSF; there is no review for Phase IIB proposals external to the NSF’s SBIR program office. Each proposal is reviewed based on the criteria given in Section 8.5.1. They are identical to the criteria applied to Phase I and Phase II proposals except that they are fleshed out to relate specifically to Phase IIB.

Additionally, if the requested amount exceeds $250,000, a representative from the SBIR grantee company and one from the private-sector third-party contributor are expected to make a presentation to a panel made up of NSF SBIR program officers.60 Further, if the requested amount exceeds $250,000, the final

60

One of the case study companies (Language Weaver) made the point that there might be issues a grantee would not wish to discuss in front of an investor (third-party funding source) and that the presentation process should take this into account. The ATP handled a similar issue by allowing potential partners and investors to give portions of their presentations separately.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

grant recommendation may be subject to SBA approval. (Supplements equal to or less than $250,000 require no presentation and no SBA approval.)

According to both NSF program officers and Phase IIB applicants who were interviewed in the course of the case studies, proposal selection for the Phase IIB option heavily emphasizes commercial potential as signaled by third-party funding. Essentially, rather than allocate all of its Phase II funding based on the reviews conducted just after Phase I the NSF’s SBIR program imposes a second, market-based test. As was shown in Chapter 4, the effect is to distribute the money differently than if all of it had been awarded based on the initial Phase II selection process.

Within 60 days of the submission deadline for Phase IIB proposals, the applicant is notified of the results. If a proposal is recommended for grant, the company must submit proof of the bank transaction showing that the third-party has exercised its commitment or the date-certain agreement for transfers of third-party funding.

8.5.4
Role of Program Manager in Phase IIB

NSF SBIR program managers play a key role in the selection of Phase IIB grants. A minimum of two program managers review each Phase IIB proposal and make a recommendation either to fund or not to fund to the SBIR office director. A program manager oversees each Phase IIB grant throughout its life and reviews all documentation. The assigned program manager reviews all financial documentation for compliance with program requirements. If funding were made available for site visits, this would provide an additional role for the program manager.61

8.6
THE GRANT CYCLE AND FUNDING GAPS

8.6.1
Phase I to Phase II Gap

Table 8.6-1 shows the NSF’s SBIR program grant cycle over a ten-year period—from 2002 through 2012—for four topic areas and for Phase I, Phase II, and Phase IIB grants. According to program staff, the program intends to follow this same basic time line for the foreseeable future.62

The ten-year cycle covers solicitations, proposal submittal, review, award of Phase I, Phase II, and Phase IIB grants, and post-grant commercialization reports. The illustrative cycle provided by the NSF’s SBIR program covers separate solicitations for two topic sets, each set consisting of two major technology

61

According to discussions with program staff, program managers sometimes visit grantee companies now when they are in the area for other reasons that have financial support and have the opportunity to “piggy-back” a company visit. However, such visits are not systematic.

62

Discussions in 2004 with Ritchie Coryell, NSF SBIR Program Staff. (Note that Mr. Coryell has since retired.)

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

TABLE 8.6-1 NSF SBIR Significant Dates for a Recent Grant Cycle

SBIR Phase

Event

Relevant Dates for Topics 1 & 2 Advanced Materials and Information Technology

Relevant Dates for Topics3&4 Electronics and Biotechnology

Phases I, II & IIB

Topic formulation

January 2002

Solicitation topics posted on Web

March 1, 2002

Phase I

Solicitation opens

May 1, 2002

Proposals due

June 12, 2002

January 22, 2003

Review panels

August–September 2002

March–April 2003

Grants & declines

December 2002

June 2003

Grant start dates

January 1, 2003

July 10, 2003

Grantees workshops

April 7–9, 2003

~October 2003

Final reports

July 15, 2003

January 15, 2004

Phase II

Proposal due dates

July 29, 2003

~January 2004

Review panels

August–October 2003

March–May 2004

Grants & declines

December 2003

June–July 2004

Grant start dates

January–March 2004

June–November 2004

Grantees workshops

~January 2004, 2005 & 2006

 

Phase IIBa

Proposal due date

March 1, 2005

November 1, 2005

Phases II & IIB

Final reports

~2007

~2007

Postgrant annual commercialization reports

2008–2012

aDoes not reflect the schedule for the “supersized” Phase IIB grants.

SOURCE: Ritchie Coryell, NSF.

topics. Each of the Phase I solicitations extends over a cycle of about 19 months, covering the period from the opening of the solicitation to the final Phase I reports for the second topic group.

For each topics set, approximately a year elapsed from the Phase I proposal due date to the submission of the project’s final report. About half the elapsed time occurred between the proposal due date and the project start date, and about half occurred between the project start date and the due date for the final reports. Thus, processing the proposal took approximately the same amount of time as carrying out the project. For example, proposals for the Advanced Materials and

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

Information Technology topics set were due on June 12, 2002; the projects started the first of the year in 2003; and the final reports were due in July 2003.

Interwoven with the Advanced Materials and Information Technology set of topics was the time line for the other topic set, Electronics and Biotechnology. For this second set, the proposals were due in late January 2003, about seven months after the proposals for the first set. The second set of projects started in July 2003, close to six months after the proposals were due, and their final reports were due in mid-January 2004. Again, it takes about the same amount of time to process the Phase I proposals and identify the grants as it takes the companies to complete the Phase I research and prepare a final project report—approximately six months for each activity.

The third section of the table pertains to Phase II grants. The proposals for the follow-on Phase II projects are due about the time the Phase I projects end. The Phase II projects start five to ten months after the Phase II proposal due dates and should be completed within two years—unless a follow-on Phase IIB is received.

Phase IIB proposals are due approximately a year after Phase II projects start. For Phase IIB grants not in excess of $250,000, the projects are due to be completed and the final reports filed about a year after the Phase II proposals would have been due. For grants in excess of $250,000, another year is added to the completion time, so the projects should be completed within four years.

From an analysis of the schedule depicted in Table 8.6-1, the main persisting funding gap occurs between completion of the Phase I grant and the start of the Phase II follow-on grant. According to the grant schedule, this period is approximately six to ten months. Any unscheduled delays will make the gap even longer for individual companies. A delay may be critical for very small companies. Yet, without provision for bridge funding of some sort, a gap may be unavoidable because time is required for the program to review Phase II proposals and decide which Phase I projects are worthy of continuation. With a growing number of proposals and administrative resource constraints, reducing the gap by tightening the schedule appears problematic. At the same time, as explained below, firms can hold the gap to a minimum by adhering to the program’s reporting and application schedule.

Eighty-one percent of NRC Phase II Survey respondents reported they had experienced a funding gap between the end of Phase I and the start of Phase II; 19 percent said they had not. For those reporting that they experienced a gap between the end of Phase I and the start of Phase II, the average gap was 8 months. For 5 percent of the respondents, the gap was 2 years or more.

The funding gap had troubling implications for many respondents. As a result of the funding gap between Phase I and Phase II, 28 percent of NRC Phase II Survey respondents stopped work on the project. Thirty-seven percent continued work but at a reduced pace during the gap, and 1 percent ceased all operations until funding resumed. Five percent of the respondents said they experienced a

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

funding gap but were unaffected by it. A small percentage reported receiving bridge funding. As the case studies revealed, companies that had built up a revenue stream in advance were able to continue work during the funding gap by self-funding research.

According to program officials, companies have an opportunity to cut the funding gap themselves by taking the following steps: (a) promptly completing and filing their reports for Phase I; (b) promptly applying for Phase II at the first of two opportunities; and (c) taking steps in advance to ensure that there will not be problems with their financial audits that would delay the Phase II award.63

The program’s flexible approach to preaward work and its payment plan may also serve to abate funding gaps. Phase I grant recipients are allowed to begin preaward spending up to ninety days before the effective date of the grant. On the effective date, the grant recipient receives two-thirds of the money, i.e., $66,666 (assuming the current maximum Phase I grant), and the remaining third, i.e., $33,334 at the project’s end upon submission of a final report. Recently, the program has begun providing what it calls a Phase IB option to bridge the gap in funding between Phase I and Phase II. The Phase IB option provides additional funds to Phase I grantees that obtain third-party investment to support their projects, thereby extending the R&D beyond the initial Phase I grant for six months.64

A Phase II grant recipient is also allowed to begin preaward spending. This is at the company’s own risk, however, because any grant is conditioned on the successful completion of a financial audit. To speed financial audits, the program has hired CPA firms. Phase II recipients receive a lump sum up front, with the remainder spread out over the work period. However, the program holds final payment until the final report is submitted.

Given the time required for proposal solicitation and project selection, the program’s limited administration resources, and its lack of procurement possibilities, it appears difficult—if not impossible—for this program to avoid altogether a funding gap between Phase I and Phase II. Hence, it seems particularly important that the program staff ensure that grantees understand the various steps they themselves can take to minimize the length of the funding gap.

8.6.2
Other Funding Sources

The NRC Phase II Survey asked respondents whether matching funds or other types of cost sharing were received for the Phase II proposals. Approximately one-third of the survey projects received such funding and two-thirds did not. Table 8.6-2 shows the sources of the third-party funding for the 48 projects that did receive it. The major source was “another company,” followed by the

63

Telephone interview with Joseph Hennessey, NSF, October 18, 2005.

64

See NSF SBIR program information at <http://www.nsf.gov/eng/iip/sbir/phaseibinfo.pdf>.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

TABLE 8.6-2 Third-Party Sources of Matching and Co-investment Funding in Phase II

Source of Matching Funds

Percentage of Projects with Matching Funds Obtained from Each Source (%)

Another company provided funding

58

Our own company provided funding (includes borrowed funds)

38

An angel or other private investment source provided funding

21

A federal agency provided non-SBIR funds

4

Venture capital provided funding

2

NOTE: The percentages are computed for the approximately one-third of NRC Phase II Survey projects that reported third-party sources of matching and co-investment funding.

SOURCE: NRC Phase II Survey.

company itself, and then by “an angel or other private investor.” Note that few of the projects received funding from venture capitalists. Clearly, respondents interpreted this question as extending beyond third-party financing for Phase IIB because self-funding is not eligible for this purpose. The other sources included are eligible for meeting third-party financing requirements.

NRC Phase II Survey respondents were also asked whether any additional developmental funding had been received or invested in the referenced project. Nearly two-thirds said they did receive such funding and slightly more than one-third said they did not. As shown in Table 8.6-3, the major source of the additional funding was “non-SBIR federal funds,” followed by “other private equity.” Venture capitalists provided approximately 6 percent of the reported additional funds. On average, each referenced project in the survey received over half a million dollars in additional funding for further development.

The case studies (see Appendix D) provide examples of companies that avoided the funding gap after Phase I by attracting funding from other sources. Some leveraged SBIR grants to obtain the larger ATP awards then available. Some had developed products and services for sale. Some obtained SBIR grants from other agencies. Some had formed partnerships that provided various forms of financial support, including licensing revenue.

While research results provide insights about funding in the Phase II period, neither the NRC Phase II Survey results nor the case studies show quantitatively how adequate the additional funding is in meeting financial requirements for development and commercialization. Frequent comments by case study interviewees about the large requirements for capital and the difficulty in obtaining it suggest that they experienced a financing shortfall in most cases.

Because the NSF generally does not procure the technologies it funds to serve its mission, there is essentially no support from the NSF to grantees in the

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

TABLE 8.6-3 Funding Sources and Average Amounts of Additional Developmental Funding for the Referenced Projects

Funding Source

Average Amount of Developmental Funding per Survey Project ($)

Non-SBIR federal funds

248,077

Private investment:

 

(1) U.S. venture capital

39,450

(2) Foreign investment

19,290

(3) Other private equity

196,141

(4) Other domestic private company

57,925

Other sources:

 

(1) State or local governments

19,938

(2) College or universities

617

Not previously reported:

 

(1) Your own company (including borrowed money)

54,617

(2) Personal funds

22,154

Total average additional funding for development, all sources, per referenced project

658,214

NOTE: The average amount of developmental funding was computed by dividing the total dollar amounts reported for each funding source by the 162 respondents in the survey, not just the number of respondents reporting additional development funding.

SOURCE: NRC Phase II Survey.

post–Phase II period from agency purchases of grantee goods and services. As was shown earlier, however, there is evidence that other federal agencies procure the results of some NSF-funded innovations, and these procurements may help reduce a financing shortfall.

8.6.3
Bridge Funding Programs (After Phase II)

Other than the Phase IIB funding program described in detail in Section 8.5, the NSF’s SBIR program does not operate bridge funding programs.

Program officials created the Matchmaker program in 2002 to help grantees deal with the sometimes lengthy process of securing third-party funding and commercial partners after Phase II. Aimed at helping grantees find funding from the private sector, Matchmaker is an online database where prospective private-sector investors and partners, as well as SBIR firms looking for investors and partners, can register and connect with one another. SBIR program managers provide a brokerage function by helping to match registered qualified prospective investors/partners with Phase II grantees that fit the profile of interest. Although the Matchmaker service reportedly has not yet been as effective in speeding the process of finding third-party funding as had been hoped, it offers potential as a

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

mechanism for reducing the funding gap at both the Phase II-Phase IIB stages and the post Phase IIB stage.

8.7
REPORTING REQUIREMENTS

8.7.1
Reports Submitted to the Agency by SBIR Winners

For Phase I grants, the deliverable at the end of the grant period is a technical report that summarizes the experimental and theoretical accomplishments of the project. Phase I grant recipients must submit a Phase I Final Report within 15 days after the expiration of the grant. This report must be submitted using the report template provided by the FastLane system prior to submitting a Phase II proposal. According to instructions provided to applicants, failure to provide the final technical report will delay NSF review and processing of pending proposals submitted by the principal investigator. Principal investigators are reminded in NSF instructions to examine the formats of the required reports in advance to assure the availability of required data.

Phase II grantees are required to provide three kinds of reports: (1) interim reports of progress, (2) a final project report, and (3) annual commercialization reports for 5 years following completion of the Phase II grant. The interim reports are due at the end of 6, 12, and 18 months, unless the grantee receives a supplementary Phase IIB grant, in which case additional interim reports at the end of 24 and 30 months are added (and more interim reports are added at 6-month intervals, as needed, to cover the duration of a supersized Phase IIB grant). These reports also are submitted using the FastLane system.

If a Phase IIB supplement is added, a final report is due at the end of Phase IIB. The final report is to include an account of cumulative project milestones, documentation of technical accomplishments, and a commercialization section. A final report following completion of a Phase IIB grant must include a combined Phase II and IIB final report and a commercialization report inclusive of an updated commercialization plan. When the grant closes, regardless of whether it is a Phase II or a Phase IIB grant, the grantee company is responsible for continuing to report annually on commercialization for five years, as noted earlier. A warning is issued to companies stating that failure to submit required reports may deter their selection by the program for future grants.65 (See Appendix F for the format of the annual report.)

A new reporting requirement in the postcompletion period has recently been added. NSF SBIR program staff reported that there are to be commercialization reports on the third, fifth, and eighth anniversaries after the start of a Phase II grant. A program manager is charged with conducting telephone interviews with eligible companies who have completed Phase II projects. The information is

65

Yet, as noted earlier, there has been a substantial lack of compliance by grantees.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

being collected monthly using a survey template. The companies are provided the survey questions in advance. This new reporting requirement was launched in July 2005.

The first new monthly report (“SBIR Metrics”), submitted by the contractor for July 2005, provided results for 24 of 30 companies who had third-, fifth-, and eighth-year anniversaries in that month. Simple metrics on “successes” are embedded in a one-page textual report.66

8.7.2
Report Utilization and Utility to the Agency

Reportedly, compliance by grantees in filing postcompletion reports has been sketchy. Furthermore, no evidence was found that program management has used data from the annual postcompletion reports to provide performance metrics or otherwise assist in program evaluation.67

As described above, the NSF’s SBIR program office has taken a step to increase company compliance for reporting postproject data by implementing a new postcompletion telephone interview. The program office reportedly plans to use the results of the telephone survey to gain insight into commercialization progress. However, without a more standardized reporting format which systematically organizes and reports the data in ways that allow trends to be established and easily detected, this new effort would seem to yield limited results.

8.8
EVALUATION AND ASSESSMENT

8.8.1
Annual and Intermittent Agency Evaluation of Its SBIR Program

NSF SBIR program managers routinely monitor the technical outcomes of Phase I projects using the Phase I final reports. If a Phase II proposal is submitted, other reviewers will also assess the Phase I technical outcomes.68 Routine monitoring of commercial data contained in proposals and final project reports is also done at the Phase II and Phase IIB stages by program managers on an individual basis.

Although a variety of data that might comprise outcome metrics is contained in individual reports and in the occasional internal studies, the program does not systematically compile this data and does not use aggregate results as outcome indicator metrics. 69

66

Dr. George Vermont, an NSF SBIR program manager, conducted the telephone survey of postcompletion companies and provided this study a copy of the first monthly report on “SBIR Metrics.”

67

Conclusions based on discussions with SBIR program officials at the NSF, August 17, 2005.

68

NRC Program Manager Survey, op cit.

69

For example, the 1996 Tibbetts Study included a table that included outcome data for 50 companies. Response to the NRC Program Manager Survey was that the NSF routinely collects outcome indicator data, listed by type, but follow-up revealed that the data in question remain disaggregated,

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

The NSF’s SBIR program office informally collects anecdotal information from grantees and unsuccessful applicants to assess their satisfaction with the program. The results provide guidance to the program office in considering how to improve the program.70

No existing evaluation studies of the NSF’s SBIR program, either conducted or commissioned by the agency, were found in the open published literature.71 However, as discussed from the standpoint of commercial results in Chapter 5, several ad hoc unpublished studies of the program were made available by OII for this NRC study. Two of these were carried out internally by SBIR program managers, and the other (in draft) was commissioned as a contractor study. Of the two internal studies, one was conducted in 2004 and the other in 1996. The contractor report, incomplete and undated, was reportedly intended as a continuation of the 1996 internally conducted study. Additional evaluation activities include the production of “success stories,” or “nuggets,” which are regularly used. A Committee of Visitors (COV) provides an expert review of NSF’s SBIR/STTR program every three years. Each of these efforts is discussed in more detail below.

Once every three years since 1998, the Committee of Visitors has been convened to review NSF’s SBIR program. The COV uses a peer review, or expert review, methodology. The COV is asked to assess all program elements and to recommend ways to improve the program. It is a significant ongoing assessment activity that prompts documented program changes.

In its most recent review (2004), the COV reviewed the NSF’s SBIR/STTR program forthe period from 2001 through 2003. It focused primarily on (a) processes and management, and (b) outputs and outcomes but also examined several other topics, including the positioning of the SBIR program within the NSF, the value placed by the agency on the program, and the development and maintenance of supporting databases.72

As a resource for assessing the program’s processes, the COV reviewed 123 proposal jackets from among the 5,814 proposals processed by the NSF over the three years. The nonrandom selection, which aimed at including proposals across the three years, geographic regions, and underrepresented groups, yielded 78 Phase I proposals, 36 Phase II proposals, and 9 Phase IIB proposals.

are found in different forms in individual proposals and project reports, and are not systematically collected and compiled by the program office for use as indicator data. Telephone interview with Joseph Hennessey, NSF, October 18, 2005.

70

Ibid.

71

The NSF at large has commissioned evaluation studies carried out by external evaluators that are available in the open literature, but these studies were not of the SBIR program.

72

Descriptive information about the COV is available at the NSF’s online account of COV reports and annual updates at <http://www.nsf.gov/eng/general/cov>.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

TABLE 8.8-1 Criteria for Judging Success of the NSF’s SBIR Program

Criteria for Judging Success

Used by the NSF’s Program

Not Used by the NSF’s Program

Efficient program management (i.e., grants made on time)

X

 

Commercial outcomes

X

 

Outcomes that support specific agency missions

 

X

Customer (grantee) satisfaction

X

 

Technical contributions to the state of the art

X

 

SOURCE: NRC Program Manager Survey.

8.8.2
Operational Benchmarks for the NSF’s SBIR Program

Table 8.8-1 shows how OII rated its use of four of the listed criteria to judge the success of its program. The one criterion listed in the table that is not used as a success measure is “Outcomes that support specific agency missions.”

Operational benchmarks used by OII that were identified by the study pertain to the length of time it takes to process proposals. One stated benchmark is the goal of holding the elapsed time from the Phase I proposal due date to notification of Phase I winners to six months. Another stated benchmark is to complete all Phase I panels within three and a half months. Regarding Phase II proposal processing, a goal is to complete proposal review in no more than four months and to notify winners within six months. The latter goal, however, is conditional on company financial audits being completed without problems. At last report from the NSF, about half the Phase II proposals are processed within six months and half are taking longer than six months due to audit problems.

8.8.3
Evaluators (Internal and External)

NSF SBIR staff members have been the main performers of program-sponsored grantee surveys. By discipline, training, and experience, these staff members are not primarily evaluators; rather, they have assumed evaluative activities in addition to their usual program manager duties.

Only one instance was found of the program’s use of an outside contractor to conduct an evaluation. The firm hired, Dawnbreaker, Inc., is largely known for its expertise in providing business assistance, not evaluation. However, Dawn-breaker does regularly conduct follow-up assessments of the success of the firms it coaches in attracting investment funding.

The writers of nuggets are principally professional writers, not evaluators. They can skillfully present an accomplishment by an innovative company to a

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

varied audience that includes nontechnical people, but their focus is not on presenting a thorough assessment of benefits and costs over time.

The COV members are external experts selected with an eye to diversity. In 2004 the eight members included four from SBIR firms, three from academia, one from the investment sector, and one from state government. Six were male and two were female; one was African American, one was Hispanic, and one was Native American. Two of the COV members were also members of the SBIR/ STTR Advisory Committee.73

8.8.4
Annual Evaluation and Assessment Budget

According to the OII at the NSF, a limited amount of funds are regularly made available for program evaluation. For example, it was reported that during the period FY2000–2002, on average less than 1 percent of the NSF SBIR budget was spent each year on evaluation and assessment.74

8.9
FLEXIBILITY

8.9.1
Program Manager Discretion

NSF SBIR program managers appear to have a large amount of discretion and flexibility, as well as a large amount of responsibility. With the program director’s concurrence, they appear to have substantial discretion in establishing topic and subtopic areas, in arranging for proposal reviews within their areas of responsibility, and in making recommendations as to what will be funded at each stage. The program managers also appear to have considerable autonomy in running meetings within their topic areas and in holding meetings with grantees.

The program managers have primary responsibility, without the advice of peer reviewers, in deciding when supplemental funding is warranted under the provisions of Phase IIB grants. In practice, however, the requirement for third-party funding appears to be the governing factor.

Interviewees for the case studies frequently spoke positively about the fact that NSF program managers are empowered and flexible. Many specifically commented that the NSF SBIR program manager system is a program strength that should be maintained.

8.9.2
Program Manager Funding Discretion

Because of the centralized nature of the NSF’s SBIR program and the lack of subagency divisions often found in other agency SBIR programs, there is no

73

This profile of COV members was excerpted from a memo—dated June 23, 2005—reporting on the diversity, independence, balance, and resolution of conflicts among the SBIR/STTR program COV members. Available online at <http://www.nsf.gov/od/oia/activities/cov/eng/2004/SBIRcov.pdf>.

74

NRC Program Manager Survey, op. cit.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

issue of shifting funding from one subagency to another. There is, however, the issue of shifting funding across topic areas, particularly given the fact that two topic areas have solicitations for proposals within the same time frame. Program guidelines point to flexibility in funding decisions but do not directly address the issue of whether funding is shifted across topic areas to achieve “balancing,” and if so, how.

8.9.3
Program Manager Perceptions of Constraints

Informal discussions with program managers revealed no complaints regarding a lack of discretion and flexibility, except in one area: the constraint on their ability to conduct site visits of their grant recipients. It should be noted that this constraint has resulted from a lack of designated travel funds for this purpose rather than the ability of program managers to conduct site visits should funding be provided.

8.10
SIZE—FUNDING AMOUNTS AND SOURCES

This section explores the size of funding amounts available to small businesses through the NSF’s SBIR program and other sources. It first revisits the formal and effective limits on the size and duration of Phase I, Phase II, and Phase IIB grants available from the NSF’s SBIR program. Then it examines for a sample of Phase II projects the additions to funding that grantees obtained prior to and after the survey’s referenced grant. It also investigates the sources of these funds.

8.10.1
Formal and Effective Limits on Size and Duration of Grants

The program imposes formal and effective limits on the size and duration of individual Phase I, Phase II, and Phase IIB grants, characterized as follows:

  • Phase I: Funding for feasibility research

    • Six-month research period

    • A no-cost time extension—typically three months—can be granted75

    • Up to $100,000 of SBIR funding

  • Phase II: Initial Phase II funding for research toward developing a prototype

    • Twenty-four-month research period

    • Up to $500,000 of SBIR funding

75

Additional no-cost time extensions may be granted, but if they go past the second open date for a company to submit its Phase II proposal, the company will miss out on the Phase II opportunity. Interview with Joseph Hennessey, NSF, October 18, 2005.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
  • A no-cost time extension—in six-month increments for up to two years—can be granted76

  • Phase IIB: Supplemental Phase II funding in match against outside investment funding

    • A Phase IIB grant has a minimum size of $50,000 and a maximum size of $500,000.

    • Twelve-month research period added to the Phase II period if the supplement does not exceed $250,000

    • Twenty-four-month research period added to the Phase II period if the supplement exceeds $250,000

    • From roughly 1998 to November 1, 2003, up to $250,000 of Phase IIB funding per grant was available.

    • As of November 1, 2003, funding over $250,000 and up to $500,000 is available per “supersized” Phase IIB grant.

    • NSF will match up to 50 percent of third-party investment funding received.

    • To receive the minimum amount of $50,000 from the NSF, the applicant must show a minimum of $100,000 in third-party funding.

    • To receive the maximum amount of $500,000 from the NSF requires a minimum of $1,000,000 in third-party funding.

    • For grant amounts in excess of $250,000, the third-party funding must come from nongovernmental private-sector sources.

    • Phase IIB helps bridge the gap between research and commercialization

    • A one-time opportunity

  • Phase III: A non-SBIR funding phase during which companies are encouraged to develop commercial products/processes/service and take them to market. This phase has:

    • No time limits

    • No SBIR funding provided

The total amount of funding set aside for the SBIR is a function of the SBIR legislated rate, currently set at 2.5 percent of an agency’s R&D budget, and the amount of the agency’s R&D budget to which the rate is applied. Thus, the level of SBIR funding may rise or fall over time. It appears that the NSF uses all of its available SBIR funding each year.

76

Ibid.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

TABLE 8.10-1 Phase I Grants, 1992–2005

Fiscal Year

Number of Grants

Average ($)

Total ($)

1992

208

49,755

10,349,115

1993

256

49,727

12,730,184

1994

309

64,103

19,807,945

1995

301

64,571

19,436,011

1996

252

74,283

18,719,287

1997

261

74,262

19,382,371

1998

215

98,371

21,149,814

1999

236

98,749

23,304,757

2000

233

99,405

23,161,372

2001

219

99,353

21,758,218

2002

286

99,162

28,360,340

2003

437

99,275

43,383,103

2004

244

100,000

25,117,992

2005

149

98,575

14,687,703

SOURCE: NSF SBIR program data.

8.10.2
Distribution of Funding to Phase I and Phase II Grants within the Specified Limits

The distribution of NSF SBIR funds to Phase I and Phase II grants reflects the amount of funding available and a time-lag effect of the previous year’s allocation decisions. For example, a year of a large rise in the allocation to Phase I grants tends to be followed the next year by a rise in the allocation to Phase II grants. Discussions with NSF SBIR program officials did not reveal a formulaic approach for allocating available funding among the different grant categories. However, the solicitations mention an approximate amount of funding available, indicating advanced planning for the funding allocation.


Actual Size and Duration of Phase I Grants. The average size of NSF Phase I grants, stated in current dollars, increased in a series of steps from nearly $50,000 in 1992 and 1993 to an average of approximately $64,000 in 1994 and 1995, to approximately $74,000 in 1996 and 1997; and thereafter to an average of nearly $100,000, the current maximum Phase I grant amount.

Total annual Phase I funding rose stepwise over much of the fourteen-year period. Over the years 1992 to 1993, annual Phase I funding ranged from $10 million to $12 million. Over the years 1994 to1997, Phase I funding totaled roughly $19 million annually. Over the years 1998 to 2001, Phase I funding ranged from $21 million to $23 million annually. Then, in 2002, the total jumped to $28 million, followed by an even more dramatic jump to $43 million in 2003. In 2004 there was a drop in total Phase I funding back to $25 million, and a further drop

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

TABLE 8.10-2 Phase II Grants, 1992–2005

Fiscal Year

Number of Grants

Average ($)

Total ($)

1992

57

246,373

14,043,289

1993

52

269,668

14,022,757

1994

21

279,983

5,879,644

1995

48

277,136

13,302,542

1996

90

297,584

26,782,533

1997

122

289,390

35,305,525

1998

117

344,958

40,360,087

1999

89

394,854

35,141,968

2000

95

400,827

38,078,524

2001

91

475,018

43,226,632

2002

67

495,645

33,208,238

2003

77

498,505

38,384,851

2004

131

498,254

65,278,995

2005

132

499,715

65,962,445

SOURCE: NSF SBIR program data.

in 2005 to $14.7 million. Table 8.10-1 shows the annual number, average size, and total amount of Phase I grants from 1992 to 2005.

The mean annual duration of Phase I grants was essentially half a year throughout the ten-year period, with only minor fluctuations. A mean duration of half a year would be expected since this is a standard feature of Phase I grants for the NSF’s SBIR program, as established by the SBA, which has oversight responsibility for the program.


Actual Size and Duration of Phase II Grants. The average size of a Phase II grant grew from $246,000 in 1992 to over $300,000 by 1998, to over $400,000 by 2000, and to nearly $500,000 from 2001 to 2005. Effective November 1, 2003, the SBA increased the limit of Phase II (including the combination of Phase II/IIB grants) from $750,000 to $1,000,000. The NSF, however, has chosen to limit the funding of its Phase II grants to $500,000 and to make the remaining allowable $500,000 available as a supplement to the initial amount through follow-on Phase IIB grants.

Like its total Phase I funding, NSF total annual Phase II funding rose over the thirteen-year period, but this rise was more irregular. Total Phase II funding was at the $14-million level in 1992 and 1993. It dropped in 1994, rose back to the $13-million level in 1995, and jumped to nearly $27 million in 1996. Between 1997 and 2003, total annual Phase II funding fluctuated between $33 million to $43 million. In 2004, Phase II funding surged to $65 million as Phase I funding fell back to its 2002 level. Table 8.10-2 shows the annual number, average size, and total amount of Phase II grants from 1992 to 2005.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

TABLE 8.10-3 Phase IIB Grants, 1998–2005

Fiscal Year

Number of Grants

Average ($)

Total ($)

1998

4

99,986

399,944

1999

21

95,170

1,998,574

2000

9

184,466

1,660,191

2001

14

307,334

4,302,682

2002

39

245,861

9,588,580

2003

24

237,096

5,690,294

2004

22

273,883

6,025,436

2005

28

330,731

9,260,464

NOTE: Effective November 1, 2003, the limit of Phase IIB grants was increased from $250,000 to $500,000. Yet the average is shown to exceed $250,000 in 2001. This apparent dichotomy is explained by the fact that if the Phase II grant were below the allowable size, the program would sometimes increase the amount of the Phase IIB grant to bring the combined Phase II/Phase IIB amount up to the allowable limit—in this case, $750,000. The record keeping for this practice would cause the Phase IIB award amount to appear to exceed the limit. Telephone interview with Joseph Hennessey, NSF, October 18, 2005. Dollar amounts are in current dollars.

SOURCE: NSF SBIR program data.

According to the SBA, Phase II grants may last “as many as two years.” The annual mean duration for the initial Phase II grants over the ten-year period ranged from two to two and a half years. Reasons for the average exceeding the official limit likely reflect accounting closeout issues or requests for extensions.


Actual Size and Duration of Phase IIB Grants. In 1998 and 1999 the average Phase IIB grant ranged between $92,000 and $100,000. The average jumped to $184,000 in 2000. Thereafter, the average Phase IIB grant fluctuated between $234,000 and $294,000 each year.

Total Phase IIB grants rose dramatically from $399,900 in 1998, to nearly $2 million in 1999, which was followed by a drop to $1.7 million in 2000. The total jumped to over $4.3 million the next year, and in 2002 it more than doubled to $9.6 million, the highest level of the entire period. In 2003 and 2004 the total was close to $6 million, and it rose to $9.3 million in 2005. Table 8.10-3 gives the number, average size, and total amount of Phase IIB grants.

According to NSF program officials, they made the decision to increase the amount of funding to Phase IIB as opposed to increasing the amount for Phase I or Phase II grants in an attempt to leverage the government’s investment more effectively. Program officials thought offering additional funding through Phase IIB grants would accelerate development and commercialization of technology by encouraging Phase II grantees to seek third-party funding.

Program officials pointed out that changes in the average size of Phase IIB grants in response to the move to provide supersized grants depend on how suc-

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

TABLE 8.10-4 Total Funding Provided to Grantees by the NSF’s SBIR Program, 1992–2005

Fiscal Year

Total Grants of All Types

Total Grant Outlays ($)

1992

265

24,392,404

1993

308

26,752,941

1994

330

25,687,589

1995

349

32,738,553

1996

342

45,501,820

1997

383

54,687,896

1998

336

61,909,845

1999

346

60,445,299

2000

337

62,900,087

2001

324

69,287,532

2002

392

71,157,158

2003

538

87,458,248

2004

397

96,422,423

2005

309

89,910,612

SOURCE: NSF SBIR program data.

cessful grantees are in securing third-party investments. “We do not anticipate a significant increase in the amount of total funding, because the grantee will be required to secure $1 million in outside investment in order for NSF to match with an additional $500,000. For instance, we received 15 Phase IIB proposals in November [2004], and only three were able to secure $1 million in outside commitments.”77


Total Amount of Funding Provided by the NSF. Table 8.10-4 shows the total funding the NSF’s SBIR program provided to small businesses each year from 1992 to 2005. It began at a level of $24.4 million, rose fairly steadily to a level of $96.4 million in 2004, and fell back to $89.9 million in 2005.

8.11
ONLINE CAPABILITIES AND PLANS

8.11.1
FastLane System

Grant application and processing at the NSF is now entirely electronic via the NSF’s online FastLane system. FastLane is used for proposal submittal, panel review, proposal management, tracking payments, checking proposal status, making travel arrangements, submitting reports, and other purposes. According

77

NSF SBIR Response to NRC Questions, op. cit.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

to program administrators, as a result of FastLane, “Doing business with NSF is simpler, faster, more accurate, and less expensive.”78

Proposers are required to prepare and submit all proposals through the FastLane system. To facilitate proposal preparation, FastLane has smart forms capability that pulls in all individual and organizational information available in the NSF database to minimize the amount of information that must be typed in by the user. Detailed instructions for proposal preparation and submission via FastLane are available at <http://www.fastlane.nsf.gov/a1/newstan.htm>.

Principal investigators are also required to use the NSF electronic project reporting system, available through FastLane, for preparation and submission of project reports. This system facilitates electronic submission and updating of project reports, including information on project participants (individual and organizational), activities and findings, publications, and other specific products and contributions.

According to program administrators, adoption of electronic business Web-based approaches is consistent with the broader commitment to e-business by the NSF. At the time of the study, the NSF was the only agency to receive the highest status rating (green) in two of the government-wide President’s Management Agenda initiatives. During 2002 the NSF became the first federal agency to receive the top rating for the e-government initiative.79

8.11.2
Barriers to Online Capabilities and Plans

No barriers obstructing implementation of electronic filing and processing of proposals were found. The provision of online capabilities appears to have been successfully achieved by the NSF’s SBIR program for all aspects of grant application, processing, and reporting.

8.12
ADMINISTRATIVE RESOURCES

8.12.1
Funding of Program Administration

According to NSF SBIR program officials, the program’s dedicated annual administrative budget was $2 million in a recent year, not including program officer salaries and other administrative support that does not come out of the $2 million. Items included in the $2 million are contracts for support to supplement NSF personnel, costs of running review panels, contractor costs to organize and run national conferences or operate outreach Web sites, and other items.80

78

NSF SBIR Response to NRC Questions, op. cit.

79

Ibid.

80

Note that the NSF SBIR staff originally reported its 2004 administrative budget at $3 million. Ibid. This figure was later changed to $2 million. Telephone interview with Joseph Hennessey, NSF, October 18, 2005.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

The COV in 2004 commented in a broad way on the inadequacy of the resources that are available for managing and administering the program. It noted that as the workload of the program has increased, pressures on resources have intensified.

8.12.2
Administration Budget as a Percentage of Agency SBIR Funding

The administrative budget was 2 percent of agency SBIR funding in 2004 ($2 million out of a total SBIR funding of $96.4 million). Because the administrative budget has been relatively stable over several years, while the SBIR funding has increased, the administrative budget as a percentage of the agency’s SBIR funding has dropped. For example, the administrative budget was 2.8 percent of total funding in 2002 and 2.3 percent in 2003. Because the administrative budget is not free to vary with changes in the SBIR funding, program managers have to do more with less as the program grows. If NSF budgets increase in coming years, the pinch in administrative funding can be expected to tighten further unless a change is made.

8.12.3
Evaluation and Assessment Funding

As noted in Section 8.8.4, program officials have estimated that the program’s annual expenditures on evaluation and assessment in recent years total less than 1 percent of the NSF SBIR program budget.81 However, no actual dollar value was placed on the amount spent, so it might be anywhere between $0 and nearly $100,000 per year. Considering the fact that externally conducted evaluation would have to come out of the small administration budget, it is not surprising that the program generally has not commissioned such evaluations. Given that in-house staff do not come out of this budget, it is likewise not surprising that staff members have generally carried out the evaluations that have been done.

8.13
BEST PRACTICES AND PROGRAM EVOLUTION

8.13.1
Adoption of Best Practices from Other Agencies

There is evidence that the NSF’s SBIR program has adopted at least three to four best practices from other agencies. The following are illustrative rather than exhaustive.

A notable example of how the NSF’s SBIR program adopted a “best-practice” from another agency was the introduction of the NSF Phase IIB program in 1998. According to an NSF staff member, in the mid-1990s the Department of Defense (DoD) introduced “Fast Track” as an incentive for partnering between

81

NRC Program Manager Survey op. cit.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

the small business and investment communities. The DoD’s Fast Track required third-party funding at the front end as a prerequisite for a faster transition from Phase I to Phase II. The DoD program manager would automatically raise the ranking of Fast Track proposals to the top of the funding category. Recognizing the value of attracting investors early on, the OII decided to adapt the third-party financing requirements of Fast Track to the NSF’s SBIR/STTR program. Though derived from the DoD’s Fast Track, the NSF put its own stamp on its Phase IIB program, placing the third-party financing initiative after Phase II and using it to move grantees closer to commercialization.

A second practice borrowed from another agency is NSF’s Commercialization Assistance Program (CP2). The DoE’s SBIR program initiated a commercialization assistance program to assist its grantees early on. The NSF program later recognized the value of such assistance, and adapted the concept for its own grantees, recognizing that many of the grantees tend to be strong in science but less so in business.

A third practice, very recently borrowed by the NSF from the DoE and Navy SBIR programs, is to provide grantees the opportunity to network with and present to potential investors in organized events. In 2005 the NSF began to sponsor some of its own grantees in participating in a DoE-sponsored “Opportunity Forum.”

A fourth practice, the NSF’s Matchmaker service, may have been modeled on ATP’s R&D Alliance Network, which was established in the mid-1990s. It is not clear that NSF actually did model its service on ATP’s, however, the services are similar and ATP’s service predated NSF’s.

8.13.2
Evolution of the NSF’s Program During the NRC Study

It has often been noted that the effects of program evaluation and assessment typically are felt before a study is completed and recommendations are made. As an outside body examines a program, the subject program almost immediately intensifies its self-examination, which leads to internally generated changes. This phenomenon may be at work in the present situation, as the rate of evolution in the NSF’s SBIR program appears to have intensified during the NRC study. Of course, during this same period, the program has also been responding to recommendations of its COV.

An example of a program improvement that has occurred during the course of the program is improved data compilation and analysis. At the beginning of this study, requests for program data were met with delays, delivery of partial and incorrect data, and a lack of evidence that the program administration routinely and systematically used program data effectively to manage the program. During the course of the study, the program appeared to have made progress in its data management and analysis and became more responsive to the study’s data requests. Furthermore, as evidenced by the program’s recent Strategic Report, the

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

program is making use of data to profile and enhance understanding of program developments and trends.


OII Operational Plans for Improvements. OII included plans for additional program improvements in its recent Strategic Plan. In part, these plans anticipate and potentially address some of the issues identified in this study.82 Here we provide a brief overview of these plans, particularly as they relate to the issues identified in this study.

OII’s Strategic Plan addressed four major goals: (1) to identify, nurture, and lead the small business community to technological innovation arising from the frontiers of academic research; (2) to improve the commercial success of small high technology businesses; (3) to grow the small business community as a major employer of U.S. scientists and engineers; and (4) to deliver the highest value to the nation’s small technology business community. For each of these major goals, a set of objectives is given and an operation plan is presented listing specific tasks for accomplishing the objectives. Here we briefly identify a selection of those objectives and a selection of operational plans.

  1. OII Objective: Enhance training and assistance to small business grantees for the commercialization of SBIR/STTR grants

    Operational Plan:

    1. Offer additional contract assistance to Phase II grantees

      • Develop plans to work with incubators, business schools, and other resources

      • Concentrate on specific technology incubators, for example, Biotech incubators in Maryland

      • Provide innovation management courses to grantees

    1. Revise Phase I requirements (to include more “meat” up front)

      • Bring business reviewers into the Phase I process

    1. Provide training for international patent strategies

  1. OII Objective: Create small business partnerships with investors and corporate partners and provide incentives to accelerate commercialization

    Operational Plan:

    1. Bring investors and corporate partners to grantees conferences and workshops

    2. Use business school MBAs for business assistance

    3. Conduct workshops on “narrow” topics for partnering

82

According to the preface in the Strategic Plan, the motivating force driving the OII Strategic Plan is the Engineering Directorate’s strategic directions to “Strengthen Technological Innovation” in response to the National Innovation Initiative Report “Innovate America.”

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
  1. Proactively channel NSF grantees to other agencies as potential subcontractors to primes

  1. OII Objective: Judiciously select SBIR/STTR solicitation topics

    Operational Plan:

    1. Continuously review, refine, rejuvenate, and revise the investment business focused topics of Electronics Technology, Biotechnology, and Information-Based Technology

    2. Continuously review, refine, rejuvenate, and revise the industrial market-driven topics of Advanced Materials and Manufacturing and Chemical-Based Technology

    3. Be flexible and nimble to have solicitations at short notice on technologies that respond to national needs

  1. OII Objective: Encourage entrepreneurship by underrepresented groups

    Operational Plan:

    1. Expand SBIR/STTR program beyond the newly initiated partnering with CREST (predominantly minority academic research institutions)

    2. Increase subcontractor efforts to expose underrepresented small businesses to all business resources including National Outreach Conferences

    3. Seek ways to increase underrepresented participation in SBIR/STTR

    4. Target underrepresented community pockets

    5. Target both the physically disabled community as well as the technologies for that community

  1. OII Objective: Grant all Phase I and Phase II grants within six (6) months of the solicitation deadline

    Operational Plan:

    The plan includes a group of tasks that center on organizing [in order] to handle a high volume of proposals with limited staff and the need to employ contractors and convene panels of reviewers.

  2. OII Objective: Implement a robust grants management program

  3. OII Objective: Redefine outreach

  4. OII Objective: Conduct an in-depth outcomes assessment

    Operational Plan:

    Develop a plan to define, measure, and test SBIR/STTR program metrics

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

The objectives in OII’s Strategic Plan point to the program’s efforts to improve the program on a continuing basis. Moreover, the history of the program is replete with changes to make the program better.

8.14
CONCLUSION

This report is one element of the study that Congress requested of the SBIR program as a part of the 2000 program reauthorization.83 It focuses on the smallest, and the oldest, of the “big-five” federal SBIR programs addressed by the NRC study, namely, the NSF’s SBIR program. Drawing on results of survey, case study, data and document analyses, and interviews of program staff and officials, the report has examined how the NSF’s SBIR program is meeting its four mandated purposes. In the aggregate, the report provides an account of a program that is well managed and is delivering results. At the same time, it identifies challenges and recommends operational improvements to strengthen the SBIR program at the National Science Foundation.

83

SBIR Reauthorization Act of 2000 (H.R. 5667, Section 108).

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×

This page intentionally left blank.

Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 136
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 137
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 138
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 139
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 140
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 141
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 142
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 143
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 144
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 145
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 146
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 147
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 148
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 149
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 150
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 151
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 152
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 153
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 154
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 155
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 156
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 157
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 158
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 159
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 160
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 161
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 162
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 163
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 164
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 165
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 166
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 167
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 168
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 169
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 170
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 171
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 172
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 173
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 174
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 175
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 176
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 177
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 178
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 179
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 180
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 181
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 182
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 183
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 184
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 185
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 186
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 187
Suggested Citation:"8 Program Management." National Research Council. 2008. An Assessment of the SBIR Program at the National Science Foundation. Washington, DC: The National Academies Press. doi: 10.17226/11929.
×
Page 188
Next: Appendix A: NSF SBIR Program Data »
An Assessment of the SBIR Program at the National Science Foundation Get This Book
×
Buy Hardback | $99.00 Buy Ebook | $79.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Small Business Innovation Research (SBIR) program is one of the largest examples of U.S. public-private partnerships. Founded in 1982, SBIR was designed to encourage small business to develop new processes and products and to provide quality research in support of the many missions of the U.S. government, including health, energy, the environment, and national defense. In response to a request from the U.S. Congress, the National Research Council assessed SBIR as administered by the five federal agencies that together make up 96 percent of program expenditures.

This book, one of six in the series, reports on the SBIR program at the National Science Foundation. The study finds that the SBIR program is sound in concept and effective in practice, but that it can also be improved. Currently, the program is delivering results that meet most of the congressional objectives, including stimulating technological innovation, increasing private-sector commercialization of innovations, using small businesses to meet federal research and development needs, and fostering participation by minority and disadvantaged persons. The book suggests ways in which the program can improve operations, continue to increase private-sector commercialization, and improve participation by women and minorities.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!