Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 90
An Assessment of the SBIR Program at the Department of Energy 8 Program Management 8.1 SBIR IN THE DEPARTMENT OF ENERGY1 The Department of Energy (DoE) SBIR program is located administratively within its Office of Science (SC). With a budget of $3.5 billion, SC is the department’s largest science-funding component. The programmatic emphasis within the Office of Science is on basic energy sciences, biological and environmental research, fusion energy sciences, high energy and nuclear physics, and computational science. Within the federal government, the Office of Science is the largest federal sponsor of materials and chemical sciences research. In addition to SC, six other technical programs at DoE receive services from the SBIR office: Fossil Energy, Energy Efficiency and Renewable Energy, Electricity Delivery and Energy Reliability, Nuclear Energy, Environmental Management, and Nuclear Nonproliferation. SC is the largest DoE program participating in SBIR (receiving 64 percent of SBIR funds in 2002). As noted above, most nuclear weapons programs are excluded from SBIR. At the outset, there were competing interests as to which DoE office should house the management of the SBIR program. Ultimately, Alvin Trivelpiece, then the Director of the Office of Energy Research, OER (the forerunner of the Office of Science, SC), persuaded the Secretary of Energy to assign SBIR responsibilities to OER. Two major factors contributed to this decision: (1) approximately two-thirds of DoE’s SBIR funds would come from OER (because most of DoE’s 1 Information in this section is drawn from discussions with current and former DoE SBIR staff and other DoE personnel involved in DoE SBIR, and from the presentation by Milton Johnson (DoE Office of Science Deputy Director for Operations) at the October 24, 2002, National Research Council workshop, The Small Business Innovation Research Program: Measuring Outcomes, Washington, D.C.
OCR for page 91
An Assessment of the SBIR Program at the Department of Energy Defense Programs area was exempted by law from SBIR), and (2) OER’s basic research culture was thought to be more aligned with what was thought to be the spirit of SBIR research—namely, high risk/high reward. Trivelpiece designated Ryszard Gajewski to serve as the SBIR Program Manager in addition to his responsibilities as director of OER’s Advanced Energy Projects (AEP) Division. Gajewski formed an SBIR Advisory Panel to develop processes for implementing the legislation and to oversee the conduct of the program at DoE.2 The panel provided advice to its Chairman, the SBIR Program Manager, who was able to exert considerable influence. Many of its early decisions became policies that are still in effect today. Box 8-1 summarizes major events in the evolution of the program. Today, the DoE SBIR Office is led by the SBIR Program Manager, appointed by the Director of the Office of Science. The management of DoE SBIR program is centralized along some dimensions, decentralized along others. Centralization. Following guidance provided by the SBIR program manager, all participating technical programs adhere to a common schedule involving one competition per year, observe common procedures for the receipt and evaluation of grant applications, and follow the same guidelines with regard to scoring. The processing of proposals (officially, proposals are referred to as “grant applications” at DoE), including management of the review and selection processes, is administered by the SBIR office within the Office of Science. Decentralization. The technical programs are responsible for generating technical topics for the annual solicitations, selecting reviewers for each proposal, scoring the proposals, and recommending proposals for funding. These procedures, which are overseen by the SBIR Program Manager, are performed by approximately 70 Technical Topic Managers (TTMs) and Technical Project Monitors (TPMs) located within the technical programs that participate in SBIR. Each program at DoE that participates in SBIR proposes topics to be included in the annual solicitation. The quantity of funding allocated to technical challenges identified by that program directly determines the number of topics included. By design, the share of topics for each program is about equal to its share of the overall DoE extramural research budget. The SBIR program manager assigns each program a proportional allotment of technical topics for the annual SBIR solicitation, as well as an allotment of both Phase I and Phase II awards, 2 The SBIR Advisory Panel was composed of one representative from each technical program that would contribute to the SBIR set-aside, plus representatives from general counsel, procurement, and budget. Typical representatives were working-level scientific program managers.
OCR for page 92
An Assessment of the SBIR Program at the Department of Energy Box 8-1 Major Events in the Evolution of the DoE SBIR Program 1983—DoE becomes the first agency to issue an SBIR solicitation with explicitly specified research topic areas. The first solicitation had 25 topics and attracted 1,734 proposals. However, the distribution was far from uniform. One topic received 342 proposals, others less than 10. DoE reacted by increasing its emphasis on topic editing, attempting to narrow some topics and broaden others. 1986—Samuel Barish appointed SBIR Program Manager. From the beginning Dr. Gajewski was performing two functions (he was also AEP director) and delegated most day-to-day operations to Dr. Barish. This appointment largely formalized the status quo. 1988—Commercialization Assistance Program (CAP) begins. DoE was the first agency to initiate a program to assist awardees with the commercialization of SBIR technology. Five applicants responded to a solicitation, and the competition was won by Dawnbreaker, a small company in Rochester, NY. Dawnbreaker has provided these services to DoE SBIR awardees over the next 17 years, winning several more DoE competitions. Other agencies have since adopted such programs for their own SBIR awardees. DoE believes that DoE CAP helped drive the change in the law (P.L. 102-564) that allowed agencies to use funds from the SBIR set aside for such purposes. 1993—STTR program implemented with first solicitation. With far fewer funds in STTR, DoE limited the number of technical topics, implementing a topic rotation among its technical programs. 1994—Criteria for commercial potential added to evaluation of Phase II proposals. In order to increase the emphasis on commercialization, P.L. 102-564 required agencies to consider four additional criteria in the evaluation of Phase II proposals. DoE formally added these criteria to its scoring system. 1995—SBIR Process Improvement Team recommends some significant changes. An SBIR Process Improvement Team (PIT) was established in response to two primary driving forces: (1) the by-now entrenched tension between the SBIR office and DoE’s technical programs and (2) the Office of Science initiative on Total Quality Management, pushed by its Director, Martha Krebs. The PIT, composed of five TTMs/TPMs and chaired by Robert Berger, SBIR Program Officer, was charged with developing strategies for streamlining the program and reducing tensions. based on that program’s “contribution” to the SBIR budget.3 Once the allocation of technical topics is made by the SBIR staff, the technical program managers are free to assign topic responsibility among their own staff. Reasons for such 3 The allocation of SBIR funds in proportion to R&D “contributions” is an arrangement devised in recent years that has successfully countered previous “gaming” by technical topic managers (TTMs) and technical project monitor (TPMs) in the scoring of SBIR proposals.
OCR for page 93
An Assessment of the SBIR Program at the Department of Energy 1996—Robert Berger appointed SBIR Program Manager. Dr. Berger implemented many of the changes recommended by the PIT on which he was a member. First, measures were taken to reduce the workload: the Phase II early decision program was terminated, topics were further narrowed to reduce the number of proposals, some rules were relaxed, and paperwork was reduced. A new scoring system was instituted to provide primary authority for award selection to the technical programs; although the SBIR Program Manager still maintained oversight on scoring, the Application Score Verification Panel (ASVP) was dropped and a formal process was implemented to allow the technical programs to appeal SBIR office decisions to a neutral third party. Finally, a policy was implemented to attempt to provide all technical programs with a fair return on their SBIR investment. 1999—DoE issues a single solicitation for both SBIR and STTR. To reduce the workload further, a single solicitation was issued for both SBIR and STTR and the two programs were put on the same evaluation schedule. To increase their chances of winning an award, small businesses that partnered with a research institution were allowed to apply to both programs with a single proposal by checking a box on the cover sheet. Consistent with this change, the duration for both SBIR and STTR Phase I projects were adjusted to make them of equal duration, nine months each. With equal Phase I durations, the department can evaluate the Phase I and Phase II awards for both programs simultaneously, with significant increases in efficiency. (For Phase II, DoE uses the 2-year duration prescribed by the SBA.) 2000—Congress mandates SBIR study by the National Academy of Sciences. In response to this mandate, DoE negotiated a contract for the study with the NAS and began to respond to NAS inquiries with respect to the study. 2005—Technology Niche Assessment (TNA) another assist program is added. In addition to the Commercialization Assistance Program offered by Dawnbreaker, DoE introduced a new solicitation for innovative assistance program to help the SBIR firms to better commercialize technology. Foresight with their TNA program won the solicitation. Now, Dawnbreaker and Foresight both offer their programs to SBIR firms, and both are small businesses. reassignment of responsibility and the manner in which it is carried out are described in section 5.4.2. 8.2 RESOURCES FOR PROGRAM ADMINISTRATION Public Law 102-564 prohibits the federal agencies from using any of their SBIR budgets to fund the administrative costs of the program. As a result, DoE
OCR for page 94
An Assessment of the SBIR Program at the Department of Energy regards the administrative expenses as an additional, though unspecified, tax. As host for the SBIR office, DoE’s Office of Science is responsible for the direct costs of administering the program: salaries for the federal employees, support services contracts, costs of developing and maintaining the electronic grant management system, etc. Because SC must use its own funds to administer the SBIR program, and because of the historical resistance to SBIR at DoE, there has been a tendency for SC to reduce the resources for administering the program—ultimately to the point where 1.5 full time DoE employees, assisted by five contractors, were managing the administration (though not the review) of approximately 1,500 applications annually (both Phase I and Phase II). In March 2005, DoE SBIR reported a staff of three federal employees and six contractors. They still reported substantial resource constraints, including the use of 20-25 percent of their time on redundant paperwork issues. In order to cope with these resource constraints, the SBIR staff has placed a high premium on sticking to a clearly mapped-out schedule and meeting established deadlines. These in turn are appreciated by TTMs and TPMs. Management results over time—on time service, the cooperation of the technical program staff, acceptable outcomes—might seem to indicate that the program receives approximately the appropriate level of agency funding, even though the level of administrative support per dollar of funds awarded, or per application processed, is lower at DoE than at other study agencies. Low funding levels for administration mean however that the DoE SBIR staff devotes nearly 100 percent of staff time to managing the processes for generating technical topics, and for receiving, evaluating, and selecting grant applications. This leaves little time for activities such as outreach, measuring Phase III activity, encouraging Phase III activity (both within and outside the department, including the national laboratories), internal evaluation, strategic planning, and documenting successes. Reallocating resources toward SBIR administration in an agency whose culture has long favored more traditional research performers (national laboratories, universities) will be a challenge. 8.3 TOPIC GENERATION Since at least 1992, SBIR topic development at DoE has proceeded as follows. In the spring, approximately six months prior to publication of the program solicitations in the fall, the DoE SBIR program manager sends a “call for topics” memo to all of the Department’s SBIR portfolio managers (the designated representatives responsible for managing the SBIR portfolio within each of DoE technical program areas), along with copies to all DoE personnel that managed technical topics in the prior year. Accompanying the memo is the National Criti-
OCR for page 95
An Assessment of the SBIR Program at the Department of Energy cal Technologies List,4 as well as guidelines for topic development.5 The SBIR portfolio managers develop their own specific procedures for generating the technical topics. Once proposed topics are received from the portfolio managers, the DoE SBIR program manager arranges for a technical edit of the topic statements into a common format, beginning with an articulation of the problem to be addressed, often followed by a description of some technical approaches that are considered to be of interest. Prior to publication of the topic, the author of the topic provides the SBIR program manager with at least five bibliographic references (sometimes including Web sites), to be published with the topic description. In FY2003, this process resulted in 47 technical topics within 11 program areas. All technical topics in the SBIR solicitation are constructed to support the overall mission of the technical program areas that provides the topics. Therefore, any proposal deemed to be responsive to the topic (in the First-Step Technical Review described below), and subsequently selected for award, would, by definition, be using DoE SBIR funds to support DoE mission. One hundred percent of DoE SBIR awards satisfy this condition. DoE explicitly uses narrowly-defined technical topics to keep the number of applications at what the agency views to be a manageable level. Staff noted in interviews that increases in the number of applications may strain the resources available to provide a thorough evaluation. This is an important point. By defining topics narrowly—and by specifying in some cases the technical means to be used to address a topic—DoE is explicitly limiting the number of likely respondents, and is de facto excluding firms that might have innovative approaches, or may recognize problems that DoE has not yet become concerned about. Narrowing the topics is therefore a trade-off: It reduces agency work-load, but also agency opportunity. A few of DoE case study companies indicated a preference for broader topics; others commented that tight topics gave the impression that the topic may be designed to give a particular company an advantage in the selection competition. 8.4 AWARD SELECTION 8.4.1 First-step Technical Review The technical review of SBIR proposals takes place in a two-step process. In the first step, conducted by the Technical Topic Managers (TTMs), a proposal can be declined for any one of five reasons: 4 As conveyed to DoE from the Small Business Administration. 5 For example, Executive Order 13329, which encourages innovation in manufacturing. Access at <http://www.whitehouse.gov/news/releases/2004/02/20040224-6.html>.
OCR for page 96
An Assessment of the SBIR Program at the Department of Energy It is deemed unresponsive to the solicitation topic and subtopic; It is not for research or for research and development; Not enough information is provided to properly evaluate the proposal; It unduly duplicates other work; and Its scientific and technical quality are deemed significantly lower than other proposals in the competition. The last of these reasons requires the TTM to score the proposal with respect to the evaluation criteria. The objective of the first step of the review process is to reduce workload, and to avoid having reviewers waste time with non-responsive or poor proposals. 8.4.2 Initial Review Approaches Until 1995, each Phase I proposal was assessed according to five criteria: The scientific/technical quality of the proposed research; The proposal’s degree of innovation; Staff qualifications and the availability of adequate facilities or instrumentation; The anticipated benefits, technical and/or economic, of the proposed research (Phase I and Phase II), with special emphasis on the attraction of further funding; The extent to which the Phase I award could prove the feasibility of the concept. For each criterion, every application received a score from zero (unsatisfactory) to four (excellent). The TTMs (or TPMs, Technical Project Monitors, in Phase II) would receive the comments from expert technical reviewers, usually from outside DoE, and would then score the proposal based on these comments. This approach resulted in tensions between the SBIR office and the technical programs. TTMs and TPMs raised multiple concerns, largely focused on the workload: Early SBIR solicitations attracted at least twice as many proposals, per dollar awarded, than other DoE programs. Also, the need to maintain program uniformity, ensure each applicant was treated fairly, guarantee confidentiality of proprietary information, and conform to SBA rules added extra steps and paperwork. Finally, DoE’s commitment to on-time service6 exacerbated demands on the TTMs and TPMs. 6 From the very beginning, DoE SBIR program emphasized the need to meet its deadlines. The reasons for this were twofold: (1) the deadlines were mandated in the SBIR Policy Directive, issued by the Small Business Administration to guide agency operations, as stipulated in P.L. 97-219 (although most other agencies did not meet the SBA guidelines); and (2) even more importantly, DoE believed that a timely response to its applicants was only fair—after all, most had put in a lot of time on their proposals, did not deserve to be left in limbo, and were anxious to begin work.
OCR for page 97
An Assessment of the SBIR Program at the Department of Energy In short, technical program personnel believed that they were being asked to do an exorbitant amount of work for a small percentage of their budget, with research performed by a new type of performer with whom they had little experience and some doubt as to their capabilities. Technical program managers were also concerned that they were not getting a reasonable return on their SBIR investment. 8.4.3 1995 Process Revisions In 1995 a Process Improvement Team was established to resolve these tensions (see Box 8-1 for program chronology). The team recommended changes to the grant selection process, which have evolved to the procedures that are used today. The first-step review process and the selection of reviewers remained the same. The major changes concerned scoring and award selection. First, the scoring system was made less rigid and more subjective. The number of criteria was reduced from five to three: Quality of the scientific/technical approach; Ability to carry out the project; and Impact. Second, the primary authority for award selection was shifted to the technical programs. Each program’s SBIR portfolio manager is responsible for developing a process for determining which grant applications should be selected for award. While the SBIR program manager continues to oversee scoring, this process is now less formal. Within the SBIR office, funding candidates are reviewed to ensure that the scoring can be justified by the comments of the reviewers. If a discrepancy occurs, the SBIR program manager contacts the TTM to resolve the scoring. If the discrepancy cannot be resolved, the proposal is referred to an Adjudicating Official, appointed by the Director of the Office of Science. The new grant selection process has worked well from this perspective. Each year, only one or two proposals are referred to the Adjudicating Official. 8.4.4 Fairness of Competition The interviewed firms that commented on the fairness of competition (Eltron, NanoScience, NexTech) believed that the award system was fair. They appeared to understand that subjectivity plays a role in the selection of proposals for award, but this subjectivity was regarded as a general weakness of the peer review system and not a specific weakness of the SBIR award process. One firm (Eltron) perceived an unevenness of competition between applicants that were academic off-shoots and those that were independent private companies. In a university setting, a grantee may have the advantage of hiring
OCR for page 98
An Assessment of the SBIR Program at the Department of Energy post-docs for little money or have access to university equipment at no extra charge. It was suggested that this potential disparity should be taken into consideration by decision makers. 8.5 OUTREACH Among the consequences of limited administrative funding is the program’s constrained ability to conduct outreach to the small business community and to others. Because the staff is so small and the demands of the evaluation processes are so large, DoE staff decline most speaking invitations. DoE does participate in the SBIR National Conferences, which have been held semi-annually and are sponsored by the DoD and the NSF. However, DoE has attempted to avoid state and local conferences because: (1) its staff was extremely small and busy with its own SBIR processes, (2) the National Conferences served the same purpose, and (3) too often, the audience was either too small or inappropriate (i.e., companies with no research capabilities) to justify the time and expense. DoE does take steps to make small businesses aware of upcoming SBIR solicitations. Historically, DoE posted its solicitation in the Commerce Business Daily and mailed out twenty thousand copies each year in order to ensure that a broad community of small business was aware of DoE SBIR program. The NRC SBIR Phase II Survey did seek information from firms regarding the extent to which they sought outside assistance in preparing their proposals from state agencies, mentor companies, regional associations, or universities. About 86 percent of respondents reported receiving no assistance in proposal preparation. A few firms received assistance from state agencies, or from mentor companies. Universities provided assistance to 8 percent of the respondents. Of those that received assistance, 60 percent found it very useful, 35 percent found it somewhat useful, and 5 percent did not find it useful. 8.6 THE APPLICATION AND AWARD PROCESS: AWARDEE COMMENTS Half of the case study companies (Atlantia, Eltron, IPIX, NanoScience, Pearson) expressed overall satisfaction with the application process. Two of the three interviewees (IPIX, NanoScience) that commented on the feedback provided to applicants indicated that the comments were informative and straightforward. The third company (PPL) did not believe the feedback was particularly useful. There were a number of recommendations with respect to the solicitations and the submission of proposals. Two case study companies (Airak, NexTech) recommended that all agencies should increase the issuance of their solicitations to at least twice per year, as DoD and NIH now offer. One of these interviewees (Airak) recommended a combined SBIR application process for all agencies, in order to save time and effort on behalf of the applicants.
OCR for page 99
An Assessment of the SBIR Program at the Department of Energy One company indicated that the amount of work necessary to submit proposals outweighed the resulting funding, and another (Airak) was ambivalent about whether the preparation of SBIR proposals was an effective use of resources. However, four interviewees (Atlantia, Eltron, IPIX, NanoScience) believed that the cost and time spent are more than made up for by the benefits. One of these companies (NanoScience) stated that the learning process involved in putting together a strong application could be beneficial, sometimes more than any eventual reward. Regarding topic specification, several interviewees (IPIX, NanoScience, PPL) were ambivalent about the need for broader topics; but where preferences were expressed (Atlantia, IPIX, PPL), they supported broader topics in order to increase transparency, ease topic selection for applying firms, and allow easier entry by smaller participants. (IPIX) One case study company “almost did not do a proposal” because its research did not appear to fit a published solicitation topic; yet, the work led to great success (Atlantia). Three interviewees (IPIX, NexTech, PPL) suspected that, occasionally, very tight topic specifications could imply a lock on the grant by a particular applicant whose orientation and capabilities are better reflected in the specifications. 8.7 MANAGING INFORMATION ON AWARDS 8.7.1 Reporting Requirements DoE collects three types of information from SBIR participants: First, the SBIR proposals themselves. Data on all proposals for the last 20 years (successful and not) are maintained in a FoxPro database. Second are the traditional progress and final reports of the research project itself (final reports only for Phase I), which are delivered to both the TPM and the contracting officer. Final reports for Phase I funded projects that do not receive Phase II funding and Phase II final reports are delivered to the TPM and the department’s Office of Scientific and Technical Information. The firms’ reports are held for four years in confidentiality after completion of the SBIR effort, as required by P.L. 102-564.7 Third, as a condition of their grants, Phase II awardees are required to report Phase III information for three years after their SBIR award and voluntarily thereafter; however, not all awardees actually do so. 8.7.2 Freedom of Information Act One of the main purposes of the SBIR program, the commercialization of SBIR-developed technology, would be defeated if the results of the research were 7 This law, P.L. 102-564, does state explicitly when this 4-year period begins.
OCR for page 100
An Assessment of the SBIR Program at the Department of Energy made public. Therefore, the SBIR office has worked with DoE’s Freedom of Information Office to identify legal exemptions that could be used to protect the companies’ proprietary information both in their proposals and reports. 8.8 PROGRAM STRUCTURE 8.8.1 Differences Between Agencies Most of the case study companies indicated an awareness of the differences between agencies in their administration of the program, but none indicated that they had a problem with the fact that there were differences. In fact, one company (Creare) called these differences “a strength” of the program. Two companies (Eltron, NexTech) understood that the administrative differences were intended to serve the agencies’ differing missions and goals; for example, NSF procedures are consistent with their pursuit of more basic and long-term research, leading to commercial products, while DoD seeks solution-oriented proposals with more immediate military applications. Some specific differences between agencies will be identified within the categories that follow. 8.8.2 Award Limits Three interviewees (Creare, IPIX, NanoScience) believed that the size of the awards is about right, although one of them (IPIX) preferred a slight increase for Phase II. In general these companies prefer the current situation to an environment in which there were fewer but larger grants. One interviewee commented that the current funding and award ratio allows for a wide variety of topics to be presented by the agencies while also encouraging firms to keep applying for SBIR grants. A large increase in funding might lead to complacency on the part of the applying firms, as they would be able to stretch the award over a longer period of time (IPIX). Two others (Eltron, NexTech) advocated an increase in the award limits to keep up with inflation. The final interviewee that commented on the size of awards (Pearson) recommended a large increase for the Phase II limit, to $2 million. 8.8.3 Time Frames The case study companies commented on three types of time frame: (1) the time that it takes to be paid following the submission of an invoice, (2) the time between Phase I and Phase II, and (3) the time for the entire process—from application submission to the end of Phase II—to be completed. The first time frame received the most attention. Although one company (IPIX) said that funding delays, if existent, have been inconsequential, most of the others recognized differences between the agencies. DoE was considered to
OCR for page 101
An Assessment of the SBIR Program at the Department of Energy be the most prompt agency, with payments occurring fairly rapidly (Eltron). One interviewee (Atlantia) said that after receiving its SBIR awards, DoE sent checks for the amount of the awards and required little oversight. While switching to electronic filing and payments has sped up the process in some agencies, it was reported that others take longer to make payments; e.g., in the DoD. The other time frames were cited once each by the interviewees. One case study company (IPIX) said that the time gap between the Phase I and Phase II requires attention, as it is difficult for some firms to maintain employees or facilities during the intervening time. Another (Airak) recommended that ways should be sought to speed up the total award time frame—four years from Phase I application to Phase II completion—including an option that would allow grantees to work more quickly, in order to aid future commercialization efforts. 8.8.4 Gaps Between SBIR Phase I and Phase II Funding Another agency-wide issue for the SBIR program concerns the gap in funding between Phase I and Phase II. According to the NRC Phase II Survey, 36 percent of the responding projects from DoE experienced no funding gap between the end of Phase I and the beginning of the Phase II project targeted in the survey. For the remaining firms, the funding gap affected work on the project: 56 percent stopped work on the project during the gap and another 33 percent continued to work during the gap but at a reduced pace. Three percent of firms reported ceasing all operations during the gap. Only 3 percent of the respondents received bridge funding of some kind during the gap. The average gap, as reported by 95 respondents, was 5 months. Three percent of respondents reported a gap of one or more years. The department’s target for the Phase II gap remains three months. During the early years of the program, DoE did have in place a gap-closing program, which allowed Phase I awardees to apply early for Phase II, thus effectively cutting the gap between Phases. However, this approach was eliminated as because the additional round of proposal reviews and scoring was seen as an added burden for the SBIR office and well as for the TTMs and TPMs in the technical program office. 8.9 PARTICIPATION OF DOE NATIONAL LABORATORIES IN SBIR 8.9.1 Overview of DoE National Laboratories Among the federal agencies that participate in the SBIR a program, the Department of Energy is unique with respect to its government owned-contractor operated national laboratories (GOCOs). The uniqueness has to do with their large number and breadth of interests. While some other agencies have GOCOs (e.g., NASA’s Jet Propulsion Laboratory and NSF’s National Center for Atmo-
OCR for page 102
An Assessment of the SBIR Program at the Department of Energy spheric Research), they are relatively few in number. For the most part, the other SBIR agencies have government owned-government operated laboratories (GOGOs), for example the Department of Commerce’s National Institute of Standards and Technology (NIST) or the Navy’s Naval Research Laboratory (NRL). (DoE also has a GOGO, its National Energy Technology Laboratory.) GOCOs are not staffed by government employees, and, consequently, the work that they do is not being performed by the government. Therefore, GOCOs, including DoE national laboratories, are eligible to serve as subcontractors in SBIR since their participation would not violate the prohibition against SBIR dollars going back to the government. The major national laboratories within DoE can first be distinguished by DoE programs they serve. The majority are either Science laboratories (i.e., “owned” by DoE’s Office of Science (SC)) or Defense laboratories (i.e., “owned” by DoE’s National Nuclear Security Administration, which administers the defense programs that are exempt from the SBIR set-asides). The Science laboratories can be multiprogram laboratories (serving a variety of technical areas within SC) or single-program laboratories. The multiprogram Science laboratories and all three Defense laboratories are huge, with annual budgets on the order of hundreds of millions of dollars; some exceed $1 billion. Budgets for the SC single program laboratories are typically in the tens of millions, up to about $100 million. Together, these laboratories represent a major national resource. The national laboratories also represent a resource for small businesses participating in SBIR. For the most part, their facilities and expertise cannot be duplicated elsewhere—in fact, the national laboratories are not allowed to compete with the private sector. As with universities, when a national laboratory is identified as a subcontractor on an SBIR proposal, the credibility of the research team is enhanced. Therefore, one might expect that a large number of small business/national laboratory partnerships would exist. However, this is not the case. Typically, less than 10 percent of DoE SBIR projects involve such partnerships. Many more projects have subcontracts with universities. 8.9.2 Why SBIR Collaborations Are Not More Frequent The reasons for the infrequent utilization of the resources of the national laboratories may include: A lack of outreach on the part of most National Laboratories; Some trepidation among small companies in dealing with such large organizations, particularly when it comes to having to negotiate cooperative research and development agreements (CRADAs) or related vehicles; The fact that some of the other agencies do not allow their SBIR awardees to partner with DoE National Laboratories; and
OCR for page 103
An Assessment of the SBIR Program at the Department of Energy The recent SBA change in their policy directive requiring small businesses to apply for a waiver in order to partner with a DoE National Laboratory.8 The National Laboratories would appear to have an incentive to reach out to small businesses to encourage partnerships on SBIR projects.9 After all, a significant part of the SBIR set asides represents money that would have passed from DoE to the laboratory, and collaboration on these projects would represent one means of recouping a portion of these funds. In addition, some laboratories have technology transfer goals that could be satisfied by these collaborations. Also, some laboratories, desiring to be good neighbors, seek to support the small businesses in their communities. However, the laboratories have been widely divergent with respect to their outreach efforts, ranging from aggressive for some to disinterested for others. For reasons that are not entirely clear, however, the National Laboratories are treated differently in the two policy directives—for SBIR and STTR—promulgated by the Small Business Administration. For STTR (not covered by this NRC study) the enabling legislation (P.L. 102-564), along with the SBIR Policy Directive, specifically allows federally Funded Research and Development Centers (FFRDCs), which are identified by the National Science Foundation and includes DoE National Laboratories, to serve as the research institution (a required subcontractor on all STTR projects). However, for SBIR, the SBA Policy Directive imposes additional restrictions on small businesses that desire to partner with a National Laboratory: if the small business certifies that the work to be performed by the laboratory is important to the project and cannot be found elsewhere, the SBA will waive the restriction. DoE SBIR office assists its applicants in negotiating this waiver request process, and, to this point, no request has been denied by the SBA. 8.10 DEVELOPMENTS IN PROGRAM ADMINISTRATION SINCE 2003 8.10.1 Online Capabilities and Plans The technology for administering grant applications at DoE was paper-based throughout the focal period of this study (1992-2003). In fall 2004, subsequent to the first meeting with the Academies’ study team, the DoE initiated a pilot effort 8 At DoE, the granting of such waivers is routine. SBIR program staff report that that the waiver process has been made minimally burdensome to small businesses. Nonetheless, for firms that may be unaware of the strong likelihood of being granted such a waiver, this additional requirement does create an additional barrier to collaboration with national laboratories. 9 Of course, the laboratories must adhere to fairness-of-opportunity constraints, which, for example, would restrict a laboratory from soliciting a particular company as a partner for an SBIR project; however, laboratories may alert broad numbers of small businesses of their expertise with respect to technical topics in agency solicitations, or broadly invite small businesses to a laboratory open house.
OCR for page 104
An Assessment of the SBIR Program at the Department of Energy to explore the possibility of transitioning to an electronic format for applications. Since 2005, all SBIR applications have been handled electronically. This adoption of a new procedure has imposed significant start-up costs in setting up the new technology and in the time required to process and manage applications. These costs are expected to diminish as the new system becomes increasingly familiar. 8.10.2 Program Manager Given More Control In a recent change, DoE SBIR program managers are given more direct control over SBIR awards. With more control the program managers have taken more ownership and appear to be more engaged with the companies, resulting in positive impacts for everyone involved. Two specific areas of this change include the program managers deciding on technical assistance, and the program managers’ sign-off on the second year of the Phase II awards. Technical assistance. The program managers now choose which of their award companies will be offered technical assistance support. Unfortunately, there was—and still remain—insufficient resources to fund free technical assistance for every award company. Until recently, support was awarded on a first-come first-served basis. With the recent change, all companies now have the opportunity to be selected by the program manager, whose judgment can be based on criteria such as the company’s timeline and needs. Year two sign-off. DoE SBIR program managers now sign off on the second year of the Phase II awards. In the past, program managers had no control over funding once the award was granted. It was hard to terminate an award even if no progress had taken place. DoE expects that nearly all projects will receive the year two sign-off, but this additional capability gives the program manger more power and more accountability. 8.10.3 Phase II Supplemental Awards In 2005, DoE began to offer firms completing Phase II awards the option of applying for a $250,000 supplemental award. This request was subject to review. The intention was to provide the highest performing firms with additional resources, as needed, to advance development of the technology in question. However, complications in the administration of this supplemental program put this program on hold for some time. However, DoE SBIR staff reintroduced this option for funded firms in 2007, and it is now standard practice.
OCR for page 105
An Assessment of the SBIR Program at the Department of Energy 8.11 ACTIONS TAKEN BY DOE SBIR PROGRAM TO ENCOURAGE COMMERCIALIZATION DoE has taken four distinct actions to encourage, promote, and track the commercialization of SBIR technology including: Mandating evidence of commercialization within the Phase II evaluation criteria; Providing commercialization assistance services to SBIR awardees; Collecting Phase III data from Phase II awardees, and Publicly recognizing success. 8.11.1 Evidence of Commercialization Included in Phase II Criteria Public Law 102-564, which in 1992 both reauthorized the SBIR program and created the STTR program, also modified the definition of Phase II.10 Essentially, this provision requires agencies to consider commercial potential according to four indicators in the evaluation of Phase II proposals, but it did not say how this should be done. At DoE, this requirement was implemented formally in the scoring of SBIR proposals. Three indicators were added to the criterion on Impact. These three indicators would count as one-half of the Impact criterion, or one-sixth of a proposal’s total score. They are evaluated by the SBIR office, based on information that the small businesses are encouraged to include in their Phase II proposals. DoE determined that the fourth indicator of commercial potential was already being addressed in the Impact criterion and was already being covered by reviewers in their comments on this criterion. Therefore, the fourth indicator was considered to be included within the other half of the score for the Impact criterion. All together, then, the four criteria count for somewhat more than one-sixth of the total score for a Phase II proposal. 8.11.2 Commercialization Assistance Services for SBIR Awardees To aid Phase II awardees that seek to speed the commercialization of their SBIR technology, DoE has sponsored a Commercialization Assistance Program (CAP). The CAP provides, on a voluntary basis, individual assistance in develop- 10 Public Law 102-564 states that “a second phase, to further develop proposals which meet particular program needs, in which awards shall be made based on the scientific and technical merit and feasibility of the proposals, as evidenced by the first phase, considering, among other things, the proposal’s commercial potential, as evidenced by—(i) the small business concern’s record of successfully commercializing SBIR or other research; (ii) the existence of second phase funding commitments from private sector or non-SBIR funding sources; (iii) the existence of third phase, follow-on commitments for the subject of the research; and (iv) the presence of other indicators of the commercial potential of the idea….”
OCR for page 106
An Assessment of the SBIR Program at the Department of Energy ing business plans and in preparing presentations to potential investment sponsors. The CAP is operated by a contractor, Dawnbreaker, Inc., a private firm based in Rochester, NY, which has repeatedly won competitions to provide this service. The CAP is an intensive experience for the small businesses that choose to participate in it. The CAP begins with a kick-off meeting at DoE, where potential small business participants are introduced to the program. At that time they also meet one-on-one with Dawnbreaker staff to discuss their goals and technologies. Over the next several months, the companies, coached by Dawnbreaker staff, perform market research activities and prepare an iterative series of business plans. Because of the heavy workload (and for other reasons, e.g., illness), approximately 50 percent of the small businesses drop out during this period; however, this attrition is built into the design of the program. The CAP culminates in a Commercialization Opportunity Forum, in which those companies that survive the business planning process present their business opportunities to a group of potential partners or investors—typically representatives from large corporations or from venture capital companies. Before the forum takes place, the companies are coached and rehearsed in the art of making effective presentations. In order to make the forum a more attractive event for the potential partners/ investors—i.e., to provide more business opportunities—DoE SBIR program often partners with other agencies or other DoE offices. These partners make their own contractual arrangements with Dawnbreaker for the training of their own small business participants. For the FY2006 Forum DoE SBIR agreed to partner with the National Science Foundation’s SBIR Program and DoE’s Office of Industrial Technologies. These partnerships will add another 20 and 10 SBIR Phase II projects, respectively, to the 30 DoE projects coming from DoE SBIR program, for a total of 60 business opportunities to be presented at the forum.11 Dawnbreaker, the DoE CAP vendor since 1989, tracks commercialization by polling each CAP participant at 6-, 12-, and 18-month intervals following the Commercialization Opportunity Forum, with emphasis on those companies that made a presentation at the forum. Dawnbreaker reported that half of the companies that completed the CAP program have already received in excess of $400 million for commercialization of their SBIR research. Dawnbreaker also solicits detailed feedback on CAP participation from clients through an evaluation template. CAP was formally launched in 1989 by Program Manager Samuel Barish with a $50,000 contribution spread over 11 DoE technical programs.12 At the present time, the CAP—along with the other commercialization assistance services described in this report—is funded from the 2.5 percent SBIR set-aside, 11 Interview with Larry James, May 5, 2005. 12 Interview with Sam Barish, DoE Director of Technology Research Division, May 6, 2005.
OCR for page 107
An Assessment of the SBIR Program at the Department of Energy TABLE 8-1 2005 DoE SBIR Initiatives in Support of Commercialization CAP Trailblazer Technology Niche Assessment (TNA) Virtual Deal Simulator™ (VDS) Contractor Dawnbreaker, Inc. Foresight Science and Technology, Inc. Foresight Science and Technology, Inc. Foresight Science and Technology Start Date January 2005 January 2005 January 2005 January 2005 Completion Date December 2007 December 2007 December 2007 December 2007 Eligibility SBIR II only SBIR I only SBIR I or II SBIR I or II SOURCE: Department of Energy SBIR program publications and Web site. in accordance with the SBA’s 2002 Final Policy Directive to finance commercialization assistance, and supported by a 1997 finding by DoE legal counsel.13 P.L. 102-564 permits the use of $4,000 per Phase I project for commercialization assistance. In addition to the CAP, DoE now offers a number of other commercialization assistance services, which are summarized, along with the CAP, in Table 8-1. DoE’s 2005-2007 SBIR commercialization assistance “menu,” which offers services to companies in SBIR Phases I and II, is funded at $2 million over three years. With a FY2005 SBIR budget of $102 million, DoE expects that approximately 180 SBIR winners from the 2004 pool of 425 Phase I and II recipients will participate in one or more of these components during 2005. (DoE expects some overlap, since many SBIR firms have multiple awards.) The agency’s goal is successful commercialization of SBIR technology through licensing, subsidiary spin-off or “otherwise providing a path for the innovation to have the program manager’s intended impact.”14 One of these services, Technology Niche Assessments (TNA), has been available to DoE SBIR awardees for the past 6 years. In this service, which requires much less time commitment on behalf of the awardees, the companies initially describe their technology to a second DoE contractor, Foresight Science and Technology. Foresight then performs a search and analysis to identify private sector contacts that may have an interest as potential investors. Although the small businesses that have received this service have expressed general satisfaction, the SBIR office has not received any evidence that any small business received subsequent funding as a result of this service. However, TNAs performed during Phase I projects do allow companies to present more realistic commercialization plans in subsequent Phase II proposals. 13 Interview with former DoE SBIR Program Manager Bob Berger, March 4, 2005. 14 Interview with Larry James, December 29, 2004.
OCR for page 108
An Assessment of the SBIR Program at the Department of Energy Over the history of DoE SBIR program, participation in commercial assistance services was open to all SBIR Phase II winners on a self-select basis, and attrition (i.e., dropping from the Dawnbreaker CAP) was the decision of the SBIR winner. With the addition of two new services on a pilot basis in 2005 (Trailblazer and Virtual Deal Simulator) DoE SBIR introduced a new participation model, which includes Phase I participants as well. The agency cites the importance of acknowledging commercialization tasks during Phase I. Although all Phase I and II winners are encouraged to take advantage of one or more of the four services described, entry is determined primarily by personnel in DoE technical programs that support the SBIR program. The principal selection criterion is “technological innovation that has a high probability of success, and would have high program impact if successful.” SBIR winners must be nominated by their project monitor, and must have won at least one DoE SBIR Phase I grant. The agency anticipates 160 SBIR participants across the four commercial assistance services during 2005. Most recently DoE SBIR program has sought the involvement of TTM and TPMs in nominating for commercialization assistance firms whose technologies appear to have particular strong market potential. The management of each commercialization assistance service is performed by its contractor, with regular SBIR client participation and fiscal reports made to the agency SBIR Program Manager. Included among the management responsibilities is tracking the commercialization progress of the SBIR participants. Each vendor reports on its SBIR clients’ commercialization progress for a period of 2 years subsequent to completion of the Phase II grant. DoE SBIR uses both contractors’ tracking results in biannual evaluations, but the data is not made publicly available—both vendors cite client confidentiality as the reason for not making commercialization figures available for each SBIR client. 8.11.3 Collecting Phase III Data From the outset, DoE SBIR office made a concerted effort to assess SBIR commercialization outcomes.15 All DoE SBIR Phase II awardees are required to report on their Phase III activity as a condition of their grant. Consequently, DoE has, on nearly an annual basis, collected Phase III data from its SBIR awardees. 15 Outcomes are the link between broadly defined goals and the details of program management. Outcomes can be framed along three dimensions: • Agency perspective: Did the program satisfy the objectives of the technical program managers? At the limit, were technical program managers ever willing to put “their own” money into the program? • For the firms: Did the SBIR program help funded firms achieve goals including sustained employment for technical personnel, growth, and successful commercialization of new technologies? • For society: Did SBIR funds create technologies that would not have been created otherwise? Did those technologies in turn motivate economic activity and improve human welfare?
OCR for page 109
An Assessment of the SBIR Program at the Department of Energy The department learned, through trial and error, what they believe is the most effective way to ask awardees about Phase III. The key, they found, was to avoid asking about individual projects. A project-by-project approach was tempting because: (1) the agencies award and track individual projects, and (2) the GAO, in its earlier studies of SBIR, also used this approach. However, the project-by-project approach yielded inaccurate results because small businesses do not track their success in this way. Small companies tend to track their success based on the products or services derived from SBIR projects. Therefore, DoE learned to ask companies to: (1) first, list all products and services that were derived from their DoE SBIR projects; (2) report on both sales and/or Phase III investment (including post-SBIR funding for further development) related to those products and services; and (3) then identify which Phase II projects contributed to the development of the products and services. DoE SBIR office asks their Phase II awardees to report on two types of revenues: (1) sales, and (2) follow-on investment. Both categories are further broken down into federal and nonfederal sources of revenue. The second category, follow-on investment, could include funding for activities that are directly related to commercialization (marketing, setting up a production facility, etc.) or funding for further development of the technology. DoE considers all of these categories to be examples of “Phase III” funding. Of the 787 companies that received 1,731 Phase II awards through 2002, 609 responded to DoE Phase II survey. Their responses are detailed in Table 8-2. TABLE 8-2 DoE Sales and Development Outcomes Sales FED Nonfederal Total Companies 136 246 269 > $850.000 42 90 109 Amounts ($) 232,996,373 1,384,990,095 1,617,986,468 > $850,000 215,563,248 1,352,142,325 Development Internal Nonfederal Fed Non-SBIR Total Companies 346 246 160 419 > $850.000 53 67 68 146 Amounts ($) 246,885,194 631,295,607 455,757,462 1,333,938,263 > $850,000 187,257,499 588,425,378 427,494,830 SOURCE: Department of Energy.
OCR for page 110
An Assessment of the SBIR Program at the Department of Energy Forty-eight percent of the respondents reported some sales, with 18 percent reporting sales of over $850,000 (a threshold value chosen as the total public funds awarded in a Phase I and Phase II, combined).16 Although many companies received Phase III funding, little or none of this funding came from DoE itself. A longtime DoE SBIR program manager could not recall one instance in which DoE has funded Phase III with non-SBIR funds as a follow-on to the research performed in Phases I and II.17 DoE SBIR staff report that, during the interval covered by the study, many DoE Technical Project Monitors (TPMs) were unaware that SBIR technologies can be procured through a noncompetitive process in Phase III. One reason for this information gap was that no systematic process existed for continually disseminating this information to DoE technical programs. On about 10-15 occasions, the aforementioned program manager provided DoE technical program managers with documentation to show that the law and the SBA Policy Directive allowed, even encouraged, agencies to make Phase III awards to the SBIR company on a sole source basis. However, he was not aware that such follow-on funding ever resulted from these efforts. (Phase III funding from DoE can occur without the knowledge of the SBIR office; for example, Phase III funding from national laboratories or DoE offices could be administered directly by the technical programs, without informing the SBIR office.) 8.11.4 Recognizing Success DoE recognizes the success of SBIR firms in a number of ways. The agency maintains and updates a broad selection of commercially successful SBIR-funded technologies on its SBIR Web site (<http://www.science.DoE.gov/sbir>). The Web site also identifies DoE SBIR awardees who have received the prestigious R&D 100 Award. 16 No data exist (self-reported or other) that would inform us regarding the extent to which SBIR funds were critical in the development of the products in question. Furthermore, it is possible that firms have an incentive to inflate these figures, and no obvious incentive to underreport “success” in commercialization in a survey of this variety. Although It is also possible that the lack of perfect institutional memory in small, dynamic firms could lead to systematic underreporting. 17 The norm in the fossil energy and environmental offices is to work with very large companies, mostly on a cost-sharing basis. SBIR projects involving the development of instrumentation may an exception. One company reported a number of instances in which DoE Office of Health and Environmental Research (OHER, now the Office of Biological and Environment Research) provided Phase III funding for the use of instruments developed with SBIR funds in environmental fields.