National Academies Press: OpenBook

SBIR/STTR at the Department of Energy (2016)

Chapter: Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools

« Previous: Appendixes
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×

Appendix A

Overview of Methodological Approaches,
Data Sources, and Survey Tools

This report on the Small Business Innovation Research (SBIR) and Small Business Technology Transfer programs at the Department of Energy (DoE) is a part of a series of reports on SBIR and STTR at the National Institutes of Health (NIH), Department of Defense (DoD), NASA, and National Science Foundation (NSF). Collectively, they complement and earlier assessment of the SBIR program by the National Academies of Sciences, Engineering, and Medicine, completed in 2009.1

The first-round assessment of SBIR, conducted under a separate ad hoc committee, resulted in a series of reports released from 2004 to 2009, including a framework methodology for that study and on which the current methodology builds.2 Thus, as in the first-round study, the objective of this second-round study is “not to consider if SBIR should exist or not”—Congress has already decided affirmatively on this question, most recently in the 2011 reauthorization of the program.3 Rather, we are charged with “providing assessment-based findings of the benefits and costs of SBIR . . . to improve public understanding of the program, as well as recommendations to improve the program’s effectiveness.” As with the first-round, this study “will not seek to compare the value of one area with other areas; this task is the prerogative of the Congress and the Administration acting through the agencies. Instead, the study is concerned with the effective review of each area.”

___________________

1 Effective July 1, 2015, the institution is called the National Academies of Sciences, Engineering, and Medicine. References in this report to the National Research Council or NRC are used in an historic context identifying programs prior to July 1, 2015.

2National Research Council, An Assessment of the Small Business Innovation Research Program: Project Methodology, Washington, DC: The National Academies Press, 2004.

3 National Defense Authorization Act of 2012 (NDAA) HR.1540, Title LI.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×

These areas refer to the four legislative objectives of the SBIR program:4

  • Expand the U.S. technical knowledge base
  • Support agency missions
  • Improve the participation of women and minorities
  • Commercialize government-funded research

The parallel language for STTR from the SBA’s STTR Policy Directive is as follows:

(c) The statutory purpose of the STTR Program is to stimulate a partnership of ideas and technologies between innovative small business concerns (SBCs) and Research Institutions through Federally-funded research or research and development (R/R&D). By providing awards to SBCs for cooperative R/R&D efforts with Research Institutions, the STTR Program assists the small business and research communities by commercializing innovative technologies.5

The SBIR/STTR programs, on the basis of highly competitive solicitations, provides modest initial funding for selected Phase I projects (up to $150,000) for feasibility testing, and further Phase II funding (up to $1 million) for about one-half of Phase I projects.

From a methodology perspective, assessing these programs presents formidable challenges. Among the more difficult are the following:

  • Lack of data. Tracking of outcomes varies widely across agencies, and in no agency has it been successfully implemented into a fully effective tracking system. There are no successful systematic efforts by agencies to collect feedback from awardees.
  • Intervening variables. Analysis of small businesses suggests that they are often very path dependent and, hence, can be deflected from a given development path by a wide range of positive and negative variables. A single breakthrough contract—or technical delay—can make or break a company.
  • Lags. Not only do outcomes lag awards by a number of years, but also the lag itself is highly variable. Some companies have sales within 6 months of award conclusion; others take decades. In addition, often the

___________________

4 The most current description of these legislative objectives is in the Policy Guidance provided by the Small Business Administration (SBA) to the agencies. SBA Section 1.(c) SBIR Policy Directive, October 18, 2012, p. 3.

5 Small Business Administration, Office of Investment and Innovation, “Small Business Technology Transfer (STTR) Program— Policy Guidance,” updated February 24, 2014.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×

biggest impacts take many years to peak even after products have reached markets.

ESTABLISHING A METHODOLOGY

The methodology utilized in this study of the SBIR/STTR programs builds on the methodology established by the committee that completed the first-round study of the SBIR program.

Publication of the 2004 Methodology

The committee that undertook the first-round study and the agencies under study formally acknowledged the difficulties involved in assessing SBIR programs. Accordingly, that study began with development of the formal volume on methodology, which was published in 2004 after completing the standard National Academies peer-review process.6

The established methodology stressed the importance of adopting a varied range of tools, which meshes with the methodology originally defined by the study committee to include a broad range of tools, based on prior work in this area. The committee concluded that appropriate methodological approaches

build from the precedents established in several key studies already undertaken to evaluate various aspects of the SBIR. These studies have been successful because they identified the need for utilizing not just a single methodological approach, but rather a broad spectrum of approaches, in order to evaluate the SBIR from a number of different perspectives and criteria.

This diversity and flexibility in methodological approach are particularly appropriate given the heterogeneity of goals and procedures across the five agencies involved in the evaluation. Consequently, this document suggests a broad framework for methodological approaches that can serve to guide the research team when evaluating each particular agency in terms of the four criteria stated above. [Table APP A-1] illustrates some key assessment parameters and related measures to be considered in this study.7

___________________

6 National Research Council, An Assessment of the Small Business Innovation Research Program: Project Methodology, 2.

7 National Research Council, An Assessment of the Small Business Innovation Research Program: Project Methodology, 2.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×

TABLE APP A-1 Overview of Approach to SBIR Program Assessment

SBIR Assessment Parameters Quality of Research Commercialization of SBIR-Funded Research/Economic and Non-Economic Benefits Small Business Innovation/Growth Use of Small Businesses to Advance Agency Missions
Questions How does the quality of SBIR-Funded research compare with that of other government-funded R&D? What is the overall economic impact of SBIR-funded research? What fraction of that impact is attributable to SBIR funding? How to broaden participation and replenish contractors? What is the link between SBIR and state/regional programs? How to increase agency uptake while continuing to support high-risk research
Measures Peer-review scores, publication counts, citation analysis Sales; follow-up funding; progress; initial public offering Patent counts and other intellectual property/employment growth, number of new technology firms Agency procurement of products resulting from SBIR work
Tools Case studies, agency program studies, study of repeat winners, bibliometric analysis Phase II surveys, program manager surveys, case studies, study of repeat winners Phase I and Phase II surveys, case studies, study of repeat winners, bibliometric analysis Program manager surveys, case studies, agency program studies, study of repeat winners
Key Research Challenges Difficulty of measuring quality and of identifying proper reference group Skew of returns; significant interagency and inter-industry differences Measures of actual success and failure at the project and firm levels; relationship of federal and state programs in this context Major interagency differences in use of SBIR to meet agency missions

NOTE: Supplementary tools may be developed and used as needed. The committee notes that while sales is a legitimate indicator of progress toward commercialization, it is not a reliable measure that commercial success has occurred.

SOURCE: National Research Council, An Assessment of the Small Business Innovation Research Program: Project Methodology, Washington, DC: The National Academies Press, 2004, Table 1, p. 3.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×

TOOLS UTILIZED IN THE CURRENT STUDY

Quantitative and qualitative tools being utilized in the current study of the SBIR/STTR programs include the following:

  • Case studies. The committee commissioned in-depth case studies of 12 SBIR and STTR recipients at DoE. These companies are geographically diverse, demographically diverse, and funded by several different DoE programs, and they are at different stages of the company life cycle.
  • Workshops. The committee convened a number of workshops to allow stakeholders, agency staff, and academic experts to provide unique insights into the program’s operations, as well as to identify questions that need to be addressed.
  • Analysis of agency data. A range of datasets covering various aspects of agency SBIR/STTR activities were sought from DoE and other sources. The committee has analyzed and included the data that was received as appropriate.
  • Survey of award recipients. All PIs that received a Phase II SBIR or STTR award from DoE between FY 2001 and FY 2010 were surveyed by a contractor for the National Academies. Details are discussed below.
  • Open-ended responses from SBIR/STTR recipients. For the first time, the committee solicited textual responses, drawing more than 200 observations from DoE SBIR/STTR respondents (respondents were asked to describe in their own words significant long-term impacts of the SBIR/STTR programs on their company).
  • Agency interviews. Agency staff was consulted on the operation of the SBIR/STTR programs, and most were helpful in providing information both about the program and about the challenges that they faced.
  • Literature review. In the time period since the start of our research in this area, a number of papers have been published addressing various aspects of the SBIR program. In addition, other organizations, such as the Government Accountability Office (GAO), have reviewed particular parts of the SBIR program. These works are referenced in the course of this analysis.

Taken together with our committee deliberations and the expertise brought to bear by individual committee members, these tools provide the primary inputs into the analysis.

We would stress that, for the first-round study and for our current study, multiple research methodologies feed into every finding and recommendation. No findings or recommendations rest solely on data and analysis from the survey; conversely, data from the survey are used to support analysis throughout the report.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×

COMMERCIALIZATION METRICS AND DATA COLLECTION

Congressional discussions of the SBIR program in the context of the 2011 reauthorization reflected strong interest in the commercialization of technologies funded through SBIR. This enhanced focus is understandable: the investment made should be reflected in outcomes approved by Congress.

However, no simple definition of “commercialization” exists.8 Broadly speaking, in the context of the program it means funding for technology development beyond that provided under Phase II SBIR funding. At DoE, most commercialization occurs outside the agency, mostly in the private sector (as survey results indicate).

In the 2009 report on the DoE SBIR program9 the committee charged with that assessment held that a binary metric of commercialization was insufficient. It noted that the scale of commercialization is also important and that there are other important milestones both before and after the first dollar in sales that should be included in an appropriate approach to measuring commercialization. The committee carrying out the current study further notes that while sales is a legitimate indicator of progress toward commercialization, it is not a reliable measure that commercial success has occurred.

Challenges in Tracking Commercialization

Despite substantial efforts at DoE, described below, significant challenges remain in tracking commercialization outcomes for the DoE SBIR/STTR programs. These include the following:

  • Data limitations.
  • Linear linkages. Tracking efforts usually seek to link a specific project to a specific outcome. Separating the contributions of one project is difficult for many companies, given that multiple projects typically contribute to both anticipated and unanticipated outcomes.
  • Lags in commercialization. Data from the extensive DoD commercialization database suggest that most projects take at least 2 years to reach the market after the end of the Phase II award. They do not generate peak revenue for several years after this. Therefore, efforts to measure program productivity must account for these significant lags.
  • Attribution problems. Commercialization is often the result of several awards, not just one, as well as other factors, so attributing company-level success to specific awards is challenging at best.

___________________

8 See Chapter 5 (Quantitative Outcomes) for related analysis of commercialization in the SBIR program.

9National Research Council, An Assessment of the SBIR Program at the Department of Energy, Washington, DC: The National Academies Press, 2009.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×

Why New Data Sources Are Needed

Congress often seeks evidence about the effectiveness of programs or indeed about whether they work at all. This interest has in the past helped to drive the development of tools such as the Company Commercialization Record database at DoD. However, in the long term the importance of tracking lies in its use to support program management. By carefully analyzing outcomes and associated program variables, program managers should be able to manage more successfully.

We have seen significant limitations to all of the available data sources. Unfortunately, DoE declined to share its tracking data on privacy grounds, so we are unable to draw conclusions about the quality or extent of DoE data collection, and the data itself were not made available for our use.

BEYOND COMMERCIALIZATION METRICS

Although Congressional interest has focused primarily on commercialization in recent years, it remains the case that there are four congressionally mandated objectives for the SBIR program, and that commercialization is only one of them. STTR adds additional objectives beyond commercialization. DoE’s data collection tools focus almost exclusively on that commercialization; they appear to have limited capabilities for collecting data about the other three SBIR program objectives described in the introduction to this appendix.

OVERVIEW OF THE SURVEY

Our analysis of the SBIR and STTR programs at DoE make extensive use of case studies, interviews, and other qualitative methods of assessment. These sources remain important components of our overall methodology, and Chapter 7 (Insights) is devoted to lessons drawn from case studies and other qualitative sources. But qualitative assessment alone is insufficient.

The Role of the Survey

The survey offers several significant advantages over other data sources, as it

  • covers all kinds of commercialization inside and outside of DoE;
  • provides a rich source of textual information in response to open-ended questions;
  • probes more deeply into company demographics and agency processes;
  • addresses principal investigators (PIs), not just company business officials;
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
  • allows comparisons with previous data-collection exercises; and
  • addresses other Congressional objectives for the program beyond commercialization.

At the same time, however, we are fully cognizant of the limitations of this type of observational survey research in this case. To address these issues while retaining the utility and indeed explanatory power of survey-based methodology, this report contextualizes the data by comparing results to those from the survey conducted as part of the first-round assessment of the SBIR program (referred to below as the “2005 Survey”10). This report also adds transparency by publishing the number of responses for each question and indeed each subgroup, thus allowing readers to draw their own conclusions about utility of the data.

We contracted with Grunwald Associates LLC to administer a survey to DoE award recipients. This 2014 survey is based closely on the 2005 Survey but is also adapted to lessons learned and includes some important changes discussed in detail below. A methodology subgroup of the committee was charged with reviewing the survey and the reported results for best practice and accuracy. The 2014 Survey was carried out simultaneously with surveys focused on the SBIR programs at NIH, and followed a survey in 2011 of awardees at NASA, NSF, and DoD.11

The primary objectives of the 2011 and 2014 surveys were as follows:

  • Provide an update of data collected in the National Academies survey completed in 2005, maximizing the opportunity to identify trends within the program;
  • Probe more deeply into program processes, with the help of expanded feedback from participants and better understanding of program demographics;
  • Improve the utility of the survey by including a comparison group;
  • For the first time, survey STTR awardees, and
  • Reduce costs and shrink the time required by combining three 2005 survey questionnaires—for the company, Phase I, and Phase II awards—into a single questionnaire.

Box A-1 identifies multiple sources of bias in survey response.

___________________

10 The survey conducted as part of the current, second-round assessment of the SBIR program is referred to below as the “2014 Survey” or simply the “survey.” In general, throughout the report, any survey references are understood to be to the 2014 Survey unless specifically noted otherwise.

11 Delays at NIH and DoE in contracting with the National Academies combined with the need to complete work contracted with DoD NSF and NASA led the committee to proceed with the survey at three agencies only.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×

Survey Characteristics

In order to ensure maximum comparability for a time series analysis, the survey for the current assessment was based as closely as possible on previous surveys, including the 2005 Survey and the 1992 GAO survey.

Given the limited population of Phase II awards, the starting point for consideration was to deploy one questionnaire per Phase II award. However, we were also aware that the survey imposes burdens on respondents. Given the detailed and hence time-consuming nature of the survey, it would not be appropriate to over-burden potential recipients, some of whom were responsible for many awards over the years.

An additional point of consideration was that this survey was intended to add detail on program operations, rather than the original primary focus on program outcomes. Agency clients were especially interested in probing operations more deeply. We decided that it would be more useful and effective to administer the survey to PIs—the lead researcher on each project—rather than to the registered company point of contact (POC), who in many cases would be an administrator rather than a researcher.

The survey was therefore designed to collect the maximum amount of relevant data, consistent with our commitment to minimize the burden on individual respondents and to maintain maximum continuity between surveys. Survey questionnaires were to be sent to PIs of all projects that met selection characteristics, with a maximum of two questionnaires per PI. (The selection procedure is described in section on “Initial Filters for Potential Recipients”.)

Based on reviewer feedback about the previous round of assessments, we also attempted to develop comparison groups that would provide the basis for further statistical analysis. This effort was eventually abandoned (see section on “Effort at Comparison Group Analysis”).

Key similarities and differences between the 2005 and 2014 Surveys are captured in Table A-2.

The 2014 Survey included awards made from FY 2001 to FY2010 inclusive. This end date allowed for completion of Phase II awards (which nominally fund 2 years of research) and provided a further 2 years for commercialization. This time frame was consistent with the 2005 Survey, which surveyed awards from FY 1992 to FY 2001 inclusive. It was also consistent with a previous GAO study, published in 1992, which surveyed awards made through 1987.

The aim of setting the overall time frame at 10 years was to reduce the impact of difficulties generating information about older awards, because some companies and PIs may no longer be in place and because memories fade over time. Reaching back to awards made before FY 2001 would generate few additional responses.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×

TABLE A-2 Similarities and Differences: 2005 and 2014 Surveys

Item 2005 Survey 2014 Survey
Respondent selection

Focus on Phase II winners

images images

Inclusion of Phase I winners

images images

All qualifying awards

images

Respondent = Principal Investigator (PI)

images

Respondent = Point of Contact (POC)

images

Max number of questionnaires

<20 2
Distribution

Mail

images No

Email

images images

Telephone follow-up

images images
Questionnaire

Company demographics

Identical Identical

Commercialization outcomes

Identical Identical

IP outcomes

Identical Identical

Women and minority participation

images images

Additional detail on minorities

images

Additional detail on PIs

images

New section on agency staff

images

New section on company recommendations for SBIR

images

New section capturing open-ended responses

images

Determining the Survey Population

Following the precedent set by both the original GAO study and the first-round study of the SBIR program, we differentiated between the total population of awards, the preliminary survey target population of awards, and the effective population of awards for this study.

Two survey response rates were calculated. The first uses the effective survey population of awards as the denominator, and the second uses the preliminary population of awards as the denominator.

From Total Population of Awards to Effective Population

Upon acquisition of data for the 2014 Survey from the sponsoring agencies (NIH and DoE) covering record-level lists of awards and recipients,

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×

initial and secondary filters were applied to reach the preliminary survey population and ultimately the effective survey population. These steps are described below.

Initial Filters for Potential Recipients: Identifying the Preliminary Survey Population

From this initial list, determining the preliminary survey population required the following steps:

  • elimination of records that did not fit the protocol agreed upon by the committee—namely, a maximum of two questionnaires per PI (in cases where PIs received more than two awards, the awards were selected by agency [DoE and then NIH in that order], then by year [oldest], and finally by random number); and
  • elimination of records for which there were significant missing data—in particular, where emails and/or contact telephone numbers were absent.

This process of excluding awards either because they did not fit the protocol agreed upon by the committee or because the agencies did not provide sufficient or current contact information, reduced the total award list provided by DoE from an initial list of 1,325 to a preliminary survey population of Phase II SBIR and STTR awards of 1,077 awards.

Secondary Filters to Identify Recipients with Active Contact Information: Identifying the Effective Population

This preliminary population still included many awards for which the PI contact information appeared complete, but for which the PIs were no longer associated with the contact information provided and hence effectively unreachable. This is not surprising given that there is considerable turnover in both the existence of and the personnel working at small businesses and that the survey reached back 13 years to awards made in FY 2001. PIs for awards may have left the company, the company may have ceased to exist or been acquired, or telephone and email contacts may have changed, for example. Consequently, two further filters were utilized to help identify the effective survey population.

  1. PI contacts were eliminated—and hence the awards assigned to those PI contacts were eliminated—for which the email address bounced twice. Because the survey was delivered via email, the absence of a working email address disqualified the recipient PI and associated awards. This eliminated approximately 30 percent of the preliminary population (320 awards).
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
  1. Efforts were made to determine whether non-bouncing emails were in fact still operative. Email addresses that did not officially “bounce” (i.e., return to sender) may still in fact not be active. Some email systems are configured to delete unrecognized email without sending a reply; in other cases, email addresses are inactive but not deleted. So a non-bouncing email address did not equal a contactable PI. In order to identify not contactable PIs, we undertook an extensive telephone survey. Telephone calls were made to every PI with an award among the preliminary survey population of awards at DoE and who did not respond to the first round of questionnaire deployment. On the basis of responses to the telephone survey, we were able to ascertain that 263 further PIs could not be contacted even though their email addresses did not formally bounce.

There was little variation between agencies or between programs in the quality of the lists provided by the agencies, based on these criteria.12

Following the application of these secondary filters, the effective population of DoE Phase II awardees was 494.

Deployment

The survey opened on December 3, 2014, and was deployed by email, with voice follow-up support. Up to four emails were sent to the PIs for the effective population of awards (emails were discontinued once responses were received or it was determined that the PI was non-contactable). In addition, two voice mails were delivered to non-responding PIs of awards in the effective population, between the second and third and between the third and fourth rounds of email. In total, up to six efforts were made to reach each PI who was sent an award questionnaire.

After members of the data subgroup of the committee determined that additional efforts to acquire new responses were not likely to be cost effective, the survey was closed on April 7, 2015. The survey was therefore open for a total of 18 weeks.

Response Rates

Standard procedures were followed to conduct the survey. These data collection procedures were designed to increase response to the extent possible within the constraints of a voluntary survey and the survey budget. The population surveyed is a difficult one to contact and obtain responses from, as

___________________

12 The share of preliminary contacts that turned out to be not contactable was higher for this survey than for the 2005 Survey. We believe this is primarily because company points of contact (POCs) to which the 2005 Survey was sent have less churn than do principal investigators (PIs) (often being senior company executives).

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×

evidence from the literature shows.13 Under these circumstances, the inability to contact and obtain responses always raises questions about potential bias of the estimates that cannot be quantified without substantial extra efforts requiring resources beyond those available. (See Box A-1 for a discussion of potential sources of bias.)

The lack of detailed applications data from the agency, beyond the name and address of the company, makes it impossible to estimate the possible impact of non-response bias. We therefore have no evidence either that nonresponse bias exists or that it does not. For the areas where Academy surveys have overlapped with other data sources (notably DoD’s mandatory CCR database), results from the survey and from the DoD data are similar. Table A-3 shows the response rates at DoE, based on both the preliminary study population and the effective study population after all adjustments.

The 2014 Survey primarily reached companies that were still in business: overall, 97 percent of PIs responding for an award in the effective population indicated that the companies were still in business.14

Effort at Comparison Group Analysis

Several readers of the first-round reports on the SBIR program suggested inclusion of comparison groups in the analysis. There is no simple and easy way to acquire a comparison group for Phase II SBIR or STTR awardees especially at the agency level. These are technology-based companies

TABLE A-3 2014 Survey Response Rates at DoE

Overall Population of Awards (all awards made) 1,325
Preliminary Population of Awards 1,077
Awards for which the PIs Were Not Contactable

No Email

320

No Phone Contact

263
Effective Population of Awards 494
Number of Awards for which Responses Were Received 269
Response Rate: Percentage of Effective Population of Awards 54.5
Response Rate: Percentage of Preliminary Population of Awards 25.0

SOURCE: 2014 Survey.

___________________

13 Many surveys of entrepreneurial firms have low response rates. For example, Aldrich and Baker (1997) found that nearly one-third of surveys of entrepreneurial firms (whose results were reported in the academic literature) had response rates below 25 percent. See H. E. Aldrich and T. Baker, “Blinded by the Cites? Has There Been Progress in Entrepreneurship Research?” pp. 377-400 in D. L. Sexton and R. W. Smilor (eds.), Entrepreneurship 2000, Chicago: Upstart Publishing Company, 1997.

14 2014 Survey, Question 4A.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×

at an early stage of company development, which have the demonstrated capacity to undertake challenging technical research and to provide evidence that they are potentially successful commercializers. Given that the operations of the SBIR/STTR programs are defined in legislation and limited by the Policy Guidance provided by SBA, randomly assigned control groups were not a possible alternative.

As part of the our 2011 Survey of DoD, NSF, and NASA SBIR and STTR award recipients, efforts to identify a pool of SBIR-like companies were made by contacting the most likely sources (Dun & Bradstreet and Hoovers), but these efforts were not successful, as insufficiently detailed and structured information about companies was available.

In response, we sought to develop a comparison group from among Phase I awardees that had not received a Phase II award from the three agencies surveyed in 2011 Survey during the award period covered by the survey (FY 1999-2008). After considerable review, however, we concluded that the Phase I-only group was also not appropriate for use as a statistical comparison group.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Page 225
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Page 226
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Page 227
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Page 228
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Page 229
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Page 230
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Page 231
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Page 232
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Page 233
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Page 234
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Page 235
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Page 236
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Page 237
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Page 238
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Page 239
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Page 240
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey Tools." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR/STTR at the Department of Energy. Washington, DC: The National Academies Press. doi: 10.17226/23406.
×
Page 241
Next: Appendix B: Major Changes to the SBIR and STTR Programs Resulting from the 2011 SBIR Reauthorization Act, P.L. 112-81, December 2011 »
SBIR/STTR at the Department of Energy Get This Book
×
 SBIR/STTR at the Department of Energy
Buy Paperback | $70.00 Buy Ebook | $54.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Small Business Innovation Research (SBIR) program is one of the largest examples of U.S. public-private partnerships, and was established in 1982 to encourage small businesses to develop new processes and products and to provide quality research in support of the U.S. government’s many missions. The Small Business Technology Transfer (STTR) Program was created in 1992 by the Small Business Research and Development Enhancement Act to expand joint venture opportunities for small businesses and nonprofit research institutions by requiring small business recipients to collaborate formally with a research institution. The U.S. Congress tasked the National Research Council with undertaking a comprehensive study of how the SBIR and STTR programs have stimulated technological innovation and used small businesses to meet federal research and development needs, and with recommending further improvements to the programs. In the first round of this study, an ad hoc committee prepared a series of reports from 2004 to 2009 on the SBIR and STTR programs at the five agencies responsible for 96 percent of the programs’ operations -- including the Department of Energy (DoE). Building on the outcomes from the first round, this second round presents the committee’s second review of the DoE SBIR program’s operations.

Public-private partnerships like SBIR and STTR are particularly important since today's knowledge economy is driven in large part by the nation's capacity to innovate. One of the defining features of the U.S. economy is a high level of entrepreneurial activity. Entrepreneurs in the United States see opportunities and are willing and able to assume risk to bring new welfare-enhancing, wealth-generating technologies to the market. Yet, although discoveries in areas such as genomics, bioinformatics, and nanotechnology present new opportunities, converting these discoveries into innovations for the market involves substantial challenges. The American capacity for innovation can be strengthened by addressing the challenges faced by entrepreneurs.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!