National Academies Press: OpenBook

SBIR at NASA (2016)

Chapter: Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To

« Previous: Appendixes
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×

Appendix A

Overview of Methodological Approaches, Data Sources, and Survey Tools

This report on the Small Business Innovation Research (SBIR) program at the National Aeronautics and Space Administration (NASA) is a part of a series of reports on SBIR at the National Institutes of Health (NIH), Department of Defense (DoD), Department of Energy (DoE), and National Science Foundation (NSF). Collectively, they represent a second-round assessment of the program by the National Academies of Sciences, Engineering, and Medicine.1

The first-round assessment, conducted under a separate ad hoc committee, resulted in a series of reports released from 2004 to 2009, including a framework methodology for that study and on which the current methodology builds.2 Thus, as in the first-round study, the objective of this second-round study is “not to consider if SBIR should exist or not”—Congress has already decided affirmatively on this question, most recently in the 2011 reauthorization of the program.3 Rather, we are charged with “providing assessment‐based findings of the benefits and costs of SBIR . . . to improve public understanding of the program, as well as recommendations to improve the program’s effectiveness.” As with the first-round, this study “will not seek to compare the value of one area with other areas; this task is the prerogative of the Congress and the Administration acting through the agencies. Instead, the study is concerned with the effective review of each area.”

__________________

1 Effective July 1, 2015, the institution is called the National Academies of Sciences, Engineering, and Medicine. References in this report to the National Research Council or NRC are used in an historic context identifying programs prior to July 1.

2 National Research Council, An Assessment of the Small Business Innovation Research Program: Project Methodology, Washington, DC: The National Academies Press, 2004.

3 National Defense Authorization Act of 2012 (NDAA) HR.1540, Title LI.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×

These areas refer to the four legislative objectives of the SBIR program:4

  • Expand the U.S. technical knowledge base
  • Support agency missions
  • Improve the participation of women and minorities
  • Commercialize government-funded research

The SBIR program, on the basis of highly competitive solicitations, provides modest initial funding for selected Phase I projects (up to $150,000) for feasibility testing, and further Phase II funding (up to $1 million) for about one-half of Phase I projects.

From a methodology perspective, assessing this program presents formidable challenges. Among the more difficult are the following:

  • Lack of data. Tracking of outcomes varies widely across agencies, and in no agency has it been successfully implemented into a fully effective tracking system. There are no successful systematic efforts by agencies to collect feedback from awardees.
  • Intervening variables. Analysis of small businesses suggests that they are often very path dependent and, hence, can be deflected from a given development path by a wide range of positive and negative variables. A single breakthrough contract—or technical delay—can make or break a company.
  • Lags. Not only do outcomes lag awards by a number of years, but also the lag itself is highly variable. Some companies commercialize within 6 months of award conclusion; others take decades. In addition, often the biggest impacts take many years to peak even after products have reached markets.

ESTABLISHING A METHODOLOGY

The methodology utilized in this second-round study of the SBIR program builds on the methodology established by the committee that completed the first-round study.

Publication of the 2004 Methodology

The committee that undertook the first-round study and the agencies under study formally acknowledged the difficulties involved in assessing SBIR programs. Accordingly, that study began with development of the formal

__________________

4 The most current description of these legislative objectives is in the Policy Guidance provided by the Small Business Administration (SBA) to the agencies. SBA Section 1.(c) SBIR Policy Directive, October 18, 2012, p. 3.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×

volume on methodology, which was published in 2004 after completing the standard Academies peer-review process.5

The established methodology stressed the importance of adopting a varied range of tools, which meshes with the methodology originally defined by the study committee to include a broad range of tools, based on prior work in this area. The committee concluded that appropriate methodological approaches

“…build from the precedents established in several key studies already undertaken to evaluate various aspects of the SBIR. These studies have been successful because they identified the need for utilizing not just a single methodological approach, but rather a broad spectrum of approaches, in order to evaluate the SBIR from a number of different perspectives and criteria.

This diversity and flexibility in methodological approach are particularly appropriate given the heterogeneity of goals and procedures across the five agencies involved in the evaluation. Consequently, this document suggests a broad framework for methodological approaches that can serve to guide the research team when evaluating each particular agency in terms of the four criteria stated above. [Table APP A-1] illustrates some key assessment parameters and related measures to be considered in this study.”6

TOOLS UTILIZED IN THE CURRENT SBIR STUDY

Quantitative and qualitative tools being utilized in the current study of the SBIR program include the following:

  • Case studies. The committee commissioned in-depth case studies of 11 SBIR recipients at NASA. These companies are geographically diverse, demographically diverse, funded by several different NASA Research Centers and Mission Directorates, and are at different stages of the company life cycle.
  • Workshops. The committee convened a number of workshops to allow stakeholders, agency staff, and academic experts to provide unique insights into the program’s operations, as well as to identify questions that need to be addressed.
  • Analysis of agency data. A range of datasets covering various aspects of agency SBIR activities were sought from NASA and other sources. The committee has analyzed and included the data that was received as appropriate.

__________________

5 National Research Council, An Assessment of the Small Business Innovation Research Program: Project Methodology, 2.

6 Ibid.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×

TABLE APP A-1 Overview of Approach to SBIR Program Assessment

SBIR Assessment Parameters → Quality of Research Commercialization of SBIR-Funded Research/Economic and Non-economic Benefits Small Business Innovation/Growth Use of Small Businesses to Advance Agency Missions
Questions How does the quality of SBIR-funded research compare with that of other gov’t-funded R&D? What is the overall economic impact of SBIR-funded research? What fraction of that impact is attributable to SBIR funding? How to broaden participation and replenish contractors? What is the link between SBIR and state/regional programs? How to increase agency uptake while continuing to support high-risk research
Measures Peer-review scores, publication counts, citation analysis Sales; follow-up funding; progress; initial public offering Patent counts and other intellectual property/employment growth, number of new technology firms Agency procurement of products resulting from SBIR work
Tools Case studies, agency program studies, study of repeat winners, bibliometric analysis Phase II surveys, program manager surveys, case studies, study of repeat winners Phase I and Phase II surveys, case studies, study of repeat winners, bibliometric analysis Program manager surveys, case studies, agency program studies, study of repeat winners
Key Research Challenges Difficulty of measuring quality and of identifying proper reference group Skew of returns; significant interagency and inter-industry differences Measures of actual success and failure at the project and firm levels; relationship of federal and state programs in this context Major interagency differences in use of SBIR to meet agency missions

NOTE: Supplementary tools may be developed and used as needed.

SOURCE: National Research Council, An Assessment of the Small Business Innovation Research Program: Project Methodology, Washington, DC: The National Academies Press, 2004, Table 1, p. 3.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
  • Open-ended responses from SBIR recipients. For the first time, the committee solicited textual responses in the context of the 2011 Survey, drawing more than 150 observations from respondents on the NASA SBIR program (respondents were asked to describe in their own words significant long-term impacts of the SBIR program on their company).
  • Agency interviews. Agency staff was consulted on the operation of the SBIR program, and most were helpful in providing information both about the program and about the challenges that they faced.
  • Literature review. In the time period since the start of our research in this area, a number of papers have been published addressing various aspects of the SBIR program. In addition, other organizations, such as the Government Accountability Office (GAO), have reviewed particular parts of the SBIR program. These works are referenced in the course of this analysis.

Taken together with our committee deliberations and the expertise brought to bear by individual committee members, these tools provide the primary inputs into the analysis.

We would stress that, for the first-round study and for our current study, multiple research methodologies feed into every finding and recommendation. No findings or recommendations rest solely on data and analysis from the survey; conversely, data from the survey are used to support analysis throughout the report.

COMMERCIALIZATION METRICS AND DATA COLLECTION

Congressional discussions of the SBIR program in the context of the 2011 reauthorization reflected strong interest in the commercialization of technologies funded through SBIR. This enhanced focus is understandable: the investment made should be reflected in outcomes approved by Congress.

However, no simple definition of “commercialization” exists.7 Broadly speaking, in the context of the program it means funding for technology development beyond that provided under Phase II SBIR funding. Given the diversity of Mission Directorates and Centers at NASA, it is not surprising that there is considerable variation in the definition of commercialization and in the collection of data that can be used for assessment and measurement. Possible meanings and elements include the following:

  • issuance of a certified Phase III contract by NASA directly to the small firm;

__________________

7 See Chapter 5 (Quantitative Outcomes) for related analysis of commercialization in the SBIR program.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
  • adoption of a technology by a NASA program;
  • utilization of a technology in space activities;
  • licensing of technologies to prime contractors (primes) and other parties serving NASA;
  • sale of products and services to primes for use on NASA systems (this may or may not include sale of data rights); and
  • any sale of goods or services derived from SBIR-funded technologies, to NASA or to other purchases, including the U.S. private sector, other U.S.-based government agencies, and foreign buyers.

Challenges in Tracking Commercialization

The challenges involved in accurately tracking commercialization are of formidable scale and complexity. So it is useful to break the tracking issue into three broad components:

  • within NASA;
  • in the NASA primes and other companies serving NASA;
  • and all remaining commercialization.

Tracking Phase III Commercialization within NASA: FPDS

The primary mechanism for tracking commercialization within NASA should be through the Federal Procurement Data System (FPDS), which records all agency procurement and which should therefore include information on all Phase III contracts. However, like DoD, and perhaps to an even greater extent, FPDS does not capture Phase III contracts effectively. A recent GAO study concluded that

“…comprehensive and reliable technology transition data for SBIR projects are not collected. Transition data systems used by DOD provide some transition information but have significant gaps in coverage and data reliability concerns. The military departments have additional measures through which they have identified a number of successful technology transitions, but these efforts capture a limited amount of transition results.”8

__________________

8 U.S. General Accountability Office, Small Business Innovation Research: DOD’s Program Supports Weapon Systems, but Lacks Comprehensive Data on Technology Transition Outcomes, GAO-14-96, December 20, 2013.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×

For FPDS to become a successful tracking tool, contracts which are designed as Phase III contracts should be marked as such. But this requires training for contracting officers across the agency, many of whom will not have had any experience with SBIR. In addition, because designation as a Phase III contract carries with it significant data rights for the small business, there may also be incentives for contracting officers, who do not want to share these rights, to avoid this designation, according to company interviewees.

Although the Navy SBIR program has made a concerted effort by devoting dedicated staff time to improving the quantity and quality of Phase III contracts captured by FPDS, this effort is an outlier, and at NASA no such effort has been undertaken. As a result, we have no data showing the extent to which NASA Phase III contracts are so designated in FPDS. It does not appear that NASA utilizes FPDS data for any program management functions.

Tracking Commercialization Through NASA Primes

Activities resulting in commercialization on behalf of NASA present a further layer of complexity. Because these activities take the form of private contracts between a prime and the subcontractor (the SBIR awardee), NASA does not collect detailed data as part of the contracting process. Typically, data about the SBIR heritage of a subcontract are not collected by the prime and are not further delivered to NASA for review.

Tracking Commercialization Outcomes Outside NASA and the NASA Primes

The contracting process sheds no light no activities outside NASA. Instead, all of the SBIR agencies must reply on reports from companies provided either through reports provided by the agency or through surveys conducted on the agency’s behalf. DoD (and in the future NIH) utilizes the former, NSF and DoE utilize the latter. NASA has recently settled on the former, with the addition of tracking modules to the Electronic Handbook (EHB) discussed in Chapter 3 (Initiatives).

Why New Data Sources Are Needed

Congress often seeks evidence about the effectiveness of programs or indeed about whether they work at all. This interest has in the past helped to drive the development of tools such as the Company Commercialization Record database at DoD. However, in the long term the importance of tracking lies in its use to support program management. By carefully analyzing outcomes and associated program variables, program managers should be able to manage more successfully.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×

We have seen significant limitations to all of the available data sources. FPDS captures a limited dataset, and even that is not accurate especially with regard to Phase III. Data from the primes are often not directly reported. Although self-reporting through the EHB is growing, it is far from comprehensive at NASA.

BEYOND COMMERCIALIZATION METRICS

Although Congressional interest has focused primarily on commercialization in recent years, it remains the case that there are four congressionally mandated objectives for the SBIR program, and that commercialization is only one of them. The data collection tools described above focus almost exclusively on that objective; they have in general no capabilities for collecting data about the other three program objectives described in the introduction to this appendix. Some data from NASA’s Electronic Handbook was ultimately made available for this study, but the data was too incomplete to be utilized; however, the EHB does hold substantial promise in eventually helping NASA to address concerns about data collection.

OVERVIEW OF THE SURVEY

Our analysis of the SBIR program at NASA makes extensive use of case studies, interviews, and other qualitative methods of assessment. These sources remain important components of our overall methodology, and Chapter 7 (Insights) is devoted to lessons drawn from case studies and other qualitative sources. But qualitative assessment alone is insufficient.

The Role of the Survey

The survey offers several significant advantages over other data sources, as follows:

  • covers all kinds of commercialization inside and outside of NASA;
  • provides a rich source of textual information in response to open-ended questions;
  • probes more deeply into company demographics and agency processes;
  • addresses principal investigators (PIs), not just company business officials;
  • allows comparisons with previous data-collection exercises; and
  • addresses other Congressional objectives for the program beyond commercialization.

At the same time, however, we are fully cognizant of the limitations of this type of observational survey research in this case. To address these issues

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×

while retaining the utility and indeed explanatory power of survey-based methodology, this report contextualizes the data by comparing results to those from the survey conducted as part of the first-round assessment of the SBIR program (referred to below as the “2005 Survey”9). This report also adds transparency by publishing the number of responses for each question and indeed each subgroup, thus allowing readers to draw their own conclusions about utility of the data.

We contracted with Grunwald Associates LLC to administer a survey to NASA award recipients. This survey is based closely on the 2005 Survey but is also adapted to lessons learned and includes some important changes discussed in detail below. A methodology subgroup of the committee was charged with reviewing the survey and the reported results for best practice and accuracy. The survey was carried out simultaneously with surveys focused on the SBIR programs at NSF and DoD.10

The primary objectives of the 2011 survey were as follows:

  • Provide an update of data collected in the Academies survey completed in 2005, maximizing the opportunity to identify trends within the program;
  • Probe more deeply into program processes, with the help of expanded feedback from participants and better understanding of program demographics;
  • Improve the utility of the survey by including a comparison group; and
  • Reduce costs and shrink the time required by combining three 2005 survey questionnaires—for the company, Phase I, and Phase II awards—into a single questionnaire.

Box A-1 identifies multiple sources of bias in survey response.

Survey Characteristics

In order to ensure maximum comparability for a time series analysis, the survey for the current assessment was based as closely as possible on previous surveys, including the 2005 Survey and the 1992 GAO survey.

Given the limited population of Phase II awards, the starting point for consideration was to deploy one questionnaire per Phase II award. However, we

__________________

9 The survey conducted as part of the current, second-round assessment of the SBIR program is referred to below as the “2011 Survey” or simply the “survey.” In general, throughout the report, any survey references are understood to be to the 2011 Survey unless specifically noted otherwise.

10 Delays at NIH and DoE in contracting with the Academies combined with the need to complete work contracted with DoD NSF and NASA led the Committee to proceed with the survey at three agencies only.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×

were also aware that the survey imposes burdens on respondents. Given the detailed and hence time-consuming nature of the survey, it would not be appropriate to over-burden potential recipients, some of whom were responsible for many awards over the years.

An additional point of consideration was that this survey was intended to add detail on program operations, rather than the original primary focus on program outcomes. Agency clients were especially interested in probing operations more deeply. We decided that it would be more useful and effective to administer the survey to PIs—the lead researcher on each project—rather than to the registered company point of contact (POC), who in many cases would be an administrator rather than a researcher.

The survey was therefore designed to collect the maximum amount of relevant data, consistent with our commitment to minimize the burden on

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×

individual respondents and to maintain maximum continuity between surveys. Survey questionnaires were to be sent to PIs of all projects that met selection characteristics, with a maximum of two questionnaires per PI. (The selection procedure is described below in “Initial Filters for Potential Recipients”.)

Based on reviewer feedback about the previous round of assessments, we also attempted to develop comparison groups that would provide the basis for further statistical analysis. This effort was eventually abandoned (see comparison group analysis section below).

Key similarities and differences between the 2005 and 2011 Surveys are captured in Table A-2.

The 2011 Survey included awards made from FY1998 to FY2007 inclusive. This end date allowed for completion of Phase II awards (which nominally fund 2 years of research) and provided a further 2 years for commercialization. This time frame was consistent with the 2005 Survey, which surveyed awards from FY1992 to FY2001 inclusive. It was also consistent with a previous GAO study, published in 1992, which surveyed awards made through 1987.

The aim of setting the overall time frame at 10 years was to reduce the impact of difficulties generating information about older awards, because some companies and PIs may no longer be in place and because memories fade over time. Reaching back to awards made in 1998, while ensuring comparability, generated few results from older awards.

Determining the Survey Population

Following the precedent set by both the original GAO study and the first-round study of the SBIR program, we differentiated between the total population of awards, the preliminary survey target population of awards, and the effective population of awards for this study.

Two survey response rates were calculated. The first uses the effective survey population of awards as the denominator, and the second uses the preliminary population of awards as the denominator.

From Total Population of Awards to Effective Population

Upon acquisition of data from the sponsoring agencies (DoD, NSF, and NASA) covering record-level lists of awards and recipients, initial and secondary filters were applied to reach the preliminary survey population and ultimately the effective survey population. These steps are described below.

Initial Filters for Potential Recipients: Identifying the Preliminary Survey Population

From this initial list, determining the preliminary survey population required the following steps:

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×

TABLE A-2 Similarities and Differences: 2005 and 2011 Surveys

Item

2005 Survey 2011 Survey

Respondent selection

Focus on Phase II winners

images images

Inclusion of Phase I winners

images images

All qualifying awards

images

Respondent = PI

images

Respondent = POC

images

Max number of questionnaires

<20

2

Distribution

Mail

images

No

Email

images images

Telephone follow-up

images images

Questionnaire

Company demographics

Identical

Identical

Commercialization outcomes

Identical

Identical

IP outcomes

Identical

Identical

Women and minority participation

images images

Additional detail on minorities

images

Additional detail on PIs

images

New section on agency staff

images

New section on company recommendations for SBIR

images

New section capturing open-ended responses

images
  • elimination of records that did not fit the protocol agreed upon by the committee—namely, a maximum of two questionnaires per PI (in cases where PIs received more than two awards, the awards were selected by agency [NASA, NSF, DoD, in that order], then by year [oldest], and finally by random number); and
  • elimination of records for which there were significant missing data—in particular, where emails and/or contact telephone numbers were absent.

This process of excluding awards either because they did not fit the protocol agreed upon by the committee or because the agencies did not provide sufficient or current contact information, reduced the total award list provided by the three agencies to a preliminary survey population for all three agencies of

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×

approximately 15,000 awards. From this, the preliminary survey population of Phase II SBIR and STTR awards for NASA was 1,131 awards.

Secondary Filters to Identify Recipients with Active Contact Information: Identifying the Effective Population

This preliminary population still included many awards for which the PI contact information appeared complete, but for which the PIs were no longer associated with the contact information provided and hence effectively unreachable. This is not surprising given that there is considerable turnover in both the existence of and the personnel working at small businesses and that the survey reaches back 13 years to awards made in FY1998. PIs for awards may have left the company, the company may have ceased to exist or been acquired, or telephone and email contacts may have changed, for example. Consequently, two further filters were utilized to help identify the effective survey population.

  • First, PI contacts were eliminated—and hence the awards assigned to those PI contacts were eliminated—for which the email address bounced twice. Because the survey was delivered via email, the absence of a working email address disqualified the recipient PI and associated awards. This eliminated approximately 30 percent of the preliminary population (340 awards).
  • Second, efforts were made to determine whether non-bouncing emails were in fact still operative. Email addresses that did not officially “bounce” (i.e., return to sender) may still in fact not be active. Some email systems are configured to delete unrecognized email without sending a reply; in other cases, email addresses are inactive but not deleted. So a non-bouncing email address did not equal a contactable PI. In order to identify not contactable PIs, we undertook an extensive telephone survey. Telephone calls were made to every PI with an award among the preliminary survey population of awards at NASA and who did not respond to the first round of questionnaire deployment. On the basis of responses to the telephone survey, we were able to ascertain that PI's for a further 27 percent of the preliminary population awards were in fact not contactable even though their email addresses did not bounce.

There was little variation between agencies or between programs in the quality of the lists provided by the agencies, based on these criteria.11

__________________

11 The share of preliminary contacts that turned out to be not contactable was higher for this survey than for the 2005 Survey. We believe this is primarily because company points of contact (POCs) to which the 2005 Survey was sent have less churn than do principal investigators (PIs) (often being senior company executives).

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×

Following the application of these secondary filters, the effective population of NASA Phase II awardees was 490.

Deployment

The survey opened on October 4, 2011, and was deployed by email, with voice follow-up support. Up to four emails were sent to the PIs for the effective population of awards (emails were discontinued once responses were received or it was determined that the PI was non-contactable). In addition, two voice mails were delivered to non-responding PIs of awards in the effective population, between the second and third and between the third and fourth rounds of email. In total, up to six efforts were made to reach each PI who was sent an award questionnaire.

After members of the data subgroup of the committee determined that additional efforts to acquire new responses were not likely to be cost effective, the survey was closed on December 19, 2011. The survey was therefore open for a total of 11 weeks.

Response Rates

Standard procedures were followed to conduct the survey. These data collection procedures were designed to increase response to the extent possible within the constraints of a voluntary survey and the survey budget. The population surveyed is a difficult one to contact and obtain responses from as evidence from the literature shows.12 Under these circumstances, the inability to contact and obtain responses always raises questions about potential bias of the estimates that cannot be quantified without substantial extra efforts requiring resources beyond those available. (See Box A-1 for a discussion of potential sources of bias.)

The lack of detailed applications data from the agency makes it impossible to estimate the possible impact of non-response bias. We, therefore, have no evidence either that non-response bias exists or that it does not. For the areas where the survey overlaps with other data sources (notably DoD’s mandatory CCR database) results from the survey and the DoD data are similar.

Table A-3 shows the response rates at NASA, based on both the preliminary study population and the effective study population after all adjustments

Table A-3 shows the response rates at NASA, based on both the preliminary study population and the effective study population.

__________________

12 Many surveys of entrepreneurial firms have low response rates. For example, Aldrich and Baker (1997) found that nearly one-third of surveys of entrepreneurial firms (whose results were reported in the academic literature) had response rates below 25 percent. See H. E. Aldrich and T. Baker, “Blinded by the Cites? Has There Been Progress in Entrepreneurship Research?” pp. 377-400 in D. L. Sexton and R. W. Smilor (eds.), Entrepreneurship 2000, Chicago: Upstart Publishing Company, 1997.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×

TABLE A-3 2011 Survey Response Rates at NASA

Preliminary Population of Awards 1,131

Awards for which the PIs were Not Contactable

-641

Effective Population of Awards

490

Number of Awards for which Responses were Received

179

Percentage of Effective Population of Awards Contacted

36.5

Percentage of Preliminary Population of Awards

15.8

SOURCE: 2011 Survey.

The survey primarily reached companies that were still in business: overall, 97 percent of PIs responding for an award in the effective population indicated that the companies were still in business.13

Effort at Comparison Group Analysis

Several readers of the first-round reports on the SBIR program suggested inclusion of comparison groups in the analysis. There is no simple and easy way to acquire a comparison group for Phase II SBIR awardees. These are technology-based companies at an early stage of company development, which have the demonstrated capacity to undertake challenging technical research and to provide evidence that they are potentially successful commercializers. Given that the operations of the SBIR program are defined in legislation and limited by the Policy Guidance provided by SBA, randomly assigned control groups were not a possible alternative.

Efforts to identify a pool of SBIR-like companies were made by contacting the most likely sources (Dun & Bradstreet and Hoovers), but these efforts were not successful, as insufficiently detailed and structured information about companies was available.

In response, we sought to develop a comparison group from among Phase I awardees that had not received a Phase II award from the three surveyed agencies (DoD, NSF, and NASA) during the award period covered by the survey (1999-2008). After considerable review, however, we concluded that the Phase I-only group was also not appropriate for use as a statistical comparison group.

__________________

13 2011 Survey, Question 4A.

Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Page 207
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Page 208
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Page 209
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Page 210
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Page 211
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Page 212
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Page 213
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Page 214
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Page 215
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Page 216
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Page 217
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Page 218
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Page 219
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Page 220
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Page 221
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Page 222
Suggested Citation:"Appendix A: Overview of Methodological Approaches, Data Sources, and Survey To." National Academies of Sciences, Engineering, and Medicine. 2016. SBIR at NASA. Washington, DC: The National Academies Press. doi: 10.17226/21797.
×
Page 223
Next: Appendix B: Major Changes to the SBIR Program Resulting from the 2011 SBIR Reauthorization Act, P.L. 112-81, December 2011 »
SBIR at NASA Get This Book
×
 SBIR at NASA
Buy Paperback | $72.00 Buy Ebook | $59.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Small Business Innovation Research (SBIR) program is one of the largest examples of U.S. public-private partnerships, and was established in 1982 to encourage small businesses to develop new processes and products and to provide quality research in support of the U.S. government’s many missions. The U.S. Congress tasked the National Research Council with undertaking a comprehensive study of how the SBIR program has stimulated technological innovation and used small businesses to meet federal research and development needs, and with recommending further improvements to the program. In the first round of this study, an ad hoc committee prepared a series of reports from 2004 to 2009 on the SBIR program at the five agencies responsible for 96 percent of the program’s operations -- including NASA. In a follow-up to the first round, NASA requested from the Academies an assessment focused on operational questions in order to identify further improvements to the program.

Public-private partnerships like SBIR are particularly important since today's knowledge economy is driven in large part by the nation's capacity to innovate. One of the defining features of the U.S. economy is a high level of entrepreneurial activity. Entrepreneurs in the United States see opportunities and are willing and able to assume risk to bring new welfare-enhancing, wealth-generating technologies to the market. Yet, although discoveries in various fields present new opportunities, converting these discoveries into innovations for the market involves substantial challenges. The American capacity for innovation can be strengthened by addressing the challenges faced by entrepreneurs.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!