National Academies Press: OpenBook

Revisiting the Department of Defense SBIR Fast Track Initiative (2009)

Chapter: Appendix A: Survey Methodology and Administration

« Previous: Appendixes
Suggested Citation:"Appendix A: Survey Methodology and Administration." National Research Council. 2009. Revisiting the Department of Defense SBIR Fast Track Initiative. Washington, DC: The National Academies Press. doi: 10.17226/12600.
×
Page 81
Suggested Citation:"Appendix A: Survey Methodology and Administration." National Research Council. 2009. Revisiting the Department of Defense SBIR Fast Track Initiative. Washington, DC: The National Academies Press. doi: 10.17226/12600.
×
Page 82
Suggested Citation:"Appendix A: Survey Methodology and Administration." National Research Council. 2009. Revisiting the Department of Defense SBIR Fast Track Initiative. Washington, DC: The National Academies Press. doi: 10.17226/12600.
×
Page 83
Suggested Citation:"Appendix A: Survey Methodology and Administration." National Research Council. 2009. Revisiting the Department of Defense SBIR Fast Track Initiative. Washington, DC: The National Academies Press. doi: 10.17226/12600.
×
Page 84
Suggested Citation:"Appendix A: Survey Methodology and Administration." National Research Council. 2009. Revisiting the Department of Defense SBIR Fast Track Initiative. Washington, DC: The National Academies Press. doi: 10.17226/12600.
×
Page 85
Suggested Citation:"Appendix A: Survey Methodology and Administration." National Research Council. 2009. Revisiting the Department of Defense SBIR Fast Track Initiative. Washington, DC: The National Academies Press. doi: 10.17226/12600.
×
Page 86
Suggested Citation:"Appendix A: Survey Methodology and Administration." National Research Council. 2009. Revisiting the Department of Defense SBIR Fast Track Initiative. Washington, DC: The National Academies Press. doi: 10.17226/12600.
×
Page 87
Suggested Citation:"Appendix A: Survey Methodology and Administration." National Research Council. 2009. Revisiting the Department of Defense SBIR Fast Track Initiative. Washington, DC: The National Academies Press. doi: 10.17226/12600.
×
Page 88
Suggested Citation:"Appendix A: Survey Methodology and Administration." National Research Council. 2009. Revisiting the Department of Defense SBIR Fast Track Initiative. Washington, DC: The National Academies Press. doi: 10.17226/12600.
×
Page 89
Suggested Citation:"Appendix A: Survey Methodology and Administration." National Research Council. 2009. Revisiting the Department of Defense SBIR Fast Track Initiative. Washington, DC: The National Academies Press. doi: 10.17226/12600.
×
Page 90
Suggested Citation:"Appendix A: Survey Methodology and Administration." National Research Council. 2009. Revisiting the Department of Defense SBIR Fast Track Initiative. Washington, DC: The National Academies Press. doi: 10.17226/12600.
×
Page 91
Suggested Citation:"Appendix A: Survey Methodology and Administration." National Research Council. 2009. Revisiting the Department of Defense SBIR Fast Track Initiative. Washington, DC: The National Academies Press. doi: 10.17226/12600.
×
Page 92
Suggested Citation:"Appendix A: Survey Methodology and Administration." National Research Council. 2009. Revisiting the Department of Defense SBIR Fast Track Initiative. Washington, DC: The National Academies Press. doi: 10.17226/12600.
×
Page 93
Suggested Citation:"Appendix A: Survey Methodology and Administration." National Research Council. 2009. Revisiting the Department of Defense SBIR Fast Track Initiative. Washington, DC: The National Academies Press. doi: 10.17226/12600.
×
Page 94

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Appendix A Survey Methodology and Administration This appendix provides a description of the survey, including how the survey was administered. A. SAMPLE SELECTION The selection of the survey sample involved determination of which Phase II awards and firms should be surveyed. The projects selected for this sample included DoD Fast Track award winners, Phase II Enhancement awards and other DoD awards (which were neither Fast Track nor Phase II Enhancement) selected as a control group.1 The prior studies of SBIR commercialization by GAO in 1991 showed that it often takes several years after completion of Phase II before significant sales are achieved.2 The 1991 survey questionnaire was sent all the Phase II awardees from the first 4 years—1984 through 1987—in which the agencies made Phase II awards. The 2000 Fast Track study by the NRC did not have the luxury of allowing the Fast Track surveyed award four years to commercialize. Since there was no Fast Track prior to 1997, and the study had to be completed in 1999, Phase II awards given in 1997 were surveyed in 1999.3 1 In selecting the Phase II awards for a control group survey, random samples were drawn from each award year from 1997 to 2002 insuring that the overall average age was comparable to that of the Fast Track sample, and the number per service was comparable to the number per service from the combination of the Fast Track and Phase II Enhancement samples. Distribution of the control group was also comparable to the distribution of Fast Track and Phase II Enhancement by states. In aggregate, 392 control projects were selected. A larger sample of control awards was selected on the assumption that these awards might have a lower response rate.” 2 U.S. General Accounting Office, Small Business Innovation Research Shows Success But Can Be Strengthened, GAO/RCED-92-37, Washington, DC: U.S. Government Printing Office, March 1992. 3 Although the averages sales per Fast Track Phase II award in the 1999 survey was more than double that of the average control group Phase II award, the Fast Track average sales was less than one third 81

82 APPENDIX A For the current study effort, the GAO survey methodology (which is also being used in the ongoing larger NRC study of SBIR at five agencies) could be applied.4 Thus, the sample for the 2006 survey included all Fast Track Phase II awarded from program inception to 2002.5 All 250 Phase II Fast Tracks awarded through 2002 were surveyed. Although Phase II Enhancement was announced in a 1999 Solicitation, DoD components began making Enhancement awards in 1999 as modifications to Phase II contracts that had been awarded in 1997. Hence, for both programs the initial Phase II awards were made in 1997. In 1997, three times as many proposals received Fast Track Phase II contracts as proposals that followed the standard Phase II award process but were subsequent recipients of a Phase II Enhancement. The growth in the number of Phase II Enhancements and the decline in the number of Fast Track awards was such that the number of proposals awarded Phase II in 2002, which subsequently received Phase II Enhancement awards exceeded the number receiving Fast Track awards in 2002 by a factor of three. Fast Track and Phase II Enhancement are not mutually exclusive. Twenty-four of the 250 Fast Track awarded by 2002 also received Phase II Enhancement awards. The sample of 219 Phase II Enhancement awards surveyed was selected to be comparable to the Fast Track awards.6 In selecting the Phase II awards for a control group survey, random samples were drawn from each award year from 1997 to 2002. This ensured that the overall average age was comparable to that of the Fast Track sample and that the number per service was comparable to the number per service from the combination of the Fast Track and Phase II Enhancement samples. Distribution of the control group was also comparable to the distribution of Fast Track and Phase II Enhancement by States. In aggregate, 392 control projects were selected. A larger sample of control awards was selected on the assumption that these awards might have a lower response rate. B. ADMINISTRATION OF THE SURVEY The questionnaire used in the 1999 National Research Council assessment of SBIR at the Department of Defense, SBIR: An Assessment of the of the sales reported by GAO per Phase II award, which had which had two to five more years to commercialize. 4 For a review of the methodology, see National Research Council, An Assessment of the Small Business Innovation Research Program—Project Methodology, Washington, DC: The National Academies Press, 2004. 5 Two Phase II awards from the 1996 Solicitations occurred in 1996; the remainders of the awards from those Solicitations were not made until 1997. 6 A survey of 100 percent of the 384 Phase II through 2002, which received Enhancements awards, would have been disproportionately skewed to the Phase II award years 2001 and 2002, resulting in the average Phase II Enhancement award surveyed having more than a year less time to commercialize than the average Fast Track award. At the time of the survey, the average age for sampled awards was: Fast Track 7.2 years, Phase II Enhancement 6.9 years, and Control group 7.2 years.

APPENDIX A 83 Department of Defense Fast Track Initiative evolved from the earlier GAO survey. Both surveys asked questions about the firm and questions about the specific Phase II award. In the 2008 NRC SBIR study, the NRC selected questions that were a further evolution of questions used in surveys for the 2000 NRC Fast Track report. . Eighty percent of the questions on the earlier NRC study were incorporated and 24 new questions added to attempt to understand both commercial and non-commercial aspects, including knowledge base impacts, of SBIR, and to gain insight into impacts of program management. However, the NRC recognized that many firms would be surveyed about multiple awards. Rather than ask questions about the firm on each Phase II survey, the 2005 questionnaire was divided into a firm survey and a separate Phase II award survey. This same format was applied for the 2006 study. Four additional questions dealing with Fast Track and Phase II Enhancement were added to both the firm and Phase II surveys.7 The section above on sample selection described how Phase II awards to be surveyed were selected. Once an award was selected, the firm responsible for that award was added to the firm sample. Some firms had more than one award selected. Surveys (one per firm and one per sampled award) were emailed to the 601 firms conducting the 807 projects (control group plus study sample) on April 19, 2006. Subsequent emails were sent to an additional 30 firms (each with a single Phase II award.)8 The characteristics of the firms in the sample are described below. This sample was used to mail out the survey and as a basis for selection of firms to be subsequently interviewed. Award information, including addresses, principal investigators (PI) and phone numbers, all of the characteristics used in matching, as well as other information in the database such as award amounts, dates, contract numbers and scheduled durations were provided to the investigators to assist in selection of firms for interviews. Information was also provided to enable survey of the government technical points of contact. Advantages and Disadvantages of On-line Surveys The surveys were administered on-line, using a web server. The formatting, encoding and administration of the survey was subcontracted to BRTRC, Inc. of Fairfax, VA. There are many potential advantages to online surveys including cost, speed, and flexibility in questions. As response rates with on-line surveys become clear, they indicate the need for follow up with non-respondents. Hyperlinks provide amplifying information, and built in quality checks control 7 See survey questions in Appendix B (Firm) and Appendix C (Phase II Award). 8 Thus the total sample was 837 awards to 631 firms.

84 APPENDIX A the internal consistency of the responses. Finally, online surveys allow dynamic branching of question sets, with some respondents answering selected sub-sets of questions but not others, depending on prior responses. Web surveys for the NRC’s 2008 SBIR reports—administered in 2005—also made clear that there are also disadvantages to attempting on line surveys in an era of viruses, worms, spam blockers and phishing. Survey recipients are increasingly suspicious of unsolicited email that requires interaction and that requests detailed information. Despite these disadvantages, which limit response rates, the existence of the encoded 2005 survey, responses to that survey, and already established web site procedures for survey administration, together made the use of a similar on line survey the most cost effective approach for the administering—in 2006—the DoD Phase II Enhancement survey. Since many of the firms and some of the Phase II projects had responded to the survey administered in 2005, the research team wanted to avoid asking the firm for answers already provided. Surveys to such firms were linked to their prior answers. For example, historical information from the prior survey, such as what year was the firm founded, were not displayed again. When the response to a question previously answered could have changed, such as how many SBIR Phase II awards has your firm received from the federal government, the question was displayed with the prior answer filled in. The firm could then accept the earlier answer or change it. The conduct of an on-line survey required knowledge of the email address of the correct official. An SBIR Point of Contact (POC) and email address was available for every firm that had submitted for a DoD SBIR since 1999. However, only limited email addresses were available for the remainder of the firms, and firms only update their information when they submit new proposals to DoD. Firms frequently move as they grow or shrink; new POC are added; and email systems are often changed. The decision to use an on-line survey meant that the first step of survey distribution was an outreach effort to establish contact with the firms. Establishment of Contact If point of contact (POC) and email information was not available, or if the information that was available failed to work in the 2005 study, a search was conducted to acquire new information. Contact was attempted by calling the agency provided phone number for the firm, then by using the Central Contractor Registration database, Business.com (powered by Google) and Switchboard.com. When an apparent match was found, the firm was called to verify that it was in fact the firm, which had completed the SBIR. Often firms had phone numbers that seemed correct, but they were never present and did not return calls.

APPENDIX A 85 At the conclusion of this effort, no email address could be determined for 24 firms. To enhance cooperation with the survey further, an advance letter from the NRC study director, Dr. Charles W. Wessner, was sent to each of the selected firms three weeks prior to the survey. The letter described the purpose and importance of the study and requested cooperation in survey completion. As expected from the earlier studies, a number of advance letters (8 percent) were returned as undeliverable. On the return of these undelivered letters, the firm was looked up in Internet yellow pages, the Central Contractor Registration database, Business.com, and Switchboard.com, to try to find correct address information. If a new address and phone were found, the firm was contacted to verify that it was indeed the correct firm, and where possible to obtain a Point of Contact (POC) to address the survey to. Attempts were also made contact the PI listed in the DoD awards database. Once a POC was identified, the email address list for the survey was updated. For POC identified after distribution of the survey, a survey request was emailed to the POC. On April 19, 2006, the survey was announced by email to the previously identified points of contact. 94 of the 607 email could not be delivered. These “bounced” email led to a new search effort, which ultimately updated 34 email addresses. High Response Rates By November 1, 2006, seven months into the survey, 240 responses had been received. Eighty-four firms, responsible for 120 sampled projects could not be contacted due to incorrect or missing email addresses. Six of the firms were known to have been acquired and two known to be out of business. Using the same methodology as the GAO had used in 1992, undeliverables, and out of business firms were eliminated prior to determining the response rate. Although 837 projects were surveyed, 120 were eliminated as described. This left 717 projects, of which 240 responded, representing a 33 percent response rate. Similarly, the 232 firm surveys completed represented a 42 percent response rate from firms. Considering the length of the survey and its voluntary nature, this rate was relatively high and reflects both the interest of the participants in the SBIR program. The sample groupings and their address and response data are shown in Tables App-A-1 and App-A-2. C. FAST TRACK REVISITED, AND INITIAL EVALUATION OF PHASE II ENHANCEMENT We now turn to a description of the characteristics of the survey sample groups and copies of the announcement letters. Slightly different announcement letters were sent to firms that responded to the 2005 study than to those which had not responded.

86 APPENDIX A Phase II Award Sample All, except two, of Phase II Awards sampled were awarded from 1997 to 2002. Fast Track began with the first solicitation of 1996; thus, due to the time needed to select, award and execute the Phase I and to select and award Phase II, most of the earliest Fast Track Phase II were awarded in 1997. Two of these initial Fast Track Phase II awards were conferred in 1996. Any Phase II awarded after 2002 was considered to lack sufficient time to have commercialized by the release of the survey in the spring of 2006. Fast Track All 250 Phase II Fast Tracks awarded through 2002 were surveyed. Phase II Enhancement In 1997, three times as many proposals received Fast Track Phase II contracts as proposals which followed the standard Phase II award process but were subsequent recipients of a Phase II Enhancement. The growth in the number of Phase II Enhancements and the decline in the number of Fast Track awards was such that the number of proposals awarded Phase II in 2002, which subsequently received Phase II Enhancement awards exceeded the number receiving Fast Track awards in 2002 by a factor of three. Sampling 100 percent of the Phase II Enhancements Phase Ii from 1997 to 2002 would have resulted in the average age of the Phase II Enhancement being over a year younger than the average Fast Track. This reduction in time to commercialize would have distorted the results. Consequently 100 percent of the early Phase II enhancements and random sample of the later years was selected. The sample consisted of 219 Phase II Enhancement awards (including 24 which were also Fast Track) awarded their Phase II from 1997 to 2002 Control Group The Control Group sample was randomly selected to approximate the distribution of the other two samples. All 392 Phase II awards were conferred from 1997 to 2002. Average Time Since Award At the time of the survey, the average age for sampled awards was: Fast Track 7.2 years, Phase II Enhancement 6.9 years and Control group 7.2 years. Firm Characteristics The selection of firms was determined by the selection of Phase II awards. Every firm, which had a Phase II award in one or more of the sample award sample groups, was put in the firm sample. Although most firms had only a single award sampled, the 837 awards sampled resulted in only 631 firms.

APPENDIX A 87 Firms could not be categorized as a Fast Track firm or a Phase II Enhancement firm or a control group firm, since the firm could also have awards in one of the other categories.9 Overall Characteristics of the 631 sampled firms are shown in Table App-A-3. D. UNDERSTANDING SURVEY RESPONSE RATES Response rates can serve as a valuable statistic to judge the quality of surveys.10 A small survey response may limit the statistical power and credibility of surveyed data. However, low response rates do not necessarily imply bias, and there appears to be no commonly used standard for an “acceptable” level of survey response. Survey response rates, meanwhile, continue a decades-long decline and this development is of growing concern among survey experts. Declining Response Rates Survey response rates have long been declining in the United States.11 The overall response rates for web surveys are now typically less than 30 percent.12 Response rates for web or paper surveys sent by email reflect this declining trend. One analysis of response rates to 31 email surveys conducted between 1986 and 1999 reported a mean response rate of 36.83 percent, but the subset of surveys conducted in the 1998 to 1999 period reported a mean response rate of 31 percent.13 According to a 2006 study, mean response rates for surveys of executives have declined, with an overall response rate of 32 percent.14 Survey fatigue may account for some of the decline in survey response rates. Given their relative speed, low cost, and ease of administration, the number of emailed web surveys has risen in the United States. This increase in surveying has, however, led to a rising refusal rate among survey recipients.15 9 Some sampled firms had awards sampled in more than one category. Many firms, sampled for awards in one category, had awards that were not sampled in other categories; e.g., a sampled Fast Track award and a not sampled Phase II Enhancement in the same time frame. 10 P. P. Biemer and L. E. Lyberg, Introduction to Survey Quality, New York: John Wiley and Sons, 2003. 11 Edith de Leeuw and Wim de Heer, “Trends in Household Survey Nonresponse: A Longitudinal and International Comparison,” in Robert M. Groves, Don A. Dillman, John L. Eltinge, and Roderick J. A. Little, eds., Survey Nonresponse, New York: Wiley, 2002, pp. 41-54. 12 Michael D. Kaplowitz, Timothy D. Hadlock, and Ralph Levine, “A Comparison of Web and Mail Survey Response Rates,” Public Opinion Quarterly, 68(1):94-101, 2004. 13 Kim Sheehan, “E-mail Survey Response Rates: A Review,” Journal of Computer Mediated Communication, 6(2), 2001. 14 Cynthia S. Cycyota and David A. Harrison, “What (Not) to Expect When Surveying Executives: A Meta-Analysis of Top Manager Response,” Organizational Research Methods, 9:133-160, 2006. 15 Kim Sheehan, “E-mail Survey Response Rates: A Review,” op. cit.

88 APPENDIX A The growth of malicious internet viruses has also led to the widespread use of filtering software to delete unsolicited emails, lowering survey response rates.16 Gauging the Quality of Response Rates Low survey responses may compromise the sample size, the statistical power, the credibility of the data, and compromise the ability to generalize information from collected data.17 However, low response rates do not necessarily suggest bias because the respondent’s characteristics may still be representative of the population from which it was drawn.18 Gauging the quality of response rates, thus, depends on evaluating how well the analysis characterizes the non-responders, and the extent to which the non-response is linked to the information sought in the survey. No Formal Minimum Threshold Reflecting this disjuncture, there appears to be no standard for a minimal acceptable response rate. A survey of leading academic journals by the American Association for Public Opinion Research found that there is no consensus on a cutoff threshold, with several journal editors noting that they often make judgments on the validity of survey size on a case-by-case basis.19 Improving Response Rates Response rates can be improved by pre-notification letters from reputable organizations, by keeping surveys short, and by sending follow-up reminders. The salience of the issue being surveyed to the persons being surveyed is also a factor in improving response rates.20 16 Ibid. 17 S. Rogelberg, C. Spitzmüeller, I. Little, and S. Reeve, „Understanding Response Behavior to an Online Special Survey Topics Organizational Satisfaction Survey,” Personnel Psychology, 59:903- 923, 2006. 18 D. Dillman, Mail and Internet Surveys: The Tailored Design Method, 2nd Edition, Toronto, Ontario: John Wiley and Sons, Inc, 2000. 19 Timothy Johnson and Linda Owens, “Survey Response Rate Reporting in the Professional Literature,” Paper presented at the 58th Annual Meeting of the American Association for Public Opinion Research, Nashville, TN, May 2003. 20 Kim Sheehan, “E-mail Survey Response Rates: A Review,” op. cit.

APPENDIX A 89 SURVEY ANNOUNCEMENT LETTER Director 500 Fifth Street, NW Technology, Innovation, and Entrepreneurship Washington, DC 20001 Phone: 202 334 3801 Fax: 202 334 1813 EMAIL: cwessner@nas.edu 16 March 2006 CEO or President [Firm] [Address] [City], [State] [Zip] Dear Sir or Madam, I am writing to request your assistance with a study being carried out by the National Academy of Sciences at the request of the Congress to evaluate ways in which the Small Business Innovation Research (SBIR) program could be improved. To carry out the study, the National Academies appointed a distinguished Steering Committee headed by the Honorable Dr. Jacques S. Gansler, former Under Secretary of Defense for Acquisition, Technology and Logistics (for additional information on the study, please see <http://www7.nationalacademies.org/sbir>). As part of its analysis, the Committee commissioned a major survey of awardees that was conducted in 2005. Twelve hundred firms completed surveys in 2005 I am asking now for your assistance in evaluating two important DoD SBIR initiatives—Fast Track and Phase II Enhancement. A new survey will include all Phase II awarded from 1997 to 2002, which were either Fast Track or Phase II Enhancement, as well as a control group consisting equal number of standard Phase II awards. The surveys will consist of a firm survey and an award survey for each Phase II in the sample. The surveys, which will take about 30-40 minutes to complete, will be conducted on-line using the same proven site and procedure used in the successful 2005 survey. The DoD SBIR submission site has identified for your firm POC as [POCName] at [POCemail]. The purpose of this letter is to request your support of this important study. If your POC information has changed, we would ask that you have the appropriate person update their contact information at <https://www.dodsbir.net/submission/SignIn.asp> or email the correct POC name/email to jcahill@brtrc.com.

90 APPENDIX A An appropriate POC would often be the CEO, CTO, business manager, or principal investigator. In any case, he/she should be knowledgeable of the products of your SBIR, of efforts to commercialize those products, and your firm's experiences and opinions concerning Fast Track and Phase II Enhancement. The survey is will be distributed to your POC by email in the near future. Please note, survey responses will be confidential. The results are to be aggregated with those of other firms for survey analysis. Your response is, however, very important for the integrity and completeness of the study. Let me thank you in advance. Your input will be a significant contribution to our understanding and recommendations to the Congress for improvements to the program. If you have any questions, please do not hesitate to contact me. Thank you for your cooperation. Sincerely yours, Charles W. Wessner, Ph.D. Director Technology, Innovation, and Entrepreneurship

APPENDIX A 91 SURVEY ANNOUNCEMENT LETTER TO RESPONDENTS TO 2005 SURVEY Director 500 Fifth Street, NW Technology, Innovation, and Entrepreneurship Washington, DC 20001 Phone:202 334 3801 Fax:202 334 1813 EMAIL: cwessner@nas.edu 16 March 2006 CEO or President [Firm] [Address] [City], [State] [Zip] Dear Sir or Madam, I am writing again to request your further assistance with the study being carried out by the National Academy of Sciences at the request of the Congress to evaluate ways in which the Small Business Innovation Research (SBIR) program could be improved. This request should take only around ten minutes to fulfill. To carry out the study, the National Academies appointed a distinguished Steering Committee headed by the Honorable Dr. Jacques S. Gansler, former Under Secretary of Defense for Acquisition, Technology and Logistics (for additional information on the study, please see <http://www7.nationalacademies.org/sbir>). As part of its analysis, the Committee commissioned a major survey of awardees that was conducted in 2005. Thank you for the effort that your firm spent on completing those surveys I am asking now for your assistance in evaluating two important DoD SBIR initiatives—Fast Track and Phase II Enhancement. The new survey will include all Phase II awarded from 1997 to 2002, which were either Fast Track or Phase II Enhancement, as well as a control group consisting equal number of standard Phase II awards. The surveys will consist of a firm survey and as well as an award survey for each Phase II in the sample. The surveys will again be conducted on line. Surveys for firms and awards, which completed a survey in 2005, will not repeat previously answered questions unless the answer may have changed with the passage of time. Any repeat of previous questions will have your 2005 answers filled in. Your firm may overwrite if the answer has changed. As noted, the new questions on each survey should take less than 10 minutes to complete. Surveys on awards not sampled in the 2005 survey will take about 40 minutes.

92 APPENDIX A The POC and email address that DoD has identified for your firm is [POCName] at [POCemail]. The purpose of this letter is to request your support of this important study, and if your POC information has changed, have the appropriate person update their contact information at <https://www.dodsbir.net/submission/SignIn.asp> or email the correct POC name/email to jcahill@brtrc.com. An appropriate POC would often be the CEO, CTO, business manager, or principal investigator. In any case, he/she should be knowledgeable of the products of your SBIR, of efforts to commercialize those products, and your firm's experiences and opinions concerning Fast Track and Phase II Enhancement. The survey is will be distributed to your POC by email in the near future. Please note, survey responses will be confidential. The results are to be aggregated with those of other firms for survey analysis. Your response is, however, very important for the integrity and completeness of the study. Let me thank you in advance. Your input will be a significant contribution to our understanding and recommendations to the Congress for improvements to the program. If you have any questions, please do not hesitate to contact me. Thank you for your cooperation. Sincerely yours, Charles W. Wessner, Ph.D. Director Technology, Innovation, and Entrepreneurship

APPENDIX A 93 TABLE App-A-1 Surveyed Phase II Awards Award Surveys Email Could Not Completed Overall Contacted Award Contact Contact Survey Response Response Category (Number) (Number) (Number) Rate (%) Rate (%) Fast Track 156 61 50 23 32 Phase II 198 12 69 33 35 Enhancement Both Fast 32 2 14 41 44 Track and Phase II Enhancment Control 331 45 107 28 32 Group TABLE App-A-2 Firm Contact Status Email Status Number of Firms Percent of Firms Correct Email 513 81 No Email 24 4 Bounced Email 60 10 Updated with Correct Email 34 5 Totals 631 100

94 TABLE App-A-3 Overall Characteristics of the 631 Sampled Firms Year Percent of Number Percent of Current Percent of Firm Percent of Founded Firms of Phase II Firms Number of Firms Location Firms Employees Prior to 1983 19 >100 1 >200 5 CA 21 1983 to 1992 36 51 to 100 3 51 to 200 15 MA 10 1993 to 1996 24 26 to 50 9 21 to 50 17 VA 9 1997 to 1999 17 11 to 25 13 6 to 20 35 OH 6 After 1999 4 6 to 10 22 0 to 5 28 PA 5 2 to 5 37 NY 5 1 15 MD 5 CO 4 NJ 4 TX 3 FL 3 All Others 25

Next: Appendix B: Fast Track/Phase II Enhancement Firm Survey Summary of Responses »
Revisiting the Department of Defense SBIR Fast Track Initiative Get This Book
×
Buy Paperback | $64.00 Buy Ebook | $49.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

In October 1995, the Department of Defense launched a Fast Track initiative to attract new firms and encourage commercialization of Small Business Innovation Research (SBIR) funded technologies throughout the department. The goal of the Fast Track initiative is to help close the funding gap that can occur between Phase I and II of the SBIR program. The Fast Track initiative seeks to address the gap by providing expedited review and essentially continuous funding from Phase I to Phase II, as long as applying firms can demonstrate that they have obtained third-party financing for their technology. Another program initiative, Phase II Enhancement, was launched in 1999 to concentrate SBIR funds on those R&D projects most likely to result in viable new products that the Department of Defense and others will buy.

The current volume evaluates the two SBIR Program initiatives--Fast Track and Phase II Enhancement--and finds that both programs are effective. Ninety percent of Fast Track and 95 percent of Phase II Enhancement reported satisfaction with their decision. This book identifies the successes and remaining shortcomings of the programs, providing recommendations to address these issues.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!