in the random and firm samples described above, only 95 Phase IIs were added in this fashion.

  • Coding. The project database tracks the survey sample, which corresponds with each response. For example, it is possible for a randomly sampled project from a firm that had only two awards to be a top performer. Thus, the response could be analyzed as a random sample for the program, a random sample for the awarding agency, a top performer, and as part of the sample of single or double winners. In addition, the database allows examination of the responses for the array of potential explanatory or demographic variables.

  • Total number of surveys. The approach described above generated a sample of 6,410 projects, and 4,085 firm surveys—an average of 1.6 award surveys per firm. Each firm receiving at least one project survey also received a firm survey. Although this approach sampled more than 57 percent of the awards, multiple-award winners, on average, were asked to respond to surveys covering about 20 percent of their projects.

Administration of the Survey

The questionnaire drew extensively from the one used in the 1999 National Research Council assessment of SBIR at the Department of Defense, The Small Business Innovation Research Program: An Assessment of the Department of Defense Fast Track Initiative.1 That questionnaire in turn built upon the questionnaire for the 1991 GAO SBIR study. Twenty-four of the 29 questions on the earlier NRC study were incorporated. The researchers added 24 new questions to attempt to understand both commercial and noncommercial aspects, including knowledge base impacts, of SBIR, and to gain insight into impacts of program management. Potential questions were discussed with each agency, and their input was considered. In determining questions that should be in the survey, the research team also considered which issues and questions were best examined in the case studies and other research methodologies. Many of the resultant 33 Phase II Award survey questions and 15 Firm Survey questions had multiple parts.

The surveys were administered online, using a Web server. The formatting, encoding, and administration of the survey was subcontracted to BRTRC, Inc., of Fairfax, VA.

There are many advantages to online surveys (including cost, speed, and possibly response rates). Response rates become clear fairly quickly, and can rapidly indicate needed follow up for nonrespondents. Hyperlinks provide amplifying information, and built-in quality checks control the internal consistency of the responses. Finally, online surveys allow dynamic branching of question sets,


National Research Council, The Small Business Innovation Research Program: An Assessment of the Department of Defense Fast Track Initiative, Charles W. Wessner, ed., Washington, DC: National Academy Press, 2000.

The National Academies of Sciences, Engineering, and Medicine
500 Fifth St. N.W. | Washington, D.C. 20001

Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement