Skip to main content

Currently Skimming:

7. Methodology Development: Primary Research
Pages 26-33

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 26...
... The approach adopted is to select the methodological elements best suited to complement and supplement existing information. The study objectives will be realized using the most efficient combination of methods.59 These include analyzing existing studies and databases, interviewing program officials, surveying various program and technical managers and project participants, carrying out case studies, using control groups and counterfactual approaches to isolate the effects of the SBIR program, and other methods such as econometric, sociometric, and bibliometric analysis.
From page 27...
... Program manager survey The program manager survey will focus on strategic management issues and on manager views of the program. It will be designed to capture senior agency views on the operations of the SBIR program focused on concerns such as funding amounts and flexibility, outreach, topic development, top-level agency support for SBIR, and evaluation strategies.
From page 28...
... The DoD commercialization database contains information on approximately 75 percent of DoD Phase II awards from 1992 to 2000, 67 percent of NASA and DoE awards, 54 percent of NSF awards, and 16 percent of NIH/HHS awards. DoE has provided commercialization data by product, which cannot be directly associated to projects as this may lead to a double counting of awards to firms.
From page 29...
... Stewardship: Substantial time and effort will be devoted to following up the survey with phone calls to non respondents and those that provide incomplete information. While the NRC study expects a significant response rate, based on the same techniques as have proved successful in the past, it is inherently difficult to predict the precise size of the actual result.
From page 30...
... eda Tr mes sey 7. na isonrapmoC IRBS-noN pshirs s Ye No rtne iedfila dnuo .ziliacr gr semoctuOreh gindn sRIBSreh Roadmapy s 8.
From page 31...
... , and such surveys can now be created at minimal cost using third party services. Response rates become clear fairly quickly, and can rapidly indicate needed follow up for non-respondents.
From page 32...
... While Wallsten's paper has the virtue of being one of the first attempts to assess the impact of SBIR, Josh Lerner questions whether employing a regression framework to assess the marginal impact of public funding on private research spending is the most appropriate tool in assessing public efforts to assist small high technology firms. He points out that "it may well be rational for a firm not to increase its rate of spending, but rather to use the funds to prolong the time before it needs to seek additional capital." Lerner suggests that "to interpret such a short run reduction in other research spending as a negative signal is very problematic." See Lerner, "Public Venture Capital: Rationales and Evaluation" in The Small Business Innovation Research Program: Challenges and Opportunities, op.
From page 33...
... Additionality tests can be strengthened by using statistical tools and econometric techniques to help rule out other causes. The comparison of what program participants would have done differently without the program is usually ascertained by interviews or surveys, using what are called "counterfactual questions." Counterfactual questions, for example, have been used in a variety of ATP surveys.73They have also been used in ATP case studies to help estimate project impacts.74 Use of a control group will entail the comparison of a program group with a comparable group that did not participate in the program.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.