National Academies Press: OpenBook

Measuring Research and Development Expenditures in the U.S. Economy (2005)

Chapter: 8 Survey Management and Adminstrative Issues

« Previous: 7 Analysis of the R&D Data Funding Discrepancies
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

8
Survey Management and Administrative Issues

The programs of the National Science Foundation’s (NSF’s) Science Resources Statistics Division (SRS) transcend their relatively modest size and their rather specialized audience in importance. In a way, these programs may be seen as a harbinger of the model of a statistical agency that many are propounding these days. That model envisions a federal statistical agency with a small staff of federal employees consisting primarily of subject-matter experts, supported by a limited number of specialists in survey management, survey design, the cognitive sciences, information technologies, and data analysis. In this model, survey operations are outsourced to other organizations that have a comparative advantage due to their size and concentration. The agency funds and otherwise manages the program and communicates its intentions by promulgating guidelines and standards and specifying performance in contracts.

SRS does not conduct its own statistical collections. Although other statistical agencies in the federal statistical system also outsource their data collection programs, the organization may be seen as a testbed for management of statistical programs in a nonstandard, resource-constrained environment. With a skeletal staff, SRS is expected to meet accepted standards of any federal statistical agency. Those standards pertain to a list of practices that include continual development of more useful data, openness about the data provided, wide dissemination of data, cooperation with data users, fair treatment of data providers, commitment to quality and professional practice, an active research program, professional advancement of staff, and coordination and cooperation with other statistical agencies (National Research Council, 2001b). The challenges it faces in meeting several

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

of these standards with regard to the R&D statistics programs raise a cautionary flag to those who may propose that statistical agencies can thrive without conducting their own data collections.

INDEPENDENCE

To meet the needs of decision makers and other users of statistical data, a statistical agency must be able to provide objective, reliable information that may be used to evaluate programs and policies. To do so, the agency must have a widely acknowledged reputation for independence, which engenders trust in the accuracy and objectivity of the data. Unlike in many countries, where the independence of the statistical agency is organizationally ensured by virtue of the placement of that agency at a cabinet or subcabinet level, statistical agencies in the United States are part of mission-oriented departments and agencies. Within these departments, the independence of the statistical agency derives from its distinction from the parts of the department that carry out enforcement or policy-making activities.

The SRS division is fairly deeply embedded inside NSF, given that the agency has a fairly flat structure for a government agency (see Figure 8-1). Since 1991, the reporting arrangements have SRS as one of four divisions in

FIGURE 8-1 National Science Board organization.

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

FIGURE 8-2 Directorate for Social, Behavioral, and Economic Sciences.

the Directorate for Social, Behavioral, and Economic Sciences (see Figure 8-2). This directorate is headed by an assistant director of NSF and has responsibility for two mission-oriented divisions as well as SRS and the Office of International Science and Engineering.

The mission of SRS extends beyond the mission responsibility of social, behavioral, and economic sciences. The division informs the operations of the Office of the Director and the Directorate for Social, Behavioral, and Economic Sciences. SRS also serves as an appendage of the National Science Board (NSB), certainly with regard to the preparation of Science and Engineering Indicators, and ultimately with regard to many other aspects of the mission of the NSB. However, the NSB does not have direct reporting units, instead conducting its activities through the National Science Foundation organization (see Figure 8-1).

There is no indication that this structure has had a negative influence on the independence of SRS. Within this structure, SRS has authority, if not autonomy, over decisions on the scope, content, and frequency of the data compiled, as well as for the selection and promotion of professional, technical, and operational staff.

The current reporting structure generates challenges in the area of resourcing. The SRS division budget is not separately identified as it winds its way through the directorate to NSF to the Office of Management and Budget (OMB) and to Congress, hence it competes, often unfavorably, with the important mission areas of the directorate and NSF. The separate reporting of the budget of the SRS Division in the OMB compilation of statistical programs of the U.S. government is in a sense an artifact, in that

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

the data on SRS do not derive from a line in the administration’s budget (U.S. Office of Management and Budget, 2003c). This lack of definition in the budget process can work to the detriment of an agency with programs that serve the needs of organizations and activities outside the home base of that agency, some of which derive directly from the interests of congressional committees.

The panel could find no compelling reason to suggest that SRS be relocated organizationally within NSF. However, we have the sense that an elevation of the visibility of the resource base for SRS would be a positive step and would serve to direct attention to the needs of the programs for sustainment and improvement (Conclusion 8.1). The recent elevation of the budget of the Office of International Science and Engineering, which has its own advisory committee and is only tangentially connected to the Directorate for Social, Behavioral, and Economic Sciences (SBE), can serve as a model. The budget for this organization has been identified as a separate line in the NSF budget.

ORGANIZATION AND STAFFING

SRS is organized functionally, with four major branches, each bearing responsibility for discrete program areas. Two of the branches have responsibility for data collection, analysis, and publication programs—the Research and Development Statistics branch has responsibility for programs addressed in this report, and the Science and Engineering (S&E) Education and Human Resources Statistics branch has responsibility for the human resources programs (see Figure 8-3). The two other branches are directed toward internal and external support—Information and Technology Services and Science and Engineering Indicators.

The statistical practices of SRS have been strengthened in recent years, as is shown by the issuance of NSF-SRS Quality Guidelines in 2002. However, since SRS contracts out all of its survey work, it is left with a very thin staff. There is little or no opportunity for growing a bench of expertise in the necessary methodological specialties, including mathematical statistics, cognitive sciences, and survey design, such as exists in the larger statistical agencies. This expertise, which can respond to new demands on the data, new methods of collecting and issuing data, and new methodological innovations, must be imported. Similarly, the day-to-day pressures of survey management and the recurring demands of intensive projects, such as the biennial preparation of Science and Engineering Indicators, limits the ability of the division to deploy staff to the task of data analysis. Data analysis requires access to respondents and respondent record-keeping practices, microdata in pure forms prior to aggregation into estimates, and the tools for production of the estimates. Since these tasks are contracted out and the

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

FIGURE 8-3 Organization of the Division of Science Resources Statistics.

microdata from the Census Bureau are treated in a confidential manner, the division has not been able to develop a needed analytic capability for understanding the changing character of research and development. As a result, it has not been able to develop internal competence in how to revise and review the surveys in the changing environment.

In this regard, the panel reiterates the concern of the previous National Research Council (NRC) report that concluded that SRS lacks the staff to plan and test the methods and instruments necessary to respond to the major needs of data users (National Research Council, 2000). That report recommended three courses of action to alleviate the staffing and skill shortages:

  • augment staff expertise through professional development activities.

  • develop a long-range staffing plan to bring on new hires with skills in statistical methods and data analysis.

  • develop a more interactive relationship with external researchers to provide data and analytical support for the division’s range of professional capabilities.

Although some progress has been made in each of these areas in the past 3 years, particularly with the recruitment of additional expertise in statistical and cognitive methodology, as well as by obtaining the services of

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

nationally recognized outside experts in these fields for consultation and teaching engagements, there has not been sufficient time and resources to significantly enhance the data analysis skills.

DATA QUALITY

The panel concludes that SRS has the rudiments of a data quality program and further that SRS has adopted organizational and administrative changes to foster an environment for data quality.

We are hopeful that these recent initiatives, buttressed by additional resources and supplemented by further initiatives such as those outlined in this report, will lay a basis for further improvements in the future.

One important aspect of this quality improvement program is the development and promulgation of the publication Statistical Guidelines for Surveys and Publications in 2002 to set a framework for ensuring data quality. The guidelines are intended to (1) increase the reliability and validity of data, (2) promote common understanding of desired methodology and processes, (3) avoid duplication and promote the efficient transfer of ideas across program areas, and (4) remove ambiguities and inconsistencies.

The standards set goals, and the SRS surveys attempt to meet these goals. Yet there is inconsistent application of the guidelines. The guidelines contain a checklist to evaluate different types of nonsampling errors in the surveys. The use of the checklist in the different surveys is sporadic. Some surveys describe some aspects of survey methodology and omit other aspects. For example, none of the surveys adequately describes all of the aspects of unit nonresponse from the checklist: description of characteristics of units not responding, examination of changes in response rates over time, intensive follow-up of a sample of nonrespondents to ascertain nonresponse bias, and assessment of the method of handling nonresponse imputation or weighting procedures. Nonresponse is difficult to assess, but it is not impossible.

Because SRS has an interagency agreement with the Census Bureau to carry out the industrial survey of R&D, there are quality components written into the contract. The two tasks that are most pertinent to data quality are: (1) to prepare and maintain survey methodology and technical documentation and provide this information in a comprehensive methodology report and (2) to recommend quality improvements and methodological enhancements resulting from information gained in the previous survey cycle.

The panel notes that the interagency agreement with the Census Bureau covering survey operations for calendar year 2004 requires that the Census Bureau prepare and maintain complete and detailed survey methodology

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

and technical documentation, and further that it recommend quality improvements and methodological enhancements resulting from information gained during previous survey sampling and processing.

The panel is concerned that, although the Census Bureau does provide a detailed methodological report every year, the information is quite uneven. Processing procedures, including complete editing procedures, are not fully described. The role of the data analyst in changing or adding data is not discussed. Imputation procedures are mentioned, but item imputation rates are not provided. Other important areas, such as the sample design, are fully documented and set the standard for making necessary improvements in overall documentation.

Census Bureau and SRS staff work together to identify areas of the industrial survey that need improvement. One recent, ongoing collaboration, noted in Chapter 3, concerns the provision of more stable estimates of state R&D. However, there are other areas that need improvement, such as a better understanding of the respondents and their difficulties in responding. Several recommendations made over a decade ago by a team of cognitive researchers at the Census Bureau have not been addressed. With the urging of OMB to develop a web-based questionnaire, an opportunity exists to do cognitive testing and a codification of processing and editing rules.

The panel urges SRS to take the lead in the work on the industrial survey, using the tools of the interagency agreement, the oversight of a high-quality methodological staff, and the input of highly qualified outside experts. This lead role would be undertaken while working collaboratively with the Census Bureau (Recommendation 8.1).

In addition to strengthening the overall quality and relevance of the survey, a more active lead management role by NSF will certainly enhance the subject matter and statistical expertise of the NSF staff, as well as make them more mindful of the strengths and weaknesses of the data. This model, in which the sponsoring agency plays an active role in decision making and directing, while leaving day-to-day survey management to the experts at the Census Bureau, is employed in various degrees in the relationship of the Census Bureau with other sponsoring agencies, most notably, the Bureau of Labor Statistics, with positive results for the surveys.

ANALYSIS

Currently, SRS does almost no analysis except for the biennial input to Science and Engineering Indicators. The detailed reports are issued with many statistical tables, and the InfoBrief series, designed to quickly release the data, are issued with few. Most of the analysis in the abbreviated publications consists of table reading.

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

The panel observes that the survey data are almost unexplored because there is a lack of staff with the expertise and the time to look at the data intensively, explore issues raised by users, juxtapose data from different sources, and generally learn the strengths and weaknesses of the data in a substantive setting. There are no publications of high caliber in which staff data analysis can be readily published.

Lacking analysis, SRS loses an opportunity to learn more about the methodological weaknesses of the data. The methodology-analysis cycle is one of the strongest means of improving survey data. Until an agency uses its own data extensively, beyond descriptive statistics, it does not really understand the gaps and needs in its collection system. The panel recommends that SRS engage in research on substantive issues that would lead to improvements in the design and operation of data collection and to a fuller understanding of the data limitations. The panel further recommends that, over time, SRS develop both the internal staff capacity in data analysis and a suitable vehicle or vehicles for professional publication of data analysis by both internal staff and outside experts.

MEASURES OF EFFECTIVENESS

The discussion of the strengths and weaknesses of SRS has so far focused on the uniqueness of the agency, the organizational structure, and the principles and practices that must be in place for SRS to fulfill generally accepted minimum requisites of a federal statistical agency. These factors are inputs to a quality product. Several measures exist to judge performance, or the output of the agency. Among these are timeliness, relevance, and balance. Each of these output measures is discussed in turn.

Timeliness

It is axiomatic that data are more useful in a current than a historic context, and they are more useful when issued close to the reference period for the survey than some time after it. One of the most frequent complaints of potential data users about the R&D data is that the data are released too late. As a result, users rely on data that are less comprehensive, of lower quality, and, frequently, quite old when the analysis is performed.

The issue of timeliness of the data has been raised in several critiques of the R&D data over the years. The previous NRC study in 2000 concluded that “SRS must substantially reduce the period of time between the reference date and data release date for each of its surveys to improve the relevance and usefulness of the data” (National Research Council, 2000:5).

If anything, the recent trend has been to lengthen rather than shorten the time from reference period to publication. For example, the data from

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

the Survey of Industrial Research and Development for calendar year 2000, which had a December 31, 2000, reference period, were not available in their full publication detail until June 2003, some 30 months after the reference period. A partial release of six tables was made in February 2002, and the InfoBrief, with some narrative table reading, was available in April 2002.

This 30-month period contrasts unfavorably with the timeliness experience for the industrial R&D survey as reported by SRS for the previous 6 years, since the large-scale redesign of the survey between 1991 and 1992 (data displayed by reference calendar year in months from reference month to publication of full detail):

1993

1994

1995

1996

1997

2000

19

17

23

15

13

30

The data for the past few years for all of the major SRS surveys show that things have become worse, not better, since the previous NRC panel recommended improvement in timeliness. Table 8-1 traces the timeliness of the release process, as each R&D expenditure survey and the national patterns report moved from the reference period, to date of early release (month/year of posting on the web), to InfoBrief posting on the web, and to the posting/publication of the full set of detailed statistical tables for the past 4 years.

For every survey, there has been slippage and, in most cases, at least 2 years slippage. This lack of timeliness adds to the underutilization of the data.

Some of the delay is in the processing and provision of the data by the Census Bureau and the contractor for the web-based surveys. The Census Bureau has recently set a goal to release the industry survey by the end of the calendar year in which the survey questionnaire was distributed. For example, it is investigating processing changes so that 2004 data are released to NSF by the end of calendar year 2005. NSF would still be responsible for assembling the data release to the public.

The industry survey is the largest of the R&D surveys. If it can improve timeliness by earlier mail-out and increased automation of the disclosure review, perhaps the four smaller web-based surveys can also find ways to accelerate data release.

Relevance of Program Content

The relevance and utility of the data are expected to increase simply by their earlier provision. Still, there are major substantive gaps in the R&D

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

TABLE 8-1 Survey Release Schedules

 

Reference Period

Early Release Tables

InfoBrief

Full Set of Tables

Survey of Industrial Research and Development

Dec-99

Jan-01

May-01

Apr-02

Dec-00

Jan-02

Dec-02

Jun-03

Dec-01

NA

Jun-03

Dec-03

Dec-02

NA

Jun-04

NA

Survey of Federal Funds for Research and Development

Sep-99

Dec-00

Mar-01

Jul-01

Sep-00

NA

Feb-02

Jun-02

Sep-01

NA

Jun-03

Apr-04

Sep-02

NA

May-04

NA

Survey of Federal Science and Engineering Support for Universities, Colleges, and Nonprofit Institutions

Sep-99

Jan-01

Mar-01

Apr-01

Sep-00

Feb-02

Feb-02

May-02

Sep-01

NA

Mar-03

May-03

Sep-02

NA

May-04

NA

Survey of Research and Development Expenditures at Universities and Colleges

Oct-99

Dec-00

NA

Jul-01

Oct-00

Dec-01

NA

Feb-02

Oct-01

NA

Aug-03

Apr-03

Oct-02

NA

May-04

Jul-04

Survey of Science and Engineering Research Facilities

Sep-99

Feb-01

NA

Jul-01

Jun-01

Jan-02

NA

Mar-02

Sep-03

NA

NA

NA

National Patterns of R&D

1999

NA

Oct-99

Nov-99

2000

Nov-00

Nov-00

Mar-01

2002

NA

Dec-02

Mar-03

2003

NA

Mar-04

Aug-04

NOTE: If the milestones do not apply (e.g., there were no early release tables, there was no InfoBrief written, or the full set of tables were not published), NA is noted.

program that have been discussed throughout this report. At the present time, there are many unanswered questions. For example, what is the impact of academic research, both nationally and regionally? How do you measure innovation’s contribution to economic growth? What is the entire globalization picture? There is some information on research in the United States funded by companies abroad and some information about research abroad funded by the United States, but there is no overall picture of the entire process. Similarly, there is no accounting for R&D done in interdisciplinary centers, and interdisciplinary work is a growing trend.

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

In addition to the gaps in the substance of the program, there are gaps in the methodological development of the surveys. Four of the five surveys discussed in this report are done primarily on the web. Yet very little, if any, methodological work has been done to find out how paper and web data collection can support each other. Except for the two federal surveys, the other surveys all use imputation. It is handled in very different ways, with no one comparing and suggesting alternative methods, and no one measuring the impact on the estimates and, when appropriate, their variances.

The industrial R&D survey is the most complex of the five surveys, as well as the most costly. In 2003, its cost was $1.2 million. For a survey of this importance, it is not getting the attention it needs either from SRS or the Census Bureau, which conducts it. For the Census Bureau, it is one of the smaller surveys it conducts. For that reason, it is used as a training ground for new employees, who then graduate to bigger surveys. For staff who do not move on, there is little incentive to suggest changes or improvements. The Census Bureau processes this survey much as it does the other manufacturing surveys it conducts. In the case of the R&D survey, this means a mail-out of the questionnaire to a company with no designated respondent, no means of compiling a response rate that reflects how many companies reported data, no information on item nonresponse, and an obscure editing process that is not documented. SRS does not have the staff it needs to manage the Census Bureau staff well and to insist that the survey be done to meet well-accepted statistical standards.

Balance

The panel has observed that SRS, in the press to produce its surveys, cannot devote sufficient attention to broader concerns of data quality and coverage. To maintain high standards of statistical quality and keep abreast of important trends in R&D, SRS should align its personnel so that internal experts can provide ongoing guidance on those matters without any direct responsibility for administering a particular survey.

Currently, many respondents do not seem to understand the definitions of the various items. SRS staff do not seem to understand what is being reported. Each survey should have at least one staff person who could discuss the items and definitions with respondents. This should be a proactive measure. In the press of production work, there is no offsetting balance for dealing with respondent concerns and therefore ensuring that the data collected are really reflecting current R&D status.

Because the R&D environment is changing so rapidly, there is an ongoing need to keep abreast of users’ needs. Workshops should focus on users’ needs and types of questions. There should be a staff person whose job it is to assimilate the information provided in workshops, from users, from a

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

proposed advisory committee for the industrial survey, and from groups in other countries who work on similar surveys. At the present time, all effort is focused on production; there is very little thought given to improvements for the future.

In order to keep the R&D program relevant, both programmatically and methodologically, there should be a mechanism for proposing studies and analyzing research. There should be a mechanism for commissioning studies to deal with specific suggestions for problem areas. The constitution of a methods test panel for testing new methodology, new questions, and new technology could be of great value to SRS. However, someone needs to decide on the focus of such a panel—to analyze data, to work with respondents, to advocate change. Such a person or persons cannot be running a survey with its own pressing deadlines.

NSF should appoint two high-level experts with the following responsibilities: to help create a balance between ongoing effectiveness in issuing survey data and the need for tracking important trends in research and technology that might affect the respondent bases of surveys, to work with individual surveys on adapting to new requirements, and to serve as liaisons with outside experts, while addressing some of the long-term statistical issues at SRS. One should have expertise in statistics and one in economics. These people should not be saddled with administrative chores. Instead they should be devoted to innovation and exploratory work on the R&D surveys. Each of them should work on analyzing the data, one with a focus on meeting user needs, and one with a focus on providing the best methodology for the surveys.

TOOLS TO IMPROVE DATA QUALITY

Quality starts with an organizational commitment and is buttressed by professional standards of practice. These are two recognized elements that help define an effective statistical agency (National Research Council, 2001b). The commitment should involve several key elements: use of modern statistical theory and sound statistical practice, strong staff expertise, an understanding of the validity and accuracy of the data, and quality assurance programs that include documentation of concepts and definitions, as well as data collection methodologies and measures of uncertainty and possible sources of error. In view of the importance of the overall environment for quality, early in our work we reviewed these elements of organizational and management commitment to quality.

As a benchmark, the panel turned to the earlier NRC study Measuring the Science and Engineering Enterprise: Priorities for the Division of Science Resources Studies (National Research Council, 2000) and evaluated the progress that SRS has made to implement the recommendations for

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

quality improvement in that report. This earlier report found that SRS had a good track record of improving the data quality and meeting statistical standards in the recent past, and it recommended additional steps to ensure that standards are met across SRS operations. With respect to the R&D expenditure surveys, the previous NRC report recommended three specific steps to improve data quality: (1) require all contractors to provide methods reports that address quality standards, (2) continue recent efforts to provide NSF staff with professional development opportunities to improve statistical skills, and (3) continue to develop and strengthen a program of methodological research, undertaking rigorous analysis of the data collected to assess the quality of the data relative to concepts they are supposed to measure.

Since the 2000 NRC report was published, SRS has implemented several specific improvements and has laid the groundwork for others. The change of name to the Division of Science Resources Statistics is one indication of the commitment of management to improving the statistical foundation of the R&D data. Other specific actions that have improved or will lead to improvements include the addition of mathematical statistics expertise to the SRS senior staff, the incorporation of specific contractual obligations for measuring and presenting measures of quality in contracts with the data collection organizations responsible for conducting the several survey programs, and development of a long-term plan for methodological research for each program.

Mathematical Statistics Expertise

The 2000 NRC report focused considerable attention on perceived staffing issues in SRS, calling for enhancing staff expertise through professional development activities, augmenting staff expertise in statistical methods and subject-matter skills, and developing a more interactive relationship with outside researchers. Over the past 2 years, SRS has added four full-time senior statisticians and has secured the services of experts in survey design and cognitive aspects of questionnaire design. These staff and consultant enhancements have served to focus attention on statistical methods for the R&D surveys.

Long-Term Plan for Methodological Research

The multiyear plan for methodological research on the survey of R&D in industry is perhaps the most advanced of these efforts to specify a roadmap for near-term and long-term statistical improvements. In a presentation at the panel’s July 2003 workshop, Ray Wolfe of the SRS research and development statistics program outlined a research agenda and plan of

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

action for the R&D industrial survey. The plan is specified in the SRS Quality Improvement and Research Project List. The list was developed with input from, among others, the OMB clearance process, the SRS staff, the NRC’s Committee on National Statistics and the Board on Science, Technology, and Economic Policy, and SRS advisory committees.

The Quality Improvement and Research Project List that was current at the time of the panel’s workshop included 30 items, 9 of which were presented in some detail to the panel. Three of the research items were designed to address OMB clearance conditions: record-keeping and administrative records practices, effects of mandatory versus voluntary reporting on the state items, and web-based survey operations. Other items on the list of nine key activities include research on data collection below the company level, cognitive research on survey forms and instructions and identification of appropriate respondents, survey sample design issues, survey processing (data editing, cleaning, and imputation), research on disclosure and confidentiality issues, and electronic documentation system for respondent contacts. Each of these research agenda items is discussed in this report.

In particular, the panel encourages SRS to develop plans to review and resolve several cross-cutting issues of statistical methodology. These issues are web-based collection, provision of prior-year data to survey respondents, the designation of respondents, and nonresponse adjustment and imputation.

Web-Based Collection

The four federal and college and university surveys rely mainly on the Internet to support data collection. (A few respondents are not able or willing to respond electronically, so a manual system is also maintained.)

Many of the traditional chores of survey operations are eliminated in web-based collection. The survey is delivered electronically, rather than in paper format through the mail. The respondent completes an electronic survey form. Follow-up is by e-mail or telephone rather than postal mail. Logs of progress of the agencies and institutions in reporting are generated by the system electronically. The web-based questionnaire enables embedded edits that question the entry of erroneous data, so that data checks, repairs, and explanations can be provided by the respondents and the data are tabulated by the web-based system. In this family of surveys, the target population is small, and the processing environment is simplified in that there is no sample, no nonresponse adjustment by weighting, and no other weighting.

The very newness of the web-based system and the employment of automated functions raise several red flags in terms of data quality. For one thing, the system can force respondents to provide data that they are unsure

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

about, simply to complete the entry. In this sense, it forces imputation by the respondent, rather than by NSF. In more traditional survey approaches, the failure to complete a data item may signal a problem with questionnaire design, concepts, or definitions. These useful signals are masked by self-imputation. Another consequence of forcing an entry is that traditional measures of nonsampling error, such as item nonresponse and imputation rates, may be artificially low; statements in NSF publications that “item nonresponse is not viewed as a problem” because item nonresponse rates are under 1 percent may be quite misleading as a measure of overall quality.

Against this backdrop, OMB has challenged NSF to explore the possibility of web-based collection in the larger and more stylized Survey of Industrial Research and Development. The terms of clearance of the survey, when it was last approved by OMB in January 2002, require that “NSF will initiate a web-based version of this survey by the next submission of this clearance request. If NSF is unable to complete a web-based version upon resubmission, NSF must provide justification as to why web-based surveys can not be implemented in this instance, and document the steps being taken to allow full electronic submission of this survey” (U.S. Office of Management and Budget, 2002:2).

NSF and the Census Bureau have introduced several technological initiatives for data collection over the years. For example, firms using the short form (RD-1A) that report no R&D activity can fulfill the reporting requirement by the use of toll-free touch-tone data entry. Nearly 90 percent of such respondents in 1998 and 1999 used the touch-tone system. In addition, the Census Bureau utilizes a computerized self-administered survey system, which allows respondents to complete the survey form electronically and submit it on a disk. Approximately 20 percent of firms that were sent both a paper form and a diskette in 1999 chose to report using the diskette. Although these initiatives offer reduced respondent burden and facilitate data processing at the Census Bureau, they fall short of representing the data collection and processing improvements offered by the kind of web-based collection utilized in other NSF surveys. In essence, the current initiatives represent merely an automation of the old way of doing business.

Clearly, an intensive regime of development and testing of further automation of data collection, processing, and estimation is required. In keeping with the OMB recommendation that research should focus on web-based collection from businesses, a proof-of-concept study was conducted by the Census Bureau in the 1990s (U.S. Census Bureau, 1998). Work on web-based collection should consider the dynamic aspects of questionnaire design and processing discussed in a recent Committee on National Statistics workshop on survey automation (Tourangeau, 2003). Such a study needs to be developed and conducted in SRS. A comprehensive feasibility study

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

for an effective web-based instrument requires both use of interactive contact with individual respondents, as recommended in this report, and a carefully designed field pilot study.

Provision of Prior-Year Data to Survey Respondents

Many of the NSF surveys provide data collected in prior-year responses on the current year collection instruments. The industry survey imprints prior-year data on the RD-1 form and incorporates the information on diskettes forwarded to the companies under the Census Bureau’s computerized self-administered survey option. For the other surveys, prior-year data are available to respondents on request or automatically.

There are potential strengths and drawbacks to presenting historical data for a current reference period. A paper by Holmberg (2002) outlines motives that include, on the positive side, increasing the efficiency of the data collection, correcting previous errors, reducing response burden, and possibly reducing measurement errors and spurious response variability. Arrayed against these advantages are the possible drawbacks that preprinting can conserve errors rather than correct them, and that the practice might lead to underreporting of changes from one period to another. Holmberg’s experimental study of preprinting values on a self-administered questionnaire for business establishments by Statistics Sweden found that questionnaires with preprinted historical values outperformed questionnaires without preprinted values on several measures of data quality. These data quality improvements included (not surprisingly) a significant decrease in response variation in overlapping reference periods, a reduction in spurious reports of change, and less coding error and editing work. He further pointed out that some of the potential drawbacks were not realized (for example, the entry of the same value for different months was more common without preprinting), and there was no indication that change was underestimated with preprinting.

The favorable results evidenced in Holmberg’s research suggest that there is little danger in continuing this practice in the NSF surveys. However, there is the possibility that his results would not be replicated in U.S. surveys conducted with the spectrum of methods used in the NSF surveys. The panel notes that the Census Bureau has initiated testing of the impact of preprinting on its Survey of Residential Alterations and Repairs (personal communication, D.K. Willimack, U.S. Census Bureau, September 2, 2003).

The panel urges NSF to sponsor research into the effect of imprinting prior-period data on the industrial R&D survey in conjunction with testing the introduction of web-based data collection (Recommendation 8.2). The traditional means of testing the impact of differential data collection proce-

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

dures using randomized split samples and respondent debriefings should be considered for this research.

Designation of Respondent

A major challenge in all of the NSF surveys is to ensure that the survey questionnaire will get to the appropriate and knowledgeable (and motivated) respondent. The role of the data provider is central to all of the surveys, and it is handled very differently from survey to survey.

The Survey of Science and Engineering Research Facilities identifies one person as an institutional coordinator. The survey recognizes that this person is critical to the success of this survey, which must be transferred among various parts of the institution. The institutional coordinator decides what others should provide, what sources of data to use, and who coordinates the pieces of the effort, then reviews the data. This plan appears to generally work well for this survey.

In sharp contrast, the Survey of Industrial Research and Development is mailed to a company, with no contact person designated. Often the form is sent to a central reporting unit at corporate headquarters, with respondent “generalists” fanning out the collection to various organizations and functions within the corporation. There is little attempt to develop a standing relationship with these key reporters or to educate them on the nuances of the data collection.

The role of the data provider is critically important to the success of a survey. We note that, in response to discussions at a meeting of panel members with NSF and data collection staff in April 2003, it has been proposed that the NSF Quality Improvement and Research Project List research agenda will document research contacts for the industry R&D survey.

The panel supports the initiative to identify individual respondents in companies as a first and necessary step toward developing an educational interaction with respondents so as to improve response rates and the quality of responses (Recommendation 8.3).

Nonresponse Adjustment and Imputation

The surveys have very different approaches to nonresponse and imputation. The federal funds and college and university surveys attempt to eliminate the problem of unit nonresponse completely, by seeking 100 percent compliance from the universe of reporters. Item nonresponse is actively discouraged, either by making it difficult not to enter an item in a web-based report, or by encouraging reporters to estimate information when actual data are not available, as is the practice on the federal funds report. Thus there is little unit or item nonresponse in this family of surveys,

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

although, as mentioned earlier, some of the methods of achieving these response rates have a questionable overall impact on data quality, and their impact needs to be investigated in an evaluation study.

The Survey of Industrial Research and Development has fairly significant response challenges. Of the companies surveyed in 2000, about 15 percent did not respond. Moreover, the means of counting and then treating nonresponse in this survey raises statistical issues. The Census Bureau lists a company as responding when it has heard back from the company, even if the company reported that it was out of scope, out of business, or merged with another company. In other words, a response is a form returned, not necessarily information received. The Census Bureau then imputes values to data items representing these nonresponding units. The panel is concerned about this procedure, in that it does not follow acceptable practices for reporting nonresponse, and we are concerned about the impact of this practice on both the reported nonresponse rates and the estimates. The reported values for item nonresponse rates for key data elements in the survey are also quite problematic. Imputation rates are published, but they are a poor proxy for item nonresponse rates.

The panel recommends that NSF, in consultation with its contractors, revise the Statistical Guidelines for Surveys and Publications to set standards for treatment of unit nonresponse and to require the computation of response rates for each item, prior to sample weighting (Recommendation 8.4). When these guidelines have been clearly specified, the panel expects that the Census Bureau and other consultants would adopt these standards. With clear specification by the NSF and adoption by the contracting organizations, information on true unit and item nonresponse can be developed and assessed.

Survey Documentation

The methodological reports from the various surveys vary widely in their completeness. The SRS guidelines are quite clear on the policy for survey documentation in a methodological report expected at the completion of each survey. There is some evidence that recent methodological reports have improved in terms of depth and focus (see, for example, ORC Macro, 2002). The means of ensuring continued improvement in survey documentation is specified in the SRS guidelines—that is, survey contracts must include a requirement for a methodology report covering items addressed in this standard (National Science Foundation, no date).

Resources

The SRS budget is small for a statistical agency. Only about $3.7 million was allocated for carrying out the R&D expenditure surveys in

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

2003. None of the surveys has even as many as two full-time-equivalent (FTE) staff on the payroll of NSF. At this level of funding and staffing resources, it will be difficult, if not impossible, for NSF to implement a program of research and survey improvements that are recommended in this report.

In keeping with the thrust of the recommendations of this and previous NRC panels, the highest priority in the direction of available resources should be given to redesigning the industrial R&D survey. In particular:

  • Update the questionnaire to sharpen the focus of the survey and fix the problems identified in this report. This should be conducted in conjunction with a program of research on web-based collection and tested by means of a panel on small methods development.

  • Carry out a thorough investigation of sampling for the industrial survey, considering the use of other frames and multiple frames.

  • Research record-keeping practices of companies to determine if line-of-business data could be collected on the industry survey.

Across-the-board improvements will be made only when NSF augments its own internal staff expertise in order to exercise greater control over the content and operations of the surveys—a process that has begun in the agency but on a limited basis. The panel thinks that in order to facilitate the recommendations in this chapter, SRS should take the following steps:

  • Augment the staff with the services of high-level experts who can think ahead for the R&D surveys, filling gaps, improving methodology, and analyzing data. These could be additions to the staff but, probably would be drawn from outside (academic) experts in the subject-matter fields.

  • Implement greater control over the conduct of the surveys by rigorous utilization of the contract and cooperative agreement vehicles.

On a longer-term basis, NSF can engage in a deliberate process to redesign, or at least revitalize, all of its surveys on a rotating schedule. Such a program will have many components, some of which are suggested in the discussions of the individual surveys. NSF should fund research on questionnaire design for many of the surveys and on how to make the web design truly useful.

Some of these initiatives can be accomplished within the resource base of NSF, simply requiring a shift in emphasis and direction. Others will require a new infusion of resources into the SRS R&D expenditure programs. The requirement for new funding arises in a budget situation that has seen slow and unsteady growth over time. The panel hopes that the

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

fairly significant increase in budget authority in fiscal year (FY) 2003 can be preserved to enable SRS to make some improvements in some of the survey activities. The budget for FY 2004 was about the same as the FY 2003 budget, with a slight increase for methodological work for the industrial R&D survey. We note that the costs of the nonindustry surveys are proportionately much higher. Although the surveys that collect data on federal, academic, and nonprofit R&D are sent to many fewer respondents, the surveys are more complicated than the industry survey, and response rates are generally higher, pushing up costs (see Table 8-2 for detailed expenditure information).

Advisory Committees

An advisory committee with rotating membership gives an agency a method to bring in experts in different areas, as needed, in both substantive and survey methodological fields. An advisory committee can listen to reports, respond to questions about processes, raise questions, suggest new questions, argue about gaps in the data, and make recommendations. At present, SRS has an advisory committee that is a break-out group of the Social, Behavioral, and Economic Sciences Advisory Committee. In view of the lack of a dedicated advisory committee, NSF has sought to obtain advice and guidance on content and structural aspects of its surveys from many sources over the years, some through standing bodies and others on an ad hoc basis. The panel believes that for all surveys, outside advice and consultation is necessary to provide expertise as well as to legitimatize the surveys.

The Survey of Industrial Research and Development in particular would benefit from a dedicated, focused, formal, and ongoing external advisory panel knowledgeable about these data and the issues they help inform. In the late 1990s, external survey-relevant advice was obtained using alternative mechanisms. In 1998, NSF provided funding to the NRC’s Science, Technology, and Economic Policy Board to conduct a workshop assessing the utility and policy relevance of government’s data on industrial research (including specifically the industrial R&D survey) and industrial innovation. The purpose of the workshop was to generate suggestions for improving measurement, data collection, and analysis. Workshop participants included statisticians and economists concerned with industrial organization and innovation practices, industrial managers, association representatives, government officials representing diverse policy arenas and statistical agencies, and analysts from international organizations and other industrialized countries. As a follow-up to the workshop, the Census Advisory Committee of Professional Associations, specifically the American Economic Association component, convened an R&D miniconference to recommend changes

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

TABLE 8-2 Sample Size and Expenditures for NSF Surveys

Survey Year

Sample Size

Response Rate %

Mode

Survey Costs

Redesign & Related Costs

FTEs

All R&D surveys

1993

 

 

 

$1,289,000

$433,000

7.50

1994

 

 

 

920,000

459,000

7.50

1995

 

 

 

1,698,000

7.50

1996

 

 

 

2,102,000

7.50

1997

 

 

 

2,580,000

7.50

1998

 

 

 

1,708,000

7.50

1999

 

 

 

2,216,000

7.50

2000

 

 

 

1,709,000

7.50

2001

 

 

 

2,259,000

200,000

7.50

2002

 

 

 

2,668,000

200,000

8.25

2003

 

 

 

3,687,000

600,000

8.50

Survey of Industrial Research and Development

1993

24,064

81.8

Paper

390,000

433,000

1.50

1994

23,541

84.8

Paper

405,000

459,000

1.50

1995

23,809

85.2

Paper

575,000

 

1.50

1996

24,964

83.9

Paper

801,000

 

1.50

1997

23,417

84.7

Paper

783,000

 

1.50

1998

24,809

86.6

Paper

850,000

 

1.50

1999

24,431

83.2

Paper

855,000

 

1.50

2000

25,002

84.8

Paper

876,000

 

1.50

2001

24,198

83.0

Paper

894,000

200,000

1.50

2002

31,200

 

Paper/web

1,197,000

200,000

1.75

2003

31,200

 

Paper/web

1,200,000

 

1.75

Survey of Federal Funds for Research and Development

1993

105

100.0

Disk

149,000

 

1.50

(sample size includes subagencies)

1994

102

100.0

Disk

139,000

 

1.50

1995

98

100.0

Disk

170,000

 

1.50

1996

94

100.0

Disk

185,000

 

1.50

(costs exclude database task)

1997

93

100.0

Disk

226,000

 

1.50

1998

92

100.0

Web

267,000

 

1.50

1999

90

100.0

Web

209,000

 

1.50

2000

93

100.0

Web

315,000

 

1.50

2001

74

100.0

Web

170,000

 

1.50

2002

74

 

Web

379,000

 

1.75

2003

74

 

Web

334,000

 

1.75

Survey of Federal Science and Engineering Support to Universities, Colleges, and Nonprofit Institutions

1993

15

100.0

Disk

90,000

 

1.50

(sample size includes major agencies only)

1994

15

100.0

Disk

147,000

 

1.50

1995

15

100.0

Disk

169,000

 

1.50

1996

18

100.0

Disk

148,000

 

1.50

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

Survey Year

Sample Size

Response Rate %

Mode

Survey Costs

Redesign & Related Costs

FTEs

(costs exclude state profiles task)

1997

20

100.0

Disk

164,000

 

1.50

1998

19

100.0

Web

243,000

 

1.50

1999

19

100.0

Web

194,000

 

1.50

2000

18

100.0

Web

168,000

 

1.50

2001

18

100.0

Web

125,000

 

1.50

2002

18

 

Web

397,000

 

1.50

2003

18

 

Web

334,000

 

1.50

Survey of Research and Development Expenditures at Universities and Colleges

1993

681

96.9

Paper/disk

375,000

 

1.50

1994

500

99.6

Paper/disk

229,000

 

1.50

1995

499

90.3

Paper/disk

229,000

 

1.50

1996

493

97.3

Paper/disk

453,000

 

1.50

1997

493

98.0

Paper/disk

347,000

 

1.50

1998

556

98.6

Web/paper

348,000

 

1.50

1999

597

98.5

Web/paper

310,000

 

1.50

2000

623

97.3

Web/paper

350,000

 

1.50

2001

609

97.3

Web

620,000

 

1.50

2002

610

 

Web

695,000

 

1.75

2003

610

 

Web

552,000

 

1.75

Survey of Science and Engineering Research Facilities

1993

309

93.0

Paper

285,000

 

1.50

(excludes biomedical sample)

1994

 

 

 

 

 

1.50

1995

307

97.0

Paper/disk

555,000

 

1.50

1996

 

 

 

 

 

1.50

1997

350

87.0

Paper/web

545,000

 

1.50

1998

 

 

 

 

 

1.50

1999

556

73.0

Paper/web

648,000

 

1.50

2000

 

 

 

 

 

1.50

(two-question survey)

2001

580

90.0

Web/paper

450,000

 

1.50

2002

 

 

 

 

 

1.50

(complete redesign)

2003

585

 

Web/paper

1,267,000

600,000

1.75

Survey of Research and Development Funding and Performance at Nonprofit Institutions

1996

 

 

 

515,000

 

 

1997

9,112

41.4

Paper/web

515,000

 

 

NOTES: Research Facilities Survey is a biennial survey. FTE totals exclude the nonprofit survey.

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

and provide guidelines for improving the analytical relevance and utility of the industrial R&D survey statistics.

The Survey of Research and Development Expenditures at Universities and Colleges benefits from ongoing annual site visits, respondent workshops, and advice received from its special emphasis panel. This survey employs an external consultant, Jim Firnberg, to make multiple site visits to survey respondents. He is well-known and respected by institutional respondents, having once been a respondent to the NSF surveys and now very active in several national university organizations, including the Association for Institutional Research. His findings and recommendations are included in an annual Institutional Response Practices report, which has been the basis for identifying topics that have been more fully investigated through advisory panels and academic workshops. An advisory special emphasis panel was established in the late 1990s. In 2001, the discussion topics for this panel included (1) survey difficulties resulting from an overlap of sector boundaries (including specific discussion of hospitals and clinical trials and of consortia and pass-through funding), (2) better accounting for indirect costs, and (3) the status of optional items on non-S&E R&D performance and federal agency-specific reporting.

Annual workshops are held with respondents to discuss issues related to the academic R&D survey. Workshop topics have included technical data preparation guidance, cognitive response issues, and policy relevance of the survey questionnaire and its content. The survey instrument has been revised (questions added and modified, instructions clarified) as a result of these workshops. Participants tend to be senior university budget or research administrators whose offices are responsible for the survey response. In 2002, for example, the workshop confirmed from smaller universities that NSF should include the optional items on non-S&E R&D and federal agency sources of support as core questions on the survey.

The Survey of Scientific and Engineering Research Facilities has benefited from the advice of a special emphasis panel (or expert advisory groups) since its inception in the late 1980s. Participants tend to be senior university facilities or budget office administrators whose offices are responsible for survey response, and nonacademic institutional representatives who share similar facility concerns (such as the Howard Hughes Medical Institute). An experts meeting in Chicago in 2003 was convened to identify whether it would be possible to collect information on cyber infrastructure expenditures.

Both the Survey of Federal Funds for Research and Development and the Survey of Federal Science and Engineering Support to Universities, Colleges, and Nonprofit Institutions benefit from respondent issue workshops. A formal special emphasis panel has not been established for these federal surveys. In 1998, in lieu of a formal advisory panel, a Federal

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

Agency Workshop on Federal R&D was convened. Workshop participants included senior officials from budget and policy offices of most major federal R&D funding agencies and from major data users (including OMB, the Congressional Research Service, the National Research Council, the Consortium of Social Science Organizations, and the American Educational Research Association). Proceedings included discussions on the federal science and technology budget, R&D data by fields of science and engineering, how major R&D-performing agencies use the federal funds survey data, and agencies’ accountability for reporting R&D data.

NSF has convened a variety of multiagency respondent issue workshops to review the content and survey reporting process for the combined federal funds and federal support surveys. In addition, NSF has sponsored several agency-specific workshops to address technical issues of direct concern and relevance to individual agencies, the latest in 1999.

For the Survey of Research and Development Funding and Performance by Nonprofit Institutions, a special emphasis panel was established at the very outset to guide the development of survey content and sampling coverage. In 1996, this panel met to provide guidance and make suggestions for the planned 1996 and 1997 nonprofit R&D data collection effort. The panel consisted of 11 senior officials from the nonprofit, academic, and government communities with knowledge about, and an analytical interest in, the nonprofit sector.

Opportunities for Efficiencies

Although the SRS staff seems to be spread very thinly, there may be opportunities for efficiencies. One was mentioned earlier: that is, investigation of the use of a RaDiUS-like collection of contract, grant, and cooperative agreement data, either as replacement or an adjunct to the federal funds survey.

The academic R&D survey already makes use of the frame developed for the human resource surveys of the SRS division to identify doctorate-granting institutions. It may be that the creation of a database of universities and colleges, with distinguishing characteristics, would be a worthwhile investment that would reduce the annual cost of identifying the frame for the academic R&D survey. Such a frame would undoubtedly be useful for other purposes. Perhaps the blending, whenever possible, of the R&D and human resources surveys would create efficiencies.

Generic Clearance

SRS has recently received OMB approval for a 3-year generic clearance to conduct a broad range of survey improvement projects (pretests, case

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×

studies, pilot surveys, customer surveys, focus groups, and methodological research) and to establish a quick response capability. SRS has suggested four studies to begin the process: three focusing on cognitive testing of survey items and one to develop the quick response capability. With this generic clearance, SRS may be able to conduct more research in a timely way to improve survey content and methodology.

Mandatory Reporting

Item nonresponse in the R&D surveys outside the two surveys of the federal government is rampant. It is especially serious in the industrial survey, where many large companies refuse, as a matter of company policy, to furnish data unless the survey is mandatory.

In most years, the industrial survey is a blend of mandatory and voluntary items. From 1958 to 1960, only the item on “cost of R&D performed” was mandatory, and that was because it was deemed a requirement for the economic censuses. Then, in 1961, the “federal governments funds” portion of the cost of R&D performed became mandatory. Two more data items—“net sales and receipts” and “total company employment”—were added as mandatory items in 1969. These four items were the only mandatory items until, in 2001, “cost of R&D performed within the company by state” was made mandatory. For 2002, OMB approved that all items be mandatory, because of the 2002 economic censuses, but in 2003, there was a return to the five mandatory items. The Census Bureau has requested that all items be mandatory in the year of economic censuses; it has also proposed that the number of items be scaled back in interim years.

On a test during the 1990 survey cycle of the industrial survey, SRS looked at the effect of reporting on a completely voluntary basis. The sample was split in two. One-half was asked to report as usual on the mix of mandatory and voluntary items; the other half was asked to report on a completely voluntary basis. The result was a decrease in the overall response rate. The test showed that the mandatory items experienced a sharp decrease in response when voluntary reporting was permitted. However, no information is available on what would happen to voluntary items if mandatory reporting were enforced. The primary problem with the industry survey is the voluntary items, which can have very large nonresponse.

Neither SRS nor the Census Bureau know the reasons why companies do not report the voluntary items, although there are many suggestions. The panel recommends increased reliance on mandatory reporting between economic censuses to improve data quality, reduce collection inefficiencies, and provide greater equity among reporters. However, the panel also recommends additional research on the topic of voluntary versus mandatory reporting, to investigate whether mandatory reporting is the most effective strategy (Recommendation 8.5).

Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 152
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 153
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 154
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 155
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 156
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 157
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 158
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 159
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 160
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 161
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 162
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 163
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 164
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 165
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 166
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 167
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 168
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 169
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 170
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 171
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 172
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 173
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 174
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 175
Suggested Citation:"8 Survey Management and Adminstrative Issues." National Research Council. 2005. Measuring Research and Development Expenditures in the U.S. Economy. Washington, DC: The National Academies Press. doi: 10.17226/11111.
×
Page 176
Next: References »
Measuring Research and Development Expenditures in the U.S. Economy Get This Book
×
Buy Paperback | $57.00 Buy Ebook | $45.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

This report examines the portfolio of research and development (R&D) expenditure surveys at the National Science Foundation (NSF), identifying gaps and weaknesses and areas of missing coverage. The report takes an in-depth look at the definition of R&D, the needs and potential uses of NSF’s R&D data by a variety of users, the goals of an integrated system of surveys and other data collection activities, and the quality of the data collected in the existing Science Resources Statistics surveys.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!