NONRESPONSE

IN SOCIAL SCIENCE SURVEYS

A RESEARCH AGENDA

Roger Tourangeau and Thomas J. Plewes, Editors

Panel on a Research Agenda for the
Future of Social Science Data Collection

Committee on National Statistics

Division of Behavioral and Social Sciences and Education

NATIONAL RESEARCH COUNCIL
OF THE NATIONAL ACADEMIES

THE NATIONAL ACADEMIES PRESS

Washington, D.C.

www.nap.edu



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page R1
NONRESPONSE IN SOCIAL SCIENCE SURVEYS A RESEARCH AGENDA Roger Tourangeau and Thomas J. Plewes, Editors Panel on a Research Agenda for the Future of Social Science Data Collection Committee on National Statistics Division of Behavioral and Social Sciences and Education

OCR for page R1
THE NATIONAL ACADEMIES PRESS  500 Fifth Street, NW  Washington, DC 20001 NOTICE: The project that is the subject of this report was approved by the Govern- ing Board of the National Research Council, whose members are drawn from the councils of the National Academy of Sciences, the National Academy of Engineer- ing, and the Institute of Medicine. The members of the committee responsible for the report were chosen for their special competences and with regard for appropri- ate balance. This study was supported by the Russell Sage Foundation (award number 97-10- 04). Support for the Committee on National Statistics is provided by a consortium of federal agencies through a grant from the National Science Foundation (award number SES-1024012). Any opinions, findings, conclusions, or recommendations expressed in this publication are those of the editors and do not necessarily reflect the views of the organizations or agencies that provided support for the project. International Standard Book Number-13:  978-0-309-27247-6 International Standard Book Number-10:  0-309-27247-5 Additional copies of this report are available from the National Academies Press, 500 Fifth Street, NW, Keck 360, Washington, DC 20001; (800) 624-6242 or (202) 334-3313; http://www.nap.edu. Copyright 2013 by the National Academy of Sciences. All rights reserved. Printed in the United States of America Suggested citation: National Research Council. (2013). Nonresponse in Social Sci- ence Surveys: A Research Agenda. Roger Tourangeau and Thomas J. Plewes, Edi- tors. Panel on a Research Agenda for the Future of Social Science Data Collection, Committee on National Statistics. Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.

OCR for page R1
The National Academy of Sciences is a private, nonprofit, self-perpetuating society of distinguished scholars engaged in scientific and engineering research, dedicated to the furtherance of science and technology and to their use for the general welfare. Upon the authority of the charter granted to it by the Congress in 1863, the Acad- emy has a mandate that requires it to advise the federal government on scientific and technical matters. Dr. Ralph J. Cicerone is president of the National Academy of Sciences. The National Academy of Engineering was established in 1964, under the charter of the National Academy of Sciences, as a parallel organization of outstanding en- gineers. It is autonomous in its administration and in the selection of its members, sharing with the National Academy of Sciences the responsibility for advising the federal government. The National Academy of Engineering also sponsors engineer- ing programs aimed at meeting national needs, encourages education and research, and recognizes the superior achievements of engineers. Dr. C. D. Mote, Jr., is presi- dent of the National Academy of Engineering. The Institute of Medicine was established in 1970 by the National Academy of Sciences to secure the services of eminent members of appropriate professions in the examination of policy matters pertaining to the health of the public. The Insti- tute acts under the responsibility given to the National Academy of Sciences by its congressional charter to be an adviser to the federal government and, upon its own initiative, to identify issues of medical care, research, and education. Dr. Harvey V. Fineberg is president of the Institute of Medicine. The National Research Council was organized by the National Academy of Sci- ences in 1916 to associate the broad community of science and technology with the Academy’s purposes of furthering knowledge and advising the federal government. Functioning in accordance with general policies determined by the Academy, the Council has become the principal operating agency of both the National Academy of Sciences and the National Academy of Engineering in providing services to the government, the public, and the scientific and engineering communities. The C ­ ouncil is administered jointly by both Academies and the Institute of Medicine. Dr. Ralph J. Cicerone and Dr. C. D. Mote, Jr., are chair and vice chair, respectively, of the National Research Council. www.national-academies.org

OCR for page R1

OCR for page R1
PANEL ON A RESEARCH AGENDA FOR THE FUTURE OF SOCIAL SCIENCE DATA COLLECTION Roger Tourangeau (Chair), Methodology Group, Westat, Rockville, MD Nancy Bates, Research and Methodology Directorate, U.S. Census Bureau, Washington, DC Suzanne M. Bianchi, Department of Sociology, University of California, Los Angeles J. Michael Brick, Methodology Group, Westat, Rockville, MD Douglas D. Heckathorn, Department of Sociology, Cornell University Larry Hedges, Institute for Policy Research, Northwestern University Arthur Kennickell, Board of Governors, Federal Reserve System, Washington, DC Kristen Olson, Department of Sociology and Survey Research and Methodology Program, University of Nebraska–Lincoln Nora Cate Schaeffer, Department of Sociology, University of Wisconsin–Madison Frank Stafford, Economics Department and Population Studies Center, University of Michigan, Ann Arbor Thomas J. Plewes, Study Director Brian Harris-Kojetin, Associate Study Director (on detail from U.S. Office of Management and Budget) Michael J. Siri, Program Associate v

OCR for page R1
COMMITTEE ON NATIONAL STATISTICS 2012–2013 Lawrence D. Brown (Chair), Department of Statistics, The Wharton School, University of Pennsylvania John M. Abowd, School of Industrial and Labor Relations, Cornell University David Card, Department of Economics, University of California, Berkeley Alicia Carriquiry, Department of Statistics, Iowa State University Constantine Gatsonis, Center for Statistical Sciences, Brown University James S. House, Survey Research Center, Institute for Social Research, University of Michigan, Ann Arbor Michael Hout, Survey Research Center, University of California, Berkeley Sallie Ann Keller, Department of Statistics, University of Waterloo, Ontario, Canada Lisa M. Lynch, The Heller School for Social Policy and Management, Brandeis University Sally C. Morton, Department of Biostatistics, Graduate School of Public Health, University of Pittsburgh Ruth D. Peterson, Criminal Justice Research Center, Ohio State University Edward H. Shortliffe, Columbia University and Arizona State University Hal Stern, Donald Bren School of Information and Computer Sciences, University of California, Irvine John H. Thompson, NORC at the University of Chicago Roger Tourangeau, Methodology Group, Westat, Rockville, MD Constance F. Citro, Director vi

OCR for page R1
Contents PREFACE ix SUMMARY 1 1 THE GROWING PROBLEM OF NONRESPONSE 7 Conceptualizing and Defining Nonresponse, 9 Long-Term Trends in Response Rates, 12 Response Rate Trends in Cross-Sectional Surveys, 14 Response Rate Trends in Panel Surveys, 24 Reasons for Nonresponse, 30 Theoretical Perspectives on Nonresponse, 33 Identifying Costs Associated with Approaches to Minimize Nonresponse, 36 2 NONRESPONSE BIAS 40 Response Rates Matter, But …, 40 Effects of Nonresponse Bias, 42 Nonresponse Bias in Panel Surveys, 45 Analyzing Nonresponse Bias, 46 New Metrics for Understanding Nonresponse Bias, 47 Need for a Theory of Nonresponse Bias, 50 3 MITIGATING THE CONSEQUENCES OF NONRESPONSE 51 Nonresponse Weighting Adjustment Methods, 52 Use of Paradata in Reducing Nonresponse and Nonresponse Bias, 57 Concluding Observation, 59 vii

OCR for page R1
viii CONTENTS 4 APPROACHES TO IMPROVING SURVEY RESPONSE 61 Understanding and Reducing Respondent Burden, 62 Improving Response in Telephone and Mail Surveys, 65 New Frames and Methods of Sampling, 68 New and Emerging Data Collection Modes, 73 Multiple Modes, 76 Interviewer Effects, 81 Incentives, 88 Paradata and Auxiliary Data, 94 Responsive Design, 96 Administrative Records, 97 Other Means of Collecting Social Science Data, 98 5 RESEARCH AGENDA 101 Research on the Problem, 102 Research on Consequences, 103 Research on Coping, 103 Research on Alternatives, 104 REFERENCES AND SELECTED BIBLIOGRAPHY 105 ACRONYMS AND ABBREVIATIONS 123 APPENDIXES A Nonresponse Research in Federal Statistical Agencies 127 B Research Agenda Topics Suggested by the Literature 133 C Biographical Sketches of Panel Members 146

OCR for page R1
Preface N early three decades have elapsed since the National Research Council (NRC) last convened a panel to undertake a comprehen- sive review of issues associated with nonresponse in sample sur- veys. The three-volume seminal study, Incomplete Data in Sample Surveys ­ (National Research Council, 1983), reported the results of that early inves- tigation. The 1983 panel focused mainly on statistical techniques that could illuminate and ameliorate the effects of nonresponse. Its study recom- mended a research agenda consisting of eleven far-reaching recommended programs, projects, and activities ranging from improvement of weighting methods to gathering and analyzing data on costs; these research recom- mendations are excerpted in Appendix B of this report. Many of these recommendations have been at least partially implemented. Despite the significant improvements in general understanding of the causes and consequences of survey nonresponse and in methodology for compensating for the effects, the problems associated with the lack of re- sponse to surveys continue; in fact, nonresponse appears to be a growing issue. Response rates to government and privately sponsored household surveys that provide rich data for social science research have been falling throughout the richer countries of the world (see, e.g., De Leeuw and De Heer, 2002). To try to maintain response rates, sponsoring organizations have had to spend many more dollars in repeated efforts to contact sample units and address their concerns about participating. According to Curtin, Presser, and Singer (2005), the rapid decline in response rates has clearly increased survey costs (p. 97). Furthermore, this decline in response rates is challenging the underlying inferential assumption for estimation from ix

OCR for page R1
x PREFACE sample surveys, which is that there is 100 percent response to a probability sample selected from a designated frame with nearly complete coverage of the target population. These challenges threaten to undermine the validity of inferences ob- tained through the collection of information from subjects through sur- veys. Survey nonresponse affects validity in a number of ways. One way is through the introduction of bias into the survey results, but the issue of bias is quite complex. For example, a recent meta-analysis of 59 method- ological studies (Groves and Peytcheva, 2008) concluded that large non- response biases can occur in surveys and, further, that nonresponse rates themselves are a poor predictor of the magnitude of the biases (p. 2). This study concluded that high response rates do not always reduce the risk of nonresponse bias. Various survey attributes, such as the method used to calculate bias, survey sponsorship, and the survey population, also play a role in determining bias (p. 25). In early 2009, members of the board of the Russell Sage Foundation expressed concern to the Committee on National Statistics (CNSTAT) about the threats to statistical inference from the problems associated with declining response rates in traditional social science surveys and indicated their willingness to support a planning meeting that would help develop the plans for a useful project, such as a workshop, a series of workshops, or a full-scale panel study. The planning meeting was held in Washington, DC, on December 14, 2009. A distinguished roster of experts participated in the planning meeting, including experts in survey design; social scientists who use survey data; government, academic, and private-sector managers of surveys for research and policy analysis; and experts in alternative data sources and data collection methods. Two papers were commissioned for the meeting, which summarized the research literature on what is known about the causes of survey non­ response and the effects of the growing levels of nonresponse on inference. In addition, a panel session explored technologies and methods that could potentially mitigate nonresponse bias and other threats to the quality of data upon which social science relies. Such technologies and methods in- clude mixed-mode surveys, the use of administrative records (e.g., retail scanner data, payroll data, or state tax and transfer program data) to replace some interviews or questions in a survey, automatic data capture methods (e.g., personal data assistants, global positioning system locators), and the use of geographic information systems to develop area-based sam- pling frames. The participants indicated the nature and scope of a project that could be of most value in addressing the problems in this area. In concluding the planning meeting, the participants agreed that the first priority would be to develop a research agenda to capture information about causes, consequences, and remedies for nonresponse and to move

OCR for page R1
PREFACE xi forward the state of the science. As part of developing an agenda, it would be useful to identify short-term projects that would inform a larger, more comprehensive review of all ramifications of the problem and the solutions. This study derives from those outcomes of the planning meeting. Statement of Task A panel of experts under the National Research Council’s (NRC’s) Committee on National Statistics will conduct a study to develop a research agenda for addressing issues related to the deterioration in social science data stemming from the general decline in survey re- sponse by individuals and households. The panel will consider what is known about the causes and consequences of increasing nonresponse, the current state of survey methodology, and methods designed to ­ improve response for surveys in the government, academic, and pri- vate sectors. The panel will identify high-priority research that can answer important unresolved questions about survey response and determine the most cost-effective ways to improve response and the quality of survey data for the advancement of knowledge in the social sciences. On the basis of its information-gathering activities, including a workshop, the panel will deliberate, make recommendations, and publish these recommendations along with supporting findings as an independent NRC report. In November 2010, the Russell Sage Foundation commissioned the NRC’s CNSTAT to assemble a panel of experts to develop a research agenda for addressing issues related to the impact on social science data of the general decline in survey response by individuals and households. In the statement of task (shown above), the panel was asked to consider what is known about the causes and consequences of increasing rates of non- response, the current state of survey methodology, and methods designed to improve response for surveys in the government, academic, and private sectors. The panel was asked to identify high-priority research that can an- swer important unresolved questions about survey response and determine the most cost-effective ways to improve response and the quality of survey data for the advancement of knowledge in the social sciences. For the most part, the panel has limited its purview to nonresponse in household surveys, both public and private, in keeping with the charge in the statement of task. Likewise, the report focuses largely on U.S. household surveys, although research and operational experience in several international surveys is discussed where it has a bearing on general nonresponse issues commonly

OCR for page R1
xii PREFACE confronted in the conduct of household surveys regardless of where they are done. The panel engaged in wide-ranging information-gathering activities, including an extensive literature search. The literature review identified a number of recommendations for research on survey nonresponse topics, which are reproduced in Appendix B of this report. The panel also con- ducted two workshops to which experts in various aspects of nonresponse research were invited. The results of the literature review and the informa- tion gathered in the two workshops are summarized in Chapters 1 and 4 of the report, which focus on documenting response trends and identifying means of improving response, and in Chapters 2 and 3, which summarize the state of the science for understanding and adjusting for response bias. Working with the information gathered from these activities, the panel deliberated in order to develop recommendations for a research agenda. These recommendations are presented in this report along with supporting findings and conclusions and are summarized in Chapter 5. The panel especially and gratefully acknowledges the contributions of the many panel members and invited experts who participated in the two workshops and shared so freely of their knowledge. The findings of this report can be traced in large part to their input, although the guest experts bear no responsibility for the conclusions drawn by the panel. In its first workshop on February 17–18, 2011, the panel focused on several topics that are basic to understanding nonresponse and its effects. Sessions featured reviews of the state of knowledge about the role of field operations in achieving high response rates, the current status of research on mode effects, evidence on effectiveness of incentives, research on post- survey adjustments for nonresponse, and new metrics for nonresponse. The presenters were asked to respond to questions about the state of the current knowledge on each topic. In the first session, Cathy Haggerty and Nina Walker of NORC at the University of Chicago discussed recruiting, training, and managing field staff to achieve high-response levels, summarizing their extensive experience. A panel on mode effects featured presentations on the reports of the American Association for Public Opinion Research task forces on cell phone surveys by Paul Lavrakas, consultant, and online panels by Reg Baker of Market Strategies International. Rounding out that session was a presentation on self-administered modes by Mick Couper of the University of Michigan, Ann Arbor, and the Joint Program in Survey Methodology. Eleanor Singer of the University of Michigan, Ann Arbor, gave a presentation on what is known about incentives, and James Wagner, also of the University of Michigan, Ann Arbor, spoke on new metrics of survey nonresponse. The importance of collecting and analyzing paradata was discussed by Frauke Kreuter of the Joint Program in Survey Methodology, who described the state of the

OCR for page R1
PREFACE xiii science on the use of paradata for post-survey adjustments. In the first of a series on federal statistical agency presentations, panel member Nancy Bates summarized the status and accomplishments of the U.S. Census Bureau re- search program on nonresponse. Panel member Mike Brick summarized the research and practice on using weighting to adjust for nonresponse. Papers from this first workshop as well as from the planning meeting have been brought together in a volume of The ANNALS of the American Academy of Political and Social Science, “The Nonresponse Challenge to Surveys and Statistics,” edited by Douglas S. Massey and Roger Tourangeau (Volume 645, January 2013). These papers contain an extensive literature review, which is not repeated in this report. The second workshop, which took place on April 27–28, 2011, con- tinued the review of ongoing research on nonresponse at federal agencies and took up several new topics, including international research on non- response; the state of knowledge on the role of interviewers in achieving high response rates; a discussion of models for survey costs; current issues and practices in mixed-mode survey research; and a discussion of issues of nonresponse in social network surveys and respondent-driven sampling methods. The session on federal agency research on survey nonresponse featured John Dixon from the Bureau of Labor Statistics, Jaki McCarthy from the National Agricultural Statistics Service, Jennifer Madans from the National Center for Health Statistics, and Steven H. Cohen from the National Cen- ter for Science and Engineering Statistics. Two international guests, Ineke Stoop of the Netherlands Institute for Social Research and Lilli Japec of Statistics Sweden, discussed the status of international research and practice on survey nonresponse. The status of research on interviewer effects on nonresponse was summarized by panel member Nora Cate Schaeffer of the University of Wisconsin–Madison. Barbara O’Hare of the U.S. Census Bu- reau and François Laflamme from Statistics Canada led a session on survey costs, with the former discussing an interagency study coordinated by the Census Bureau and the latter summarizing important work in responsive design that is ongoing at Statistics Canada. Mixed-mode surveys were again a topic in this workshop and were discussed in a session featuring Don Dillman of Washington State University and Deborah Griffin of the U.S. Census Bureau. Douglas Heckathorn, a panel member from Cornell Uni- versity, and Sandra Berry of RAND focused on nonresponse in the growing class of social network surveys. Tom Plewes served as study director for the panel and ably supported its work. Michael Siri provided administrative support to the panel. The panel benefited greatly in the early phases of its work from the many con- tributions of Brian Harris-Kojetin who served as associate study director while on an Intergovernmental Personnel Act assignment on leave from

OCR for page R1
xiv PREFACE the U.S. Office of Management and Budget. We are especially thankful for the personal participation of Constance F. Citro, director of CNSTAT, in the conduct of the workshops and in the preparation of this report. These people’s hard work greatly benefited the report in numerous ways. This report has been reviewed in draft form by individuals chosen for their diverse perspectives and technical expertise, in accordance with proce- dures approved by the Report Review Committee of the NRC. The purpose of this independent review is to provide candid and critical comments that assist the institution in making its reports as sound as possible and to ensure that the reports meet institutional standards for objectivity, evidence, and responsiveness to the study charge. The review comments and draft manu- script remain confidential to protect the integrity of the deliberative process. The panel thanks the following individuals for their review of the report: Rachel A. Caspar, Center for Survey Methodology, RTI Interna- tional; Frederick Conrad, Program in Survey Methodology, University of ­ ichigan, Ann Arbor, and Joint Program in Survey Methodology, Univer- M sity of Maryland, College Park; John Dovidio, Department of Psychology, Yale University; Simon Jackman, Department of Political Science, Stanford University; Frauke Kreuter, Joint Program in Survey Methodology, Uni- versity of Maryland, College Park; Tom W. Smith, Center for the Study of Politics and Society, NORC at the University of Chicago; and Kirk M. Wolter, Survey Research, NORC at the University of Chicago. Although the reviewers listed above have provided many constructive comments and suggestions, they were not asked to endorse the conclusions or recommendations, nor did they see the final draft of the report before its release. The review of the report was overseen by Eleanor Singer, Survey Re- search Center, University of Michigan, Ann Arbor. Appointed by the NRC, she was responsible for making certain that the independent examination of this report was carried out in accordance with institutional procedures and that all review comments were carefully considered. Responsibility ­ for the final content of the report rests entirely with the authoring panel and the NRC. Roger Tourangeau, Chair Panel on a Research Agenda for the Future of Social Science Data Collection