Bates also discussed nonresponse research associated with the American Community Survey (ACS), including a questionnaire format test (grid versus sequential layout), a test of sending additional mailing pieces to households without a phone number, and a test of adding an Internet option as a response mode. For other Census Bureau demographic surveys, Bates mentioned nonresponse tests involving incentives (debit cards) to refusals in the Survey of Income and Program Participation and in the National Survey of College Graduates. Other examples included nonresponse bias studies, including studies considering the use of propensity models in lieu of traditional post-adjustment nonresponse weights. She concluded with a discussion of administrative records and how they hold great potential for understanding non-ignorable nonresponse. Currently, most Census Bureau studies using administrative records are more focused on assessing survey data quality, such as underreporting or misreporting, and less focused on nonresponse.

Many Census Bureau nonresponse research projects are tied to a particular mode, namely mail, since both the decennial census and the ACS use this mode. Bates observed that many Census Bureau research projects are big tests with large samples and several test panels. The majority of tests try out techniques designed to reduce nonresponse, while only a few are focused on understanding the causes of nonresponse.

Bates concluded with the following recommendations:

• Leverage the survey-to-administrative-record match data housed in the new Center for Administrative Records Research and Applications. This could have great potential for studying nonresponse bias in current surveys.

• Make use of the ACS methods panel for future nonresponse studies. Its multimode design makes it highly desirable.

• Leverage decennial listing operations to collect paradata that could be used across surveys to examine nonresponse and bias.

• Select a current survey that produces leading economic indicators and do a “360-degree” nonresponse bias case study. (This ties into a recent Office of Management and Budget request on federal agency applications of bias studies.)

• Going forward, think about small-scale nonresponse projects that fill research gaps and can be quickly implemented (as opposed to the traditionally large-scale ones undertaken by the Census Bureau).

• Expand the collection and application of paradata to move current surveys toward responsive design (including multimode data collection across surveys).



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement