that indicated little bias in labor force statistics; the time-use survey studies have also found little bias except for “volunteering” (see Dixon, 2012). The Consumer Expenditure Survey studies have found very little bias in expenditures (Goldenberg et al., 2009).

In conducting these surveys, BLS tends to use six methods to evaluate nonresponse: linkage to administrative data; propensity scores and process data; the results of experiments with alternative practices and designs; comparisons to other surveys; benchmark data; and the R-index. When linking survey to administrative data, BLS has found that the estimate of bias due to refusals based on the last 5 percent is similar to the estimate based on linkage to the Census 2000 long-form sample. However, these studies have shortcomings in that rarely are all the records linked successfully. Consequently, the linked measure may be defined differently from the survey estimate, and it may have error.

The R-index uses a propensity score model for nonresponse and relates that to other variables (usually frame variables, such as urbanicity, poverty, etc.). The BLS studies used 95 percent confidence intervals for the R-index, somewhat flatter than the response rate. Since one of the major flaws in nonresponse studies lies in what is not known, the use of confidence intervals that account for the estimation of both the measure of interest and the model of nonresponse would be helpful.

CENSUS BUREAU

Panel member Nancy Bates from the Census Bureau reported that Census Bureau nonresponse research studies have covered the gamut. Topics have included causes of nonresponse, techniques for reducing nonresponse, nonresponse adjustments, nonresponse metrics and measurement, consequences of nonresponse (bias, costs), nonresponse bias studies, responsive designs and survey operations, the use of administrative records and auxiliary data and paradata, level of effort studies, and panel or longitudinal survey nonresponse. During her presentation, Bates offered different examples of research, including mid-decade decennial census tests to target bilingual Spanish language questionnaires; a test adding a response “message deadline” to mail materials; the addition of an Internet response option; and varying the timing of the mail implementation strategy (e.g., the timing of advance letters, replacement questionnaires, and reminder postcards). Nonresponse research in conjunction with the 2010 Census included an experiment that tested different confidentiality and privacy messages and another that increased the amount of media spending in matched-pair geographic areas. Additionally, the Census Bureau sponsored three ethnographic studies to better understand nonresponse among hard-to-count populations.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement