Skip to main content

Currently Skimming:

8 Impact on Respondent Burden
Pages 163-174

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 163...
... 3) stated that "conceptualizations and measures of burden are still underdeveloped and, as a result, findings from empirical research in this area remain equivocal." Kenneth Darga created one useful topology of the components of respondent burden in his discussion at a workshop, reported in Benefits, Burdens, and Prospects of the American Community Survey (National Research Council, 2013)
From page 164...
... In an era of heightened ethnic/minority sensitivities, identity theft, and concerns about government intrusiveness, it may be "impossible for the Census Bureau to fully anticipate which data items might be deemed sensitive by which particular individual" (National Research Council, 2013, p.
From page 165...
... Cognitive Requirements Cognitive difficulty is determined in part by the substance of the desired information. For example, information requests that require precise details such as exact dates and counts are inherently more cognitively difficult than items asking for only general and approximate information.
From page 166...
... . Sensitivity and Social Desirability of Questions Sensitivity and social desirability factors influence whether respondents are willing to report, and report truthfully, about information (especially behaviors)
From page 167...
... Focusing on objective measures only, the study panel attempted to assess whether there was an overall change in respondent burden based on SIPP redesign. The panel examined length of the interview, number of interviews, the difficulty of the interview process, nonresponse rates, and sample loss rates between waves.
From page 168...
... , by Household Size Wave 1 Wave 2 n Minutes n Minutes All Households 23,983 103.71 14,637 91.73 Households with >2 Members 9,055 133.53 5,470 118.37 2-Person Households 7,888 101.52 4,720 89.99 Single-Person Households 7,040 67.81 4,447 60.79 NOTES: • Interview duration was calculated using audit trail files associated with each household. • The clock starts as soon as the interviewer enters the instrument and stops when the inter­ iewer exits the instrument each day.
From page 169...
... Census Bureau staff determined that the length-of-interview data for both the 2008 SIPP panel and the 2004 SIPP panel were unusable due to problems with the software. Because there is no information on length of interview for these earlier SIPP panels, there is no way to directly compare interview times across the two designs.
From page 170...
... This instrumentation approach offers a potentially significant advantage over the standard questionnaire approach by allowing respondents and interviewers to move back and forth across programs and activities, reporting events that are linked in time rather than being forced to follow the sequence incorporated in the instrument's structure. When used in this manner, the EHC has the potential to mitigate the cognitive challenges of an annual recall of certain events.
From page 171...
... Six years separate these two panels, and any number of other factors may have contributed to the larger nonresponse rate in 2014, including factors that contributed to a general downward trend of response on most government surveys. Another approach to using nonresponse rates for insight regarding respondent burden in a longitudinal survey is to compare the sample loss rates for wave 2 across survey designs.
From page 172...
... In addition, the panel examined nonresponse rates and sample loss rates at stages in the longitudinal process and summarized the following findings: Finding 8-1: Interview times could not be adequately compared because of the absence of reliable information for the 2008 panel. Finding 8-2: The redesigned 2014 panel reduced the number of inter views compared to the old design, which could reduce respondent burden.
From page 173...
... However, these increases could indicate some increase in perceived burden by respondents, or at least a decrease in their willingness to be burdened. Our analysis on these objective measures suffered from insufficient data, and no reliable determination is possible.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.