Skip to main content

Currently Skimming:

3 Increasing American Community Survey Participation Through Improved Respondent Communication
Pages 89-168

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 89...
... But, as the workshop day would make clear, firmer and more familiar ground is not easy ground, and decisions on communication contacts and strategies can have major impacts on the quality of ACS data, the burden on its respondents, and its costs. More succinctly, it is very much as Warren Brown (Cornell University and planning committee chair)
From page 90...
... 3.1 OVERVIEW OF RESPONDENT CONTACT STRATEGIES In developing the workshop, the planning committee asked to begin the session on respondent communication with a de novo description of what ACS respondents actually see and what messages they are actually given. For contrast, this overview of the ACS process was followed up with a review of Statistics Canada's implementation of its "wave methodology" for respondent interaction and its refinement over the past several iterations of the quinquennial Canadian census.
From page 91...
... (pressure seal mailer; no separate envelope) Figure 3.1 Target audience and mailing package contents, 2018 American Community Survey mailing strategy.
From page 92...
... • An FAQ Brochure, in trifold format on 14"×8.5" glossy stock, with short one- or two-paragraph answers to 6 questions: – "What is the American Community Survey? " – "How do I benefit by answering the American Community Sur vey?
From page 93...
... Enclosures census.gov Figure 3.2 Introductory letter included in Mailing 1 of American Community Survey, 2018 version. SOURCES: Dorothy Barth workshop presentation; high-quality version retrieved from materials for Information Collection Review 201803-0607-001 on www.reginfo.gov.
From page 94...
... Figure 3.3 English and Spanish panes of multilingual brochure included in ACS-9_4panel_September2017(2017) _REPRINT_VP.indd 2 Mailing 1 of American Community Survey, 2018 version.
From page 95...
... , the Census Bureau found it beneficial to convert Mailing 2 into a one-piece "pressure seal mailer" rather than a standard letter-inenvelope. "Pressure seal" means that the document is folded over and the three exterior sides sealed, so that the respondent removes the edges (tearing along perforations)
From page 96...
... A few days ago, you should have received instructions for completing the American Community Survey online. Local communities depend on information from this survey to decide where schools, highways, hospitals, and other important services are needed.
From page 97...
... Enclosures census.gov Figure 3.5 Follow-up letter included in Mailing 3 of American Community Survey, 2018 version. SOURCES: Dorothy Barth workshop presentation; high-quality version retrieved from materials for Information Collection Review 201803-0607-001 on www.reginfo.gov.
From page 98...
... Figure 3.6 Reminder postcard sent as Mailing 4 of American Community Survey, 2018 version. SOURCES: Dorothy Barth workshop presentation; high-quality version retrieved from materials for Information Collection Review 201803-0607-001 on www.reginfo.gov.
From page 99...
... Security Security Security Security Security census.gov Figure 3.7 Reminder letter sent as Mailing 5, American Community Survey, 2018 version. SOURCES: Dorothy Barth workshop presentation; high-quality version (dated February 2018)
From page 100...
... Mathieu noted that all national statistics offices and survey organizations are dealing with the general current trend for decreasing response rates to surveys, but Statistics Canada is pleased to see the opposite trend. Mathieu said that he believed that Canada's 68.3 percent Internet response rate was the highest online take-up rate achieved in a census until New Zealand's
From page 101...
... most recent census, but it is still a mark of pride -- but he conceded that the number that Statistics Canada is most proud of is the 88.8 percent overall selfresponse rate. The wave methodology used by Statistics Canada in the 2016 census is illustrated in Figure 3.8; both it and Barth's description of the cycle of ACS mailings (Figure 3.1)
From page 102...
... Mathieu said that Statistics Canada needed a costeffective way to process and send a planned 13.3 million Wave 1 letters and 8.4 million Wave 2 reminder letters -- the latter "on demand," in the sense of removing responding addresses from the mailflow -- all with variable-imaged addresses and secure Internet access codes. To satisfy all those constraints, Mathieu noted that they chose a self-mailer approach, very similar to the pressure seal mailer now in use with the ACS.
From page 103...
... SOURCES: Adapted from original diagram in workshop presentation by Patrice Mathieu, for consistency with corresponding diagram of American Community Survey data collection.
From page 104...
... SOURCE: Workshop presentation by Patrice Mathieu.
From page 105...
... SOURCE: Workshop presentation by Patrice Mathieu.
From page 106...
... The proposed deadline for response having now passed, Mathieu said that they did not want respondents to think that "there's no 4 TheCensus Questionnaire Response referenced in the letters is an automated phone line; an option is provided for respondents to key in their ID from the form to receive a paper questionnaire.
From page 107...
... It's quick and easy. OR • Call 1-855-699-2016 if you prefer to receive the paper questionnaire.
From page 108...
... It is quick and easy. OR • Complete the paper questionnaire and return it in the enclosed envelope.
From page 109...
... Mathieu said that this success has Statistics Canada considering ways to improve on Internet take-up in 2021, perhaps by dropping a letter with Internet response options in lieu of a full questionnaire package; the challenge is doing so while still providing a mechanism for getting questionnaires to households that truly need them to participate. Various threads of the presentation, and strengths of the Canadian approach, came together in a humorous moment in Mathieu's presentation -- when discussion of differences between Internet response in 2011 and 2016 obliged him to mention the moment when the 2016 census experienced what could have been a serious stumbling block.
From page 110...
... is, by definition, top-most in the figure. Internet response rate (blue)
From page 111...
... ; NRFU, nonresponse follow-up. SOURCE: Workshop presentation by Patrice Mathieu.
From page 112...
... In 2011, roughly one-quarter of Mailout dwellings had received a paper questionnaire package at Wave 1, and -- consistent with the Census Bureau's experience in testing "Internet Push" versus "Internet Choice" methods -- some respondents who might otherwise respond electronically will use paper if the paper questionnaire is in hand. Mathieu said that Statistics Canada's research estimated that 25 percent of the improvement in self-response from 2011 to 2016 could be attributed to the simple change to Internet-only at Wave 1.
From page 113...
... They found that preferences for Internet response tend to be more fluid than for paper -- and that strong preferences for paper questionnaires by some respondents are "really sticky" and hard to overcome. He said that in the context of a multimode collection strategy, the method of initial contact does not seem to reduce self-response in general, as long as the respondent's preferred mode is offered.
From page 114...
... The wording of the Wave 1 and Wave 2 letters is being revised to encourage a try at online response before requesting a paper questionnaire. And, building off of 2016 successes, Statistics Canada is seeking ways to make its communication and media strategy for the 2021 census "more targeted" in nature.
From page 115...
... Contrasting with the ACS experience where just putting "your response is required by law" in bold type could increase response rates by 3–4 percentage points, she observed that the Canadian census letters never call out the mandatory-response language in bold type and asked if Statistics Canada had tested that variant?
From page 116...
... The Bureau is considering the language for the FAQ brochure or back-of-the-letter parts of the ACS mailings. Mathieu noted that the broader communications strategy around the Canadian census has played up the benefits and the importance to the community.
From page 117...
... But he said that Statistics Canada had managed to achieve response rates of 80 percent or higher even in testing in 2014, which argues for the methodology having a major effect. Michael Schober (New School for Social Research)
From page 118...
... summarized five major tests conducted on the ACS mailing materials since the 2016 workshop, noting the tests shared two overarching objectives: to improve self-response rates as much as possible by streamlining the materials where appropriate and to be responsive to both respondent and stakeholder concerns about the nature and tone of the mandatory-response language in the materials. Of the Pressure Seal Mailer Test, Poehler would note that "you've already seen where we came down on that."5 To explain how and why the format of two of the ACS mailings changed in late 2018, Poehler noted that the ACS has tested pressure seal mailings because they are less expensive to assemble than stuffing traditional letters into envelopes.
From page 119...
... for Those Mailed Mailing 5 Response Mode Pressure Seal (T3) Postcard (T2)
From page 120...
... The Mail Design Test field-tested three bundles of revisions to the ACS mailing materials, that had been developed and refined since preliminary versions were tested in 2015. The new treatments were meant to use a friendlier and more conversational tone in the ACS mailings, staying positive by emphasizing the benefits of participating in the survey, and reducing burden through the removal or combination of materials in the mailings.6 Specifically, the Mail Design Test focused on three experimental treatments: • The Softened Revised Design used deliberately softer language: "Your Response is Important to Your Community" rather than "Your Response is Required by Law." It also included some modification to envelope, letters, and postcards.
From page 121...
... < 0.01∗ (b) Partial Redesign vs Softened Revised Design Softened Partial Revised Point in Data Collection Cycle Redesign Design Difference P-Value After the first two mailings 19.0 (0.3)
From page 122...
... At that time, the Census Bureau found that the "Internet Push" strategy generally increased the take-up rate but dampened response rates in some geographic areas. So, the notion of the test was to identify areas where sending a paper questionnaire in the first mailing might result in higher self-response rates.
From page 123...
... Experiment (Choice) 9/21/2017 Initial Mailing Package -- Pre-Notice Letter Internet Push 9/25/2017 -- Initial Mailing Package -- Web and Paper Option 9/28/2017 Reminder Letter Reminder Postcard 10/13/2017 Paper Questionnaire -- Package 10/17/2017 Reminder Postcard -- 10/19/2017 Replacement Questionnaire Package 11/2/2017 Final Reminder Postcard Final Reminder Postcard (b)
From page 124...
... , while using the updated design of the letters and questionnaire; • De-emphasized Mandatory with Current Questionnaire, which is the same as the previous treatment except that it uses the current ACS questionnaire; and • Softer/Eliminated Mandatory, which softens the mandatory-response language in the letters and removes it entirely in some cases, while using the updated design of the letters and questionnaire. In closing, Poehler noted that future tests being considered for 2019 include variations on the use of due dates/deadlines in ACS mailings and the testing of a modified mailing schedule.
From page 125...
... In particular, he pointed out Treatment 3 of the Pressure Seal Mailer Test and the improvements achieved in Mailing 5 of the Mail Design Test. Williams commented that the current ACS materials are, successfully, recognized as official and functional, but that the null gains in response evidenced by the Pressure Seal Mailer Test raised some questions.
From page 126...
... At the time of the Pressure Seal Mailer Test, reserving the "feels official" pressure seal mailer to the final meeting was not a good option because a third contact method (telephone interviewing) was still being used in the ACS.
From page 127...
... The roughly two-thirds self-response by the Internet in some of the experiments is very good, but something like pushing out the mailing of the paper questionnaire even further would be useful to test. • Foreshadowing a theme discussed in the next session of the work shop, Williams suggested experimenting with "push to device" reminder contacts.
From page 128...
... But, more fundamentally, Wagner asked: What does "improving" mean in the ACS context, and is "improving response rates" really the relevant metric? Wagner suggested that one way to think about "improvement" is to reduce measurement error; in that case, the first day of the workshop and its discussion of survey data compared with administrative and third-party data suggests a very different set of metrics.
From page 129...
... Of the Pressure Seal Mailer test, Wagner noted that the total response rates were not that different; the test was able to detect higher response rates for Treatment 3 (pressure seal mailer at both Mailing 2 and 5) relative to Treatment 2 (pressure seal mailer only at Mailing 2)
From page 130...
... Wagner closed by noting that the Data Slide and Mail Materials Tests both suggest the need for rethinking of objectives and quality metrics. The Data Slide offers interesting information, but it does not directly address concerns over privacy or other reasons why a respondent might knowingly decline to answer the survey.
From page 131...
... . In the exercise of constructing this strategic framework, a message is simply information that the Census Bureau wants to communicate to the recipient of the ACS mailing materials (a possible self-respondent)
From page 132...
... noted that the Census Bureau has conducted a third CBAMS study, in specific support of the 2020 census, and asked whether it would be incorporated into the strategic framework. Oliver replied that they would take a look, and Jonathan Schreiner (Census Bureau)
From page 133...
... From the outset, the strategic framework makes the decision that the mailing materials should most strongly target the segment of the ACS respondent audience that is cynical of participation and distrustful. He said that this is akin to the point that Mathieu had made earlier in discussing the Canadian approach, that the materials ought not "waste effort" by structuring an extensive conversation with "easy" case responses that are instances of simply not having had the time to respond rather than actively declining to respond.
From page 134...
... Oliver said that further research is needed on the demographic profile and other characteristics of this toughest of response cases, who have declined to respond to four previous appeals. 3.3.2 Review of How Current ACS Mail Materials Mesh With the Strategic Framework, and Next Steps Jonathan Schreiner (U.S.
From page 135...
... The Census Bureau is not permitted to publicly • "complete aAct of 2015, your data are protected from cybersecurity risks through screening Enhancement very important national survey, the American Community Survey" serves to aboutthat transmit your data. The enclosed brochures answer frequently asked of the systems questions 2.3.7 Highlight importance the survey.
From page 136...
... Schreiner noted that "there's no magic number," no basis for identifying an optimal number of messages for a survey like the ACS or even an individual mailing -- but, by any reasonable standard, 129 messages is a lot to expect a respondent to process. The second lesson identified by Schreiner is that the messages are repeated verbatim or only lightly paraphrased across the current ACS mail materials.
From page 137...
... • Census will pay for your return postage: As Schreiner noted, a message that only makes sense in a later mailing that actually provides a paper questionnaire (to be mailed)
From page 138...
... and no explicit statement that the survey is being conducted by the Census Bureau. The mailing materials use both the Census Bureau's wordmark-style logo and the formal seal of the Commerce Department, but does so in a somewhat haphazard way, with both logos appearing in some mailings but the paper questionnaire and the reminder postcard featuring only the Commerce seal.
From page 139...
... Census Bureau, stating that my household has been randomly selected to complete the American Community Survey. Further, it must be done online at https: //respond.census.gov/acs (but a paper questionnaire will be mailed if needed)
From page 140...
... Closing with a final lesson and observation that visual design is not consistent in the mailings, with wide variation in color, logos, graphics, and other style elements, Schreiner said that the Census Bureau is in the process of developing a revised set of mailing materials based on these lessons learned and the strategic framework. The Bureau's next steps will be cognitive testing of the revised
From page 141...
... But the bottom line remains the same, he said: The mail materials present "a lot of text to read" -- some of which is contradictory, redundant, or mixed messaging -- and he added the basic fact that some recipients "don't read everything they receive." In a broad sense, Schober said that the current mailing materials developed from "tried and true" procedures. The basic strategy of the ACS is to contact a sample of address-based households (rather than individual people)
From page 142...
... to either receive a link to start the survey or to opt in to text messaging as their preferred mode of participation? There are, to be sure, obvious constraints to this in the ACS context, Schober noted up-front -- constraints that are both operational and legal.
From page 143...
... The underlying survey offered four response modes: Human Voice (phone interview with a human interviewer) , Human Text (text message exchanges with a human interviewer)
From page 144...
... He observed that, in looking at the current ACS questionnaire, the estimate that the survey will take 20–40 minutes for an average household to complete is relatively buried in a block on the last page of the questionnaire; he asked whether this was a deliberate choice and whether that was a choice being revisited in the revised mailing materials. Oliver replied that the time estimate is required to be presented on the questionnaire (by the Office of Management and Budget, for clearance of the survey under the Paperwork Reduction Act)
From page 145...
... Oliver commented that part of the current research is to learn more about the characteristics of the people who get Mailing 5, who are still nonrespondents after several mail attempts. He added that the major challenge is similar between the census and ACS contexts: fine-grained tailoring of materials for segments of the mailout population is not feasible at such a large scale, and so the core mailout pieces have to be "things that are applicable to the entire respondent base." A major objective of the ACS mailing piece redesign is to make the language plainer and simpler, providing the right cues to respondents who just need awareness of the response options and mild encouragement to respond while also directing messages to harder
From page 146...
... 3.4 LISTENING TO ACS RESPONDENTS The final session of the workshop examined the other side of the fundamental ACS communication dynamic: considering what information is learned about respondents' experiences in completing the ACS questionnaire or in deciding whether to participate at all. 3.4.1 Assessing High-Burden ACS Questions Reduction of respondent burden is an oft-stated, high priority goal for survey programs like the ACS, and so a natural starting point is to try
From page 147...
... , median response rates for each particular item by county, and limited data on the number of complaints received by the ACS Office about the question. This information was complemented with the Interviewer Survey, an interview with a sample of 1,053 field representatives or telephone interviewers (computer-assisted telephone interviewing being offered as a response mode at the time)
From page 148...
... Raglin said that the Interviewer Survey revealed that four ACS questions showed up as the most confusing to respondents or required additional probing by interviewers: • Type of Internet Access, which 79.4 percent of the interviewers deemed confusing and 79.3 percent said typically required additional probing. In this case, Raglin said, a main source of trouble seemed to be that people instinctively thought of the "brand" or name of their Internet service provider when asked the question and so would want to reply "Verizon," "Comcast," or the like.
From page 149...
... 78 H14 Cost of utilities (all) 71 P16 Health insurance coverage 53 P15b Residence 1 year ago 48 P4 Age and date of birth 36 P13 Ancestry 34 H22 Mortgage 31 P41 Class of worker 30 P42 Name of employer 30 SOURCE: Workshop presentation by David Raglin.
From page 150...
... This work has already produced some changes in ACS question wording and several other changes are due to take effect in the ACS in 2019. He would close his presentation by summarizing that the work has already resulted in progress in improving some burdensome questions, but that the Census Bureau needs to continue the work of question burden reduction "while still getting the information we need to provide data to our Federal users." As examples, Raglin displayed the before-and-after text of several questions in Figure 3.12.
From page 151...
... Time Left for Work Question 10 Changes to Retirement Income (c) Retirement Income Question 11 Figure 3.12 Recent and upcoming changes to American Community Survey questions based on cognitive review and burden assessment.
From page 152...
... Census Bureau) 12 presented additional recent qualitative research regarding the development of questions on the subjective burden of the ACS, work that flowed out of the 2016 ACS workshop (National Academies of Sciences, Engineering, and Medicine, 2016)
From page 153...
... .13 Focus group participants were recruited by telephone, with recruitment targets driven by the person's ACS response mode above all. In particular, participant demographics were not considered as part of the focus group selection, and that information was only collected at the focus group settings.
From page 154...
... 154 IMPROVING THE AMERICAN COMMUNITY SURVEY Holzberg said that it was somewhat relieving that the focus group survey suggested that ACS participants do not find the survey to be a major imposition. To the question "How burdensome was the ACS to you?
From page 155...
... Holzberg noted some key limitations of this focus group research, chief among them that, just by nature of the exercise, "the most burdened respondents" to the ACS are unlikely to have been recruited into the focus groups. Moreover, the time gap between ACS response time and focus group participation (which could have been "one month or more")
From page 156...
... After a Congressional hearing in 2012 on "The Pros and Cons of Making the Census Bureau's American Community Survey Voluntary,"14 the Office of the Respondent Advocate for household surveys was created in the Census Bureau in 2013. Two years later, a Business Respondent Advocate was added to provide the same respondent-focused support to the business surveys.
From page 157...
... Structurally, one major piece of its efforts is providing technical assistance and educational materials to the ACS data user community, conducting data user training and webinars and maintaining the ACS website. Scanniello said that the ACS Communications Area plays a similar role to that Chan had described for the Respondent Advocate: striving to improve the respondent and data user experience and to understand and respond to stakeholder concerns.
From page 158...
... Letters and emails are tracked, and the responses are maintained in a database. In terms of the major themes that the ACS Communications Area sees in its respondent cases, Scanniello broke down the 1,264 communications from 2017 as follows: • The plurality of the easily classifiable communications, 333, questioned whether the ACS is legitimate or a scam; • 241 were problems with the ACS Internet response channel, most often a lost user ID or PIN needed to continue ACS response; • 151 were to convey refusal/declination to participate; • 137 concerned confidentiality or privacy matters; • 111 questioned the "intrusive" or "invasive" nature of ACS questions; • 67 were questions about the mandatory nature of response to the ACS; • 66 were questions about the constitutionality of the ACS; • 42 were questions about why they were included in the ACS sample (the "Why Me?
From page 159...
... praised the Census Bureau's Office of the Respondent Advocate and said she tries to build awareness of ACS communications as much as possible. She asked Chan and Scanniello whether their outreach to congressional offices is planned in any systematic way (or what the triggering event to seek a meeting might be)
From page 160...
... opened his remarks with the quip that he had been asked to comment on burden reduction even though, as a public opinion researcher, he is probably someone who has spent a career "inflicting maximum burden" on respondents rather than thinking of reducing it -- "but I cared! " In seriousness, though, Keeter noted that survey researchers and data users like himself have a major stake in the continued viability of the ACS, and so that it is encouraging that the Census Bureau is taking steps like this workshop and the research presented at it.
From page 161...
... He added that he liked the notion of making people happier with the surveytaking experience as a way of reducing burden, helping them understand the importance of the survey. Keeter also supported Michael Schober's suggestions about better leveraging text messaging and mobile device access in the ACS context; the ACS has been optimized for mobile but respondents are not
From page 162...
... She noted that, earlier in 2018, she participated with Don Dillman and others in a similar, thorough review of survey materials and contact strategies used by the National Center for Education Statistics in its surveys, with the objective of curbing declining response rates. She said that many of the materials in that review had not changed in many years; the ACS has had the comparative advantage of continuous refinement and testing, as evidenced by the day's presentations.
From page 163...
... decomposed the basic survey communication process into a sequence of interactions that have direct analogues to the pushto-web, starting-with-mail survey context. • Steiger said that Bradburn's focus on the first 5 seconds of the interview pitch has a direct analogue in the envelope used in ACS mailing materials; it is the envelope that must quickly and immediately convey the survey's importance and legitimacy.
From page 164...
... . Factors Impacting Whether Envelope Factors Impacting Whether Letter Will Will Be Opened Be Read/Recalled • Type of delivery service/class • Letterhead • Type of mailer • Branding logo • Mailer size • Endorsements • Mailer color • Font style • Addressee on mailer • Font size • Return address of sender • Salutation • Graphics on mailer • Reading level of text • Branding logo on mailer • Length of text • Topical phrasing/teaser on mailer • Text about benefits to respondent • Endorsement on mailer • Text about purpose of the study • Font style • Eligibility criteria • Font size • Text about length and burden • Text about privacy protections • Text about contact info • Text about sampling • Text about data collection mode • Nature of the signature • Signatory's affiliation • Credentials of signatory • Sex of signatory • Ethnicity of signatory • Language of text • Use of informational inserts controlled experiments -- not based nearly enough in solid qualitative measures and assessments of the respondent experience.
From page 165...
... Steiger said that this detailed research should include ethnographic work in people's homes -- saying that she "would love to see research on people processing their mail," interacting with and responding to pressure seal mailers relative to standard envelopes, and talking through what things resonate with the potential respondents. She suggested more eye-tracking research -- of which the Census Bureau has already done some, but it should be done in a way to have a better sense of where the eye is actually lingering on the outer envelope.
From page 166...
... Finally, Steiger commented that an explicit qualitative research phase should be added to the Census Bureau's plan for implementing the strategic framework. In particular, as Schober noted, communications conditions change rapidly -- so new rounds of focus groups and cognitive testing are always worth pursuing.
From page 167...
... Velkoff agreed, saying that text messaging and other new modes of communication give the ACS ways to convey messages that have not been exploited in the federal survey context before. The question was raised about "incomplete starts" in the ACS -- whether the Census Bureau has any metrics on people starting the ACS questionnaire, breaking off for whatever reason, and never returning to it.
From page 168...
... But she said those discussions inevitably come back to not wanting to distract the respondent from responding, or leading someone off path. Jonathan Schreiner concurred, adding that "we really want people to go that online response site, and not someplace else." One option is an improved "splash" page on entry to the Internet response site that gives both information and the link to complete the questionnaire.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.