National Academies Press: OpenBook
« Previous: 2 Sample Design and Precision of Estimates
Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×

3

Data Collection Methods

Beginning in 2013, when an Internet data collection mode was added, American Community Survey (ACS) data have been collected using four modes: mail, Internet, telephone, and personal visit. The first phase in the data collection process involves a request sent by postal mail urging sample members to respond by Internet. The next step is another mailing that includes a paper questionnaire and provides the option of responding by Internet or mail. If no response has been received to these requests, then a telephone follow-up is attempted, which is then followed by an in-person visit for a subset of the respondents. Table 3-1 shows the sequence of the overlapping follow-up steps for each monthly sample panel.

The goal of the Census Bureau’s multimode data collection strategy is to maximize response rates in a cost-effective manner. Prior to the implementation of the Internet option, in 2012, the weighted distribution of the responses by mode was as follows: close to half (48 percent) of the eligible sample addresses completed the survey by mail self-response, 7 percent by computer-assisted telephone interview (CATI), and a little under half (42 percent) by computer-assisted personal interview (CAPI) (U.S. Census Bureau, 2014a).

Although the 2013 data are not yet available, results from the first half of 2013 (January-June panels) found that a little over half of the surveys completed via self-response were received by Internet and the rest by mail These early results also suggested that the availability of the Internet mode could provide a slight boost to the overall self-response rate (Baumgardner et al., 2014).

The first section below details the four ACS data collection modes. The

Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×

TABLE 3-1 Sequence of Data Collection Steps for the ACS

ACS Sample Panel Month of Data Collection
January February March April May June
January Mail/Internet Telephone In-person visit      
February   Mail/Internet Telephone In-person visit    
March     Mail/Internet Telephone In-person visit  
April       Mail/Internet Telephone In-person visit
May         Mail/Internet Telephone
June           Mail/Internet

SOURCE: U.S. Census Bureau (2009c).

following three sections cover nonresponse follow-up, adaptive design, and mode effects and data quality; the panel’s recommendations on these topics are at the end of each section.

DATA COLLECTION MODES

Mail and Internet

The first mailing is an advance letter that alerts sample members to the survey and encourages participation. It is followed by a mail package, which includes instructions for how to respond through the Internet. A reminder postcard is sent a few days after the mail package. Sample members who do not respond after the reminder postcard are sent a replacement mail package, which includes a paper version of the questionnaire and a postage-paid envelope for a mail response. Instructions for responding by the Internet are also included. The package is followed by another postcard reminder. Sample members who do not have a telephone number that can be used for telephone follow-up receive an additional postcard, alerting them that a field representative will be contacting them in person if they do not respond by mail or Internet.

Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×

Telephone Follow-Up

The Census Bureau uses sample members’ mailing addresses to attempt to identify a telephone number for follow-up (with the help of vendors that do address matching). The panel’s understanding is that this effort is currently limited to locating landline numbers and does not include cell phones.

Telephone follow-up begins about 5 weeks after the first mailing. The number of follow-up calls made to a household depends on the disposition of prior calls. For example, if the household refuses to participate by telephone, then one additional refusal conversion attempt is made in this mode. The Census Bureau has been conducting research on the optimum number of follow-up calls based on historical data about call outcomes (Zelenak, 2013).

In-Person Visit

After the mail, Internet, and telephone follow-ups are completed, the cases that have not yet been completed are subsampled for in-person follow-up. Mailable addresses are sampled at a 1 in 2, 2 in 5, or 1 in 3 rate, depending on the response rate expected at the census tract level. Unmailable addresses are sampled at a 2 in 3 rate.1 The in-person follow-up operation typically begins approximately 2 months after the first mailing.

Each case is assigned to a field representative, who will first attempt to complete the interview by telephone, except in cases where the household already refused by telephone or if the address was deemed unmailable, because in those cases the field representative would need to visit the location to determine whether the housing unit exists and to determine the occupancy status. For most cases, an actual in-person visit is only attempted after three to five calls are made during different times of the day. This is typically needed for approximately 80 percent of the cases assigned to the field. Although CATI refusals are slightly more likely to also end as a refusal in CAPI, field representatives are generally very successful at obtaining an interview, with an over 95 percent completion rate (Zelenak, 2013).

Group Quarters Data Collection

As defined by the Census Bureau, group quarters are places where people live or stay in a group living arrangement and receive housing and services from an organization or other entity. This definition encompasses

______________

1All eligible addresses within designated Hawaiian homelands, Alaska native village statistical areas, and some American Indian areas are included in the personal visit follow-ups without subsampling.

Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×

such facilities as college dormitories, nursing homes, and correctional facilities. As did the census long-form sample, the ACS aims to be as comprehensive as possible in representing the entire U.S. population and therefore includes people living in nearly all forms of group quarters.

The group quarters sample is separate from the housing unit sample, and the data collection process is also different to address the unique challenges associated with interviewing in the context of such facilities. All group quarters cases are assigned to a field representative, who visits the facility after an initial mailing that introduces the survey has been sent. During the visit, the field representative obtains a roster of the residents, which is then used to generate a sample of individuals to interview. Up to 15 residents are interviewed at each facility included in the survey. An earlier report (National Research Council, 2012) examined the effect of the group quarters on the American Community Survey estimates and recommended changes to the survey design and operations.

NONRESPONSE FOLLOW-UP

As described above, cases that cannot be completed through CATI are subsampled for CAPI follow-up. Efficient design of a multimode survey depends on good information about the costs and contribution to survey accuracy of each phase of the survey. The subsampling rates (1 in 2, 2 in 5, or 1 in 3, noted above) are determined by taking into account the costs of CAPI relative to other data collection modes.

However, the Census Bureau’s approach to tracking costs is not well adapted to monitoring costs per completed interview by mode, much less distinguishing costs of early respondents from those requiring multiple contact attempts before an interview is obtained. A high-level analysis (Griffin and Hughes, 2012) estimated that completed telephone interviews are about three times as expensive as questionnaires returned by mail and in-person interviews (including those conducted over the telephone by field representatives) cost six times as much as questionnaires completed by mail, taking into account the costs of unsuccessful follow-up attempts for nonrespondents. These cost estimates are very rough and cannot be separated into interview-related costs and costs not related to interviews, or marginal and fixed costs, with reasonable precision. It is possible that more precise estimates would reveal an even larger gap between the cost of mail interviews and the cost of in-person interviews, especially in-person interviews that require multiple contact attempts, which in the experience of panel members is frequent.

Census Bureau staff have indicated that in the future they intend to refine the way they track cost data to better understand the costs associated with each interview. Such operational cost modeling is essential if the

Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×

Census Bureau is to optimize the distribution of effort across the different survey modes, including subsampling rates for the CAPI nonresponse follow-up (and perhaps for telephone follow-up as well) and number of callbacks for telephone and in-person operations. It would be particularly useful to have analyses of the costs and yield for each data collection mode by geographic area (e.g., by tract characteristics), as well as the marginal costs associated with changing the number of follow-up attempts.

It is important to note that the nonresponse follow-up subsampling rates were developed before the full implementation of the ACS and before the implementation of the Internet response option, which might affect response rates in other modes. On the basis of testing conducted before the Internet response was introduced, the Census Bureau anticipated that in addition to generating savings in areas such as printing, mailing, and data capture, the availability of the Internet response option might also lead to a slight overall increase in self-response. Thus, it is possible that since the time when the ACS was first launched, the optimal rates for subsampling have shifted considerably.

In addition to cost factors, several other considerations are relevant to follow-up design. Reaching sample households by telephone to complete a survey is becoming increasingly difficult as more households are dropping landlines in favor of cell phones and are relying on such technologies as voicemail, caller ID, call blocking, and privacy managers to screen unwanted calls. Inefficiencies associated with a high number of unproductive calls can be especially challenging in the case of surveys that do not include cell phone numbers, as is the case with the ACS follow-up calls.

At the same time, even if telephone calls are less likely to result in a successful interview, a large number of calls to a household can contribute to the perceived burden of the survey. To address concerns of respondent burden associated with follow-up efforts, the Census Bureau has been conducting research on ways of adjusting the specifications for the number of follow-up attempts in both the telephone and in-person modes to reduce the overall number of contact attempts without an adverse effect on costs and data quality. One study found that moving cases from CATI to CAPI after fewer call attempts could increase efficiency (Griffin, 2013).

In addition to the respondent burden considerations, an important question concerns the extent to which the follow-up improves the representativeness of the sample. Because follow-up cases are typically more expensive than responses from initial contact modes or contact attempts, they are an inefficient use of resources unless they reach a part of the population that differs from those reached in earlier, less expensive operations. It is possible that in some geographic areas or for some demographic groups, higher response rates due to more extensive follow-up do not substantially change the estimates. In some cases, efforts made to increase the response

Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×

rate by a few percentage points (such as through additional CATI attempts) can lead to interviews completed disproportionately among specific subgroups, which could possibly even adversely affect the representativeness of the overall sample. As shown in Table 3-2, there are demographic differences in the likelihood of responding at each stage of the follow-up process. As a consequence, it is possible that reducing follow-up effort and thereby attaining lower response rates would not necessarily lead to substantially different estimates, once the data were weighted for differential nonresponse, if the follow-up were to be reduced based on careful analysis.

Using existing data and paradata (data on survey operations), the Cen-

TABLE 3-2 Demographic Representation Among Completed Interviews After Each Follow-Up Mode (in percentage)

Percentage of Population Represented by Weighted Data Weighted Percentage of Responses from Each Mode
Demographic Group After Mail After Mail and Telephone After Mail, Telephone, and In-Person Visit Mail Telephone In Person
Total Population 50 62 91 55 13 32
Male 49 61 90 54 13 32
Female 51 63 92 55 13 32
Hispanic 26 40 87 30 16 54
White Alone 52 62 87 60 11 29
Black or African American Alone 27 40 82 33 16 51
American Indian or Alaska Native Alone 29 39 75 39 13 48
Asian Alone 51 60 89 57 10 33
Under Age 5 41 52 88 47 13 41
Age 18 and Over 52 64 91 57 13 30
Age 65 and Over 70 80 96 73 10 17

SOURCE: Data from 2003 ACS. For columns 1-3, Griffin and Raglin (2011); for columns 4-6, panel calculations.

Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×

sus Bureau could simulate the effect of truncating nonresponse follow-up in areas with various characteristics, combined with various methods of adjusting for nonresponse, such as nonresponse weighting of respondents. If such a study were to find that the effect of truncation on estimates is minimal in identifiable areas, then resources could be redirected from follow-up to increasing overall sample size in the same or other areas. The optimal subsampling rate for the in-person follow-up could also be assessed in terms of marginal variance reduction.

An alternative, but related, approach of applying small area estimation methods would begin by modeling associations between estimates for the CAPI population and those for the self-response population by small geographical area (tract or block group). If the latter are highly predictive of the former, then it might be possible to reduce CAPI sample sizes and use model-based predictions to maintain the precision of estimates for the population of households who would respond only by CAPI (O’Malley and Zaslavsky, 2007). Alternatively, further subsampling could be imposed on extended CAPI follow-up, if the cost per additional interview is found to be highest in later phases. Savings could then be used to increase initial sample sizes, especially in areas with low mail and Internet response rates. The large annual sample of the ACS and its repetition over time are important features for efficient estimation of such models (see discussion of small area estimation in Chapter 4). Simple truncation of data collection assumes equivalence of early and late respondents conditional on covariates used in weighting. Modeling can make less restrictive, more flexible assumptions that account for observed differences between early and late respondents conditional on covariates used in weighting.

From a survey operations perspective, the most critical next step is to implement a much more precise system for tracking data collection costs than currently exists. Although this will involve some up-front costs due to changes or additions needed to the current survey management systems, the precise tracking of all aspects of the data collection costs is essential to inform the ongoing work to optimize the allocation of resources. A system that closely tracks data collection costs will clearly have significant long-term payoff for a survey of the scale of the ACS.

Two of the panel’s recommendations below for specific research projects are related to data collection methods that would need to be informed, in part, by cost analyses. Although the Census Bureau’s resources for research projects is limited, the panel considers at least the initial stages of this type of evaluation to be relatively low-cost investments that could identify options for increased efficiency and opportunities to free up resources that could be re-invested in changes that would ultimately improve overall data quality.

Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×

RECOMMENDATION 4: As a priority, the Census Bureau should develop systems for tracking American Community Survey data collection costs as precisely as possible, overall and by data collection mode.

RECOMMENDATION 5: Taking into account cost and yield and their variation across areas, the Census Bureau should periodically evaluate the optimal subsampling rate for the American Community Survey, as well as the number of follow-ups in each mode.

RECOMMENDATION 6: The Census Bureau should evaluate the possibility of improving the American Community Survey’s accuracy at a fixed cost by truncating nonresponse follow-up or using modeling techniques to replace some of the nonresponse follow-up, particularly for the more expensive data collection modes.

ADAPTIVE DESIGN

As discussed above, recent research indicates that the relationship between response rates and data quality is complex, and higher response rates do not always necessarily lead to better data quality (Groves, 2006; Groves and Peytcheva, 2008). Moreover, aiming for the highest response rate possible can be costly and can increase respondent burden. Adaptive design, rooted in the total survey error perspective (Weisberg, 2005), increases efficiencies in fieldwork management by aiming for an optimal balance between data quality considerations and costs. An important aspect of this is the ability to closely monitor quality and cost indicators and to be able to make timely adjustments.

Specifically, adaptive survey design strategies

  • pre-identify a set of survey design features potentially affecting costs and errors of survey estimates;
  • identify a set of indicators of the cost and error properties of those features and monitor those indicators in the initial phases of data collection; and
  • alter the features of the survey in subsequent phases, as needed, based on cost-error tradeoff decision rules.

Although the Census Bureau’s efforts in this area are likely hindered by a lack of cost information at sufficient granularity, the ACS already incorporates several design features that are essentially characteristics of adaptive design. Some examples include sending an additional postcard to addresses that do not have telephone numbers for CATI follow-up; relying on call history information to determine the next call attempt in CATI; and

Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×

subsampling for CAPI follow-up based on expected response rates at the tract level.

The ACS offers an especially attractive vehicle for adaptive design research because of the continuous data collection in all four modes, and the research could benefit many of the other surveys at the Census Bureau. The research to date has focused primarily on analyzing information from contact attempts and outcomes to tailor subsequent attempts to contact the household, for example, by switching between data collection modes based on paradata or modeling.

Currently, the Census Bureau’s centralized telephone operations only conduct interviews on telephone landlines. Although there are some special considerations when including cell phone numbers in a survey (e.g., the numbers cannot be autodialed), conducting research on the possibility of integrating cell phones into the follow-up operations would be worthwhile given the rapid increase in the proportion of households that rely on cell phones and do not have landlines. As noted above, field interviewers complete approximately 20 percent of their workload by telephone rather than through an in-person visit, and presumably these were limited primarily to landlines. If an actual in-person visit is needed, then field representatives could also collect cell phone numbers at that stage if additional contact with the household is likely to be needed (e.g., if the interview cannot be completed at that time and the field representative is likely to want to call back). The optimal approach for switching between landlines and cell phones could also be investigated as part of an adaptive design strategy during the field follow-up.

RECOMMENDATION 7: The Census Bureau should conduct research on potential ways of identifying cell phone numbers associated with adult household members and instruct American Community Survey field interviewers in proper protocols for calling cell phone numbers, as needed.

The Census Bureau has several ongoing research projects to identify ways in which adaptive design can be further incorporated into the survey. One study augmented the sample frame data from the 2011 ACS Internet test with administrative records data and developed a discrete time-logistic model to predict household-level daily Internet propensities and optimize mode switch strategies, focusing on switching Internet nonrespondents to mail (Chestnut, 2013). Table 3-3 shows the percentage of administrative records data that were linked and that the records from most databases were linked with a high success rate except for the Internal Revenue Service database. The study found that timeliness could be increased by switching

Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×

TABLE 3-3 Administrative Records Data Sources Used in the Mode Switch Study

Administrative Record Data Source Variables ACS Internet Test Sample Linked by Master Address File Identification Number (%)
2010 Census—Housing Unit Response Data File Self-administered questionnaire, language of interview or questionnaire, proxy respondent   96
2010 Census—Edited Household Data File 2 Householder—age, race, and Hispanic origin; tenure; large household   96
2010 Census—Edited Person Data File No spouse, not related   87
2010 Census—Unedited Operation Data File Mail enumeration area, response check-in-date   99
Master Address File Urban or rural 100
Info USA Do not call flag, high-tech household   85
U.S. Postal Service—National Change of Address Database Change of address flag 100
National Telecommunications and Information Administration Broadband flag   96
Internal Revenue Service Total income reported for 2010 (form 1040)   66
2010 Census—Advertising Targeted stratum 100

SOURCE: Chestnut (2013, p. 4).

some cases that are not likely to respond by Internet to other follow-up modes sooner, or even prior to data collection.

These efforts are promising investigations into ways of increasing efficiencies in data collection. Models could be developed to include a variety of administrative records, paradata, and design characteristics. In particular, use of adaptive design strategies might necessitate concomitant modification of estimation procedures.

Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×

RECOMMENDATION 8: The Census Bureau should continue to conduct research on how adaptive design techniques can benefit the American Community Survey.

RECOMMENDATION 9: The Census Bureau should continue to investigate the use of auxiliary data to develop nonresponse models for the American Community Survey.

MODE EFFECTS AND DATA QUALITY

As noted above, the Census Bureau began offering an Internet response option in January 2013. Early estimates of the Internet response rates indicate that over half of the self-responses (mail or Internet) were being completed using the Internet option. To date, relatively little research has been published evaluating the Internet mode since its implementation, and the Census Bureau has not yet conducted formal studies of mode bias, but tests conducted prior to the implementation found few significant differences in response error between the Internet and mail modes and very low response error rates across most estimates examined (Horwitz et al., 2012).

An initial look at item nonresponse, using the raw (unedited) data from the 2013 January panel, compared item nonresponse rates between the mail and Internet modes and found that Internet nonresponse was lower than mail nonresponse for most of the survey items in the study (Clark, 2014). In the case of the basic demographic questions, item nonresponse rates were about 1-6 percentage points lower in the Internet mode than the mail mode. Internet nonresponse rates were also lower than mail nonresponse rates for all of the questions in the housing section.

Some housing items that require respondents to provide a dollar amount had particularly high nonresponse rate in the mail mode but produced much more complete data in the Internet mode. For example, the nonresponse rate for the question on gas costs was 14 percent in the mail mode and 4 percent in the Internet mode. The nonresponse rate for the question on fuel costs was 19 percent in the mail mode and 3 percent in the Internet mode. These questions are structured differently in the two modes, with the Internet instrument taking advantage of the easier integration of skip patterns and screening questions to facilitate responding.

Internet nonresponse rates were also lower than the mail rates for most of the population questions, including large differences for the income questions. For example, the nonresponse rate for the total income question was 23 percent in the mail mode and 13 percent in the Internet mode. However, there were a few population questions for which the Internet nonresponse rate was somewhat higher than the mail rate, including place of birth, citi-

Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×

zenship, language other than English, health insurance, hearing difficulty, and vision difficulty. These differences were generally smaller (3 percentage points or less) than most of the item nonresponse differences that were favorable to the Internet mode.

The reasons for the differences that were not favorable to the Internet mode could again be associated with the differences between the way the questions are asked in the two modes. If additional research can shed further light on the potential cause of these differences, then the flexibility of the Internet mode might enable the Census Bureau to bring the Internet nonresponse rates below the mail nonresponse rates for these items as well. The evaluation of the 2013 January panel (Clark, 2014) revealed that some of the item nonresponse in the Internet mode was associated with breakoffs (survey responses that were started but not completed). Breakoffs tend to be higher in the Internet mode than the mail mode, partly because the Census Bureau can track breakoffs that happen on the Internet, but does not have information about mail questionnaires that someone started to fill out but did not complete and did not return. Further analysis of the breakoff patterns would also be useful.

Some of the research conducted as part of the Internet test provides an indication of the many ways the unique features of the Internet mode can be used to improve not only the Internet instrument but also the survey itself. For example, one study analyzed paradata from the help link provided to respondents and found that the help link was requested at least once by approximately 40 percent of respondents and that 14 percent of all requests for help involved the ancestry question (Horwitz et al., 2013). These rates are surprisingly high in comparison with prior research that seems to suggest that help links are not frequently used by survey respondents (Conrad et al., 2006). The finding indicates that this feature of the Internet mode may be providing very useful assistance to respondents that the mail mode lacks. Indeed, the research revealed that in over half of the cases (54.8 percent) when the help link was accessed, the information appears to have been used to generate an answer when no response option had previously been selected, and that, in a small number of cases (5.1 percent), the help requests resulted in a changed answer. However, the high reliance on the help link raises some questions about the degree of difficulty associated with many of the questions overall, regardless of the mode of administration: this issue, too, would benefit from further research attention.

RECOMMENDATION 10: The Census Bureau should conduct a thorough evaluation of potential mode effects on both data quality and nonresponse in the American Community Survey, focusing in particular on the newly introduced Internet mode.

Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×

During the 2011 Internet test, less than 3 percent of the respondents used a device other than a personal computer (such as a tablet computer or cell phone) to access the web-based survey instrument (Horwitz et al., 2013). Because a version customized for such uses was not available, it is difficult to know whether more people would have tried to complete the survey on a mobile device if an option tailored to these devices was made available. Since the use of mobile devices for functions that were previously performed on a personal computer is quickly growing, continuing to monitor this trend and beginning to plan for the development of a survey instrument that works well on the majority of mobile devices is important. Work in this area could also serve as a test vehicle for the 2020 census.

RECOMMENDATION 11: The Census Bureau should conduct research to understand what types of devices are used by American Community Survey respondents to connect to the Internet and whether there are any associated data quality implications.

Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×
Page 46
Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×
Page 47
Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×
Page 48
Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×
Page 49
Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×
Page 50
Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×
Page 51
Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×
Page 52
Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×
Page 53
Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×
Page 54
Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×
Page 55
Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×
Page 56
Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×
Page 57
Suggested Citation:"3 Data Collection Methods." National Research Council. 2015. Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities. Washington, DC: The National Academies Press. doi: 10.17226/21653.
×
Page 58
Next: 4 Data Processing and Analytic Issues »
Realizing the Potential of the American Community Survey: Challenges, Tradeoffs, and Opportunities Get This Book
×
Buy Paperback | $56.00 Buy Ebook | $44.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The American Community Survey (ACS) was conceptualized as a replacement to the census long form, which collected detailed population and housing data from a sample of the U.S. population, once a decade, as part of the decennial census operations. The long form was traditionally the main source of socio-economic information for areas below the national level. The data provided for small areas, such as counties, municipalities, and neighborhoods is what made the long form unique, and what makes the ACS unique today. Since the successful transition from the decennial long form in 2005, the ACS has become an invaluable resource for many stakeholders, particularly for meeting national and state level data needs. However, due to inadequate sample sizes, a major challenge for the survey is producing reliable estimates for smaller geographic areas, which is a concern because of the unique role fulfilled by the long form, and now the ACS, of providing data with a geographic granularity that no other federal survey could provide. In addition to the primary challenge associated with the reliability of the estimates, this is also a good time to assess other aspects of the survey in order to identify opportunities for refinement based on the experience of the first few years.

Realizing the Potential of the American Community Survey provides input on ways of improving the ACS, focusing on two priority areas: identifying methods that could improve the quality of the data available for small areas, and suggesting changes that would increase the survey's efficiency in responding to new data needs. This report considers changes that the ACS office should consider over the course of the next few years in order to further improve the ACS data. The recommendations of Realizing the Potential of the American Community Survey will help the Census Bureau improve performance in several areas, which may ultimately lead to improved data products as the survey enters its next decade.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!