A surprising degree of uncertainty surrounds one seemingly simple point of fact about the U.S. Census Bureau’s American Community Survey (ACS): When did the ACS begin? Some statements cast the ACS as a very new survey—largely through comparison with the 2010 decennial census, in which the ACS replaced the “long-form” questionnaire administered in previous decades to a sample of respondents. Others claim a longer lifetime for the concepts and questions of the ACS, alternately describing the ACS as dating back to the first decennial census in 1790, which started the precedent of asking for more detail than a simple headcount; to 1940 and the first major use of statistical sampling for some content in a decennial census; or to 1960, the first census in which a full-fledged long-form questionnaire was administered.
In fact, a case can be made for all of the years listed above—and others—because some of the ideas behind the ACS are indeed as old as American census-taking itself. In its content, the 1790 census stuck to the basic information suggested in the Constitution’s enumeration clause, but its execution established a basic, subtle precedent that underlies today’s ACS and census: data on aggregates such as communities, states, or the nation as a whole are acquired by asking questions of individuals (in the 1790 case, asking heads of household for a name-by-name accounting of family members). The range of those questions began to widen over the course of the 19th century—more detailed age categories in 1800, rough indicators of occupation in 1810, queries on disability (deaf or blind) in 1830, and later expansion into topics ranging from educational status to amounts of wages earned and taxes paid. The 1940 census began the practice of covering some of the now-sprawling content of the census on a sample basis, asking certain supplemental questions only of those individuals appear-
ing on certain lines of the census schedule. The 1960 census formalized the process, creating a separate long-form questionnaire; most households received the short-form questionnaire containing only basic data items while a roughly one-sixth sample received the more detailed long form.
As the 2000 census approached, a notion originally advanced in the early 1940s as a possible replacement for the census began to gain traction, albeit as a replacement for the long-form sample rather than the entire census. Sociologist and demographer Philip Hauser, working at the Census Bureau in 1941, is generally credited with first advancing the idea of an “annual sample census” to generate small-area estimates. Increasing calls for more timely estimates led, in part, to Congress amending Title 13 of the U.S. Code in 1976 to authorize a mid-decade census in years ending in 5, but no action was taken under the new law. In 1981, statistician Leslie Kish proposed mounting a continuous sample survey, creating “rolling” samples by accumulating one or more years of collected data and using those samples to create estimates; the Decade Census Program suggested by Roger Herriot and others in the late 1980s advanced similar notions of using rolling samples for state-level estimates (Herriot et al., 1989). Drawing from these ideas, the Census Bureau’s Charles Alexander took the lead in assessing options to replace the long-form sample, which had become a greater drain on census resources (and whose quality had arguably slipped, conducted as an adjunct to the main census). Alexander’s proposed Continuous Measurement Survey would pick up a new moniker—the American Community Survey—as it entered very early pilot testing in four counties in 1996. The number of test sites (counties or groups of counties) was slowly ramped up until 2000. In that year, one of the formal experiments associated with the 2000 census was to dramatically scale up ACS coverage to roughly one-third of all counties—mainly as a test of feasibility of conducting both the ACS and a decennial census simultaneously. Bolstered by favorable results, the ACS continued collection at these so-called Census 2000 Supplementary Survey levels (after the name of the 2000 census experiment) until 2005, when the ACS began operations in all counties nationwide.
To secure approval for the full-scale ACS, the ACS was made an integral part of plans for the 2010 decennial census—the basic bargain being that shifting the historical long-form content to the ACS would free the main 2010 census to be conducted as a “short-form only” count. In this spirit, the content and questions of the ACS were closely patterned after the long-form sample questionnaire used in the 2000 census. As years have passed, new questions have gradually been added to the survey. For instance, questions on health insurance coverage (discussed more fully in Chapter 2) and marital history were added after testing in 2006, and the Census Bureau plans to add a question on Inter-
net access beginning in 2013.1 As implemented in 2012, the ACS questionnaire contains 75 numbered questions:
- A lead respondent for the household is guided through the process of answering six roster questions about each member of the household; these questions include each person’s name, relationship to the lead respondent, age, and race and Hispanic origin.2
- The lead respondent is then asked 21 “housing” questions; these include attributes of the housing stock (e.g., age of construction of the building, presence or absence of plumbing and cooking facilities, and the amount spent in the previous month for electricity) and general characteristics of the household (e.g., monthly rent, whether anyone in the household holds the mortgage on the property, and whether anyone in the household received Supplemental Nutrition Assistance Program/Food Stamp benefits in the previous year).
- Finally, up to 48 numbered “person” questions are asked of each person in the household (some questions may be skipped depending on earlier responses); these questions include U.S. citizenship, language other than English spoken at home, place of residence 1 year ago, fertility (whether the person gave birth in the past year), employment status, and various types of income in the past year.
In brief, the basic parameters of the ACS are that on the order of 295,000 housing units—spanning all counties in the United States—are selected for the ACS sample every month (or roughly 3.5 million housing units per year).3 Those sample units are contacted by mail during their first month in the sample; in the second month, those who failed to respond by mail are selected for contact/interview by telephone, if a phone number can be determined; in the third month, a subsample of nonrespondents by either mail or phone are then selected for personal interview by Census Bureau field staff. In this way, the Census Bureau is continuously collecting data across all three modes: the data actually collected in month M include mail returns from the new housing units added to the sample in month M, phone interviews with some people from the
1The Census Bureau archives the questionnaires used in various years of ACS implementation on its website at http://www.census.gov/acs/www/methodology/questionnaire_archive/. Throughout this summary, when appropriate, question text is excerpted from the 2012 version of the questionnaire, the direct link to which is http://www.census.gov/acs/www/Downloads/questionnaires/2012/Quest12.pdf.
2This full detail is only asked about five people in the household; only name, sex, and age are asked about Persons 7–12, and the additional detail may be collected about these people in a followup phone interview.
3The full-scale ACS operated on a monthly sample of about 250,000 households until the increase to 295,000 with the June 2011 mailing. A separate sample of roughly 200,000 residents of group quarters—places like college dormitories, nursing homes, military barracks—is also covered in the ACS each year. The unique challenges presented by group quarters are largely beyond the scope of this workshop, but are addressed in detail by National Research Council (2012).
month M – 1 sample (plus late mail returns from that group), and personal visit interviews with some people from the month M -2 sample (also including late mail returns from that group). The Census Bureau intends to implement an Internet response option to the ACS beginning in January 2013, rather than relying solely on return of the mailed paper questionnaire.
An important nuance of the ACS, as currently implemented, is that response to the survey is mandatory under law—inheriting this attribute from the ACS’s role as replacement for the former (mandatory) long-form sample and as a supplement or complement to the decennial count.4 Nonrespondents are subject to a fine,5 and messages emphasizing the mandatory nature of the survey are prominently displayed on ACS mailing materials and noted on the questionnaires. More will be said of this requirement—and proposed changes to it—below in Section 1–B.
The data from the ACS’s monthly samples are cumulated and averaged over 1 or more years—with a single year’s collection of data able to provide information for large population subgroups and several years of data needed to yield reliable estimates for small groups. Specifically, the estimates currently produced by the ACS use three time windows:
- 1-year period estimates, for geographic and demographic groups of population 65,000 or greater;
- 3-year period estimates, for groups with populations between 20,000 and 65,000; and
- 5-year period estimates, for groups with populations less than 20,000—a category including the vast majority of cities and counties in the United States, as well as smaller geographies such as census tracts.
Being period estimates, and pooling several months of data, ACS estimates differ from the estimates produced by the former long-form sample, which could be interpreted as point estimates for the decennial census reference date (April 1). In addition to the conduct of the decennial census itself, the year 2010 was
413USC § 193 states that: “In advance of, in conjunction with, or after the taking of each census provided for by this chapter, the Secretary [of Commerce—and by extension the Census Bureau—] may make surveys and collect such preliminary and supplementary statistics related to the main topic of the census as are necessary to the initiation, taking, or completion thereof.”
5Like the decennial census, the defined penalty for “refus[ing] or willfully neglect[ing] … to answer, to the best of his knowledge, any of the questions on any schedule submitted to him in connection with any census or survey provided for” elsewhere in the law (including § 193, discussed in the previous footnote) is a fine of not more than $100 (13USC § 221(a)). The penalty for “willfully giv[ing] any answer that is false” is a fine of not more than $500 (13 USC § 221(b)). However, under the terms of the Sentencing Reform Act of 1984, a person guilty of either of these infractions could be subject to a fine of “not more than $5,000” (18 USC § 3571). Prior to enactment of P.L. 94-521 in 1976, violators were also subject to imprisonment: up to 60 days for nonresponse and up to 1 year for false answers. That said, the Census Bureau’s longstanding position is to emphasize the reasons for responding to the ACS rather than the possible legal penalties, and has refrained from charging ACS nonrespondents.
pivotal for the Census Bureau in that it was the first year in which the full suite of ACS products was available for the entire nation—including many localities’ first real glimpse at ACS data, through the 5-year 2005–2009 estimates. The Census Bureau releases ACS estimates primarily through its American FactFinder website at http://factfinder.census.gov.
The unique challenges and opportunities presented by the ACS data were explored by a previous National Research Council (2007b) panel. However, given its dates of operation, that panel’s work was necessarily abstract and hypothetical. So too was the Census Bureau’s own work to prepare data users, including the production of a series of “Compass” handbooks for different types of data users.6 Only with the release of actual data, and following the reaction to the data and to their accompanying and prominently featured margins of error, did the ACS become real and concrete to its potential users. Now in its second decade of large-scale implementation, the release of the first sets of full data also marks an opportune time to assess the ACS, its uses, and its demands.
The U.S. Census Bureau requested that the Committee on National Statistics (CNSTAT) convene a:
workshop on the benefits to a broad array of non-federal users of the data products from the U.S. Census Bureau’s American Community Survey (ACS). The workshop will also address the burden on the American public of responding to the ACS questions. A workshop on the benefits of the ACS is timely because the survey has just completed its first full cycle of releasing 1-, 3-, and 5-year period estimates (for 2009, 2007–2009, and 2005–2009, respectively), and there is need to take stock of user experience with the data and to identify priority uses for the future. The workshop is part of a larger review the Census Bureau is conducting to assess the overall mission, vision, goals, and objectives of the ACS. The Bureau’s review also includes examining the data products along with the methods for conducting and managing the survey.
Pursuant to this charge, CNSTAT convened a steering committee for the workshop, which met in March 2012 to discuss the task and craft the structure of the workshop.
The steering committee made two major decisions that governed the final shape of the workshop. The first was to make the notion of ACS “burden” an integral part of the proceedings and to extend it beyond the single issue of time
6These “Compass” guides included, for instance, volumes targeted at state and local government users (U.S. Census Bureau, 2009), high school teachers (U.S. Census Bureau, 2008b), members of Congress and congressional staff (U.S. Census Bureau, 2008a), and business users (U.S. Census Bureau, 2008c).
spent by respondents in completing the survey. The charge of the workshop emphasizes the benefits of the ACS and its title retained the wording used at the project’s inception—literally expressing “burdens” as a parenthetical. But the steering committee sought to ensure that the challenges presented by the ACS were given an open and prominent airing, and decided to seek presentations and discussions concerning more general types of “burden” in the ACS. For instance, from the perspective of respondents, the ACS and its detailed questions raise privacy and confidentiality concerns that might serve as an impediment to response altogether. For data users, the ACS presents a significant communication burden: explaining “new” concepts (the interpretation of estimates based on multiple years of data) and old ones (margins of error) to potentially skeptical audiences. A final example of a broader burden created by the ACS follows directly from its design: Large, populous geographic units have access to a wealth of ACS estimates of 1-, 3-, and 5-year vintages while rural areas and smaller population groups face a relative scarcity of ACS estimates, necessarily waiting for 5-year accumulations that may still have very high standard errors. These types of “burden” considerations shaped the steering committee’s selection of presenters within sessions; to give the issues a dedicated airing, the committee also sought to carve out a specific section of the program for a structured discussion of various types of “burden,” recognizing that a single presentation or paper was unlikely to be sufficient.
The second was to cast a net as wide as possible to collect perspectives on ACS uses, and to make the workshop’s agenda book (background briefing materials) a virtual poster session in order to spotlight more voices than the limited number of speaking slots could accommodate. The steering committee drafted a short “feeler” notice, asking ACS users to write back with a short description of how (and how often) they use ACS data products, and for what purposes. This feeler notice was distributed through a variety of channels, email lists, and contact networks, and yielded dozens of expressions of interest. The steering committee filled most of the slots in its working agenda using these submissions. Later, this group of feeler-notice respondents was contacted again and asked to contribute short written thoughts for the meeting agenda book—whether a particularly interesting “case study” of ACS data usage or an expanded “user profile” describing how individual users work with the ACS (and work around any of its shortcomings). In all, this background briefing book included about 30 submissions.
7 The specific page link to the workshop materials is http://sites.nationalacademies.org/DBASSE/CNSTAT/ACS_Benefits_Burdens.
The workshop itself was held on the afternoon of June 14 and the full day of June 15, 2012, at the National Academies’ Keck Center in Washington, DC. The workshop drew roughly 80 attendees across both days, with the first afternoon’s proceedings drawing a particularly strong crowd. The final agenda for the workshop and a list of the contributors to the accompanying case study agenda book are reprinted in Appendix A.
While the workshop agenda was under development—and independent of the workshop and its charge—the ACS drew sudden legislative attention in spring 2012. Ultimately, this attention would set a most unusual context for the workshop: This workshop on the uses of the ACS took place slightly more than one month after one chamber of Congress voted to effectively terminate the survey.
Multiple bills filed in the 111th Congress sought to eliminate or weaken the requirement that response to the ACS be mandatory under law. H.R. 3131 (introduced by Rep. Ted Poe of Texas) would remove any penalty “for refusing or willfully neglecting to answer questions” on the ACS other than respondent name, contact information, “the number of people living or staying at the same address,” and the response date. H.R. 5046 (introduced by Rep. Todd Akin of Missouri) would take a stronger line, making the only mandatory question on the ACS (and the decennial census) to be “the number of individuals living in such individual’s residence.”8 Though the bills acquired some cosponsors (35 for the Poe bill, 8 for the Akin bill), neither advanced beyond referral to subcommittee during the 111th Congress.
In the 112th Congress, Rep. Poe reintroduced his legislation in March 2011 (now numbered H.R. 931), and it was referred to two subcommittees of the House Judiciary Committee as well as to the Oversight and Government Reform Committee (which has primary jurisdiction over the Census Bureau). After no action for 1 year, the House Oversight subcommittee with census jurisdiction9 held a hearing on the topic of a voluntary (rather than mandatory) ACS on March 6, 2012. In addition to Rep. Poe, Census Bureau Director Robert Groves and three ACS data users from the business and economic development
8Filed on April 15, 2010, the Akin bill—even under the speediest of circumstances—could not have applied to the 2010 census, which was then well under way. It would have required placing “on the front of each [questionnaire or survey] in a conspicuous manner” an advisory statement of the form: “Constitutionally, in responding to this survey, you are only required to provide the number of individuals living in your residence. Answers to all other questions contained within this survey are optional.”
9Formally, the Subcommittee on Health Care, District of Columbia, Census and the National Archives.
sectors were invited to testify. Because the hearing was merely on the subject matter, and not a hearing or meeting on the bill itself, no further action was taken on the bill in the immediate wake of the hearing.
Several weeks later, on May 9, two amendments were offered during House floor consideration of H.R. 5326, the fiscal year (FY) 2013 Commerce, Justice, Science, and Related Agencies appropriations bill including funding for the Census Bureau. Both amendments took the form of policy “riders”—not explicitly striking or replacing a specific funding number, but simply prohibiting the use of any funds in the bill for certain purposes. The first amendment (introduced by Rep. Poe) would bar the use of funds in the bill to enforce penalties for nonresponse to the ACS, effectively making the survey voluntary rather than mandatory.10 The second, offered by Rep. Daniel Webster of Florida, would prohibit use of the funds to “conduct the survey” altogether. Under the rules governing debate on the bill, consideration of each amendment was limited to five minutes on each side; the central arguments made in favor of the amendments were concern for the intrusive nature of some of the ACS questions and the argument that the detail of the ACS queries exceeds the constitutional mandate for the decennial census. After those short debates, the Poe amendment was judged to have passed by voice vote and the Webster amendment was ultimately passed by a 232–190 vote.11 The next day, the House wrapped up consideration of the whole bill and passed it, as amended.
In the wake of the House vote, a version of the standalone Poe bill (to make the ACS voluntary) was introduced in the Senate by Sen. Rand Paul of Kentucky. Neither the Paul bill nor the Senate’s Commerce, Justice, and Science appropriations bill for FY 2013 have progressed to further consideration at this writing in summer 2012; the prospects for completing work on the FY 2013 bills could reasonably be described as uncertain heading into a presidential election and the closing sessions of the 112th Congress.12
Introducing the workshop on behalf of the steering committee, CNSTAT Director Constance Citro noted that these recent legislative developments were an “elephant in the room” for the workshop. She noted that the workshop was
10Both the Poe amendment and the full bill, H.R. 931, speak only to the penalty defined in 13 USC § 221(a)—for “willfully refus[ing] or neglect[ing]” to provide a response. They let stand the penalty applying under 13 USC § 221(b), which penalizes giving willfully false answers.
11There was no request to hold a recorded vote after the chair ruled that the yeas had prevailed on the voice vote on the Poe amendment and—initially—the Webster amendment appeared to carry by voice vote as well. Rather than immediately call for a recorded vote on the Webster amendment, floor managers instead asked special dispensation to let a late-arriving speaker comment on the amendment—after which the chair moved on to other amendments. Shortly thereafter, opponents of the Webster amendment prevailed on the bill managers to vacate the voice vote and instead hold a recorded vote, which produced the 232–190 result.
12The Senate appropriations subcommittee’s mark for the Census Bureau gave the Bureau its full request for FY 2013, including some funds for both the 2010 and 2020 decennial censuses and the 2012 economic census that were trimmed in the House version. As of early December 2012, no further congressional action had been taken on the relevant FY 2013 appropriations bills.
not developed, and should not be interpreted, as a response to the legislation or an advocacy, one way or the other, for the ACS. Rather, the workshop is intended to document the way the ACS estimates are being used and to construct a portrait of the nonfederal ACS user base. She noted that the steering committee put no limits on what the workshop presenters (or case study contributors) say, and she encouraged the workshop participants to approach the session with the same candor.
The legislative developments that would either kill or impair the ACS—rolling out so closely before a workshop on the ACS—did not materially affect the content or the structure of the workshop, but they undeniably shaped its context and climate. The continued existence of the survey became a high-level “benefit” of the ACS that presenters wanted to address in their presentations. Likewise, the discussion sessions following topic blocks at the workshop that might—in different times—have involved more probing of the challenges or burdens of ACS data in specific applications instead reflected the underlying concerns: What would you do if the ACS had a lower sample size? What, if any, other data sources could you use if the ACS were to go away? And, could individual state agencies or private businesses generate similar data to meet basic needs? Hopefully, a full accounting of the workshop confirms that it did not become an advocacy platform in any direction—both benefits and deficiencies with the data were given ample time—but it also reflected the underlying tension over the basic prospects for the ACS in 2013 and beyond.
This report has been prepared as a factual summary of what occurred at the Workshop on the Benefits (and Burdens) of the American Community Survey. The workshop steering committee’s role was limited to planning and convening the workshop. Accordingly, the views contained in this report are those of individual workshop participants and do not necessarily represent the views of all workshop participants, the steering committee, or the National Research Council.
This workshop summary largely follows the topic blocks that were used in scheduling the workshop, though some rearrangement has been made when that seemed logical. The most prominent such rearrangement is the chapter describing the workshop’s dedicated panel discussion on various aspects of burden associated with the ACS. The workshop steering committee deliberately scheduled that session for the rough midpoint of the workshop—the morning of the second day—so that it might draw from themes raised from some of the user presentations and infuse discussions of others. However, for purposes of this summary, it makes logical sense to defer the summary of this session to the end, in Chapter 8, so as not to disrupt the flow of the previous chapters. There, it
Chapter 2 summarizes presentations from the kickoff session of the workshop, focused on the applications of the ACS in planning for health care and transportation services. Based primarily on the experiences of nonprofit organizations and their use of the ACS, Chapter 3 describes the workshop block on ACS uses in planning for social services and preparing for (and responding to) natural and manmade disasters. To close the first afternoon of the workshop, the steering committee sought perspectives from three users in the print and online media; that discussion is in Chapter 4. The core of the second day of the workshop was oriented more around different “sectors” of ACS data users than specific topics; the experiences of state, local, and tribal government users and of business, economic development, and “data aggregator” users are discussed in Chapters 5 and 6, respectively. Chapter 7 turns to one important use of the ACS that is explicitly spelled out in federal law and regulation—to ensure compliance by state and local governments with the terms of the Voting Rights Act—as well as to related legal, political, and social equity uses of the ACS. The previously mentioned panel discussion on various aspects of burden associated with the ACS and the capstone discussion of the workshop are summarized in Chapter 8.