Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
The Census Bureauâs stated goals for ACS are to â¢ Provide federal, state, and local governments with an information base for the administration and evaluation of government programs; â¢ Facilitate improvement of the 2010 Census by allowing the decennial census to focus on counting the population; and â¢ Provide data users with timely demographic, housing, social, and economic statistics updated every year that can be compared across states, communities, and population groups. The Census Bureau began developing the ACS in the mid 1990s. In the first few years, while the program was just beginning, preliminary ACS data were collected for a few test sites. In 1999, the number of test sites was increased to 31 locations, comprising 36 counties and representing a broad range of communities that were selected to provide different combinations of popula- tion sizes, population characteristics, population growth levels, and difficulty of enumeration. Table 2.1 lists the ACS test sites and the annual sampling rates that were used for each. The data collection effort for the 31 test sites has been performed annually since 1999, and among the most important outputs of the ACS testing phase has been the compilation of three complete years of data for 1999, 2000, and 2001. In addition to the test site program, the Census Bureau performed a large-scale (1,203 counties) operational test of ACS methods in the year 2000, entitled the Census 2000 Supplementary Survey (C2SS). These data allow for the compar- ison with Census 2000 decennial data along several dimensions. The Census Bureau and other researchers have performed a wide range of analyses related to the ACS test data. Many of these efforts are summarized below. In addition, this section describes the key elements of the plan for implementing the full-scale ACS, and how these elements of the ACS program may affect ACS data analysesâparticularly those analyses related to transporta- tion planning applications. 2.1 ACS Implementation The Census Bureau first described their plans for fully implementing ACS in the American Com- munity Survey Operations Plan and in associated website documents.1 These plans have evolved as a result of ACS testing and federal appropriations processes. The schedule for the transition to full implementation has slipped due to limitations and uncertainties in the appropriations process, but the operational components of the program seem to be established and are documented in an ACS 6 C H A P T E R 2 American Community Survey 1 U.S. Census Bureau, American Community Survey Operations Plan: Release 1 (March 2003). www.census.gov/ acs/www/index.html.
technical paper on design and methodology.2 This section describes the ACS operations and imple- mentation schedule based on the descriptions provided by the Census Bureau. 2.1.1 Operational Components of ACS3 Once fully implemented, the ACS will sample about 3 million addresses from the Master Address File (MAF) each year and about 2.5 percent of group quarters populations. This annual sample will be systematically divided into 12 monthly samples for interviewing, and the sampled units will then be contacted to provide data. The collected data will then be processed and refined, and made available to data users on an annual basis. Depending on the size of the geographic area under study and the analyses being performed, data users may need to combine multiple years of ACS data to analyze specific geographic areasâthe more detailed geography, the greater the number of years that will need to be combined. To accomplish the ongoing implementation of ACS, the Census Bureau will need to contin- uously perform the following functions: â¢ Address list development and updating to provide the sample universe; â¢ Implementation of sample selection protocols to obtain a sample each month; â¢ Implementation of the following data collection: â Mail out/mail back data collection phase; â Computer-assisted telephone interviewing (CATI) data collection phase; and â Computer-assisted personal interviewing (CAPI) data collection phase. â¢ Implementation of data entry and telephone follow-up procedures for mail returns; American Community Survey 7 County Pima County, Arizona Jefferson County, Arkansas Tulare County, California Upson County, Georgia Miami County, Indiana Black Hawk County, Iowa DeSoto Parish, Louisiana Calvert County, Maryland Hampden County, Massachusetts Madison County, Mississippi Iron County, Missouri Reynolds County, Missouri Washington County, Missouri Flathead County, Montana Lake County, Montana Douglas County, Nebraska Otero County, New Mexico Rockland County, New York 1999-2001 Annual Sampling Rate 5% 5% 5% 5% 5% 5% 5% 5% 5% 5% 5% 5% 5% 5% 5% 5% 5% 5% County Multnomah County, Oregon 1999-2001 Annual Sampling Rate 5% Fulton County, Pennsylvania Schuylkill County, Pennsylvania Sevier County, Tennessee Starr County, Texas Zapata County, Texas Petersburg City, Virginia Yakima County, Washington Ohio County, West Virginia Oneida County, Wisconsin Vilas County, Wisconsin San Francisco County, California Broward County, Florida Lake County, Illinois Bronx Borough, New York Franklin County, Ohio Fort Bend County, Texas Harris County, Texas 5% 5% 5% 5% 5% 5% 5% 5% 5% 5% 3% 3% 3% 3% 3% 1% 1% Table 2.1. American community survey test sites. 2 U.S. Census Bureau, Design and Methodology: American Community Survey, Technical Paper 67 (May 2006) U.S. Government Printing Office, Washington, D.C. 3 This entire section relies heavily upon the ACS Design and Methodology and Operations Plan documents.
â¢ Data processing as follows: â Coding, editing, and imputation procedures; and â Weighting, disclosure editing, and tabulation, â¢ Data product dissemination. These elements of the ACS process are discussed below. Address List Development and Update The Census Bureau maintains the MAF, a national address sampling frame for the decennial census and other census data collection activities. Maintaining the quality of this database will be an essential element of successful implementa- tion of the ACS program. Therefore, the Census Bureau is actively engaged in efforts to improve the database and to maintain it into the future. The MAF was developed for Census 2000 using the previous decennial census list, the U.S. Postal Serviceâs Delivery Sequence File, and address data supplied by local governments. The MAF is linked with the Censusâs Topologically Integrated Geographic Encoding and Referenc- ing (TIGER) database. The TIGER system and the MAF currently are being updated by the Census Bureau in prepa- ration for the 2010 decennial census.4 One update process is called the MAF/TIGER Accuracy Improvement Project, or MTAIP. The project, expected to be complete by 2008, will improve the positional accuracy of street centerlines in the TIGER database. The update process is using existing data sources whenever possible, including â¢ State/local/county/tribal GIS files; â¢ Commercial GIS files; and â¢ Existing imagery. If existing data are not available, new sources, such as imagery and field collection, are used. Although the project will result in spatially more accurate TIGER/Line files, the TIGER/Line identifiers will not change. Attribute data will be conflated to the new geometry. For new seg- ments, if city-style addresses are present in the file, they will be transferred to TIGER. The MAF/TIGER Accuracy Improvement Project (MTAIP) process is focused on Census 2010 and is expected to be useful to ACS after 2008. A pilot study of acquiring coordinates for resi- dential structures also is being conducted, where attributes, including feature names, address ranges, and address lists as appropriate from state/local/tribal/county GIS files are collected. The MAF is kept up to date by use of the U.S. Postal Serviceâs Delivery Sequence File for both residential and non-residential addresses. The update takes place twice a year for those blocks that are completely city-style residential addresses. In addition, ACS field representatives note any address corrections found in visiting housing units during the personal visit non-response follow-up data collection phase. The Census Bureau also performs systematic listing and map- ping of selected areas to support several of their data collection efforts. Finally, to address qual- ity concerns relating to areas with high concentrations of non-city-style addresses, the Census Bureau has initiated a program called the Community Address Update System (CAUS). Sample Selection Protocols According to the Design and Methodology document, when ACS is fully implemented, each year the Census Bureau will select a systematic sample of addresses (3 million addresses per year or 250,000 addresses per month) from the most current MAF. Ini- tially, this sampling rate will be equivalent to 2.5 percent of households each year, but this rate will decrease over time as the nationâs population increases. In addition, about 2.5 percent of the 8 A Guidebook for Using American Community Survey Data for Transportation Planning 4 Robert Lamacchia, U.S. Census Bureau, âTIGER/MAF Update Process.â Presentation to U.S. DOT on July 9, 2004, as part of the ACS FAIP Program.
people in group quarters facilities will be included in the ACS. The sample will be selected from each county in the United States. No address will receive the ACS questionnaire more than once in any five-year period. To improve the reliability of estimates for small governmental units (such as small counties or American Indian reservations) with less than 1,200 addresses, some areas will be over sampled similar to what was done for the census 2000 Long Form design. For 2005, the actual sampling rates are expected to range from 1.6 percent to about 10 percent each year. In the future, the Census Bureau also will consider additional over sampling of certain counties to try to improve the reliability of estimates of geographically dispersed small minority population groups (such as Native Hawaiians and other Pacific Islanders, Asians, or American Indians and Alaska Natives) living in urban areas, but these changes would not be made until the current over sampling scheme for mail survey response is fully analyzed. 2.1.2 Questionnaires The current ACS questionnaire (ACS 2003) is the result of several iterations of questionnaire implementation and revision. After the ACS demonstration period testing (1996-1998) and prior to the comparison period testing (1999-2001), the ACS questionnaire was modified. In addition, the ACS questionnaire was modified after the comparison period, and again for the 2003 ACS. The same questionnaire is being used for the 2003-2007 ACS efforts. In preparation for the 2008-2012 period, the Census Bureau has been conducting the 2006 ACS Content Test to evaluate potential reworded and reformatted questions and to try new questions related to marital history, health insurance coverage, and veteranâs service-related disabilities. Some potential rewording of questions related to work status would have the great- est impact on transportation planners. It is expected that the results of the 2006 test will lead to the 2008 questionnaire, and that this questionnaire will remain the same through 2012. Appendix A summarizes differences between the current ACS questionnaire and the decen- nial census Long Form (1990 and 2000) for housing and population questions. The ACS and Census 2000 questionnaires roughly have the same questions (in different question order). The differences in the data collection protocol, however, lead to a few key differences in the popula- tion questions, as described below. Residence Rules The ACS uses different residence rules than have been used in past decen- nial censuses. Decennial censuses and most surveys use the usual residence concept. The usual residence concept requires that respondents have only one place as their usual residenceâmost often the place where they spend the most time. The usual residence rule does not count people who are staying somewhere other than their usual residence as occupants of that place. For exam- ple, people who spend their winters in Florida and the rest of the year in Vermont, so called âsnowbirds,â have in the past been enumerated in the census as residents of Vermont, not Florida. The ACS, in contrast, uses the current residence concept and the Two-Month Rule. Under the Two-Month Rule, anyone who is living for more than two months in a survey unit when the unit is contacted (either by mail, telephone, or personal visit) is considered to be a current resident of that unit. Persons who are away from a residence for two months or less, regardless of their temporary location or the purpose of their travel, are considered to be âin residenceâ at the residence. If a residence does not have any occupants for more than two months from when it is sampled for the ACS, it is classified as a vacant housing unit. If a residence is occupied only by individuals that stay there for two months or less, and who have another permanent address, the residence American Community Survey 9
is classified as a temporarily occupied housing unit. Only limited housing unit data are collected for vacant and temporarily occupied housing units (no household or person data). The ACS Two-Month Rule has the following exceptions: â¢ Children (kindergarten through Grade 12) away at boarding schools are considered residents of their parental home. (College studentsâ current residency is based on the Two-Month Rule.) â¢ Children living in joint custody and who frequently move between separate residences are considered to be residents of the sampled residence if they are present at that residence when contact is initially made. â¢ Commuter workers who stay in a residence close to where they work and return regularly to their primary residence are considered to be residents of their primary residence, not the work-related one. The current residence concept suits the ACS, because the ACS continuously collects informa- tion from monthly samples throughout the year. The current residence concept recognizes that people can live in more than one place over the course of a year, and that population traits for some areas may be noticeably affected by these shifts. Although ACS will not capture the seasonal changes in the population (because ACS estimates are tied to Census Bureau annual estimates for July 1), ACS can capture the characteristics of the population for the full year. Reference Date An important difference between ACS data and previous decennial census data that is brought about by the continuous nature of ACS data collection is the reference period of the survey. In the decennial census, the questions are referenced to the beginning of April of the census year, and questions that require retrospective information are tied to the calendar year. For example, in Census 2000 respondents were asked the location of their places of work for the week before the April 1, 2000, census date and their household incomes for the 1999 calendar year. For ACS, the questions are referenced to the time the survey is conducted. Respon- dents are asked at what location they worked last week. The ACS household income reference period is the 12 months ending in the month prior to the survey. The ACSâs variable reference dates will capture seasonal differences the decennial census could not capture, but it is important that analysts consider the changed reference date definitions before using the ACS data, particularly in comparison with previous census estimates. Other Questionnaire Differences Although data elements between the ACS and decennial census Long Form are consistent, wording differences (for both the query and the answer cate- gories) do exist. Such wording changes, however, have been common in the evolution of the Long Form because the Census Bureau has a program of continuous improvement for ques- tionnaire items. As in the past, analysts will need to be cautious when trending data elements for which there have been wording changes. Research of survey methods indicates that the wording of questions affects the corresponding answers. For example, in an instruction for the housing questions, the ACS directs âPlease answer the following questions about the house, apartment, or mobile home at the address on the mailing label.â The Long Form instructed, âNow, please answer [the housing] questions about your household.â While the questions seek essentially the same information, some respondents could have interpreted them differently, thus differences in the population could be identified where none really should exist. There also are small differences in the ways that the ACS and decennial census efforts collect data from respondents in larger households. 2.1.3 Data Collection Procedures The ACS data collection occurs in continuous, three-month cycles using a combination of mail out/mail back, CATI, and CAPI data collection modes. The data collection protocols were 10 A Guidebook for Using American Community Survey Data for Transportation Planning
established based on the Census Bureauâs experience with the decennial census and their demo- graphic surveys. Figure 2.1 shows the workflow for the ACS data collection effort. The data collection process begins with the mail phase. Sampled addresses are evaluated to determine whether they are accurate and complete. Thus far, over 95 percent of the sample uni- verse have been eligible for the ACS mail out. Those sampled units with non-mailable addresses are assigned to the CAPI follow-up. If a sampled unit has a valid address, the Census Bureau mails a prenotification letter, then the initial mailing package (which includes the ACS questionnaire, an instruction booklet, and related materials), and then a reminder card. If no response is received from an address after three weeks, a replacement mailing package is sent. Currently, only English American Community Survey 11 Sample Drawn from MAF Valid Mailed Address? No Yes Mailed Prenotification Letter Mailed ACS Questionnaire, Instructions, Related Materials Mailed Reminder Card (After Three Weeks as Necessary) Replacement Mailing Package Response Received? No Ye s Telephone Number Available? No Ye s Telephone (CATI) Phase No Contact Yes Quality Control, Coding Editing Subsample Drawn from CAPI Addresses In Subsample? No Yes Response Received? Personal Visit (CAPI) Phase No Survey Response? Ye s Assigned to Personal Visit (CAPI) Phase Figure 2.1. Data collection process for ACS monthly sample panels.
language materials are available for U.S. states, but a Spanish language version is used in Puerto Rico, and future plans call for the general availability of a Spanish language package. Alternative language forms will be available upon request. Mail survey respondents are provided with a toll-free telephone number that respondents may use if they have questions, or if they prefer to provide responses by phone. Assistance is provided in English and Spanish. About six weeks after the first questionnaire is mailed, the telephone data collection is begun. The Census Bureau contracts with commercial vendors to obtain available telephone numbers for the identified addresses. Households that have not responded but for which telephone num- bers have been obtained are contacted by telephone interviewers. Using the Census CATI system, interviewers from three call centers perform the ACS (using the same data collection instrument as for the mail survey) over the phone. The CATI operation makes use of quality assurance and training procedures being used in the best commercial calling facilities. If a respondent refuses to participate in the CATI survey, a refusal conversion specialist calls again and makes a second attempt to complete the interview. The CATI surveys are performed in English and Spanish. At the conclusion of the CATI operation (which lasts about four weeks for each sample panel), the Census Bureau selects a subsample of remaining uninterviewed addresses for CAPI. The CAPI subsample contains addresses categorized by their geography and whether or not they have mail- able addresses. The different address categories are sampled at different rates, as discussed below. Over a four-week period, Census Bureau field representatives visit CAPI subsample addresses and at each one, verify the existence of the address, determine its occupancy status, and conduct inter- views if possible. The field representatives collect the data using laptop computers with English and Spanish translations. The ACS interviewers are more experienced than decennial census interview- ers as they are continuously employed by one of the Bureauâs 12 regional offices. All interviewers are supervised by senior interviewers with three or more years of experience, and emphasis is given to recruit bilingual staff to improve the data collection from non-English-speaking households. Unlike for the decennial census, proxy interviews from a non-sample housing unit resident are not permit- ted in the ACS. Proxy interviews within sample housing units are permitted. The ACS schedule means that each monthly sample panel is collected over a three-month period. As shown in Figure 2.2, the collection of data from the monthly sample panels overlaps 12 A Guidebook for Using American Community Survey Data for Transportation Planning MailJun 2009 PhoneMailMay 2009 VisitPhoneMailApr 2009 VisitPhoneMailMar 2009 VisitPhoneMailFeb 2009 VisitPhoneMailJan 2009 Jul 2009Jun 2009May 2009Apr 2009Mar 2009Feb 2009Sample Panel Calendar Month Source: David Hubble, Census Bureau Presentation at [Irvine]. Figure 2.2. Example data collection schedule for ACS monthly sample panels.
so that each step of the survey methodology will proceed in each month. This means that data collection staff can work continuously on their specialty tasks. Therefore, in February 2009, the Census Bureauâs mail phase team will concentrate on the portion of the ACS annual sample that has been assigned to January. Then, in March 2009, they will focus on the sample assigned to February, while the phone (CATI) team works on the January portion of the sample. In April, the mail team will turn their attention to the March sam- ple; the phone team will work with the February sample; and the field representatives will work with a subsample of the January sample. This process continues indefinitely. It is important to note that only a portion of the sample households that have not participated in the mail or telephone phases are included in the CAPI subsample. The sampling plan is designed so that desired sample sizes are achieved without having to complete the field inter- views with all the households that remain after the mail and phone phases of the effort. The CAPI subsampling rates were initially established to be 1-in-3 of the mailable addresses that have not completed the mail or telephone phases and 2-in-3 of the unmailable addresses. Based on the initial ACS experience, the Census Bureau now applies the subsampling rates shown in Table 2.2. The actual disposition of households in the 2001 ACS was something like that shown in Figure 2.3. More than a quarter (28.4 percent in the figure) of the households in the original ACS sample did not respond to the mail phase of the data collection or the telephone (CATI) phase of the data collection, but then were never contacted as part of the personal visit phase of the data collection. Since the data collection effort is not completed for this group, the Census Bureau uses a weighted response rate that effectively discounts this group in the response rate calculation. Figure 2.4 shows the completion results by data collection mode for the 2001 ACS based on the weighted response rate calculation method. By this definition, the weighted response rate for the 2001 ACS was almost 97 percent. The rates for subsequent years have been similarly high. It is important to note that the Census Bureau response rates do not reflect the substantial proportion of households for which the data collection effort is not completed. Thus, the potential for non- response bias is higher than one would infer from the reported weighted response rates. Data Entry and Follow-Up The returned mail surveys are sent to the Census Bureauâs processing center, checked in, and reviewed by staff to determine whether they are minimally complete. If so, the returned survey is keyed and automatically reviewed for completeness and internal consistency. If problems are detected, the return is subjected to the Census Bureauâs telephone edit follow-up procedures, in which respondents are contacted by phone to clarify American Community Survey 13 Address and Tract Characteristics CAPI Subsampling Rate5 Unmailable addresses and addresses in remote Alaska 66.7% Mailable addresses in tracts with predicted levels of completed interviews prior to CAPI subsampling between 0% and 35% 50% Mailable addresses in tracts with predicted levels of completed interviews prior to CAPI subsampling between 35% and 50% 40% Mailable addresses in other tracts 33.3% Source: United States Census Bureau, Design and Methodology: American Community Survey, Technical Paper 67 (May 2006) U.S. Government Printing Office, Washington, D.C. Note: Percentage of addresses with uncompleted interviews prior to CAPI phase that are included in CAPI data collection. Table 2.2. CAPI subsampling rates for the 2005 ACS. 5 The CAPI Subsampling Rate column represents the percentage of addresses with incomplete interviews prior to CAPI phase that are included in CAPI data collection.
their mailed responses. Because the decennial census process schedule cannot accommodate this data quality review and verification, the final ACS returns are more complete and inter- nally consistent than the census Long Form data. Coding In the coding phase of the ACS data collection, questionnaire fields with write-in values are coded to a prescribed list of valid values. Manual coding methods are used to assign codes for industry and occupation, and automated coding programs are used to assign codes for the following: â¢ Place of birth, â¢ Migration, â¢ Ancestry, â¢ Language, â¢ Race, 14 A Guidebook for Using American Community Survey Data for Transportation Planning Source: David Hubble, TRB Conference: Census Data for Transportation Planning, May 2005, Irvine, CA. Completed Personal Visit 26.0% Nonfollow-up Disposition Unknown 28.4% Completed Mail 36.6% Non-Response 2.4% Completed Phone 6.6% Figure 2.3. 2001 ACS disposition of sample. Source: David Hubble, TRB Conference: Census Data for Transportation Planning, May 2005, Irvine, CA. Phone 9.2% Non-Response 3.3% Mail 51.1% Personal Visit 36.4% Figure 2.4. 2001 ACS completion results by data collection mode.
â¢ Hispanic origin, and â¢ Place of work. The most significant coding effort is the geocoding of reported work locations. In the processing of Census 2000 data, the work location was geocoded in a two-phase oper- ation using both the workplace address and employer name given by respondents on the Long Form questionnaires.6 The first phase is an automated or computer-match operation. Records not resolved during this phase moved on to a computer-assisted clerical phase. In January 2000, the U.S. DOT and Census Bureau cosponsored a program called Work-UP to improve the quality of the employer file used by the Census Bureau in the automated and cler- ical coding process. In this program, local agencies (MPOs and state DOTs) used customized GIS software to examine and update employer locations. This effort resulted in about 75 percent of the responses being geocoded properly during the first phase. For records not coded in the first phase, data attributes underwent an allocation process. The allocation procedures used both trip data and job data to assign workplace locations using âstandard allocationâ and âextended allocation.â Standard allocation used travel time, residence tract, means of transportation, and industry to code work locations to a state, county, and place geocode. In addition, many records were allocated down to the block group and traffic analysis zone (TAZ) level during the standard allocation. The extended allocation procedure developed for use in CTPP 2000 was targeted at assigning workplace tract and block codes to workers who could not be coded during the standard alloca- tion process. Extended allocation was done in two stages. In the first stage, a set of potential des- tination areas was identified for each recipient, based on trip characteristics (such as mode and travel time) and residence location. In the second stage, the recipient was matched to a fully geocoded donor who matches the recipientâs industry and occupation characteristics and who works in any one of the potential destination areas. Preliminary negotiations are underway between Census Bureau and U.S. DOT on developing a Work-UP for ACS. The extended allocation system currently is not being used for ACS (because of cost and the insufficient number of donor records). It is expected that when five years of ACS data are collected, the extended allocation process may be implemented. Currently, the rate of origin- destination pairs in ACS is about 75 percent of the successfully geocoded Census 2000. Once the above improvements are made, a better match between ACS and Census 2000 would be expected. The coded data for each residence are recompiled and a data file is produced for editing and weighting. Editing and Imputation The Census Bureauâs edit and allocation rules are used to account for missing, incomplete, and contradictory responses. As for the Long Form data, the Census Bureau has established specific rules regarding procedures for supplying values for variables that are missing. In the ACS, the values are based on other responses provided by the respondent and on responses for similar households. The editing and imputation procedures allocate the hous- ing and population variables according to a predetermined hierarchy, similar to that used for the census 2000 Long Form. The ACS editing process begins with the determination of whether collected information con- stitutes a usable interview. Those responses that are deemed to be non-interviews are included in the later non-response weighting effort. For those responses that are deemed to be interviews, the American Community Survey 15 6 Ed Limoges, Sabre Systems Inc., âAllocation of Missing Place of Work Data in Decennial Censuses and CTPP 2000,â CTPP 2000 Status Report, January 2004.
Census Bureau staff use automated procedures to identify inconsistent and missing answers that require imputation, or the substitution of reasonable values for missing and incorrect data items. ACS imputation is accomplished through the use of âassignments,â which are rule-based procedures that use established relationships between different data items to fill in or correct the missing or incorrect items, and through the use of âallocations,â which are statistical procedures (nearest neighbor methods and hot-deck methods) that use other respondentsâ data to infer rea- sonable values for missing or incorrect items. Table 2.3 shows the item imputation rates of several data items for ACS and for the decennial census Long Form data collection. As the table shows, data item imputation rates are significantly lower for ACS than for the decennial census, and in many cases, appear to be improving over time. These improvements are likely the result of the superior survey design and procedures of the ACS, compared to the decennial census. It should be noted that although individual transportation-related items show reasonable allo- cation rates, many of the household items, when combined with person items, show unusual results. This is probably a result of Census Bureau processing the allocations of household items and person items separately without any cross-referencing. Table 2.4 shows that a number of workers from zero-vehicle households were allocated to driving alone for their commute to work. While 16 percent of the weighted respondents from households without vehicles said they âdrove aloneâ to work, almost 60 percent of the workers who reported that they did not have a vehicle and that did not report a mode to work were assigned by Census Bureau procedures to 16 A Guidebook for Using American Community Survey Data for Transportation Planning Imputation Ratesa Description Census 2000 Number of Vehicles Available 6.2 Place of Birth 10.1 Citizenship 0.8 Previous Residenceb Mobility Status 6.9 Previous Residence Geography (One or More Parts) 11.0 Employment Status Recode 10.9 Place of Work Geography (One or More Parts) 10.7 Means of Transportation to Work 7.6 Private Vehicle Occupancy (Carpooling) 10.0 Time Leaving Home to Go to Work 15.0 Travel Time to Work (Minutes) 2003 ACS 1.0 6.2 0.4 2.2 5.9 3.4 5.2 3.1 4.1 9.6 7.0 2002 ACS 1.1 4.4 0.4 2.5 6.0 3.5 4.9 3.0 3.9 9.2 6.9 2001 ACS 1.3 4.6 0.4 2.6 7.3 3.8 5.3 3.1 4.1 9.9 7.2 2000 (C2SS) 1.6 634 0.5 4.0 14.9 6.0 9.9 4.6 5.8 11.3 8.7 11.8 Imputation Rates for Items of Branch Interest: 2003, 2002, 2001, and 2000 ACS, and Census 2000 compiled by David Hubble, Census Bureau for the Irvine, CA presentation. Source: Data are based on 2003, 2002, 2001, and C2SS data from the American Community Survey detailed tabulations and the Summary File 3 from the Census 2000 detailed tabulations. Note: Data are limited to the household population and exclude the population living in institutions, college dormitories, and other group quarters in the ACS tabulations. However, the Census 2000 data include these persons. a Base to the imputation rate is the population at risk for the characteristic. For example, the imputation rate for âtravel time to workâ is based on âworkers 16 years and over who did not work at home.â b Previous residence is for a one-year interval in ACS and for a five-year interval in Census. Table 2.3. Selected ACS imputation rates of interest.
the drove-alone-to-work category. Where both data items were missing, more than half of the respondents that were allocated to households with zero vehicles also were assigned to the drove- alone-to-work category. Weighting, Disclosure Editing, and Variance Estimation The coded ACS data for a calendar year are weighted so that the combined sample units reflect the actual population as well as possible. Weighting includes the following three adjustments: 1. Initial weights are developed to account for differences in sampling unitsâ probabilities of selection, 2. Initial weights for interviewed households are adjusted to account for non-interviews by month and census tract, and 3. Weights are then adjusted to match independent housing unit and population control totals. Among its many other activities, the Census Bureau develops annual estimates of popula- tion by race/ethnicity, age, and sex. These are based on the previous decennial census counts and a range of administrative records databases. These post-census estimates serve as the weighting targets for ACS and other census surveys. Because of the way they are developed, the post-census estimates will almost certainly be much more accurate immediately following the decennial census. Therefore, by the end of each decade, the ACS estimates are somewhat less likely to reflect the actual population, and it is likely that larger year-to-year differences will be detected in the ACS data as the post-census estimates are updated with new decennial census count data. ACS users will need to understand that the reported changes in the ACS data for decennial census years and the previous years are likely to be affected by larger than normal changes in the underlying population estimates by race/ethnicity, age, and sex. The Census Bureau population estimates for previous years are revised when new decennial census count data become available, but the ACS estimates will not be revised. The ACS estimates for each year will be weighted based on the initial Census Bureau population estimates. Before releasing any ACS data, however, the Census Bureau first edits the database to ensure it is within compliance with disclosure rules. The Census Bureauâs Disclosure Review Board (DRB) governs the release of census data as described below: Title 13 of the United States Code authorizes the Census Bureau to conduct censuses and surveys. Section 9 of the same Title requires that any information collected from the public under the author- ity of Title 13 be maintained as confidential. . . . The Census Bureauâs internal Disclosure Review Board (DRB) sets the confidentiality rules for all data releases.7 American Community Survey 17 Zero Vehicles in Household Total Workers Drove Alone to work Not Allocated Allocated Not Allocated Allocated Not Allocated 5,065,639 380,191 824,431 (16.3%) 226,424 (59.6%) Allocated 34,425 108,451 18,624 (54.1%) 55,458 (51.1%) Source: U.S. Census Bureau American Community Survey 2002 PUMS data. Table 2.4. 2002 ACS allocation rates for workers in zero-vehicle households. 7 See www.census.gov/eos/www/sestats.html.
The effects on the published data of implementing rules to ensure compliance with this law are discussed in Section 4.3 of this guidebook. 2.1.4 Data Product Dissemination The many ACS data products are made available to users via the Census Bureauâs website âAmerican FactFinderâ page. American FactFinder provides access to data and products related to the Census Bureauâs â¢ Decennial census, â¢ ACS, â¢ Economic census, â¢ Annual economic surveys, and â¢ Population estimates program. The process of locating and obtaining ACS data from American FactFinder is described in the next section of this guidebook. 2.2 Additional Information Sources on ACS Implementation The key source of ACS operations and implementation information is, of course, the Census Bureau. The website includes the Operations Plan, as well as other documents that summarize different aspects of ACS implementation. The Census website also includes several archived documents that provide a useful perspec- tive on how ACS was conceived and how it has developed. In addition to providing a historical record, these documents provide insights into how the survey may evolve over time. For these purposes, we recommend the following documents described in the remainder of this section. â¢ United States Government Accountability Office, âAmerican Community Survey: Key Unre- solved Issues.â8 The Government Accountability Office report on ACS noted the following as âkey unresolved issuesâ: 1. Introduction of a new concept of residence: âSufficient research has not been conducted to make the final set of rules for the âcurrent residenceâ used for ACS.â 2. Uncertainty about the new methodology for deriving independent controls for population and housing characteristics: âThe Census Bureau has not developed a methodology for using the Intercensal Population Estimates (ICPE) program for the full ACS to derive controls consistent with the ACS residence concept and ACS reference period, or at the same level of geography used for the Census 2000 Long Form.â 3. Lack of guidance for users on the characteristics of multiyear averages for small geographic areas: âBecause of statistical properties of multiyear averages and usersâ unfamiliarity with them . . . it is critical for Census Bureau to provide users with guidance on topics such as reliability of multiyear averages for areas with rapidly changing populations, reliability of trends calculated from annual changes in multiyear averages, and the use of multiple esti- mates from ACS data for geographic areas with populations greater than 20,000.â 18 A Guidebook for Using American Community Survey Data for Transportation Planning 8 GAO-05-82, October 2004. See www.gao.gov/new.items/d0582.pdf.
4. Operational procedures, such as questionnaire design, and adjustment to dollar-denominated values, and to the consistency between ACS and Census 2000 data. Alternatives to improve small geographic area data: An alternative to provide more reliable small area data is to additionally fund a larger sample for 2009-2011, and provide a replacement for the Long Form one year earlier. â¢ Barry Edmonston and Charles Schultze, editors, âModernizing the U.S. Census.â9 This study provides a review of the traditional U.S. census; considers ways to improve cover- age, reduce differential undercount, and limit enumeration; examines needs for small area data during intercensal years, and explores the use of sampling methods. It recommends ways to improve initial response rates for both Short and Long Forms, examination and testing of ques- tions on race and ethnicity, use of continuous measurement and other methods to obtain small area data, reduce costs, and suggests a new design for the census questionnaire. Several recom- mendations are related to the improvement of MAF/Tiger, and development of intercensal esti- mates for small areas. Alternatives to the decennial censusâsuch as use of administrative records, a national regis- ter, and a rolling censusâalso are presented. The panel evaluated the uses of Long Form data to arrive at the conclusion that â. . . in addi- tion to data to satisfy constitutional requirements, there are essential public needs for small area data and data on small population groups of the type and breadth now collected in the decen- nial census.â In the panelâs judgment â The Long Form is not responsible for the decline in response rates or increase in costs in the previous censuses, â Dropping the Long Form would not have a very large effect on response rates, and â The inclusion of the Long Form questionnaire for a large sample of households is a cost- effective way of obtaining highly valuable information. â¢ Daniel L. Cork, Michael L. Cohen, and Benjamin F. King, editors, Panel on Research on Future Census Methods, National Research Council, âReengineering the 2010 Census: Risks and Challenges.â10 This study examined the Census Bureauâs current plans for a reengineered Census 2010 with MAF/TIGER enhancements, American Community Survey, and Early Integrated Planning as its core concepts. The panel strongly supported the major aims of the Census Bureauâs emerging plan for 2010, while noting that considerable challenges must be overcome for the innovations to be successful. Specifically, the panel noted that the Census Bureau should â Develop a sound evidentiary base for its 2010 census plan. â Identify, articulate, and quantify risks in the census process (especially the impact of reduced funding on the quality of ACS estimates for small area data). The panel espe- cially noted the need for a clear and early decision on ACS and contingency plans for the traditional Long Form if full ACS funding were not forthcoming. Other ACS issues that concerned the panel included the collection of Group Quarters data, the risk of a American Community Survey 19 9 Barry Edmonston and Charles Schultze, eds., âModernizing the U.S. Census.â Washington, D.C., National Academies Press, 1995. See www.nap.edu/openbook/0309051827/html/index.html. 10 Daniel L. Cork, Michael L. Cohen, and Benjamin F. King, editors, Panel on Research on Future Census Meth- ods, National Research Council, âReengineering the 2010 Census: Risks and Challenges.â Washington, D.C., National Academies Press, 2004. See www.nap.edu/catalog/10959.html.
voluntary versus mandatory response, interaction with intercensal population estimates and the demographic analysis programs, and the use of sequential hot-deck imputation for the treatment of individual non-response. â Develop a comprehensive plan for updating and improving the MAF. The panel notes that each of the tasks related to modernization of TIGER carries considerable riskâ especially the timeliness of realignment of TIGER geographic features to be consistent with GPS coordinates and the conversion of MAF/TIGER from its current homegrown format to a modern object-oriented computing environment. â Work with the postal service in assessing the quality of the Delivery Sequence File. â Analyze the Community Address Updating System; and justify plans to implement a complete block canvas. â¢ Constance F. Citro, Daniel L. Cork, and Janet L. Norwood, editors, Panel to Review the 2000 Census, National Research Council, âThe 2000 Census: Counting under Adversity.â11 The panelâs overall conclusion was that âCensus 2000 experienced both major successes and significant problems.â The successes pointed out in the report are the completeness of demo- graphic coverage and the quality of basic demographic data. Census 2000 saw a halt to the decline in the mail response rates, and operations were conducted in a timely manner. Net undercounts were lower in Census 2000 than in the 1990 Census. The problems cited included errors in the MAF, a large number of duplicates, problems with some Long Form items such as employment and income, and inaccuracies in the enumeration of Group Quarters population. The panel found that âcensus counts at the block level, whether adjusted or unadjusted, are subject to high levels of error, and hence should be used only when aggregated to larger geogra- phies.â The lack of agreement until 1999 on the basic design hampered planning and increased costs for Census 2000. The panel recommended that the Census Bureau, the administration, and Con- gress should agree on the basic design for Census 2010 and the ACS by 2006. In its assessment of Census 2000 operations, the panel found âlimited pieces of evidence to suggest some problems in the imputation of whole persons.â An administrative records experi- ment conducted in five counties showed that 41 percent of imputed census households were larger in size than linked administrative households, while 27 percent were smaller. âMissing data rates for some Long Form items were high in many cases; in some cases, higher than the comparable rates in 1990. The Census Bureau relied on imputation of these items on procedures that it used for many censuses with little evaluation of their appropriateness or effectiveness.â Also determined was that âThe Census Bureau should conduct experiments to test the relative costs of more imputation versus more follow-up before deciding whether to continue the 2000 strategy in 2010.â For the household population, missing data item rates were high (10 percent or more) for over one-half of the Long Form items, and very high (20 percent or more) for over one sixth of the Long Form items. Given these high rates of imputation, the panel recommended that the Cen- sus Bureau develop procedures to quantify and report variability of 2000 Long Form estimates, further study the effects of imputation, and conduct research on improving imputation meth- ods for ACS (or the 2010 Census if it includes a Long Form). 20 A Guidebook for Using American Community Survey Data for Transportation Planning 11 Constance F. Citro, Daniel L. Cork, and Janet L. Norwood, Editors, Panel to Review the 2000 Census, National Research Council, âThe 2000 Census: Counting under Adversity.â Washington, D.C., National Academies Press, 2004. See www.nap.edu/catalog/10907.html.
With respect to the MAF, the panel recommended that the Census Bureau develop procedures to accurately identify housing units within multi-unit structures, redesign the Local Update of Census Addresses (LUCA) program to benefit participating state and local governments, and plan evaluations of MAF well in advance of the 2010 Census. The panel also recommended the development of an improved Accuracy and Coverage Evaluation (ACE) program for the 2010 Census. American Community Survey 21