National Academies Press: OpenBook

Assessing the 2020 Census: Final Report (2023)

Chapter: Appendix C: Additional Detail and Reference on 2020 Census Operations

« Previous: Appendix B: Glossary and Abbreviations
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

– C –

Additional Detail and Reference on 2020 Census Operations

This appendix summarizes procedures used to conduct the 2020 Census, based primarily on U.S. Census Bureau presentations made in public sessions of the panel in 2021 (see Appendix B in the interim report). It has sections on the following: (1) goals for the 2020 Census; (2) hiring, training, and retention of field staff; (3) development of the Master Address File (MAF) used to initially contact households through the U.S. Postal Service (USPS) or delivery by census field staff; (4) types of enumeration areas (TEAs); (5) attempts to encourage self-response by household residents through outreach, partnerships, and promotion; (6) contacting household residents to provide multiple ways of responding and enumerating people in group quarters (GQs) and other nonhousehold situations; (7) attempts at following up initial nonrespondents and closing out nonresponsive households; (8) data processing and creation of data products; and (9) assessments of the quality of the counts and the effectiveness of various census processes.

C.1 GOALS FOR THE 2020 CENSUS

The processes used to conduct the 2020 Census were designed to achieve the following goals: (1) to provide a complete and accurate count so that every resident is counted once, only once, and in the right location; (2) to contain costs and improve efficiency in field operations; (3) to maintain security of data during transmission and on the Census Bureau’s computing network; and (4) to

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

provide alternative ways for the public to self-respond. A goal added toward the end of 2020 Census planning was to use newly developed disclosure avoidance techniques for the 2020 data products.

To achieve these goals in a cost-effective manner, an initial objective of the 2020 Census was to reengineer field operations. Field reengineering had three objectives: (1) streamline the office and staffing structure; (2) increase the use of technology; and (3) increase the effectiveness of field management. Steps to improve field management included giving managers tools to provide better data on the status of cases in the field, along with improved communications and redesigned quality-assurance operations. Technology was used to automate the provision of work assignments for field personnel, automate aspects of field personnel recruitment and training, automate compensation and expense reporting, and reduce reliance on paper and the manual processing of responses.

C.2 HIRING, TRAINING, AND RETENTION OF FIELD STAFF

Field staff were essential to many 2020 Census operations, including: InField Address Canvassing, Remote Alaska, Update Leave, Update Enumerate, some Group Quarters Enumeration operations, and, most importantly, Nonresponse Followup (NRFU). The Census Bureau implemented a web-based Recruitment and Assessment System—over 3.9 million applicant profiles were created in the system over the course of the 2020 Census. Almost 3.1 million applicants were available for job selection. The Census Bureau employed over 500,000 temporary staff during the taking of the census, primarily for NRFU, with nearly 300,000 people working during the peak week of NRFU in early August 2020.

Because of COVID-19 restrictions, field training for NRFU had to be modified from a partially office-based approach to a virtual training program. There were also 2 hours of in-person enumerator training during which enumerators received equipment. (The original plan had been for 2 days of in-person training.) In addition, extraordinary efforts were made to retain field staff. The Census Bureau extended the recruitment period by 5 months to run through September 2020 and offered incentives to enumerators for greater production. High-performing enumerators who agreed to travel were asked to help complete NRFU in areas that were experiencing production shortfalls.

Field operations were facilitated by technology. Address canvassers used laptops equipped with the Listing and Mapping Application (LiMA) to complete in-field address work. NRFU enumerators used smartphones. Assignments were generated online and adjusted dynamically—for example, to account for late self-responses in the NRFU assignments.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

C.3 CREATING THE ADDRESS LIST FOR THE 2020 CENSUS

As described in more detail in Chapter 5, the 2020 Census continued the practice of using an ongoing repository of address information—the MAF—as the basis for the operational frame for the census. It differed slightly in terminology; in the 2000 and 2010 Censuses, the notion was to construct a snapshot extract immediately prior to the start of main census operations, which was termed the Decennial MAF (DMAF), while the 2020 Census was couched as applying various filters to the ongoing, fluid MAF database to define specific operational universes. But the MAF-building process for 2020 differed greatly in methodology and approach, through the major innovation of in-office canvassing methods to review the MAF addresses without fieldwork (and, pointedly, without a 100-percent field canvass of every block in the nation as was done in 2000 and 2010).

C.3.1 2010 Decennial Master Address File Development

An overview of the analogous process used for the development of the 2010 DMAF is offered for purposes of comparison. The goal of the 2010 DMAF development process was to provide the primary address list update and validation activity for the 2010 Census enumeration frame. The Census Bureau deployed a national-level, automated, paperless data-collection and transmission process, primarily making use of handheld computers that were provided to census field workers. The Address Canvassing operation, which covered 100% of the residential addresses in the United States, was managed by opening 150 Local Census Offices in 12 Regional Census Centers early. It was conducted by over 150,000 field staff who started on March 30, 2009, and who completed their work on July 10, 2009.

In addition to verifying addresses for census mailing purposes, a major focus of the 2010 Address Canvassing was the collection of GPS coordinates (a “map spot”) for every entry on the MAF, for inclusion in the MAF and the Census Bureau’s companion Topologically Integrated Geographic Encoding and Referencing (TIGER) System and to associate each MAF entry with a specific geographic location.1 The exception was large blocks that contained more than 1,000 addresses, which were handled by the Large Block Address Canvassing (LBAC) operation, which was conducted from February 2, 2009, until June 17, 2009. The LBAC used laptop computers to collect information on the addresses because the handhelds had limited storage space.

___________________

1 Prior to the collection of map spot data—or in the absence of those coordinates—geographic locations and placement of address numbers within census blocks can be done by exploiting the topological nature of the TIGER database. Line segments corresponding to streets are also coded with beginning and end addresses, on both sides of the street as applicable, so a specific street address can be assigned an approximate geographic location by interpolation along the line segment.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

For GQs—primarily college/university student housing, residential treatment centers, skilled nursing facilities, group homes, correctional facilities, and workers’ dormitories—the 2010 address canvassing operation identified each possible residence as either a housing unit, a GQ that was included in the 2000 Census DMAF inventory, or as a potential GQ from other sources, such as administrative records. Address canvass listers marked addresses that were not housing units as Other Living Quarters (OLQs); those addresses were included in the 2010 Census Group Quarters Validation (GQV) operation universe. Specially trained staff used the GQV questionnaire to verify that the address had the correct census geography and to determine the status of the OLQ address as either a GQ, a housing unit, a transitory location, or as nonresidential, vacant, or nonexistent. Staff also classified the type of GQ and collected the maximum number of residents who could live or stay at the address. These additional GQs, housing units, and transitory locations were added to the developing decennial master address list.

C.3.2 Intercensal Additions to the MAF

Master Address File Coverage Study

In previous decennial censuses, the Census Bureau used a 100% block canvass to update the MAF from the previous census. This operation was increasingly viewed as inefficient, since a large percentage of residential blocks experience no change from one census to the next. As a result of this finding, the development of the MAF for the 2020 Census was planned to be conducted in two phases. There would be an “in-office” phase, conducted by census geographers who used imagery to identify stable blocks versus “active” blocks that experienced changes from the previous census (i.e., differed from the current MAF). After the “in-office” phase, there was an “in-field” phase, conducted by field personnel, carried out as a targeted block canvass on the subset of residential blocks that were identified as being subject to change in the intercensal period. This canvassing attempted to physically locate each address and determine whether it was a housing unit.

The MAF Coverage Study was intended to be an ongoing field activity that listed a sample of 20,000 nationally representative blocks annually. This sample was planned to measure the coverage of the MAF, validate the use of in-office canvassing procedures that could determine which blocks had additions or deletions during the intercensal period, improve in-field data-collection methods, and provide updates to the MAF on a continuous basis. Field work started in April 2016, but only one report was produced—the June 2016 Monthly Status Report—and the program was discontinued in 2017 due to budgetary constraints.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

U.S. Postal Service Delivery Sequence File Updates to the Master Address File

Throughout the decade of 2010–2020, various versions of the USPS’s Delivery Sequence File (DSF) were used to update the ongoing MAF. The USPS provided three files to the Census Bureau twice a year during the intercensal period: (1) the DSF, a database of all mail delivery addresses that, as implied by the name, is used to structure routes for mail carriers; (2) a locatable address conversion service file; and (3) a list of ZIP codes and the plus-four codes served by USPS. MAF/TIGER uses these addresses and locations to update the information on current addresses and to add new addresses to the MAF in what is often termed a “refresh.” The version of the MAF for use in the subsequent census, the DMAF, is initiated through updating about a year prior to the start of the census using the most current DSF. That updating, in 2018, added 5.3 million new residential addresses to the final version of the DMAF from the 2010 Census. Another 2.3 million current residential addresses were in the 2010 DMAF alone. While there is substantial overlap, both systems—DMAF and DSF—have unique addresses.

Updates to the Master Address File through Partnerships

Geographic Support System (GSS) Beginning in 2011, the Census Bureau implemented a Geographic Support System Initiative for continuous updating of MAF/TIGER and related activities, working with public- and private-sector partners. The initiative became the Geographic Support System (GSS) Program in 2015. Its activities included: maintaining the MAF/TIGER system; collecting and assigning addresses to geographic locations; identifying, collecting, and maintaining geographic entities and their boundaries; preparing and maintaining maps that support data collection and dissemination of Census Bureau data; developing and maintaining partnerships with tribal, federal, state, and local governments and commercial entities for the acquisition of address and mapping data; and coordinating federal and international geospatial activities with other federal agencies and the appropriate international groups. The GSS was coordinated with the Local Update of Census Addresses (LUCA) Program in 2018, attempting to acquire address or road data for 15,000 census tracts located within government entities that were not participating in LUCA.

Through the GSS, 106.7 million addresses were acquired between 2012–2018. Of those addresses, 106.2 million (99.5%) matched addresses already in the 2010 DMAF, which was confirmation of its quality. Geospatial locations were improved or corrected for 75.1 million addresses. In addition, through the Ungeocoded Resolution Project, 810,899 addresses (72% of addresses reviewed) that previously were not assigned to a census block were geocoded.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

Local Update of Census Addresses (LUCA) LUCA resulted from the Census Address List Improvement Act of 1994. The Act allowed tribal, state, and local governments to review and provide updates to the Census Bureau’s MAF for their jurisdictions under the condition of confidentiality.2 Updates could include addresses that were missed, addresses that were included in error, and addresses located in the wrong block. This updating began two years prior to the decennial census to allow for completion of an appeals process to finish (LUCA Validation). The LUCA operation for the 2020 Census began in April 2018 and was completed in March 2019.

LUCA consisted of the following phases: (1) LUCA Outreach—promotions, training, and registration; (2) LUCA Review of Materials—submission of address lists by participants; (3) LUCA Returns and Validation—changes made to the MAF by the Census Bureau; and (4) LUCA Feedback and Appeals and LUCA Closeout. LUCA validation included a full in-office review of LUCA-submitted changes that were not validated through an automated match to the MAF. Such changes included potential additions that did not match an existing GSS or MAF record, addresses that were moved by the local participant to a different block than indicated in MAF/TIGER, and additions of nonresidential addresses to the MAF. Of the addresses provided, 81.2% were matched to the current version of the MAF, leaving 3.46 million unmatched addresses.

In-Office Address Canvassing

In-Office Address Canvassing (IOAC), with initial review of the country completed in June of 2017, and the entire operation completed by October 11, 2019, was based on satellite imagery used to detect changes or stability in blocks of housing units throughout residential areas in the United States and Puerto Rico. This program compared the number of housing units identified in the imagery to the number in the Census Bureau’s MAF/TIGER system. IOAC included Interactive Review, Active Block Resolution, Ungeocoded Resolution, and Group Quarters/Transitory Locations review. It also reviewed LUCA addresses that failed automatic matching to the MAF.

Interactive Review (IR) IOAC used imagery and validated data sources to identify changes and potential changes (i.e., new construction) on the ground, related to residential structures. The intent was to eliminate field work in areas with no measured changes or where the residential structure inventory was stable. Office staff used validated data sources to correct any identified issues and to resolve addresses that could not be automatically geocoded. Staff measured the extent to which the number of housing units in the MAF was

___________________

2 In contrast, GSS was a one-way sharing of addresses from partners to the Census Bureau and not the reverse.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

consistent with the number of residential structures visible in current imagery. The result was the assignment of blocks to one of three categories: (1) passive—no observable change in the number of housing units over time, and the MAF accurately represented the number of housing units shown in the imagery; (2) active—observed changes in the residential landscape or the number of housing units in the MAF differed from the number shown in the imagery; or (3) on hold—when a review could not be completed due to poor imagery.

Active Block Resolution (ABR) ABR was designed to research active blocks, which showed evidence of change in IR, in the office using street-view imagery and other address resources to attempt to resolve them in the office rather than send the block directly to In-Field Address Canvassing (IFAC). Blocks that were not fully resolved in ABR remained active and progressed to IFAC. ABR launched in April 2016 but was discontinued in February 2017 due to budgetary constraints, leaving IR as the main component of IOAC.

Ungeocoded Resolution (UR) This operation began in April 2017 and finished in February 2020. UR was designed to assign a block location—a geocode—to residential addresses lacking a geocode in the MAF. Staff used data sources in the Matching and Coding System and Geographic Information System viewer. They added or edited spatial features and address ranges in the MAF/TIGER system based on information from local source data.

Group Quarters/Transitory Locations (GQ/TLs) This operation began in September 2017 but stopped in March 2018 due to budgetary constraints. The GQ/TLs operation attempted to verify, update, and validate GQ/TL addresses in the MAF, using administrative data sources, local geographic information systems (GIS) data, public and commercial information, and by calling administrative contacts.

In-Field Address Canvassing

IFAC was similar to the 100% block canvass used in the 2010 decennial census, though it was focused on the subset of blocks for which there was direct evidence of recent change since 2010. In those residential blocks that showed evidence of change in the number of housing units through the decade, listers equipped with laptops were deployed to canvass over 50 million addresses. The resulting effort proved to be 31% more efficient than the initial plan, which called for a 100% block canvass.

Blocks that were determined to need canvassing from IOAC and other sources were:

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
  • Blocks identified as having undercoverage during IR, except for those containing only single-family housing and with a history of complete and accurate updates from the USPS DSF within a specified timeframe. Blocks could include undercoverage only or undercoverage in combination with overcoverage, growth, and decline.
  • Any block with overcoverage of two or more housing units, and no undercoverage. Blocks could include overcoverage only or overcoverage in combination with growth/decline
  • Any block exhibiting growth or decline and an inconsistent history of DSF updates, even when IR did not classify the block as having overcoverage or undercoverage.
  • Any block for which IR detected decline but was not classified as having overcoverage or undercoverage.
  • Any block with on-hold status, suggesting that the block was not covered by adequate imagery for update, or that future growth could occur.
  • Any block that was adjusted through use of the LUCA program.
  • Any block that was “triggered” for rereview (e.g., because of a MAF update subsequent to IR) and could not be resolved in-office.

IFAC began on August 4, 2019, and was completed on October 11, 2019. It was conducted in 22% of the Basic Collection Units (BCUs) used in 2020 Census operations,3 which contained 34.9% of addresses in Self-Response (TEA 1) areas (described below). Address information was collected using laptops managed by the Decennial Device as a Service program. The laptops were equipped with the LiMA to complete work in the field. Field staff canvassed specific geographic areas to identify every place where people could live or stay, comparing what was seen on the ground to the existing MAF, verifying or correcting address and location information, and adding new address records for those missing and removing addresses for residences that did not exist.

The IFAC operation made use of a soft launch, so that the procedures could be beta tested prior to more universal adoption. After IFAC, field staff had to process LUCA appeals and addresses from the New Construction and Count Review operations.

During the 2020 Census itself, several operations provided additional addresses to the MAF. Areas designated for Update Leave (in-person delivery of questionnaire packages) and Update Enumerate/Remote Alaska (in-person interviewing) were ineligible for IFAC because of the “Update” component

___________________

3 The BCU superseded the collection block and assignment area constructs used in the 2010 Census. Collection blocks corresponded in most but not all cases to tabulation blocks, the difference being that collection blocks had to be bounded by visible features and not invisible political boundaries. Assignment areas were groups of collection blocks assigned to field staff to conduct their work. The 2020 BCU does not have a readily available clear definition but combines aspects of the 2020 collection block and assignment area definitions to develop a cluster of addresses that a field worker could cover in a single day. BCUs were adjusted over the course of the decade.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

of those operations; field staff reviewed and updated MAF entries as part of their work in these areas, which were mainly rural and lacking mail delivery. Though address list work was not a primary focus of NRFU, NRFU field interviewers had the capacity to collect MAF information on new addresses that they experienced on their rounds. New for the 2020 Census, the Office-Based Address Validation operation applied IOAC methods to attempt to resolve addresses from Non-ID Processing that did not directly match the MAF. Finally, the special enumeration operations for GQ and transitory locations included an Advance Visit phase that was important for building and updating the address information for those sites.

C.4 TYPES OF ENUMERATION AREAS

Prior to the census, the nation is delineated into TEAs. These assignments, made at fine geographic resolution (collection blocks in 2000 and 2010, BCUs in 2020), govern the way in which initial contacts to participate in the census are made, and also have implications for the way MAF updating proceeds in the areas.

The 2020 Census used five TEAs. The largest of these was TEA 1, Self-Response, which was designed to cover about 95% of the households in the nation. TEA 1 succeeded the Mailout-Mailback TEA designated in the 2000 and 2010 Censuses (the name change reflecting the innovation of internet response), and covered areas with predominantly home mail delivery rather than delivery to post office boxes. TEA 1 was further subdivided, based on American Community Survey (ACS) and other data, into two contact strategy groups (Internet Choice and Internet First, the former receiving a paper questionnaire earlier in the sequence of mailings and contacts than the latter) and into two language groups (with households in some areas with particularly strong concentrations of households with Spanish as the primary spoken language receiving a bilingual English-Spanish questionnaire). The contact strategies are discussed later in this appendix.

The second largest TEA in the 2020 Census was TEA 6, Update Leave,4 at over 6.8 million addresses, and was applied in areas where the majority of the households do not receive mail at their home’s physical location, such as small towns where mail is only delivered to post office boxes or areas recently affected by natural disasters. In these areas, invitation packets that included paper census questionnaires were dropped off at the residences, with the respondents asked to return the filled-in questionnaires either online or via the mail; there was no attempt to complete the interview in person. The “Update” part of the operation’s name refers to MAF updating; because census field staff were

___________________

4 In previous censuses, TEAs bore labels punctuated with slashes—Update/Leave and Update/Enumerate—but these were dropped in the early documentation of 2020 Census plans.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

to visit addresses in these areas to deliver questionnaires, TEA 6 areas were excluded from IFAC and the address list update function was made part of the Update Leave work. The reason for Update Leave bearing the numeric label 6 is that it was not part of the original 2020 Census plan—the idea being to handle all primarily rural, non-home-mail-delivery areas as Update Enumerate as described next—but the TEA was reinstated in later versions of the 2020 Census Operational Plan.

The two remaining principal TEAs in the 2020 Census—TEA 2 Update Enumerate and TEA 4 Remote Alaska—collectively comprise less than 1% of households in the nation in mainly remote regions. They share the same methodology: in-person interviewing at the time of the field enumerator’s visit, along with the MAF updating implied by the “Update” part of the name. TEA 2 Update Enumerate was used in areas where it was difficult to deliver mail and in areas where internet response was judged to be particularly difficult; in 2020, this includes parts of northern Maine, southeast Alaska, and American Indian and Alaska Native areas that were offered their choice of enumeration procedures. TEA 4, Remote Alaska, as the name implies, covered large portions of Alaska where houses may not be connected by roads and internet connectivity is often poor but was conducted in the same manner of Update Enumerate—with the significant difference of timing. Remote Alaska has become storied and publicized in recent censuses as the earliest field operation, taking place beginning in January 2020 before the spring thaw and before residents leave their villages to fish and hunt. TEA 4 accounted for about 34,000 completed enumerations.

TEA 3, Island Areas, covered American Samoa, the Commonwealth of the Northern Mariana Islands, Guam, and the U.S. Virgin Islands. As with previous decennials, TEA 3 in the 2020 Census defined an essentially parallel census effort: the census was entirely paper-based in these areas and conducted by enumerators who began by listing the addresses and then conducting in-person interviews.

Early plans—and the initial delineation of TEAs for the 2020 Census—included TEA 5, Military, covering areas in and around military installations in the United States. Early plans were to rely on administrative records-type data provided by the U.S. Department of Defense to count the service personnel in TEA 5 areas and usual census methods for other residents. However, after reviewing initial data files, the Census Bureau opted in 2019 not to rely on records in these areas—dissolving TEA 5 and allocating its BCUs to TEA 1 (Self-Response) and TEA 6 (Update Leave) as appropriate.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

C.5 CENSUS PARTNERSHIP, OUTREACH, AND PROMOTION ACTIVITIES

C.5.1 2020 Census Partnership Program

The goal of the 2020 Census Partnership Program was to motivate national, regional, and local organizations to assist the Census Bureau in urging people in their networks to respond to the census mailings or to cooperate with census field personnel. To this end, the associated Census Media Campaign ran local paid media in every market in the United States, in addition to national media, to request people to respond to census materials and personnel. Further, the Census Bureau relied on a number of programs to communicate the importance of cooperation with the 2020 Census attempts at enumeration, described below.

Community Partnership and Engagement Program

The aim of the Community Partnership and Engagement Program (CPEP) was to enroll community partners to increase participation by people less likely to respond and often missed. The goals were to: (1) educate people about the 2020 Census and foster cooperation with enumerators; (2) encourage community partners to motivate people to self-respond; and (3) engage grassroots organizations to reach out to hard-to-count groups and those not motivated to respond. The CPEP partner organizations formed 8,470 Complete Count Committees, hosted more than 493,000 events, and completed more than 732,000 commitments to support the 2020 Census. CPEP partners and National Partnership Program partners (see below) organized activities such as digital weekends of action, recruiting events, Mobile Questionnaire Assistance events, Get Out the Count activities like block parties and caravans, phone banks, and text messaging campaigns. The partners also invited Census Bureau personnel to speak at their events. Further, the Census Bureau created more than 250 types of materials—fact sheets, posters, and social media toolkits—which were posted on the 2020 Census website and available in multiple languages. In addition, the Census Bureau shared information, resources, and operational updates with partners via an email list with nearly 80,000 subscribers. Local Partnership Specialists (Census Bureau staff hired specifically to facilitate CPEP) educated partners and stakeholders on updates and supported activities that encouraged households to respond.

National Partnership Program

The National Partnership Program (NPP) had very similar goals as the CPEP, but it was targeted toward national organizations rather than local organizations. The NPP signed up 1,090 national organizations (exceeding its goal of 900 organizations) by April 20, 2020.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

Statistics in Schools

Regional Partnership Specialists partnered with local school districts, individual schools, Head Start programs, daycare centers, diaper pantries, and other community organizations to promote Statistics in Schools and the Counting Young Children initiative.

C.5.2 Integrated Partnership and Communications Operation

The goal of the Integrated Partnership and Communications Operation (IPC) was to communicate the importance of participating in the 2020 Census to the entire population of the United States, Puerto Rico, and the Island Areas. The hope was to engage and motivate people to self-respond, preferably using the internet. Also, during later stages of the census, the goal was to raise and maintain awareness of the census throughout the full 2020 Census process to encourage response and cooperation. The IPC Operation ran local paid media in every market in the United States in addition to paying for national media. The IPC Operation had four phases: Strategic Early Education Phase, Awareness Phase, Motivation Phase, and Reminder/NRFU Phase.

  • Strategic Early Education Phase (January–December 2019): The goal of this phase was to build public trust in Census Bureau efforts to encourage participation, and to educate residents about the importance of accurate counts. The strategy was to utilize the CPEP, Statistics in Schools, and various public relations outreach to get this message across.
  • Awareness Phase (planned for January 14–March 12, 2020; occurred January–February 2020): The goal of this phase was to notify the public about the upcoming 2020 Census, the purpose of the census, its importance, and how to respond to it. The strategy was to provide information on available means for completing the 2020 Census, explain where residents could access additional information and resources, and explain what residents could expect from the Census Bureau and its partners.
  • Motivation Phase (planned for February–April 2020; occurred March 13–August 2, 2020, due to COVID-19-related adjustments): The goal of this phase was to work toward census completion by informing residents that the 2020 Census was underway and reminding them to participate in one of several ways. The strategy was to deliver general and audience-specific messages to encourage participation, as well as to urge people to encourage friends and family members to participate. In particular, the hope was to focus on “fence-sitters” who were comfortable responding via the internet but may not yet have decided to do so.
  • Reminder/NRFU Phase (planned for May–August 2020; occurred August 3–September 27, 2020): The goal of this phase was to remind residents
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
  • that the 2020 Census was taking place and encourage them to participate if they had not yet done so. The strategy employed was dependent on the availability of response modes.

C.5.3 Other Media and Outreach Efforts

Targeted E-mail Outreach

In this operation during July–August 2020, 48 million targeted e-mails were delivered to low-responding regions, resulting in 7.8 million digital impressions of motivational videos.

2020 Census Barriers, Attitudes, and Motivators Study

In the lead-up to the 2020 Census, the Census Bureau initiated an effort to better understand attitudes and behaviors that underly various levels of census participation across demographic and geographic groups. The Census Barriers, Attitudes, and Motivators Study (CBAMS) consisted of a population survey of 50,000 households (35% response rate) fielded in 2018 and 42 focus groups in 14 cities. The general findings of this study were that motivators of census participation include the prospect of proper funding for community needs and services, such as additional hospitals, fire departments, and schools. CBAMS discovered a general lack of knowledge about the decennial census’ scope, purpose, and constitutional foundation. Further, there was a general apathy towards the census. There were also concerns due to privacy worries and distrust of government. These knowledge gaps had important subgroup differences.

Changes Due to COVID-19

The emergence of COVID-19 during the 2020 Census created problems for messaging and outreach efforts. First, the media campaign had to be shifted to account for the adjusted messaging and deadlines. Second, the pandemic made some of the outreach efforts more difficult to arrange. Third, the pandemic dominated various media communications, so it was difficult to prioritize intended census messaging.

C.6 SELF-RESPONSE AND OTHER TYPES OF ENUMERATION OPERATIONS

In most areas of the country, the 2020 Census, like previous censuses, strove to motivate self-response by households through a series of mailings (in addition to the outreach and media campaigns described in Section C.5). In some areas and for some types of living situations, census field staff played a

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

Table C.1 Modification of Timing of 2020 Census Enumeration Operations Due to COVID-19

Operation Original Timing Actual Timing
Remote Alaska 1/21–4/30 1/21–3/18; 4/13–8/28
Self-Response Phase 3/12–7/31 3/12–10/15
Update Leave 3/15–4/17 5/4, 6/12–8/10
Update Enumerate 3/16–4/30 6/14–8/31
Service-Based Enumeration 3/30–4/1 9/22–9/24
Targeted Non-Sheltered Outdoor Locations 3/31–4/1 9/23–9/24
Group Quarters Enumeration (Residential) 4/2–6/5 4/2–9/3
Enumeration of Transitory Locations 4/9–5/4 9/3–9/28
Nonresponse Followup 5/13–7/31 7/16, 8/9–10/15

SOURCE: Fontenot (2021h:Slide 13).

major role in the enumeration: rural enumeration strategies (Update Leave, Update Enumerate, Remote Alaska); and GQ enumeration strategies (including Service-Based Enumeration and targeted non-sheltered outdoor locations, as well as the related Enumeration at Transitory Locations). Field staff also conducted Nonresponse Followup (NRFU)—see Section B.7. COVID-19 affected all these operations, to a greater or lesser degree, as summarized in Table C.1. The delays affected subsequent operations, including data processing and release.

C.6.1 Household Contact Strategies for Self-Response

2010 Census

A major difference between the 2010 and 2020 Censuses was that, in 2010, paper questionnaires were the primary medium used for collecting data. Questionnaires were mailed to U.S. households in the Mailout-Mailback TEA using a mailing strategy consisting of multiple contacts, as follows. First, an advance letter was sent, informing households that the census questionnaire would be arriving at their households soon. Then the initial questionnaire was sent, followed by a direct mail postcard sent to select areas. Then an English-only replacement questionnaire was distributed using one of three treatments: (1) blanket replacement mailing; (2) targeted replacement mailing; and (3) no replacement mailing. During the 2010–2020 intercensal period, the Census Bureau tested a variety of alternative strategies to improve the procedures used in the 2010 Census, drawing in particular upon the “wave methodology” used to promote self-response in recent Canadian censuses.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

2020 Census

The goal of the Internet Self-Response (ISR) operation used in the 2020 Census was to communicate to the U.S. population the importance of self-responding, with the hope of generating the largest possible frequency of self-response. This would save considerable funds by reducing the need to conduct expensive in-person follow-up. Towards this end, the Census Bureau developed communication and contact strategies to encourage the use of the internet as the primary response mode, through a sequence of invitations and reminder mailings. Further, the goal was to increase the percentage of high-quality responses by increasing the opportunity to respond and the flexibility of response-mode options.

The first part of the mail strategy was to divide TEA 1 areas into “Internet First” and “Internet Choice” areas. Internet First areas were those for which the majority of residences had relatively easy access to the internet—specifically, census tracts not characterized as Internet Choice (see below). For those Internet First areas, the plan was to start with a letter from the Census Bureau, followed by a second letter, then a postcard, followed by a letter with a questionnaire included, and lastly an “It’s not too late” postcard. Each letter informed the residents about how to be enumerated online.

Internet Choice areas were census tracts with low overall self-response rates in the ACS and one of the following: low ACS internet response rates, high rates of people ages 65 and older, or low rates of internet subscribership, all determined using 5-year ACS data for 2013–2017. The sequence of mailings for Internet Choice areas included an initial letter with a questionnaire enclosed plus a link to respond online, followed by a letter, then a postcard, followed by a letter with a questionnaire, and then an “It’s not too late” postcard. Lastly, nonrespondents were mailed a final reminder postcard.

The timing of the originally planned contact strategy had to be modified as a result of the COVID-19 pandemic. Table C.2 provides the initial timings and adjustments made due to the pandemic.

In response to the pandemic, the Census Bureau used a sixth mailing (addressed postcards including a Census ID for internet response) sent to an estimated 50 million nonresponding housing units as of mid-July. There was also a seventh mailing with a paper questionnaire that was sent to about 16.2 million households in low-responding census tracts, targeted to households that had received only one paper questionnaire in the mail. In addition to the additional sixth and seventh mailings indicated above, there was an “Every Door Direct Mail” postcard, which was sent to 4.5 million USPS boxes in areas where the physical addresses of housing units were not eligible for any form of USPS carrier delivery service.

Also in response to COVID-19, the Census Bureau had to adjust 2020 Census operations to protect the health of respondents and Census Bureau field

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

Table C.2 2020 Census Self-Response Mail Contact Strategy and Adjustments for COVID-19

Internet First (1)
Letter
(2)
Letter
(3)a
Postcard
(4)a
Letter Plus Questionnaire
(5)a
Postcard
(6)b
Postcard
(7)c
Letter Plus Questionnaire
Cohort 1 3/12 3/16 3/26 4/8 (4/14) 4/20 (4/27) 7/22–7/28 8/21–9/14
Cohort 2 3/13 3/17 3/27 4/9 (4/18) 4/20 (4/30) 7/22–7/28 8/21–9/14
Cohort 3 3/19 3/23 4/2 4/15 (4/22) 4/27 (5/4) 7/22–7/28 8/21–9/14
Cohort 4 3/20 3/24 4/3 4/16 (4/24) 4/27 (5/6) 7/22–7/28 8/21–9/14
Internet Choice (1)
Letter Plus Questionnaire
(2)
Letter
(3)a
Postcard
(4)a
Letter Plus Questionnaire
(5)a
Postcard
(6)b
Postcard
Cohort 3/13 3/17 3/27 4/9 (4/28) 4/20 (5/9) 7/22–7/28

a Targeted only to nonresponding households.

b Common postcard targeted to nonresponding housing units as of mid-July, including Census ID for internet response.

c Targeted to nonresponding housing units in low-responding census tracts that had received only one paper questionnaire in the mail.

NOTE: Dates are start dates for each contact; dates in parentheses are rescheduled start dates due to COVID-19. Only mailings (1)–(5) were part of the original planned contact strategy.

SOURCE: Fontenot (2021h:22–23); Hearns (2021:9–10); Fontenot (2021e).

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

personnel. New procedures had to be consistent with COVID-19 guidance from federal, state, and local authorities, without compromising the need to ensure a complete and accurate count. The Census Bureau monitored the rapidly changing conditions at the state and local levels and, in consultation with appropriate officials, updated the start dates for selected operations.

2020 Internet Instrument Design to Encourage Self-Response

Internet respondents received the following message on the 2020 Census online response landing page:

Welcome to the 2020 Census

  • It’s quick and easy. The 2020 Census questionnaire will take about 10 minutes to complete.
  • It’s safe, secure, and confidential. Your information and privacy are protected.
  • Your response helps to direct billions of dollars in federal funds to local communities for schools, roads, and other public services.
  • Results from the 2020 Census will be used to determine the number of seats each state has in Congress and your political representation at all levels of government.

Getting started:

  • You must complete your questionnaire once you begin. If you leave the questionnaire and return later, you will have to start over.
  • Do not use the web browser buttons (back, forward, of close browser). Use the buttons within the questionnaire to navigate.
  • For best results, use the latest version of Chrome, Firefox, Internet Explorer, or Safari. Enable cookies.

The login page asked for a 12-digit Census ID which could be found in the delivered census literature, but the ID was not required for response. People responding without the Census ID (Non-ID cases) were directed to a page on which they entered street address information for the place where they usually resided on April 1, 2020, before proceeding with the rest of the questionnaire. The address information in these Non-ID cases would be matched to the MAF to mark the address as complete or to designate it for further resolution.

2020 Census Summary of Self-Response

The original dates for Self-Response were March 12–July 31, 2020, with July 31 being the original target end date for NRFU as well. The COVID-19 pandemic resulted in several shifts in the schedule for 2020 Census data collection, with Self-Response and NRFU both ultimately closing on October 15, 2020. Addresses that were ultimately enumerated through self-response comprised 67.0% of the total, which exceeded the 66.5% for those enumerated

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

in 2010.5 Of those resolved in the 2020 Census, 79.7% were through online enumeration, 18.1% by paper, and 2.1% through use of the telephone.6 While the operations in 2020 and 2010 were not directly comparable, in 2020 all 50 states and the District of Columbia met or exceeded the percentage of addresses resolved by Self-Response in 2010. By mode, self-response rates, adding to 67.0%, were as follows: internet, 53.5%; paper, 12.3%; and phone, 1.2%. Finally, 93 million responses were collected via ISR of which 76 million had Census IDs and 17.6 million were without Census IDs.

Non-ID Processing

The Non-ID Processing operation enabled households to potentially be enumerated via Self-Response without having a Census ID (hence the designation of Non-ID). The goal was to make it easier and more convenient for households to respond to the census via the internet even when they had not received or had misplaced the various mailings from the Census Bureau, which included an identification code or Census ID. The goal was to match respondent-provided Non-ID addresses to the census living quarters address inventory, along with attempts to assign any remaining nonmatching addresses to census geographic units.

Some form of Non-ID Processing was used in the 2010 Census, through the availability of “Be Counted” forms in public locations, but solely on paper. The use of the internet as primary response mode for the 2020 Census caused a major expansion in the number of Non-ID cases, from 2.9 million in 2010 to 17.6 million in 2020. Non-ID Processing was central to the promotion of the 2020 Census as being completable by respondents at any time and from any device. The switch to internet collection also allowed for some improvements in Non-ID work. First, the internet-response instrument allowed for consistent address collection during ISR and Census Questionnaire Assistance (telephone) interviewing, facilitating immediate edits and quality checks on provided addresses. Second, real-time address matching during internet and telephone response supported a feedback loop in which a respondent could potentially produce a better address to increase the potential for matching. Third, the use of administrative records data could sometimes correct erroneous address information provided by a respondent (e.g., a misspelled street name) or supply omitted data (e.g., an apartment number).

Table C.3 shows the disposition of Non-ID Processing cases—whether the respondent-supplied addresses matched to the MAF or not—in the 2020 Census.

___________________

5 The denominator for these percentages is total addresses in the 2020 MAF, including addresses that turned out to be occupied or vacant housing units and addresses that turned out to be nonresidential or nonexistent.

6 Many of those who called Telephone Questionnaire Assistance were asked during the call whether they wanted to be enumerated on the call.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

Table C.3 2020 Census Non-ID Workload by Process and Outcome

Outcome Percent
Total Non-ID Workload—17.6 million households 100.0
Matched 91.2
Matched via Automation 83.9
Matched in Real Time 83.3
Matched Asynchronously 0.6
Matched Clerically 7.3
Nonmatched, geocoded, and verified 5.2
Nonmatched, unable to verify 3.6

SOURCE: Donello (2021:Slide 13).

The 5.2% of the total 17.6 million addresses that could not be matched to the MAF but that were ultimately able to be geocoded and verified is central, as it represented an important set of previously unknown addresses to be added to the MAF.

Coverage Improvement Program

The 2020 Census Coverage Improvement Program was an attempt to resolve possible erroneous enumerations and omissions, including people missed, counted at the wrong place, counted in error, or counted more than once, through contacting households that had responded but may have had coverage errors. The Census Bureau used respondent-provided telephone numbers to conduct telephone follow-up interviews in specific kinds of households to determine whether the household rosters on their census returns should be updated. Respondents were probed to identify whether residents were missed or counted in error. The hope was to identify people who were not living or staying at the address on Census Day (April 1, 2020), or who were missing from the original roster but living or staying at the address on Census Day. There was also an attempt to collect address information for people who indicated that they usually lived or stayed somewhere else.

Households were identified for follow-up through various checks. One possibility was a difference between the number of names on the roster and the population count of the household. Second, if the undercount question was answered affirmatively, then the household was followed up. Third, if someone usually lived elsewhere (e.g., in a type of GQ, at a seasonal home, with relatives, or for a job), the household was followed up.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

C.6.2 2020 Census Rural Enumeration Operations: Update Leave, Update Enumerate, and Remote Alaska

2020 Update Leave

Update Leave (UL) occurred in geographic areas (TEA 6) characterized by one or more of the following: (1) housing units did not have city-style addresses; (2) most people received mail at post office boxes or at drop points; (3) the area had been affected by natural disasters; or (4) the area had high concentrations of seasonally vacant housing. The objectives of the 2020 UL program were to update the address and feature data for each housing unit and to leave an Internet Choice questionnaire package at every housing unit beginning on March 15, 2020. UL personnel used the LiMA software and device that was used for 2020 IFAC. The questionnaire packet that was used in areas with city-style addresses was expanded for this purpose to include all modes of Self-Response: online, phone, or questionnaire to be mailed back. Questionnaires contained a generic ID and barcode that represented the housing unit’s response once linked to the address during the update process. Two reminder mailings were planned: (1) a letter mailed on or around April 1 with the associated ID for the address to respond online; and (2) a postcard, mailed on or around July 17, to any households from the first mailing that did not respond. Housing units in UL areas that did not self-respond were sent to the NRFU operation for follow-up.

UL operations were paused on March 18, 2020, due to COVID-19, at which point work was completed for only 10.8% of the addresses and quality control work had not yet begun. In coordination with state and local health officials, a phased restart began the week of May 4 with the reminder mailings rescheduled accordingly. Operations initially resumed in 23 Area Census Offices (ACOs) in 13 states. All ACOs resumed UL operations by the week of June 11. By July 16, nearly all (99.5%) of UL addresses were enumerated or otherwise resolved, and the remainder were completed by August 10. UL resulted in paper questionnaires and invitations to respond online being delivered to over 6.8 million addresses.

2020 Update Enumerate

The Update Enumerate (UE) operation was initially planned to occur in areas (TEA 2) where the initial visit required enumeration while updating the address frame. The primary functions of UE included: (1) verifying and updating the address list and feature data; (2) determining the type and address characteristics for each living quarter; and (3) enumerating respondents at housing units. UE occurred in areas that were part of the 2010 Census Remote Update/Enumerate operation such as northern parts of Maine and southeast Alaska, along with select tribal areas that requested to be enumerated in person during the initial visit. If there was no contact made with the residents, the field

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

staff left a notice of visit that was linked to the address for the housing unit to respond online.

Original plans for the 2020 Census had UE as the primary TEA alternative to Self-Response. However, in May 2017, the Census Bureau decided to reinstate the UL program to the 2020 Census, with most of the addresses planned for UE being switched to UL. The original operational dates for UE were March 16–April 30, 2020, but due to the COVID-19 pandemic, UE was suspended on March 18 and only resumed on June 14. In the revised UE operation, field staff used masks, gloves, and hand sanitizer, and maintained a distance of at least 6 feet during enumeration. The UE operation concluded on August 31.

2020 Remote Alaska

The objectives of the methodology used in the Remote Alaska operation (TEA 4) were the same as those for the 2020 UE operation. This operation focused on address updating and enumeration of Alaska Native villages, and it included advance contact and enumeration of the GQs and TLs within the Remote Alaska area. The initial workload was over 28,000 living quarters. The original operational dates were January 21–April 30, 2020. Due to the COVID-19 pandemic, the operation was halted on March 18, resumed on April 13, and finished on August 28.

C.6.3 Enumeration of Group Quarters in the 2020 Census

A GQ is a place where people live in a group living arrangement that is owned or managed by an entity. The services provided may include custodial or medical care, and residency is often restricted to people receiving those services. People living in GQs are usually not related to each other. GQs include such places as college/university student housing, residential treatment centers, skilled nursing facilities, group homes, correctional facilities, military barracks or ships, and workers’ group living quarters.

As part of the GQ operation, the Census Bureau developed special enumeration procedures to provide opportunities for people experiencing homelessness to be counted at service locations and pre-identified outdoor locations. These were soup kitchens, regularly scheduled mobile food vans, and targeted non-sheltered outdoor locations. In all, there were close to 30 types of GQ codes and definitions.

The GQ population comprised about 2.5% of the total population in 2020. While this is a relatively small percentage overall, GQs can comprise a substantial percentage of the jurisdictions in which they are located (especially for correctional institutions and college and university populations).

For all GQs, the Census Bureau endeavored to identify a GQ administrator, in some cases working through national associations and organizations to

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

obtain contact information for administrators of the various GQ types. The administrator was to help the Census Bureau decide the type of GQ, how to gain access to residents, and how the residents would prefer to be counted. As an example, assisted living facilities are considered housing units, but any floor or wing where residents receive 24 hours of care 7 days per week (e.g., skilled nursing) are themselves considered GQs. As a result, some of the residents would have received an invitation to respond via the internet or mail, while the administrator would decide how the GQ residents would participate in the census.

Some student housing facilities are considered housing units, while some are considered GQs. If a hotel is considered a person’s “usual home elsewhere” (the place other than the GQ/TL where the person lives and sleeps most of the time), it is considered a housing unit, but it is a GQ if the hotel houses people because of a natural disaster. If all units of the hotel are housing individuals experiencing homelessness, it is considered a GQ. If a hotel is housing both individuals experiencing homelessness and individuals who are paying for extended care, it is considered a housing unit.

Strategies for Enumeration of Residential Group Quarters

Residential GQs represented a challenge to census enumeration. There were different enumeration strategies for many types of residential GQs. Also, GQ administrators for several types of GQs had more than one option that could be used to count residents, including various paper data-collection methods and an electronic data-collection method. Strategies were designed to be flexible because GQ administrators sometimes needed to change enumeration methods at the time of enumeration.

GQ enumeration procedures used in the 2020 Census included: (1) in-person interviews; (2) drop-off/pick-up of questionnaires; (3) facility self-enumeration; (4) paper listings sent to enumerators to send to ACOs; (5) mailout/mail-back of questionnaires from ACOs to GQ administrators and back; and (6) eResponse—the electronic transfer of data records, which entailed GQ administrators uploading administrative response data for individuals living or staying in their facility to a secured web-based Census Bureau portal.

The three key principles of the criteria for GQ enumeration in 2020 were essentially unchanged from 2010: (1) count people at their usual residence—the place where they live and sleep most of the time; (2) count people in certain types of group facilities on Census Day at the group facility; and (3) count people who do not have a usual residence, or who cannot determine a usual residence, where they are on Census Day. The 2020 Census GQ operations included: (1) Group Quarters Advance Contact (In-Office/InField) and Enumeration; (2) Service-Based Enumeration; (3) Maritime/Military Vessel Enumeration; and (4) Deployed Military Enumeration.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

Group Quarters Advance Contact and Enumeration

Group Quarters Advance Contact ran from February 3–March 6, 2020. Area Census Office clerks called GQ administrators to confirm and explain the procedures to be used for the upcoming GQ enumeration. Census workers then verified the GQ’s name, address information, contact name, phone number, and business email address. For the upcoming Census Day, census workers collected an expected population count and maximum population count. Census workers also helped the facility manager to decide on the method of enumeration based on the GQ type and scheduled an appointment for enumeration. If there was no available contact name or phone number, census staff made a personal visit to collect the above information and to discuss the enumeration procedures to be used.

GQ enumeration was planned to run from April 1–June 5, 2020, and eResponse data collection was planned to run from April 1–May 1, 2020. Due to COVID-19, the dates for GQ enumeration were extended to August 26, 2020, as was eResponse and mailback of questionnaires.

COVID-19 complicated census enumeration in the GQ population. Various GQs (e.g., nursing homes, hospitals, other health-based facilities, colleges/universities, group homes) had restrictions on access, some colleges/universities were closed, some enumerators refused to work, individuals in quarantine on military bases or ships were more difficult to account for, and staff were sometimes not available to handle specific planned activities.

For colleges and universities, GQ administrators originally selected methods that allowed students to self-respond as in previous censuses. In terms of the approaches that were initially agreed upon:

  • 47% chose the eResponse methodology (administrators would upload directory-level information about each student onto the Census Bureau’s secure server);
  • 35% chose the drop-off/pick-up option (census field staff would drop off individual census questionnaires for the administrator to distribute to students, after which census field staff would return to pick up the completed questionnaires);
  • 9% chose in-person interviews (census field staff would conduct interviews with students to complete the questionnaires); and
  • 10% chose paper response data collection (administrators would provide census field staff with a paper listing containing directory information about their students).

However, many students were sent home due to COVID-19 and were not living in student housing at the time of GQ enumeration, so the Census Bureau staff reached out to GQ administrators that chose the in-person and drop--

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

off/pick-up methods to ask them to choose either eResponse or the GQ paper-response data-collection option. Based on feedback from the U.S. Department of Education, the eResponse and paper-response methods had implications for the types of data that could be provided under the Family Educational Rights and Privacy Act of 1974 (FERPA). Schools could not provide “Directory Information” for students without their consent. Also, schools could not include race, ethnicity, or gender as directory information under FERPA. The Census Bureau believes it counted 88% of college students during the regular GQ enumeration data collection. Post data collection (December 2020), the Census Bureau telephoned student housing facilities and other GQ types to verify counts and attempt to convert refusals. As a result, college/university student housing enumeration increased from 88% to 97%.

Service-Based Enumeration, Targeted Non-Sheltered Outdoor Locations Enumeration, and Enumeration at Transitory Locations

Enumeration was conducted at various places such as soup kitchens and homeless shelters and targeted non-sheltered outdoor locations (TNSOLs) to enable people experiencing homelessness to be enumerated in the census. Locations where people might be staying on a transitory basis were also visited in the Enumeration at Transitory Locations (ETL) operation to determine their usual residence (elsewhere) or to enumerate them at the location if they had no other usual residence. Service-Based Enumeration (SBE) was originally scheduled to be conducted March 30–April 1, 2020, TNSOL enumeration was originally scheduled to be conducted March 31–April 1, 2020, and ETL was originally scheduled to be conducted April 9–May 4, 2020.

In late May to early June and in response to the COVID-19 pandemic, Census Bureau representatives consulted with major stakeholders to determine alternative dates to conduct SBE/TNSOL. Determining optimal dates to conduct these activities considered the need to conduct a thorough and accurate enumeration, while also understanding the needs of external partners that were critical to conducting SBE. These discussions resulted in the decision to conduct SBE/TNSOL in fall 2020. A primary reason had to do with the effects of seasonality on the homeless population. Late September approximates the weather that would have been experienced in March. Summer would be a more difficult time to enumerate the homeless population since fewer people use shelters and people experiencing homelessness are more spread out. In addition, at the time, it was anticipated that COVID-19 infections could recede by fall. Finally, the service providers at SBE sites were also contending with the effects of the COVID-19 pandemic, and waiting until fall gave them time to prepare. Over 53,000 service-based locations were ultimately enumerated, including almost 37,000 TNSOLs.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

The rescheduled SBE/TNSOL operations were conducted September 22–24, 2020. The service-based locations included shelters for people experiencing homelessness, soup kitchens, and regularly scheduled mobile food vans. The primary enumeration methods were in-person interviews and paper listings, in which field staff picked up paper listings from GQ administrators. Drop-off/pick-up of questionnaires was an added option for shelters only. The primary method of enumeration for ETL was in-person interviews, and this operation was completed between September 3–28, 2020.

Maritime/Military Vessel and Deployed Military Enumeration

The Maritime/Military Vessel Enumeration was planned to run from April 1–June 20, 2020. Due to COVID-19, the end date for Maritime/Military Vessel Enumeration was extended to August 24, 2020. Military deployed overseas temporarily and military stationed overseas were enumerated using administrative records.7

C.7 NONRESPONSE FOLLOWUP

In TEAs 1 and 6, NRFU was a critical part of census operations. The goal of the NRFU operation was to enumerate people living at an address for which no self-response had been received (or a determination that an address was vacant housing or should be deleted from the census because it no longer existed or was nonresidential or uninhabitable). NRFU was originally planned to run from May 13–July 31, 2020. The advent of COVID-19 necessitated that these dates be modified. The Census Bureau began a soft launch of NRFU on July 16, went to full implementation on August 11, and completed NRFU on October 15, 2020. This section describes the NRFU contact strategy, identification of vacant and nonexistent addresses, outcomes of NRFU, additional impediments for NRFU in 2020, a related operation (census questionnaire assistance), and use of administrative records for enumeration.

C.7.1 NRFU Contact Strategy

The NRFU contact strategy had two primary purposes: (1) determine the housing unit status for nonresponding addresses; and (2) enumerate housing units for which a 2020 Census response was not received. Other activities

___________________

7 Military temporarily deployed were included in the resident population for reapportionment, redistricting, and other data uses at their home base. Military stationed overseas were included in the population totals for their home states for reapportionment of the U.S. House of Representatives only and were not included in any other data product. See “The Census and the Military” overview slides at https://www.doi.gov/sites/doi.gov/files/uploads/oia-02052020-census-and-the-military.pdf.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

included: (1) field verification of addresses from respondents using Self-Response that did not contain a Census ID (Non-ID cases) and that were not on the MAF; (2) enumeration of UL cases that did not self-respond on first contact; and (3) resolution of additional cases from other census operations, such as LUCA appeals and the USPS DSF refresh.

Phase 1 of the NRFU strategy included all addresses that did not self-respond. All cases were assigned to an enumerator based on the optimization of work availability and workload geography. Use of proxies was approved on the third visit and continued through all following visits. If administrative records data could be found that were considered of sufficient quality, administrative records modeling was used to classify cases as occupied, vacant, or delete after one field visit. Up to four visits were made during Phase 1. All unresolved cases were put on hold after the fourth attempt in preparation for Phase 2.

Phase 2 was referred to as Semi-Permanent Assignment. Census Field Supervisor (CFS) Areas (the geography for which a single census field supervisor was responsible) were eligible to move into Phase 2 when, initially, 60% of cases had reached four visits or were completed. That percentage was raised to 85% on August 19, 2020, to ensure that CFS areas did not move into Phase 2 too soon. All CFS areas were eligible to move into Phase 2 on September 4, 2020, regardless of their completion percentage.

Census Field Managers used the Field Operations Control System to put CFS areas (approximately 4,500 housing units in each area) into Phase 2 and designated high-performing enumerators to work those areas. Reinterviews and self-response quality-assurance cases were prioritized in both Phases 1 and 2.

Phase 3 was referred to as Closeout. CFS areas were eligible to move into Phase 3 when a sufficient percentage of cases had either been resolved or had the maximum number of attempts at enumeration. Prior to the start of the operation, this threshold was set at 85%, but it was updated to 90% on August 19, 2020, to ensure that CFS areas did not move into Phase 3 too soon.

All CFS areas were eligible for Closeout on September 11, 2020, regardless of their completion percentage. During Phase 3, an additional round of administrative records modeling was used to reduce the NRFU Closeout workload. Cases modeled through use of administrative records as occupied, vacant, or delete were closed if they had six visits. Cases continued to be assigned using the same methodology used in Phase 2. CFS areas closed out upon 100% completion or the arrival of the last day of the operation on October 15, 2020.

C.7.2 Identification of Vacant and Nonexistent Addresses

In May 2020, administrative records were used to determine possible vacant and nonexistent addresses. To be determined to be vacant or nonexistent, an address had to have at least one “unavailable as addressed” (UAA) designation

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

from the USPS in mailable areas. If the mail was undelivered and there was no sign of occupancy, the address received one field visit. Otherwise, it received up to six field visits in NRFU.

The 2020 Census was the first to make use of administrative records and third-party data to assist in NRFU by identifying vacant or nonresidential units, and by using high-quality administrative records data to enumerate occupied households. Also, this was the first census that used an adaptive design strategy to reduce the number of contact attempts for each household.

C.7.3 Nonresponse Followup: Outcomes and Additional Impediments

Among housing units that were identified as occupied during NRFU, 55.5% were enumerated through an interview with a household member and 26.1% were resolved through interview of a proxy respondent. This is comparable to the 2010 Census, in which 24.7% of occupied housing units in NRFU were enumerated via proxy. Finally, 18.4% of the occupied housing units in NRFU were enumerated using high-quality administrative records.

For housing units that were determined to be vacant in NRFU, 85.5% were resolved through an interview with a proxy respondent and 14.5% were resolved through use of administrative records. Among those housing units that were determined to be deleted (nonexistent or nonresidential) in NRFU, 96.4% were resolved with a proxy respondent and 3.6% were resolved using administrative records.

Every decennial census encounters circumstances that make NRFU and other operations more difficult than anticipated, but the delay in starting 2020 field work raised particular concerns for NRFU—putting the most intricate of census operations in the midst of peak natural disaster season, which the regular schedule seeks to avoid. In 2020, the circumstances facing NRFU included (1) the emergence of COVID-19 in March 2020—accompanied by various local rules and regulations regarding public behavior, in-person contact, and availability of personal protective equipment; (2) Tropical Storm Marco, which hit the Gulf Coast on August 24, 2020; (3) Hurricane Laura, which hit the Gulf Coast on August 26, 2020; (4) California, Oregon, and Washington wildfires and associated poor air quality in early September 2020; (5) Hurricane Sally, which hit the Gulf Coast on September 16, 2020; (6) Tropical Storm Beta, which hit the Gulf Coast on September 21, 2020; and (7) various legal challenges to the timing of various census field operations.

C.7.4 Census Questionnaire Assistance and Telephone Response

The two goals of the 2020 Census Questionnaire Assistance (CQA) operation were: (1) provide questionnaire assistance for respondents by answering questions about specific items on the census form or other frequently

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

asked questions about the census; and (2) provide an option for respondents who were interested in conducting their census interview over the telephone. A similar program was implemented during the 2010 Census, however, there were substantial changes in implementation for 2020. For example, during the 2010 Census, responses were collected primarily from paper questionnaires, and telephone support centered around supporting the paper operation; whereas the primary mode for Self-Response in 2020 was ISR, so CQA was intended to meet the needs of respondents who did not have access to the internet or computers. There were other typical questions anticipated about the 2020 questionnaire in 2020 in terms of how to respond, so toll-free telephone numbers were made available for respondents to call for help in completing their responses, which they could do over the telephone.

Other types of assistance for respondents were interactive voice response (IVR) and Frequently Asked Questions. Language support was also provided in English, Spanish, Tagalog, Arabic, Haitian Creole, Polish, French, Chinese (Mandarin and Cantonese), Korean, Russian, Vietnamese, Japanese, and Portuguese. English and Spanish were the predominant languages that were used to provide assistance. CQA received 13.5 million calls. Of these calls, 87.4% were received on English language lines and 10.4% on Spanish language lines. A total of 12.9 million calls accessed the IVR, and 7.9 million calls were handled within the IVR.

C.7.5 Use of Administrative Records for Enumeration

A major initiative of the 2020 Census intercensal research and development program was to make possible the use of administrative records in enumeration. Uses included: (1) administrative records modeling to support the reduction of contacts in the NRFU operation; (2) the assignment of a Census ID to self-respondents who did not have their census ID; and (3) Self-Response and NRFU quality assurance. This section focuses on the first application (reducing contacts during NRFU) and on how administrative records modeling was modified based on the delayed start of the NRFU operation due to COVID-19 and the extension of the Internal Revenue Service (IRS) tax filing deadline from April 15 to July 15.

Using Administrative Records to Identify Vacant and Nonexistent Addresses

Part of the original 2020 Census plan was to use administrative records data to help identify vacant and nonexistent addresses and, as a result, minimize field work to validate those determinations. Sources of administrative records included: USPS designations of UAA for census mailings made around April 1, 2020; USPS DSF and other USPS information; IRS 1040 filings and 1099

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

information returns; the Centers for Medicare and Medicaid Services (CMS) Medicare Enrollment database; the Indian Health Service Patient database; the Veterans Service Group of Illinois third-party national files; the Census Bureau’s MAF; and ACS area-level estimates of the percent of addresses that were vacant, in poverty, and occupied by people with other characteristics. The original 2020 Census plan was to use these sources to help identify vacant and nonexistent addresses. The Census Bureau developed a distance function, which measured the degree to which information from the above sources discriminated a vacant address from a nonexistent address and both from an occupied address. The resulting algorithm found that, to be considered for possible vacant or nonexistent status, an address had to have at least one UAA in the census mailings.

The original plan was to send a mailing to each nonresponding address that the administrative records model determined to be possibly vacant or nonexistent, about six weeks after Census Day (mid-May). This was to be followed up by a single field visit. If the mail was undelivered and there was no sign of occupancy, the administrative records model was to be used to decide on the status of the address as vacant, nonexistent, or possibly occupied. If the mail was delivered or if there was a sign of occupancy, the address was placed into NRFU.

Due to COVID-19, there were changes to the tax filing deadline and to the NRFU deadlines. On March 21, 2020, IRS announced that the tax filing deadline was extended from April 15, 2020, to July 15, 2020. Throughout, however, the Census Bureau continued to receive monthly deliveries of processed IRS-1040 records. The start of the NRFU operation was also delayed until August 9, 2020. As a result, the Census Bureau revised the plan for use of administrative records modeling for possible vacant or nonexistent addresses. The Census Bureau delayed determination of vacant and nonexistent addresses until the end of May 2020; it also scaled back the addresses that would be enumerated via administrative records as vacant or nonexistent, to err on the side of caution. For example, administrative records modeling was not used for addresses in ZIP codes with 57% or higher rates of UAAs on the first mailing; instead all of these addresses were sent to NRFU.

Enumerating a Percentage of Nonresponse Followup Workload Using Administrative Records

As part of NRFU, the plan was to use administrative records to enumerate some nonresponding housing units that had high-quality administrative records information. The idea was to build a roster from the most recent administrative records sources, check that multiple sources indicated that the household resided at the address in question, determine how likely it was that the census was counting all the people rostered at the right location, and then decide whether

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

to use administrative records data for the address in the census. The sources used for rostering were tax year 2019 IRS 1040 tax returns, tax year 2019 IRS information returns, the CMS Medicare Enrollment database, the Indian Health Service Patient database, and the Census Bureau’s Household Composition Key File. A wider range of sources was used to verify that a household lived at a specific address.

The process for using administrative records information to enumerate households was planned to operate as follows. For TEA 1 areas, if there was no Self-Response return after the final (originally the fifth) mailing, the household would be visited by an enumerator and receive a postcard reminder one week later. If there was still no response, then the address would be enumerated using administrative records information, if that information was found to be of sufficiently high quality. For TEA 6 areas, an enumerator visited if there was no response from leaving off the questionnaire and after two reminder mailings.

With the delays in NRFU and tax filing, the Census Bureau shifted to June to identify initial administrative records-occupied cases in which there were also count agreements with Self-Response returns. The Census Bureau continued modeling, making one visit to an address, and enumerating households via administrative records if that information was deemed reliable; the 2020 Census also used tax return data delivered by IRS in June, July, and August.

Once a Census Field Supervisor area went into NRFU “Closeout,” then the criteria for use of administrative records for enumeration of an occupied address were relaxed. A single source could be used that provided a population count, even if it lacked characteristics.

Finally, for students living in off-campus student housing, the Census Bureau decided in May to contact colleges and universities to see if they could provide names, dates of birth, and ages of students who were living off campus in the spring 2020 semester. Between June 18–August 14, the Census Bureau contacted over 1,300 schools to see if they could participate. As a result, over 600 schools provided their information to the Census Bureau.

C.8 DATA PROCESSING AND CREATION OF DATA PRODUCTS

C.8.1 2020 Census Data Files

The following is a list of internal files that represent steps in the processing of data collected in the 2020 Census leading up to the release of state population counts for reapportionment and block data for redistricting, including the dates when each file was originally planned to be completed and when it was actually created.

  • Decennial Response File 1 (DRF1): The first file produced after data were collected was DRF1, containing all the response data, including duplicate responses. DRF1 is the complete inventory of every residential
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
  • address in the nation linked to every response received during data collection. Planned dates: 9/15/2020–10/15/2020; actual dates: 10/29/2020–12/26/2020.
  • Decennial Response File 2 (DRF2): This file resulted from employing the Primary Selection Algorithm8 to select which data from DRF1 should represent a housing unit on the Census Unedited File (CUF). The DRF2 universe was smaller than that for DRF1 because duplicate and multiple responses had been resolved. Planned dates: 10/16/2020–11/7/2020; actual dates: 12/26/2000–2/24/2021.
  • Census Unedited File (CUF): After count imputation was applied to the remaining unresolved cases in DRF2, the result was the CUF. The CUF contained the final 2020 Census universes of housing unit records (including vacant units but not those deleted because they were nonresidential or nonexistent) and person records. The state-level population counts for apportionment are derived from the CUF. Planned dates: 11/8/2020–12/5/2020; actual dates: 2/25/2021–3/10/2021.
  • Census Edited File (CEF): The CEF resulted when editing and item imputation were applied to missing and erroneous values for all items in the CUF records. Hence, the CEF contains complete data for all items. Planned dates: 11/25/2020–1/26/2021; actual dates: 4/20/2021–6/24/2021.
  • Microdata Detail File (MDF): When confidentiality protection and recodes were applied to the CEF using the Disclosure Avoidance System (DAS), the result was the MDF. After tabulation geography was added, the data were ready for tabulation and dissemination. Planned dates: 1/26/2021–2/10/2021; actual dates: 6/25/2021–7/18/2021.
  • Tabulation: Table-generation software transformed the MDF into tables for public release in the 2020 Census Redistricting File. Planned dates: 2/10/2021–3/19/2021; actual dates: 7/19/2021–8/12/2021.9
  • Apportionment and Redistricting Key Milestones: The legislated deadlines are December 31 of the census year for apportionment counts and March 31 of the subsequent year for redistricting counts. Planned dates for apportionment preparation and release for the 2020 Census: 12/6/2020–12/27/2020; actual dates: 3/12/2021–4/26/2021. Planned dates for redistricting preparation and release: 2/17/2021–3/31/2021; actual dates: 8/13/2021–9/20/2021.

___________________

8 The Primary Selection Algorithm is a decennial census algorithm that chooses among multiple enumerations for a given residence.

9 The same process of creating an MDF and producing tables is being used for the Demographic and Housing Characteristics File (DHC); different processes, which skip the MDF stage, are being used for the Detailed DHC products (A and B) and the Supplemental-DHC product. See Chapter 11 for details.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

C.8.2 Count Imputation

The Census Bureau tries hard to reduce the frequency of households that fail to respond, but there always remain housing units known to be occupied but for which nothing about the household is known, including the number of residents. Further, there are housing units for which even the vacancy status is unknown. In these situations, the Census Bureau uses count imputation to estimate the number of residents on Census Day after checking administrative records for relevant information.

At the beginning of data collection, the objective is to assign a status (occupied, vacant, deleted) to every address in the MAF. Through Self-Response and NRFU, the Census Bureau strives to determine the count at every occupied address, which can result in two possibilities: (1) the address is resolved—field staff determine whether the housing unit is occupied (vacant, or delete, i.e., nonexistent or nonresidential), and if occupied, how many people live there; or (2) the address is unresolved—field staff have insufficient or conflicting information about the status of the address or how many residents live there. Unresolved addresses go to count imputation for resolution.

Types of Count Imputation

Count imputation may involve status imputation. If it is unknown whether a housing unit exists, is vacant, or is occupied, then count imputation can result in several possibilities: the housing unit is nonexistent, the housing unit exists but is vacant, or the housing unit is occupied with 1–9 or more people. If it is known that the unit exists, count imputation can result in the following possibilities: the housing unit is vacant, or it is occupied with 1–9 or more people. Finally, if it is known that the housing unit is occupied, count imputation can result in its being occupied with 1–9 or more people.

Count imputation is implemented after all data collection is complete (i.e., after Self-Response and NRFU). Data collection completion also includes the conclusion of administrative records enumeration since addresses with high-quality administrative records have already been resolved. Count imputation is a key step in creating the CUF.

Count imputation is a component of two important quality metrics. The first is the percentage of addresses that were unresolved at the end of data collection. The second is the percentage of the total resident population added through count imputation. These two percentages, when compared across decennial censuses by state and other geographies, are an indicator of comparative quality of census information.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

2020 Count Imputation Procedure

In 2020, all addresses (resolved and unresolved) were placed into one of several mutually exclusive imputation cells, defined by characteristics related to operational paradata, such as: whether the address was a valid living quarters or not; whether it was likely a delete, vacant, or other; whether it was considered residential or not in the spring 2020 DSF; and whether it was UAA. Within each state and imputation cell, addresses were sorted by geography (state, county, census tract, block group, block).

For unresolved addresses, the count imputation procedure was conducted by copying the status, population count, or both from the values obtained from a resolved nearest neighbor housing unit that had the same values for the operational paradata of the imputation cell. By using the nearest neighbor, the count imputation procedure took advantage of spatial correlations among nearby addresses. A donor housing unit could be used as the nearest neighbor for more than one other housing unit.

A different procedure was used for count imputation for GQs, due to the differences and complexity of GQs in comparison with housing units. (Count imputation for GQs was an innovation in 2020 because of the problems COVID-19 caused for the GQ enumeration.) Several forms of review occurred to improve accuracy for GQs. These included calling GQs that had not provided a population count, calling GQs with larger-than-expected counts, a review of colleges/universities using data from the U.S. Department of Education, and a review to reduce duplicate people in GQs. If it was still not possible to obtain a count of residents, even though there was evidence that the GQ was open and occupied on Census Day (e.g., refusals and GQs that were closed during the visit for GQ enumeration), the Census Bureau used information such as the expected or maximum capacity reported for a GQ in the Advance Contact operations or the ACS to estimate occupancy on Census Day and, less often, information from administrative records or other GQs.

C.8.3 Item Nonresponse and Characteristic Imputation

Characteristic imputation is used in the census to eliminate item nonresponse. Because demographic information is used for tabulations, characteristic imputation is important for accuracy of cross-tabulations. The objective is for each person, housing unit, and GQ on the final CEF to have consistent and valid values for all items.

The census item nonresponse rate is essentially the proportion of missing responses before any pre-editing or imputation procedures are used for a given item. For example, if the respondent answers all questions for all household members except age, then age for each household member counts as an item nonresponse. If the respondent provides the number of people living at the

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

Table C.4 Item Imputation Rates (Percentages), 2010 and 2020 Censuses

Census/Item Tenure Sex Age/DOB Hispanic Origin Race Household Relationship
2010
Imputed 3.5 1.6 5.1 4.5 4.1 2.1
Assigned N.A. 1.3 1.5 1.7 1.2 0.5
Allocated 3.5 0.3 3.6 2.8 2.9 1.7
2020
Imputed 7.6 6.2 14.8 8.7 9.2 8.0
Assigned 1.9 5.8 8.7 4.7 4.9 2.1
Allocated 5.7 0.4 6.1 4.0 4.3 5.9

SOURCE: DeJesus and Konya (2023:Table N).

housing unit but does not answer any of the demographic questions on the census questionnaire for any household member, all those characteristics count as item nonresponse. Inconsistent responses (e.g., responses incompatible with other responses) are not considered item nonresponse but are put through edit and imputation processes as needed.

2020 Edit and Characteristic Imputation Overview

In 2020, during the editing phase, responses were run through a series of checks to: (1) detect and correct out-of-range or inconsistent values; (2) remove invalid or duplicate responses (e.g., checking more than one relationship code on a paper questionnaire); (3) convert date of birth values to age values; and (4) assign race/Hispanic-origin responses to numeric codes. The optimal scenario was for all responses to be valid and consistent so there would be no need for imputation. When there were missing, invalid, or inconsistent values, the Census Bureau used characteristic imputation.

Characteristic imputation involves assignment and allocation. Assignment occurs when information can be determined from other responses provided for the same person, from previous census responses, or from administrative records for that person. Allocation occurs when responses that are missing or inconsistent are imputed using information from responses provided for other persons in the household or from a similar nearby household. Table C.4 provides item imputation rates for the 2010 and 2020 Censuses. Note that 2020 rates are higher than 2010 rates; enumerator returns in both censuses have higher imputation rates than do self-returns (data not shown).

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

Edit and Imputation Procedures for Specific Questionnaire Items

If a person’s sex was missing, invalid, or inconsistent (e.g., if both male and female were checked), the Census Bureau attempted the following, in order: (1) use the first name for missing sex characteristics; (2) link the person to high-quality administrative records, such as their 2010 Census response or information from the Social Security Administration (SSA) (i.e., the Numident File); (3) assign sex to maintain household consistency; or (4) allocate sex from the nearest neighbor hot deck.10

If age or date of birth (DOB) were missing, invalid (e.g., over 150 years old), or inconsistent, there were several ways the Census Bureau could impute an age value. If either age or DOB was missing but not both, the Census Bureau calculated age from the DOB or created a random DOB from the age. If an inconsistent age and DOB were provided, the Census Bureau chose the response that was consistent with the person’s relationship to the householder. Age and DOB could also be assigned from the person’s responses to the 2010 Census or from the Numident File. If none of these options were available, the Census Bureau could allocate age and DOB from the nearest neighbor hot deck.

Some respondents did not provide Hispanic-origin or race information, or provided responses that had to be edited to a single Hispanic-origin response and up to eight race responses. The Census Bureau did not impute detailed Hispanic-origin and race responses when only a major group was reported. If Hispanic origin and/or race were missing or invalid, the Census Bureau: (1) assigned Hispanic origin and race from the 2010 Census response, from the ACS, from the Numident File, or from other federal sources; (2) assigned race from Hispanic origin, or vice versa; (3) allocated from another household member’s response; or (4) allocated from the nearest neighbor hot deck.

Some respondents chose not to provide relationship information or provided invalid or inconsistent responses. Invalid could mean that multiple relationship categories were selected. Inconsistent would be when the relationship status did not match the age or sex of other members in the housing unit. When this happened, the Census Bureau: (1) assigned or allocated relationship to maintain consistency with other household members; (2) assigned relationship from the Census Bureau’s Kidlink file—which contains SSA information and links parent-child relationships; or (3) allocated a value using the nearest neighbor hot deck imputation.

If housing tenure status (owner or renter) was missing or invalid (e.g., when multiple values were selected), the Census Bureau: (1) assigned tenure

___________________

10 “Hot deck” imputation is a method whereby records are sorted geographically (state, county, census tract, block group, block), reported values from each record are input into a matrix of relevant variables for the variable requiring imputation, and the most recent reported value in the matrix is used to fill in the record with a missing value. The method is dynamic (although a donor may provide imputed values for more than one record) and takes advantage of correlations among values for people living in close proximity.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

Table C.5 Differences in Numbers of Write-Ins for the Race and Hispanic-Origin Questions, 2010 and 2020 Censuses

Census/Question 2010 Write-Ins 2020 Write-Ins Difference (2020 − 2010)
Race 37.7 million 335.5 million 297.8 million
Ethnicity (Hispanic origin) 17.1 million 15.0 million −2.1 million

SOURCE: Jones et al. (2021:Slide 9).

using federal administrative records sources, such as information from the U.S. Department of Housing and Urban Development, or commercial tax and deed information; (2) allocated from the nearest neighbor hot deck using administrative records values as a nearness measure; or (3) allocated from the nearest neighbor hot deck.

Detailed vacancy status was typically collected by enumerators during the NFRU operation when a housing unit was identified as vacant, but housing units could also be identified as vacant using high-quality administrative records and through Count Imputation. If detailed vacancy (e.g., seasonally vacant) was missing, the Census Bureau allocated a value using nearest neighbor hot deck imputation.

Finally, if all characteristics information was missing for all persons in an occupied housing unit, such as in “population count only” cases, and when the housing unit status was count imputed, the Census Bureau allocated characteristics from administrative records associated with the housing unit when such records were available. Otherwise, the Census Bureau allocated using entire neighboring households from nearest neighbor hot deck imputation. These cases were termed whole-person imputations.

C.8.4 Changes in Format and Coding of the 2020 Race and Ethnicity Questions

Chapter 10 documents significant changes in the format of the race and ethnicity questions in 2020—most important, adding write-in spaces under the White and Black checkboxes—and in the coding of race and ethnicity responses. The greater opportunity for the provision of write-in responses resulted in very large increases in the number of write-in responses for race in the 2020 Census compared to the 2010 Census. However, this was not the case for write-ins for ethnicity—see Table C.5.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

C.8.5 Disclosure Protection and Tabulation

The 2020 Census used a new DAS on publicly released data products, which has had major adverse impacts on the timeliness, quality, and utility of those products compared with earlier censuses. Many data products have just been released or are not yet released as of this writing. Chapter 11 describes the 2020 DAS and its implementation for the various 2020 data products.

C.9 2020 CENSUS EVALUATION

C.9.1 Coverage Evaluation

The Census Bureau traditionally uses two methods evaluating the completeness of the population count in total and for population groups and some geographies (e.g., states). These methods are Demographic Analysis (DA) and the Post-Enumeration Survey (PES). See Chapter 4 for details on each method and comparisons with the 2010 Census coverage measurement.

C.9.2 Census Program for Evaluations and Experiments

Like several of its predecessors, the 2020 Census Program for Evaluations and Experiments (CPEX) program combined experiments conducted during or shortly after the census, evaluations and assessments of the census, and quality profiles. Census experiments are generally intended to be forward looking, to identify techniques and methods that might prove beneficial for subsequent censuses. Census evaluations are analyses that interpret and synthesize the effectiveness and efficiencies of census components and their impact on data quality and coverage. Evaluations go beyond how successful (in terms of budget and deadline) the performance of a component process was. These studies can delve into the range of performances of a component process for various demographic groups, types of geographies, or other subsets of the population (e.g., owner/renter). Census assessments document final volumes, rates, and costs for individual operations or processes using data from production files and information collected from debriefings. They are more focused and more descriptive, as opposed to analytical, than are evaluations. Quality profiles present the results of quality-assurance programs for census operations.

Below, we catalog the titles of experiments, evaluations, and quality profiles from the 2010 and 2020 CPEX, providing release dates and scheduled release dates, as applicable. Publication of reports from the 2020 CPEX is running behind that for 2010. The titles and timelines are derived from the evaluations landing webpage for the two censuses.11

___________________

11 See https://www.census.gov/programs-surveys/decennial-census/decade/2020/planning-management/evaluate/eae.html for 2020 and https://www.census.gov/programs-surveys/decennial-census/decade/2020/planning-management/evaluate/eae.2010.html for 2010.

Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
  • Experiments
    • 2020—3 reports
      • Real-Time 2020 Administrative Records Simulation—5/5/2023
      • Optimization of Self-Response in the 2020 Census—By December 2023
      • Extending the Decennial Census Environment to the Mailing Materials—4/10/2023
    • 2010—6 reports
      • Race and Hispanic Origin Alternative Questionnaire Experiment—2/28/2013
      • 2010 Census Nonresponse Followup Contact Strategy Experiment Report—1/30/2012
      • 2010 Census Confidentiality Notification Experiment Report—1/12/2012
      • 2010 Census Deadline Messaging Compressed Mailing Schedule Experiment—12/14/2011
      • The Paid Advertising Heavy-Up Experiment—4/24/2011
      • 2010 Census AQE: Census 2000 Form Replication Panel—1/10/2011
  • Evaluations
    • 2020—12 reports
      • Analysis of Census Internet Self-Response Paradata by Language—By September 2024
      • Research on Historically Undercounted Populations: Non-English Speakers and Complex Household Residents including Undercount of Children—By September 2024
      • Evaluation of Reengineered Address Canvassing Operation—By September 2024
      • Administrative Records Dual System Estimation—By March 2024
      • 2020 Census Communications Campaign Evaluation: Census Mindset Measures Before and After the Campaign—By September 2024
      • Privacy and Confidentiality Concerns—By September 2024
      • Matching 2018 Census Barriers, Attitudes, and Behaviors Study (CBAMS) Survey Sample to the 2020 Census—By December 2023
      • 2020 Census Tracking Survey—By December 2023
      • Group Quarters Advance Contact: Refining Classification of College or University Student Housing—By December 2023
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
      • Investigating Digital Advertising and Online Self-Response—By December 2023
      • Comparing 2019 Census Test and 2020 Census Self-Response Rates to Estimate the “Decennial Environment”—1/23/2023
      • 2020 Census Quantitative Creative Testing Final Report—11/16/2022
    • 2010—12 reports
      • Comparative Ethnographic Studies of Enumeration Methods and Coverage—3/29/2013
      • Evaluation of Address List Maintenance Using Supplemental Data Source—2/20/2013
      • Administrative Records Use for Coverage Problems Evaluation Report—2/13/2013
      • Evaluation of Address Frame Accuracy and Quality—1/14/2013
      • Observing Census Enumeration of Non-English Speaking Households—12/6/2012
      • Behavior Coding Report Coverage Measurement Person Interviews—12/4/2012
      • 2010 Census Match Study—11/16/2012
      • Alternative Coverage Followup Questions and Design Evaluation—10/16/2012
      • Effectiveness of Unduplication Evaluation—9/25/2012
      • Investigation of the Methods to Evaluate the Coverage of Group Quarters—9/25/2012
      • 2010 Census Evaluation to Assess CCM Search Area & Address List Rules—9/20/2012
      • Global Positioning System Evaluation—9/12/2012
      • 2010 Census Avoid Followup Evaluation—9/7/2012
  • Operational Assessments
    • 2020—46 reports
      • 2 Assessment Reports (Scheduled)—By June 2025
      • 2 Assessment Reports (Scheduled)—By December 2024
      • 2 Assessment Report (Scheduled)—By September 2024
      • 12 Assessment Reports (Scheduled)—By March 2024
      • 17 Assessment Reports (Scheduled)—By December 2023
      • 2 Assessment Reports (Scheduled)—By September 2023
      • 2 Assessment Reports: Count Review and Post-Enumeration Survey Initial Housing Unit Matching Assessment—August 8 and 17, 2023
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
      • 2 Assessment Reports: Item Nonresponse and Imputation; Census Questionnaire Assistance (CQA)—February 7 and 9, 2023
      • 1 Assessment Reports: Post-Enumeration Survey Independent Listing—January 24, 2023
      • 2 Assessment Reports: Local Update of Census Addresses (LUCA); Archiving—November 9 and 28, 2022
      • 2 Assessment Reports: In-Field Address Canvassing; In-Office Address Canvassing—August 18, 2022
    • 2010—59 reports
      • 1 Assessment Report—January 2013
      • 52 Assessment Reports—January–December 2012
      • 6 Assessment Reports—March–December 2011
  • Quality Profiles/Quality Assurance Reports
    • 2020—3 reports
      • Nonresponse Followup Quality Assurance Results—8/7/2023
      • Update Leave Quality Assurance Results—12/7/2022
      • In-Field Address Quality Assurance Results—9/14/2022
    • 2010—6 reports
      • 2010 Census Enumeration at Transitory Locations Reinterview Operation—6/26/2012
      • 2010 Census Address Canvassing Quality Profile—6/21/2012
      • 2010 Census: Nonresponse Followup Quality Profile—3/20/2012
      • 2010 Census: Update/Leave Quality Profile—12/12/2011
      • 2010 Census Coverage Measurement Independent Listing Quality Profile—11/30/2011
      • 2010 Census Coverage Measurement Initial HU Followup Quality Profile—11/16/2011
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 419
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 420
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 421
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 422
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 423
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 424
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 425
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 426
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 427
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 428
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 429
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 430
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 431
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 432
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 433
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 434
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 435
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 436
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 437
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 438
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 439
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 440
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 441
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 442
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 443
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 444
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 445
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 446
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 447
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 448
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 449
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 450
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 451
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 452
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 453
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 454
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 455
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 456
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 457
Suggested Citation:"Appendix C: Additional Detail and Reference on 2020 Census Operations." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 458
Next: Appendix D: Extensions of Census Coverage Evaluations »
Assessing the 2020 Census: Final Report Get This Book
×
 Assessing the 2020 Census: Final Report
Buy Paperback | $60.00 Buy Ebook | $48.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Since 1790, the U.S. census has been a recurring, essential civic ceremony in which everyone counts; it reaffirms a commitment to equality among all, as political representation is explicitly tied to population counts. Assessing the 2020 Census looks at the quality of the 2020 Census and its constituent operations, drawing appropriate comparisons with prior censuses. The report acknowledges the extraordinary challenges the Census Bureau faced in conducting the census and provides guidance as it plans for the 2030 Census. In addition, the report encourages research and development as the goals and designs for the 2030 Census are developed, urging the Census Bureau to establish a true partnership with census data users and government partners at the state, local, tribal, and federal levels.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!