National Academies Press: OpenBook

Assessing the 2020 Census: Final Report (2023)

Chapter: 12 Learning from 2020, Preparing for 2030

« Previous: 11 Impact of New Confidentiality-Protection Methods on 2020 Census Data Products
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

– 12 –

Learning from 2020, Preparing for 2030

The U.S. Census Bureau overcame massive challenges and withstood immense pressures in carrying out the 2020 Census. The delays and operational changes induced by the COVID-19 pandemic strained the process of census-taking—and were arguably even more injurious to the Post-Enumeration Survey (PES), a fundamental tool for ascertaining census quality—but did not break the process. The 2020 Census was not perfect, to be sure; group quarters (GQs) are challenging to enumerate in any census and were even more so in 2020, and the diminished quality of proxy responses (as evident in the striking age-heaping phenomenon) calls the very utility of proxy reporting into question. But it is unrealistic to expect perfection in any census. The Census Bureau should rightly take pride in having a solid plan for the 2020 Census, adhering to that plan as much as possible, and making careful and reasonable adaptations as necessitated by the pandemic.

Yet it serves no one well to downplay a very important point: some challenges were forced upon the 2020 Census but others were the result of the Census Bureau’s own decisions. The Census Bureau decided in the very late stages of 2020 Census planning to completely replace its Disclosure Avoidance System (DAS) with an entirely new approach, one that had not been tested, prototyped, or deployed in the population census context. While confidentiality protection is a critically important responsibility of a statistical agency, this decision was made without appropriate consideration and balance regarding the utility of resulting census data products to fulfill the many important functions of census data. In short, the new DAS was not ready

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

for use in 2020 Census production, and substantially degraded the value of 2020 Census data products in terms of both quality and availability. Moreover, lingering questions about both the simulated database reconstruction attack that motivated the new DAS and the degree of confidentiality protection that was ultimately realized in practice have arguably harmed 2020 Census data products and the Census Bureau’s reputation.

Going forward, to best build a research and evidentiary base for conducting the 2030 Census, we echo and strongly endorse the core guidance of our predecessor Panel to Review the 2010 Census (National Research Council, 2011): to effect major change in census conduct, the Census Bureau should focus its primary attention on a small number of carefully selected major innovation areas and conduct a rigorous program of strategic testing and systems development to address the challenges of how to implement those innovations rather than whether to do so. The predecessor panel articulated the four major innovation areas—field reengineering, response options (including internet response), administrative records, and geographic resource improvement—that were adopted by the Census Bureau in framing the 2020 Census. We believe that the capacities and efficiencies associated with these four priority areas were crucial to the successful conduct of the 2020 Census under supremely adverse conditions, as they each helped the Census Bureau to adapt to the difficult conditions of 2020 as best could be expected.

With this chapter, we close this report by presenting our detailed recommendations from the preceding chapters in a unified framework, along with some additional and overarching suggestions. These recommendations generally fall under three general headings: a suggested set of priority goals for 2030 Census research and development, along with other overall recommendations for improving census evaluation; suggested changes for 2030 Census procedures and operations that are still very important to overall census quality even if they fall outside the top-priority tier; and improvements in the legal and regulatory environment in which the decennial census operates.

12.1 PRIORITY GOALS FOR 2030 CENSUS RESEARCH AND DEVELOPMENT

The predecessor National Research Council (2011:Recommendation 3) Panel to Review the 2010 Census suggested that the Census Bureau adopt a broad motivational goal: “to reduce significantly (and not just contain) the inflation-adjusted per housing unit cost relative to [the 2010 Census], while limiting the extent of gross and net census errors to levels consistent with both user needs and cost targets.” To progress toward that goal, that panel suggested that the Census Bureau focus its research and testing efforts on the four innovation areas previously mentioned in Section 2.1. We concur that

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

this approach should continue into 2030 Census planning, and the list of key objectives that we suggest has important points of overlap and divergence with last decade’s guidance. Specifically, we recommend as follows (note that there is no rank ordering of importance in this list):

Recommendation 12.1: To prioritize its testing and development of both procedures and technical systems for the 2030 Census, the U.S. Census Bureau should take the following as primary objectives:

  • Maximize self-response to the census, including better matching of contact and communication strategies to the desired response mode, with particular attention to hard-to-reach, at-risk populations;
  • Improve the quality of data in Nonresponse Followup, including reducing if not eliminating the use of low-quality proxy reporting when there is an alternative available;
  • Reduce gaps in coverage and data quality associated with race, ethnicity, and socioeconomic status;
  • Improve the quality of address listings and contact strategies for all living quarters, including group quarters; and
  • Realign the balance between utility, timeliness, and confidentiality protection in 2030 Census data products.

12.1.1 Maximize Census Self-Response

The “maximize self-response to the census” plank of Recommendation 12.1 is, admittedly, an inevitable reaction to the experience with promoting census self-response—particularly via the internet—in other countries. The U.S. 2020 Census achieved an Internet Self-Response rate of about 62–63% of occupied housing units,1 which was a formidable achievement, but one need only look at the progression to 80%-plus online response in some recent international censuses (see Table 12.1) to wonder what is possible in 2030. But, by this goal, we mean something more ambitious than it may seem at first blush, and more than just hitting a higher internet-uptake rate in the 2030 Census. Rather, the challenge going forward is to cast the traditional type of enumeration area (TEA)/contact-strategy assignment as less a mechanical exercise in questionnaire delivery and more as a call to action, tailoring that call as much as possible to specific target audiences for maximum effect. There is some portion of the population that will readily reply to the census, and the goal is to make those people aware of the Internet Self-Response option as quickly

___________________

1 Panel calculation from (U.S. Census Bureau, 2021a).

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

Table 12.1 Response Rate (%) Via Online Forms in Recent International Censuses

Census Year Australia New Zealand* Canada United Kingdom–England and Wales
2011 34.3 34.0 53.9 16.4
2016 58.8 87.9 68.3
2021 78.9 84.1 88.9

* New Zealand’s 2011 Census was canceled and postponed to 2013 by the Christchurch earthquake, shifting the usual 5-year cycle by 2 years—2016 to 2018 and 2021 to 2023. Data collection in the 2023 New Zealand Census was delayed by cyclone activity.

SOURCE: Harding et al. (2022:Table 3.3.2).

and efficiently as possible. There is another part of the population that will ever be reluctant, if not completely refuse, to participate, and the challenge for that segment is to try to find the messaging and modes of delivery that could achieve the best results. There are many gradations between these extremes. The Census Bureau’s work in delineating TEAs (Self-Response, Update Leave, and Update Enumerate) in 2020, and particularly in subdividing Self-Response into Internet First and Internet Choice, was solid. A challenge for 2030 is revisiting the whole nexus of TEA delineation—from information delivery/contact strategy, to the sequence of mailings and information broadcasts, to basic communication strategy and messaging—to efficiently steer potential respondents to the mode of choice (or the mode most likely to result in completion). This might involve more nuanced approaches than a simple Internet First/Internet Choice divide, a further shift from Update Leave to Self-Response, and—for the truly difficult to count—an in-person initial contact akin to Early Nonresponse Followup or Update Enumerate, if that is most likely to yield results.

Specific recommendations consistent with this theme include:

Recommendation 6.1: The U.S. Census Bureau should continue to investigate the performance of the contact strategies used in the 2020 Census, to increase overall self-response (particularly via the internet) in 2030, more in concordance with international experience with online census response. This investigation should include review of the effectiveness of the 2020 Census’s segmentation into Internet First and Internet Choice strategies, as well as the timing and the operative contact/enumeration strategy (e.g., Internet First or Internet Choice mailings or Update Leave delivery) underlying successful Non-ID enumerations in 2020.

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

Recommendation 6.2: The U.S. Census Bureau should engage in further research to more specifically identify social, economic, and housing characteristics associated with less-than-accurate forms of enumeration, and should also research related communication and operational strategies that would improve self-response and associated data accuracy for those population and housing segments associated with poorer-quality census data in 2020.

Recommendation 6.3: The U.S. Census Bureau should assess the magnitude of the impact of imputation on the accuracy of 2020 Census data. Assessment of the differential impact of imputation on subpopulations should be used to prioritize research into strategies to improve self-response.

Recommendation 5.5: The U.S. Census Bureau should research and test approaches to increase the proportion of dwellings that can be mailed to while still being associated with a physical geographical location. This would reduce the proportion of addresses in Update Leave and expand the use of approaches yielding higher Self-Response for more dwellings in the 2030 Census.

12.1.2 Improve Quality of Nonresponse Followup Data

Census quality is a very complex construct, and it can be subject to dangerous oversimplification. We adopt the basic premise that self-response to the census is preferred over other modalities because it is likely to provide higher-quality information but there are no absolute guarantees. A census return is not necessarily perfectly accurate by dint of being returned by mail or transmitted with a keystroke; answers can still be missing or misleading. Likewise, administrative records data can vary in quality according to their source and recency, and neighbors or landlords contacted to provide proxy responses might have great familiarity with the households in question—or such responses might not be materially better than guesswork. Thus, even with attempts to maximize Self-Response, Nonresponse Followup (NRFU) will remain an important part of 2030 Census data collection, and so it is natural to ask how NRFU data might be collected more accurately—in terms of both coverage accuracy and content completeness/accuracy.

There are many possible dimensions for improvement of NRFU data that the Census Bureau can and should explore, including reassessing the minimum and maximum number of visits to a nonresponding housing unit or the number of unsuccessful visits before resorting to alternative means; or considering whether centralized telephone follow-up could improve efficiency

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

when accurate phone contact information is available. Our analyses of data on NRFU resolutions and the marked degree of age heaping in the 2020 Census consistently suggest that proxy reporting in 2020 NRFU was particularly problematic. For reasons that are not yet fully understood, proxy returns were particularly diminished in quality in 2020 compared with 2010, and this makes the assessment of proxies particularly important. Much could be learned in census testing about the nature of proxy response and trying to differentiate between more reliable and less reliable proxy responders—to try to attach a measure of confidence in the information provided during a proxy interview. Proxy-related questions should also be considered for research as part of a more detailed interrogation of 2020 Census operational data. But, in terms of broader visions for the 2030 Census and beyond, it is worth considering whether proxy reporting ought to be reserved for extraordinary circumstances—if permitted at all—if high-quality alternative information is available.

Specific recommendations along these lines include:

Recommendation 7.1: As a goal for the 2030 Census, the U.S. Census Bureau should consider a major reduction in the use of proxy interviewing for enumeration, if not the elimination of proxy reporting in all but very limited circumstances. Work toward this goal should be predicated on the results of research that only the Census Bureau can conduct, reexamining and comparing proxy reports in the 2020 Census with the information that would have been available from administrative records or third-party data resources.

Recommendation 7.2: The U.S. Census Bureau should consider conducting research on the quality, and factors associated with variation in the quality, of proxy and count-only responses, with the aim of minimizing the occurrence of these factors.

Recommendation 7.3: The U.S. Census Bureau should conduct fine-grained analysis of Nonresponse Followup enumerations (separately for household, proxy, and administrative records responses) for small geographic areas and population groups as part of its research to develop and test ways to improve the quality of enumeration in the 2030 Census. The Census Bureau should publish the results of its analysis as an aid to users of 2020 Census data.

12.1.3 Reduce Quality and Coverage Gaps by Race, Ethnicity, and Socioeconomic Status

In considering various approaches to enumeration and ways to optimize their implementation, it is important for the Census Bureau to pay special

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

attention to outcomes for traditionally hard-to-count populations, such as racial and ethnic minorities, those with lower socioeconomic status, and babies and young children. Our analyses show clear patterns of less-optimal response modalities (i.e., lower Self-Response and higher NRFU proxies) for these groups, and the PES shows increases in differential undercoverage relative to 2010 for racial and ethnic minorities and for renters.

The challenge for 2030 Census planning is not just identifying these kinds of quality and coverage deficits through analysis of 2020 Census and other data, but developing specific, implementable actions to address the deficits. One clear direction in this regard is for the Census Bureau, in conjunction with the U.S. Office of Management and Budget and with Congressional oversight, to reach closure early on the mechanics of collecting race and ethnicity data in 2030. The reporting of race and Hispanic origin in the 2020 Census was partially impaired by the relatively late-stage continuation of the two-question format for these items on the questionnaire itself, coupled with changes in the coding and processing of write-in categories. Early closure on the form of the relevant race and Hispanic-origin questions to be used in 2030 is certainly not a panacea for quality and coverage differentials, but is a good step nonetheless.

Along these lines, we recommend:

Recommendation 10.1: The U.S. Census Bureau should conduct research to determine how the changes in format and processes that were made in the 2020 Census and in the American Community Survey beginning in 2020 affected the distributions of race and ethnicity. Such research should use qualitative, quantitative, and simulation methods to ascertain: how respondents viewed and used the 2010 and 2020 formats; trends in multirace reporting by age, sex, race, and ethnicity; how samples of 2020 respondents would have been categorized using the 2010 format, data capture, and coding; and the implications of differences in write-ins by response mode (e.g., more write-ins for internet responses) on race distributions among geographic areas. The Census Bureau should communicate the results of this research to data users to assist them to understand the implications of the changes for time series, and should also use the findings to inform 2030 Census planning.

Recommendation 10.2: To improve the quality of reporting on race and ethnicity in the census, American Community Survey (ACS), and other federal data collections, the U.S. Office of Management and Budget (OMB) should revise Statistical Policy Directive No. 15 to adopt a combined, check-more-than-one race/ethnicity classification with both Hispanic and Middle Eastern or North African as main categories, in addition to

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

White, Black, American Indian and Alaska Native, Asian, and Native Hawaiian and Other Pacific Islander. OMB should formalize any changes to the race and ethnicity reporting standards as soon as practicable, to permit the U.S. Census Bureau to implement the new race/ethnicity question in the ACS and in the 2030 Census. OMB should allow agencies to collect data only for the major categories in the combined question when added detail (as in the 2020 Census) would impose undue administrative burdens. A decision by the Census Bureau on including added detail and write-in spaces, along with expanded data capture and coding procedures, for a combined race/ethnicity question for the ACS and the 2030 Census should await completion of research on such matters as how respondents viewed the 2020 question format and the effects of response-mode differences in write-in responses on diversity among geographic areas.

Recommendation 10.3: The U.S. Census Bureau should produce a crosswalk or bridge between the 2010 and 2020 Census race and ethnicity questions and responses. Similarly, if a combined race/ethnicity question (with Hispanic and Middle Eastern or North African categories) is adopted as the standard by the U.S. Office of Management and Budget, the Census Bureau should produce a crosswalk or bridge between the 2020 version and revised race and ethnicity questions and responses, as soon as the revisions are implemented in the American Community Survey and then in the 2030 Census. The Census Bureau should involve data users in this important work.

Recommendation 10.4: The U.S. Census Bureau should consult as early as possible with the redistricting community, which includes state legislators, redistricting commissions, political parties, and political consultants, among others, to determine the optimum set of tabulations to include on the 2030 Redistricting File, whether the current race and ethnicity categories are retained or the proposed revisions take effect. The consultation should include consideration of streamlining the number of race/ethnicity categories in the file and combining blocks with small populations (using input from localities) to maximize the ability to protect confidentiality while maintaining accuracy.

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

12.1.4 Continuous Modernization of Geographic Resources for All Living Quarters

An understated success story of the 2020 Census is that there was not a comprehensive, listers-walking-every-street Address Canvassing operation in 2019. Rather, through partnership work early in the decade and development of systems for in-office review of imagery and other data resources, the Census Bureau shifted the great bulk of Address Canvassing work to in-office review rather than in-field listing. This fourth plank of Recommendation 12.1 echoes the “modernizing geographic resources” innovation area suggested by National Research Council (2011) and urges the Census Bureau to continue its good work in the geographic arena, but also calls for escalation in important ways. First, In-Office Address Canvassing (IOAC) should endure—early, often, and continuously—but it should also be allowed to fulfill its original vision. The early discontinuation of Active Block Resolution due to budget constraints was regrettable, as it diminished IOAC’s role to one of simply documenting discrepancies in housing unit counts between the Master Address File (MAF) and available imagery (rather than actually correcting address issues through in-office research). The seemingly solid performance of the Office-Based Address Validation operation (verifying respondent-supplied addresses in census returns without a MAFID through query of in-office data) portends good things about a fully realized IOAC program. Second, the small but very relevant nonhousehold population living in GQs warrants commensurate attention earlier in the decade; the Census Bureau has taken first steps toward making the MAF an ongoing repository of all living quarters, household and GQ alike, and these efforts should continue. Finally, in both the household and GQ context, an address repository is necessary but not sufficient; attention should be paid to the best ways to actually contact residents at those addresses (e.g., by mail or in person) and collect that contact-strategy information on more than a once-a-decade-refresh basis.

Accordingly, we recommend:

Recommendation 5.4: Lessons learned from the Office-Based Address Verification operation should be documented and incorporated into the tools and techniques for ongoing, continuous address and map data refinement (e.g., continuous In-Office Address Canvassing) as well as revisions of the Non-ID Processing operation.

Recommendation 9.1: The U.S. Census Bureau’s steps to make the Master Address File an ongoing, continuous inventory of all living quarters—including group quarters (GQs)—should continue, with an eye toward developing updated information for GQs on par with the regular updating of conventional

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

housing units. This should include a separate inventory of GQ types that have longstanding address provenance and those that need to be tracked more closely because they are continually changing.

Recommendation 9.2: Group quarters (GQs) require more than updated address information; they require continually updated contact strategies as well. The U.S. Census Bureau should develop “selected advance contact” protocols with GQ facilities by type (and their parent companies, as appropriate), akin to follow-ups performed with businesses, and state, local, and tribal partners should be engaged in this update activity.

Recommendation 9.3: In light of the 2020 experience, the U.S. Census Bureau should convene group quarters (GQ) stakeholders in a discussion of more effective and useful options for electronic provision of data for GQ residents, and on the issues involved in obtaining access to administrative records for GQ facilities. The completeness of such records should also be assessed by GQ type.

Recommendation 5.3: The U.S. Census Bureau should extend and improve during-the-decade programs, akin to the Geographic Support Systems Initiative of the 2010s, with the goal of continuous address- and map-database refinement. Such efforts should include engagement with state, local, and tribal partners, including on the question of whether some addresses can safely be removed from the Master Address File.

12.1.5 Realign Balance between Utility, Timeliness, and Confidentiality Protection in 2030 Census Data Products

As discussed in Section 1.2.2, census quality is a multidimensional construct, and unduly weighting any one dimension to dominate all others can be detrimental. One important dimension of data quality—timeliness of data—fell particularly to the wayside in the 2020 Census experience, in part due to delays arising from the COVID-19 pandemic but mainly due to the late-stage implementation of disclosure avoidance based on differential privacy (see Chapter 11), which had the effect of weighting confidentiality protection above all. This fifth plank of Recommendation 12.1 calls for the balance to be righted among critical aspects of quality in 2030.

Both national and international sources identify timeliness as an aspect of data quality. Principles and Practices for a Federal Statistical Agency specifies as part of practice 9 that “dissemination should be timely, and information should be made readily available on an equal basis to all users” (National

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

Academies of Sciences, Engineering, and Medicine, 2021:93). In both domestic and international standards for statistical quality, timeliness is one of many dimensions that comprise overall quality (Federal Committee on Statistical Methodology, 2020 and Box 1.1).2

For most uses of the census, data are time sensitive. Census law in Title 13 of the U.S. Code requires that apportionment counts be delivered by December 31 of Census Year and the P.L. 94-171 redistricting data three months later (by April 1 of Census Year + 1). The time sensitivity of these data is due to the constitutional necessity to reapportion the U.S. House of Representatives and the desire by most states to have recent data for redistricting. Yet there are additional purposes—some governmental and others in the private sector—for which census data are not just time sensitive but actually perishable. Stale data can become useless data.

In prior U.S. censuses, timeliness and geographic disaggregation were both major foci of the decennial census data products program. Through 1990, basic counts for places were reported in “preliminary” counts, by the field staff, very shortly after the field enumeration. This was done as early as May or June of the census year, so that local officials could see and challenge counts (as appropriate) before they were made official, and this was done even for small places.3 In the 20th century, cross-tabulations were released in “advanced” unbound formats, which were then assembled into hardback books by states, usually by Census Year + 2. (Computer summary files, as described in Chapter 11, were also released on a timely basis.) Moreover, in providing these timely glimpses, the Census Bureau was well aware of the needs of localities. In the introductory description of the “publication program” from the 1960 Census Procedural History, there is emphasis on “timeliness of the data” and the “quantity of 1960 statistics published:” (U.S. Census Bureau, 1966)4

___________________

2 See also https://unstats.un.org/unsd/methodology/dataquality/references/1902216UNNQAFManual-WEB.pdf.

3 See, for example, the account in the New York Times describing the August 1990 release of preliminary 1990 Census estimates for about 39,000 local governments in 26 states (“Earliest Figures for Census Fall Below Estimate,” August 23, 1990, at https://www.nytimes.com/1990/08/23/us/earliest-figures-for-census-fall-below-estimate.html.

4 The 1960 Census procedural history continues (U.S. Census Bureau, 1966):

The quantity of statistics published for the 1960 Censuses was greater than that of earlier censuses. One of the major policy decisions resulting from consultation with the Census Advisory Committees and with the Conferences of Census Users was that more data should be published for small areas, both for use by local communities and for detailed research on variations of characteristics within larger areas. Users of the census data could then group the data for small areas to build statistics for whatever larger area was appropriate for their purposes, such as a homogeneous economic area or the area served by a particular facility. In addition there was an increase in the subject matter for which statistics were published.

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

The goal of the publication program for the 1960 Censuses, like that of previous censuses, was to make the data available as soon as possible after the census was taken, and to do so while maintaining the Bureau’s standards relating to the quality of the statistics issues and the way presented.

Other English-speaking countries that conducted traditional censuses in the COVID-19-affected year of 2021—Australia, Canada, and England and Wales—have already released their findings. By comparison, the U.S. 2020 Census has slipped further behind in releasing data—some of which (i.e., missing the December 21, 2020, deadline for apportionment totals) is eminently explainable by the lengthy fieldwork delay induced by the pandemic. But the COVID-19 pandemic is not an adequate explanation for all of the subsequent delays; rather, it appears that the adoption of differential privacy-based disclosure avoidance presented both anticipated and unanticipated obstacles to releasing the data. The Census Bureau itself most forcefully expressed this in its May 31, 2023, press release announcing the delay of some detailed 2020 Census data products until September 2024: “Our existing disclosure avoidance system could not ensure confidentiality protections for people within substate geographies for these tables while also meeting the Census Bureau’s data quality standards” (U.S. Census Bureau, 2023b). Moreover, the same press release announced a startling reduction in the number of geographies that would be available in the Supplemental Demographic and Housing Characteristics Files product—from original plans to publish some tables at the “county, census tract, block group, American Indian/Alaska Native/Native Hawaiian [area], and [place]” level to the national and state levels only.

Withholding local data is also a departure from previous U.S. census practice. The planned unavailability of most 2020 data for local geographies affects school districts, emergency planners, land-use planners, and many other users. The identification of local needs, and the creation of policies and programs to address those needs, rely on decisions that need to be based on data that accurately reflect the relevant population. The absence of timely population data for small geographic areas, for example, can seriously compromise the ability of local officials to respond to public health crises or, more generally, to prioritize use of scarce resources to support local communities. The delay and reduced detail constitute a markedly diminished return on the nation’s investment. Importantly, this is a data-equity issue for nonmetropolitan areas, many of which are already counted as poverty-persistent areas.5 Small communities are especially disadvantaged, and, in many states, the majority of communities are relatively small. Larger metropolitan areas, big cities, and other large jurisdictions may employ their own demographers and access other local resources for decision making.

___________________

5 See https://www.census.gov/library/stories/2023/05/persistent-poverty-areas-with-long-term-highpoverty.html.

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

Moreover, their large size and resources make American Community Survey (ACS) data a viable and timelier alternative. Small towns, villages, and many tribal areas have much more limited choices, given their size, with no real alternatives to decennial census data. This is especially the case for states in which decennial census data are mandated for the distribution of revenue and other resources, leaving very small and frequently vulnerable communities at risk.

We recognize and agree with the importance of confidentiality protection as a major responsibility of the Census Bureau, as a statistical agency. Yet we are concerned that this one dimension of census and data quality has become so dominant in the Census Bureau’s approach as to threaten to preclude other critical considerations, such as utility and timeliness. Accordingly, in early planning for 2030, we strongly recommend that order and balance be restored—including a plea for that most basic of census information, the total population count unperturbed by disclosure avoidance, to be made available to functional geographic units.

Recommendation 11.1: For 2030 Census data products, the U.S. Census Bureau should adopt the risk-utility framework recommended by the Advisory Commission on Data for Evidence Building, which accepts that disclosure risk is a continuum, that not all data items are equally sensitive, and that federal statistical data need to be accessible and useful for a wide range of users and uses.

Recommendation 11.2: At a minimum, the 2030 Census should provide as-collected (i.e., unaltered) total population counts for all governmental units (states, counties and equivalents, minor civil divisions, incorporated places, recognized tribal areas, school districts) and quasi-governmental units (census county divisions, census-designated places, and tribal statistical areas), no matter how small. In addition, census block population totals should add up to block groups, census tracts, counties, and states.

Recommendation 11.3: For the 2030 Census data product plan, the U.S. Census Bureau should begin immediately on a multipronged research program with ample testing and opportunities for feedback and dialogue with the data user and stakeholder community, broadly defined. The goal should be an end-to-end plan by 2027–2028 for producing a suite of 2030 data products that serve user needs, are appropriately protected, and meet the time schedule of the 1990–2010 Censuses.

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

Recommendation 11.4: U.S. Census Bureau research and dialogue with the user community, broadly defined, on the end-to-end plan for the 2030 Census suite of data products should include:

  • A review of the 2020 Census experience, including rigorous assessment of the Census Bureau’s reconstruction-reidentification studies under plausible scenarios of potential attack and comprehensive debriefing of users.
  • Consultation with the redistricting community on race/ethnicity and geographic detail (accounting for the possibility of a combined race/ethnicity question) that best trade off accuracy and confidentiality protection for the 2030 Redistricting File.
  • Issuance of challenges to the research community, in addition to focused contracts, for research and development on a range of confidentiality-protection methods and understandable metrics of risk and accuracy to accompany those methods.
  • Research on practical methods for users to account for noise injected into 2030 Census data by the selected confidentiality-protection techniques.
  • Research on the application of cost-benefit and risk-utility analysis for making tradeoffs between confidentiality protection and utility of census data products.
  • Research on the sensitivity of individual data items to relevant communities at differing levels of geographic aggregation and the implications for confidentiality protection.

12.2 IMPROVING CENSUS RESEARCH AND DEVELOPMENT

We will return to specific areas for 2030 Census research-and-development work in the next section, but believe it important to offer some general precepts for such work first.

12.2.1 Continue to Learn from the 2020 Census Experience

The first of these general precepts is simply that there are still many important lessons to be drawn from the 2020 Census—from its as-yet-unreleased data products, from its internal evaluations and assessments, from the vast array of information that electronic data collection made uniquely possible in 2020, and from the history and evolution of 2020 Census procedures and practices.

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

Although everyone hopes that the 2030 Census will not be impacted by a global force as disruptive as the COVID-19 pandemic, we are concerned that the 2020 Census could be treated too much as a one-off set of never-to-be-repeated conditions and pressures.

As a first step, we urge the Census Bureau to proceed with its evaluation, experiment, and assessment reports as expeditiously as possible, particularly to the extent that those reports do not simply stop at rote cataloging and instead fuel deeper dives and simulations. In the same vein, a formal, documented procedural history has been invaluable for understanding the development of past decennial censuses. It is true that many pieces can be gleaned from dozens of detailed memoranda, but there is great value in a having a coherent record. The lack of a procedural history for the 2010 Census is unfortunate, but one should be created for 2020.

Recommendation 2.1: In addition to completing its own program of evaluation and assessment reports, the U.S. Census Bureau should complete and publish a comprehensive procedural history of the 2020 Census.

By this precept of not closing the book on the 2020 Census too early, we also acknowledge the time and resource limitations on the work of our panel’s data analysis subgroup—and recognize that many of the essential analyses of 2020 Census and PES data can only be done by the Census Bureau itself. The Census Bureau should study issues that our panel was unable to examine, and should do so with analyses at subnational spatial and demographic resolution whenever possible. Among the many topics that fall under the heading are: assessment of the reengineered field-operation systems and their impact/efficiency; effectiveness of Non-ID Processing and timing of Non-ID responses; efficacy of differing numbers of NRFU enumeration attempts; and exploration of the impact of individual components of the 2020 Census contact strategy, including whether the effectiveness of the impromptu mailings performed as a result of the COVID-19 pandemic can be assessed.

We encourage the Census Bureau to approach reanalysis of 2020 Census operational and output data, as well as data from the 2020 PES, as analysis of a natural experiment on a grand scale. In many respects, 2020 Census data and paradata offer an ideal opportunity to learn about census procedures, as they were applied under the full trappings of a decennial census and in ways that cannot be replicated well in intercensal tests. In addition, the census itself provides huge sample sizes not available in census tests.

Our analyses in Chapters 6 and 7 illustrate instructive associations between response modes and many ACS variables, at the tract level. Census Bureau analysts have the knowledge and resources to dig much deeper, by incorporating operational variables and performing multivariate analyses (via regression or higher-dimensional tables) to more plausibly approximate causal relationships.

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

Also, by examining similarities between outlying domains identified through such modeling, the Census Bureau might discover regions or types of housing units not well served by current procedures. Further, the complexity of the census in terms of the variety of data-collection procedures used in different types of geographies for different types of housing units motivates the use of more complicated statistical models to better represent the interaction of different census processes and different types of housing units and types of households.6

With specific reference to the PES, we recommend:

Recommendation 4.2: The U.S. Census Bureau should conduct additional analyses of the 2020 Post-Enumeration Survey (PES) to learn as much as possible about census omissions. One such analysis should include P-sample and E-sample households with at least one member in common, to assess not only erroneous census enumerations but also omissions and how they vary with household characteristics and type of enumeration (e.g., proxy interview, internet response). In addition, the Census Bureau should perform discriminant or similar analyses to identify variables that contributed to census component errors (e.g., omission, duplication) for 2020 P-sample and E-sample addresses, households, and individuals, which could identify census processes to target for improvement in 2030.

As a final example, there is likely some useful information contained in the operational paradata that the Census Bureau possibly has retained, concerning difficulties that households experienced during the enumeration process. For instance, what parts of the census questionnaire did individuals have difficulty filling out, based on latencies in response to the internet instrument? What households contacted the Census Bureau via telephone for questionnaire assistance and what languages did those households primarily speak? We are of the impression that paradata remain a useful, underutilized source of information through which the parts of the census processes that were problematic for various subsets of the census universe could be explored.

12.2.2 Improving Census Experimentation and Testing

Over the decades, it has become the norm to think of a census experiment as something that must be done at large scale and that must pit some completely novel approach or technique against standard procedures. Similarly, it has become the norm to equate “census test” with “test census,” favoring large-scale dry runs of substantial parts of census data-collection machinery. Both of these

___________________

6 Some candidates for use as independent and dependent variables in such models are provided in Appendix D.

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

concepts are limiting in important ways, and we encourage the Census Bureau to approach census experimentation more broadly in planning the 2030 Census and the program of tests that will precede it.

The Census Bureau has regularly conducted randomized experiments as part of intercensal tests of census procedures, and randomized experiments are the most effective way to identify actionable, causal relationships. Accordingly, in the panel’s opinion, the Census Bureau should look for opportunities to conduct such experiments to learn about operations under consideration. But we argue that the Census Bureau should also seek opportunities to use randomized experimentation as a learning/evaluation tool. One drawback of 2020 data is that, to our knowledge, there were no substantial randomized experiments that would allow causal inference for any aspects of the enumeration procedures employed in 2020. One example of an opportunity for randomized experimentation was the delineation of Internet First and Internet Choice contact strategies in the 2020 Census. There are still some things to be learned about internet versus paper take-up in the two strategy groups, but insights would be enriched if a small (not disruptive to overall census conduct) experiment with randomized assignment had been embedded in 2020 Census production. Understandably, the Census Bureau’s overriding objective in fielding the census is to conduct the best possible (current) count, but we suggest that small provisions for randomizing application of methods could garner much valuable information at very little cost or disruption (and also without severely impacting individuals’ ability to be counted in the census). We encourage the Census Bureau to think ahead about embedding such possibilities in 2030 Census production to inform the 2040 Census.

Embedded randomized experimentation might also be the basis for studying the effectiveness and return on investment of the decennial census partnership and communication programs, as suggested in our interim report (National Academies of Sciences, Engineering, and Medicine, 2022:Conclusion 4.5). In the panel’s view, planning for the 2030 Census should consider designed evaluations of issues in the formation, coverage, continuity, and timing of these partnership and outreach functions. For a true experiment, one member of a pair of tracts or other domains evaluated as “similar” on relevant attributes could be randomized to intensive outreach, another to less intensive (including no) outreach. Mode-specific participation can be compared, delivering actionable, causal relations. Alternatively, the focus of the experiment could be on ways to increase self-response for groups for which alternatives to the Self-Response mode are weak (e.g, low-quality administrative records sources).

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

12.2.3 Systems Design to Facilitate Evaluation

A series of National Academies of Sciences, Engineering, and Medicine census study panels, beginning with National Research Council (1985) and perhaps expressed most forcefully by National Research Council (2004b:§ 8–B), made calls for the decennial census to retain a Master Trace Sample—effectively, a complete audit trail through all census enumeration and follow-up operations for a sample of households that could then be analyzed for cost and error structures. Attempting to shore up the entire technical/information infrastructure of the census, the latter National Research Council (2004b:Recommendation 8.7) panel argued that the vision should be a Master Trace System rather than a sample—meaning that this type of audit trail should be available for all households in the census, as a byproduct of regular system functioning.

As we understand it, the Census Bureau made important strides toward this vision in its development of information technology (IT) systems for the 2020 Census and its allowance for operational dashboards by which census managers could monitor real-time flows. Underlying the 2020 Census IT plans was the concept of a Census Data Lake, representing the initial essential step of data and paradata retention. However, the Census Bureau’s decision to convert to another software platform—and the subsequent reprogramming effort (see Box 6.2)—made for a late scramble to finalize the systems for deployment.

Accordingly, we echo our predecessor study panels in urging the Census Bureau to develop operational and technical systems for the 2030 Census with the requirement that census systems be amenable to both real-time monitoring and post hoc evaluation and reanalysis. Such a requirement would benefit the understanding of census quality and could ultimately improve all census operations. This recommendation is, in part, a reflection of the panel’s difficulty in identifying areas for analysis: our ability to ask for information and paradata was limited because we did not know exactly what information what was being retained and how. We were commonly referred to the Census Bureau’s evaluation teams’ study plans, which we were grateful to be able to review behind the Census Bureau IT firewall, but those documents had similar questions regarding the data that were actually available because they were generally developed 2–3 years prior. In some instances, our data requests were modified precisely because of the anticipated difficulty in retrieving information from the Census Data Lake without impeding the Census Bureau’s ongoing work on 2020 Census production. Technical systems developed with an eye toward facilitating eventual research and evaluation would be a great boon to any evaluator of census methods—most notably to the Census Bureau itself.

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

12.3 IMPROVING CENSUS PROCESSES

For a variety of reasons, some topics on which we offer specific recommendations in this report did not rise to the level of priority goals outlined in Recommendation 12.1. This is not to diminish the importance of these topics—they remain vitally important to a quality 2030 Census—and they merit further explication under the general heading of improving census processes.

12.3.1 Continued Work on Administrative Records to Supplement Census Operations

The reason administrative records work is listed here but omitted as a priority goal area in Recommendation 12.1 is expressed in detail in Section 8.6. The way administrative records were used as a supplement to NRFU and other census operations in 2020 was appropriate and careful, including the use of records for enumeration in NRFU instances in which at least one in-person attempt failed to yield a return and the Census Bureau had high confidence in the availability of administrative records information for the household in question. There is considerable room for refinement and improvement in this limited, supplementary use of administrative records data, and we encourage it:

Recommendation 8.1: The U.S. Census Bureau should continue its research and development program concerning the best ways to use administrative records as a supplement to decennial census operations. Potential uses of administrative records include expanding enumeration of limited subsets of the 2030 Census population in the Nonresponse Followup workload, reducing proxy responses and whole-person imputations, and possibly redressing the long-standing net undercount of children ages 0–4.

But, as Section 8.6 argues, we do not believe that achieving administrative records enumeration of a very large share of the population, absent any contact attempt, should be a priority goal of 2030 Census planning and implementation. The move to an administrative records-based census is a much longer term proposition for which the evidentiary, technical, and legal base must be carefully developed. We understand the appeal of the approach from a cost-reduction standpoint, but pursuing minimum cost to the exclusion of all other aspects of census quality in 2030 would be akin to weighting confidentiality protection above and beyond all other considerations in 2020. Just as the Census Bureau is still grappling with the anticipated and unanticipated difficulties in completely supplanting its disclosure-avoidance and data-production routines, we worry that rushing to low-cost original enumeration via administrative records raises a great many conceptual and procedural issues that are unlikely to

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

be resolved by 2030, and thus threatens to erode the role of the decennial census as an essential civic ceremony.

12.3.2 Continued Development of Master Address File Curation and Updating

The MAF does play importantly into Recommendation 12.1, from the standpoint of making the MAF a resource covering all living quarters—housing units and GQs alike. But it is also important that several technical aspects of MAF curation and updating are important to address in the coming years, along with clarifying the rules to make participant-input address-building operations more of an active partnership with the Census Bureau.

Recommendation 5.1: The U.S. Census Bureau should continue to research and refine the filters it applies to the Master Address File to derive functional operational extracts, with the intent to reduce the number of addresses cancelled during collection (i.e., flagged as deletes in Nonresponse Followup and other field operations). Along those lines, the Census Bureau should continue to study ways to partner with and more fully utilize U.S. Postal Service data (including the Delivery Sequence File) and potential sources for address addition and revision information between censuses, such as address data recoverable from administrative records extracts, with particular eye toward more regular updating in areas without mail delivery.

Recommendation 5.2: The rules and scope of participant-input address-building operations of the census require clarification, including the Local Update of Census Addresses, New Construction, and Count Review. The U.S. Census Bureau should continue to improve ways to work with user-supplied input or data resources, and make it easier for state, local, and tribal authorities to supply input in usable form, while more clearly laying out the expectations for those address-building operations.

12.3.3 Improvements to Major Census Coverage Evaluation Methods

The 2020 Census experience spotlighted stresses and strains not just on the mechanics of conducting the census but on the primary tools used to assess census quality as well. These are Demographic Analysis, which produces basic counts from sources including birth, death, Medicare, and international migration records and estimates, and comparison of census data with the

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

independent PES. Chapter 4 delves into considerable detail on important improvements that can be made to the application of these evaluation methods in 2030.

Recommendation 4.1: For the Demographic Analysis (DA) coverage evaluation method, the U.S. Census Bureau should conduct research on:

  • The reasonableness of the assumption for incompleteness of Medicare enrollment in the high series, which produced coverage estimates that diverged greatly from the low and middle series for people ages 75 and older, through an appropriate match study;
  • The suitability of Medicare data for coverage evaluation of the entire population ages 65 and older as in the 2010 Census and for improving the census count of this age group in 2030;
  • The reasonableness of the methods for assigning Hispanic and non-Hispanic births for coverage estimates for children and young adults, which produced wide differences between the low and high DA series in 2020;
  • Methods for narrowing estimates of net international migration, which affect estimates for Hispanic people and the total population; and
  • Methods for developing experimental subnational DA estimates, starting with young children.

Recommendation 4.3: For the Post-Enumeration Survey (PES) method, the U.S. Census Bureau should:

  • Perform sensitivity analyses, based on plausible assumptions, to put error bounds around such key operations as matching and imputation of match and enumeration status, to evaluate the quality of the 2020 PES and plan such analyses from the outset for the 2030 PES;
  • Plan analyses for the 2030 PES of components of error for census operations in exhaustive and mutually exclusive categories that are as comparable as possible between 2020 and 2030;
  • Seek adequate funding to increase the PES sample size in 2030 to at least 2010 levels; and
  • Experiment with modeling to estimate net undercoverage and overcoverage for more detailed geographic and demographic groups for which direct estimates could not be provided in 2020.
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

Recommendation 4.4: The U.S. Census Bureau should prioritize research on potential sources of coverage errors—both undercounts and overcounts—for geographic areas and population groups, using additional methods besides further analysis of the Post-Enumeration Survey. To address omissions from the census, the Census Bureau should match 2020 Census records with the 2019–2021 American Community Survey and a wide variety of administrative records for census tracts—perhaps sampling those with low self-response rates. The goal would be to provide an evidence base for methods and data sources that could potentially reduce omissions in the 2030 Census. To address duplications in the census, the Census Bureau should match 2020 Census records with themselves to identify duplicate enumerations of people with more than one residence, college students enumerated at college and at home, children in joint custody arrangements, and the like. The goal would be to provide an evidence base for methods to reduce duplicates in the 2030 Census and potentially to remove them from the count, just as people duplicated within a household are dropped from the count.

12.3.4 Revisit Unplanned Contingency Changes from 2020 that Merit Further Consideration

Finally, in taking stock of the procedural history and lessons learned from the 2020 Census, we encourage the Census Bureau to review the operational revisions made dynamically in 2020—in reaction to the COVID-19 pandemic and other difficult enumeration conditions—to consider learnings that could be useful on a planned basis for the 2030 Census. One such unplanned contingency measure that merits consideration for 2030 is the Census Bureau’s utilization of its separate Contact Frame to derive phone numbers, which permitted enumerators to make telephone contact with some NRFU cases, particularly in hard-to-access settings. A role for some form of centralized telephone callout as an alternative response mode in NRFU might serve to increase the number of contact attempts made with actual household members and reduce the need for proxy enumeration. Another 2020 contingency measure worthy of assessment and reconsideration is the Every Door Direct Mail postcard that was sent to post office box addresses; it is worthwhile to consider the benefit of a bolstered call to action sent to all addresses, not just those with regular delivery by postal mail carriers.

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

12.4 IMPROVING CENSUS LAW

We close with a note that is not traditionally a focus of National Academies panels, and hence not an area on which we suggest specific language. But the 2020 Census experience suggests some foundational risks and challenges, for which the only viable solution may be serious reinterpretation and reexamination of census law in Title 13 of the U.S. Code, if not legislative revision of related statutes.

The 400-plus sections of Title 13 of the U.S. Code represent an omnibus compilation of provisions, establishing the Census Bureau, authorizing the Secretary of Commerce to administer various census and statistical data collections, both demographic and economic, and providing for data sharing collaborations among various federal agencies, and in several cases between the federal governments and state and local governments. Title 13 includes quite specific provisions on details ranging from the development and purchase of “mechanical and electronic” equipment to the penalties for violating provisions of the title (including violations of the confidentiality of the information collected). As described in Box 12.1, major changes to Title 13 have been relatively infrequent since the act was codified in 1954.

The crux of the debate over the implementation of new disclosure avoidance methods in the 2020 Census is that the privacy and confidentiality protections of 13 U.S.C. § 9—some version of which have been in statute since 1909—are strong but open to interpretation. It is unclear, for instance, whether the section’s prohibition of “any publication whereby the data furnished by any particular establishment or individual under this title can be identified” extends to any characteristics of individuals that might, in combination with other and auxiliary data, leak some personal information. Moreover, Section 9’s protections do not operate in a vacuum; there is a wide array of case law and other privacy/confidentiality law alongside which the decennial census and its confidentiality protections must operate. Prominent among these, as discussed in Chapter 9, is the major problem experienced with college and university student housing in the 2020 Census: the Family Educational Rights and Privacy Act’s limitation to “directory information” as being the only shareable information, which posed a major challenge as the Census Bureau scrambled to try to get colleges and universities to respond electronically due to campus shutdowns.

In this report, we offer one specific suggestion along these lines—shifting some of the burden of confidentiality protection to the user community.

Recommendation 11.5: The U.S. Census Bureau should welcome initiatives to add language to appropriate legislative vehicles, such as the Foundations for Evidence-Based Policymaking Act of 2018, that prescribes responsibilities and penalties for

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

data users, in addition to agency staff, for willful, harmful disclosure of confidential information.

More generally, there may be a need to revisit the interpretation of Title 13’s confidentiality protections and reconcile them with provisions in other laws, lest divergent interpretations raise new challenges to census operations.

We also note that the current language of Title 13 does not include an explicit mission statement for the Census Bureau or the decennial census, nor does it speak to the broader purposes of census data collections and products (save for the specificity of the redistricting data file in P.L. 94-171). It could be useful to the vital, participatory civic ceremony that is the U.S. decennial census to have its importance noted—and for the Census Bureau’s function as a scientific, statistical agency to be affirmed—in positive law, for 2030 and beyond.

In 2012–2018, it was impossible to know the full array of challenges that would ultimately confront the 2020 Census—and the same can be said looking ahead to the 2030 Census and beyond. There are broader societal and physical

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

trends that portend new, great challenges to the collection of quality census data, among them increasing fluidity of sex and gender identity and increased vulnerability to major disruption from natural (and manmade) disasters. Yet there are also broader forces that may actually be beneficial to the cause of quality census taking, such as wider availability of broadband internet and fuller saturation of smartphone use (and hence comfort with electronic response). Reshaped workforces and workplaces coming out of the COVID-19 pandemic, perhaps with wider adoption of telework and corresponding shifts in the understanding of “usual residence” are also major forces whose effects on census conduct remain to be determined. In this context, it is critically important that the goals and designs for the 2030 Census should be developed in true partnership with census data users and the myriad community of stakeholders and state, local, tribal, and federal government partners—communities defined as broadly as possible—that make the census the vital, grand civic ceremony upon which the nation relies.

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×

This page intentionally left blank.

Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 335
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 336
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 337
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 338
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 339
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 340
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 341
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 342
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 343
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 344
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 345
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 346
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 347
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 348
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 349
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 350
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 351
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 352
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 353
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 354
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 355
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 356
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 357
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 358
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 359
Suggested Citation:"12 Learning from 2020, Preparing for 2030." National Academies of Sciences, Engineering, and Medicine. 2023. Assessing the 2020 Census: Final Report. Washington, DC: The National Academies Press. doi: 10.17226/27150.
×
Page 360
Next: References »
Assessing the 2020 Census: Final Report Get This Book
×
 Assessing the 2020 Census: Final Report
Buy Paperback | $60.00 Buy Ebook | $48.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Since 1790, the U.S. census has been a recurring, essential civic ceremony in which everyone counts; it reaffirms a commitment to equality among all, as political representation is explicitly tied to population counts. Assessing the 2020 Census looks at the quality of the 2020 Census and its constituent operations, drawing appropriate comparisons with prior censuses. The report acknowledges the extraordinary challenges the Census Bureau faced in conducting the census and provides guidance as it plans for the 2030 Census. In addition, the report encourages research and development as the goals and designs for the 2030 Census are developed, urging the Census Bureau to establish a true partnership with census data users and government partners at the state, local, tribal, and federal levels.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!