National Academies Press: OpenBook

The 2000 Census: Counting Under Adversity (2004)

Chapter: Appendix A: Panel Activities and Prior Reports

« Previous: 10 Detailed Findings and Recommendations
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

Appendixes

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

This page intentionally left blank.

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

Appendix A

Panel Activities and Prior Reports

Formed in response to a 1998 request by the U.S. Census Bureau, the Panel to Review the 2000 Census conducted a variety of activities to carry out its broad charge to review the statistical methods of the 2000 census. In this appendix, we provide additional detail on the meetings and previous publications of the panel.

A.1 LIST OF PANEL MEETINGS, WORKSHOPS, AND TRIPS BY PANEL MEMBERS

The Panel to Review the 2000 Census met 20 times as a whole between its inception in 1998 and the end of 2003. Additional detail on the panel’s schedule of meetings is given in Table A.1.

In addition, members and staff of this panel were joined by members of the Panel on Research on Future Census Methods in visiting local and regional offices during various phases of census operations. Visits conducted during census data collection are listed in Table A.2, and visits conducted to monitor the Accuracy and Coverage Evaluation (A.C.E.) Program are listed in Table A.3.

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

Table A.1 Meetings of the Panel to Review the 2000 Census

Meeting

Date

Format

1

November 9, 1998

Open and closed sessions.

2

January 15, 1999

Open and closed sessions.

3

 

Panel chair and four members toured the National Processing Center in Jeffersonville, Indiana, with local census personnel.

4

March 19, 1999

Open and closed sessions.

5

June 28, 1999

Open and closed sessions.

6

October 6–7, 1999

Workshop, and open and closed sessions.

7

February 2–3, 2000

Workshop, and open and closed sessions.

8

May 8–9, 2000

Open and closed sessions.

9

October 2–3, 2000

Workshop, and closed session.

10

March 22, 2000

Closed session.

11

June 20–21, 2000

Closed session.

12

October 22–23, 2001

Closed session.

13

July 17–18, 2002

Closed session.

14

October 30–31, 2002

Closed session.

15

March 12, 2003

Open and closed sessions. Open session held jointly with Panel on Research on Future Census Methods.

16

April 25, 2003

Closed session.

17

May 19, 2003

Closed session.

18

July 17, 2003

Closed session.

19

September 15, 2003

Closed session.

20

November 18, 2003

Closed session.

A.2 PUBLICATIONS

A.2.a The 2000 Census: Interim Assessment

On October 9, 2001, the panel released its interim report in pre-publication format. Titled The 2000 Census: Interim Assessment, the interim report assessed the Census Bureau’s March 2001 recommendation regarding statistical adjustment of census data for redistricting and reviewed census operations. By design, the interim report did not address the Census Bureau’s decision on adjustment for nonredistricting purposes, which was anticipated to occur on or about October 15 (the decision was actually announced on October 17). Subsequently, on November 26, the panel sent a letter report to William Barron, acting director of the Census Bureau. In the letter report, the panel reviewed the new set of evaluations prepared by the Census Bureau in support of its October decision.

In late 2001, these two reports—the letter report and the interim

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

Table A.2 Site Visits to Regional and Local Census Offices, 2000

Location

Date

Participants

Update/Leave

 

Rapid City, SD

March 6

Heather Koball

Special Places

 

Madison, WV

March 17

Heather Koball

List/Enumerate

 

Remote Maine

April 6–7

Heather Koball

Nonresponse Follow-up

 

Rochester, NY

April 20

Constance Citro

Boston/Cape Cod, MAa

April 28–29

David Binderb, Heather Koball

Sampson County, NC

May 15

Allen Schirmb, Heather Koball

Philadelphia, PAa

May 19

Lawrence Brown, Heather Koball

New Orleans, LA

May 26

L. Bruce Petrie, Heather Koball

New York City, NYa

May 26

Joseph Salvob, Andrew White

Los Angeles, CAa

June 2

Robert Bell, Andrew White

Chicago, ILa

June 9

Robert Hauser, Heather Koball

Miami, FL

June 12

Benjamin Kingb, Heather Koball

Dallas, TX

June 21

L. Bruce Petrie, Heather Koball

Atlanta, GA

July 28

Robert Hauser, Heather Koball

Data Capture Center

 

Baltimore, MD

March 30

David Binderb, William Eddy, Sallie Keller-McNultyb, Janet Norwood, Joseph Salvob, Allen Schirmb, Michael Cohen, Heather Koball, Andrew White

NOTES: All visits (except to data capture centers) included panel members and staff accompanying census enumerators during their work; the Rochester, NY, visit was to the local census office before follow-up commenced.

a Visits included both a local and a regional A.C.E. office.

b Member, Panel on Research on Future Census Methods.

report—were issued as a combined volume that retained the title The 2000 Census: Interim Assessment (National Research Council, 2001a).

A.2.b LUCA Working Group Report

Jointly commissioned by the Panel to Review the 2000 Census and the Panel on Research on Future Census Methods, the Working Group on LUCA brought together representatives of state and local governments to assess participation in the Census Bureau’s Local Update of Census Addresses (LUCA) Program. The Working Group was chaired by Joseph Salvo, director of the New York City Department of City Planning’s Population Division and a member of the Panel on Research on Future Census Methods. Members of

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

Table A.3 Additional Site Visits to Accuracy and Coverage Evaluation Offices, 2000

Location

Date

Participants

Seattle, WAa

June 19

Michael Meyerb; Andrew White

McAllen, TX

June 27

Heather Koball

San Francisco, CA

June 30

Donald Ylvisakerb; Heather Koball

Greenville, MS

July 7

L. Bruce Petrie; Heather Koball

Los Angeles, CAa

July 10

Robert Bell; Heather Koball

Detroit, MIa

July 14

William Eddy; Andrew White

Newark, NJ, and New York City, NY

July 21

Joseph Salvob; Allen Schirmb;

Michael Cohen; Heather Koball

Chicago, ILa

August 4

Keith Rustb; Heather Koball

NOTES: All visits included panel members and staff accompanying A.C.E. interviewers during their work.

a Visits included both a local and a regional A.C.E. office.

b Member, Panel on Research on Future Census Methods.

the working group (and their local government affiliations) were: Shoreh Elhami (Delaware County, Ohio); Abby Hughes (formerly with Arkadelphia, Arkansas); Terry Jackson (Georgia Department of Community Affairs); Tim Koss (Snohomish County, Washington); and Harry Wolfe (Maricopa Association of Governments, Arizona). Patricia Becker of APB Associates served as consultant to the working group.

The working group found that the interaction between local governments and the Census Bureau in LUCA helped identify important problems in the Master Address File (MAF) and revealed limitations in the Bureau’s methods to create and update the MAF. The report—including a detailed chapter on case studies of LUCA implementation in various communities—described strategies that local governments employed to locate and rectify address list gaps in their areas. Overall, the working group concluded that the LUCA program was beneficial for those communities that participated. The group suggested that the Census Bureau continue to build partnerships with local governments and to encourage broader participation by local authorities in any planned continuous address updating effort.

The working group’s final report (Working Group on LUCA, 2001) was published and distributed by the panels and the Com-

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

mittee on National Statistics (CNSTAT). It is not sold by The National Academies Press, but it is available on the Internet through the CNSTAT Web site (http://www7.nationalacademies.org/cnstat/1LUCAReport.pdf [9/1/03]).

A.2.c Workshop Proceedings

As part of its work, the panel held three open workshops on topics related to the A.C.E. and possible adjustment of the census counts for population coverage errors. In 2001, the panel issued a proceedings volume for each of these workshop meetings; each volume is an edited transcript of the presentations and discussions.

The first workshop (National Research Council, 2001e) was held October 6, 1999. It considered issues of the A.C.E. design that had not yet been completely worked out by the Census Bureau staff. Topics discussed included methods and issues for determining poststrata for estimation, obtaining the final sample of block clusters from a larger initial sample, and imputing values for missing responses on characteristics needed to define poststrata.

The second workshop (National Research Council, 2001f) was held February 2–3, 2000. It covered the dual-systems estimation process, as planned for the 2000 census, from beginning to end.

The third workshop (National Research Council, 2001g) was held October 2, 2000. It laid out the process the Census Bureau planned to follow in order to reach a decision by March 1, 2001, on whether to adjust the census counts for purposes of congressional redistricting.

A.3 COMMISSIONED PAPER

To supplement its work, the panel commissioned a paper by David Harris, Department of Sociology and Institute for Social Research at the University of Michigan, to review the measurement of race and ethnicity in the 2000 census.

In the paper, Harris (2003) considers the process and method by which the Office of Management and Budget (OMB) revised its Directive Number 15 to create the racial and ethnic classifications for the 2000 census, and examines how the 2000 data can be used by government agencies and researchers. As part of this discussion, Harris looks at the social perspective on race that informed OMB’s

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

revisions, and offers a critique of the new classification system and its uses.

Harris concludes that OMB’s classification and tabulation guidelines are often somewhat arbitrary and fail to acknowledge that racial classifications emerge from social interactions. Harris recommends that future attempts at data collection, and current users of the 2000 census data, employ a social theory of race. He also suggests that data users look to other data sources to help develop models for classification patterns for multiracial respondents.

A.4 LETTER REPORTS

Although the letter reports issued by the panel are readable via the National Academies Press Web site (http://www.nap.edu), we reprint here two of the panel’s three letters that have not previously appeared in any of the bound panel reports. References and acronyms in the letter reports have been minimally edited for consistency with the rest of the volume.

A.4.a May 1999 Letter Report

Dr. Kenneth Prewitt

Director

U.S. Bureau of the Census

Room 2049, Building 3

Washington, DC 20233

Dear Dr. Prewitt:

As part of its charge, the new Panel to Review the 2000 Census offers this letter report on the Census Bureau’s plans for the design of the Accuracy and Coverage Evaluation (A.C.E.) survey, a new post-enumeration survey. This survey is needed in light of the recent U.S. Supreme Court ruling regarding the use of the census for reapportionment.

In general, the panel concludes that the A.C.E. design work to date is well considered. It represents good, current practice in both sample design and poststratification design, as well as in the inter-relationships between the two. In this letter the panel offers obser-

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

vations and suggestions for the Census Bureau’s consideration as the work proceeds to complete the A.C.E. design.

Background

Because it is not possible to count everyone in a census, a postenumeration survey is an important element of census planning. The survey results are combined with census data to yield an alternative set of estimated counts that are used to evaluate the basic census enumeration and that can be used for other purposes. For 2000, an Integrated Coverage Measurement (ICM) survey had been planned for evaluation and to produce adjusted counts for all uses of the census.1 The recent U.S. Supreme Court ruling against the use of sampling for reapportionment among the states eliminates the need for a post-enumeration survey that supports direct state estimates, as was originally planned for the ICM survey. (The state allocations of the ICM sample design deviated markedly from a proportional-to-size allocation in order to support direct state estimation. Specifically, the ICM design required a minimum of 300 block clusters in each state.) Alternative approaches are now possible for both sample and poststratification designs for the 2000 A.C.E. survey. As a result, the planned A.C.E. post-enumeration survey will differ in several important respects from the previously planned ICM survey.

Plans for A.C.E. Sample and Poststratification Design

Our understanding of the current plans for the A.C.E. survey is based on information from Census Bureau staff.2 Building on its work for the previously planned ICM, the Census Bureau will first identify a sample of block clusters containing approximately 2 million housing units and then will independently develop a new list of addresses for those blocks.3 In a second stage, a sample of block clusters will be drawn from the initial sample to obtain approximately 750,000 housing units, which was the number originally planned for the ICM. (Larger block clusters will not be drawn

1  

See National Research Council (1999b).

2  

See Kostanich et al. (1999)

3  

The use of the term block cluster refers to the adjoining of one or more very small blocks to an adjacent block for the purpose of the A.C.E. sample design. Large blocks often form their own block clusters.

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

in their entirety; they will first be subsampled to obtain sampling units of 30–50 housing units. Because the costs of interviewing are so much greater than the costs of listing addresses, this subsampling approach allows the interviewed housing units to be allocated in a more effective manner.) Finally, in a third stage, a sample of block clusters will be drawn from the second-stage sample to obtain the approximately 300,000 housing units required for the A.C.E. sample. The target of 300,000 housing units for the A.C.E., which may be modified somewhat, will be based on a new set of criteria that are not yet final.

The Census Bureau is considering three strategies for selection of the 300,000 A.C.E. subsample from the 750,000 sample: (1) reducing the sample proportionately in terms of state and other block characteristics from 750,000 to 300,000; (2) reducing the sample by using varying proportions by state; or (3) differentially reducing the sample by retaining a higher proportion of blocks in areas with higher percentages of minorities (based on the 1990 census).4 These options for selection of the 300,000 A.C.E. housing units from the 750,000 units first selected will be carefully evaluated. The plans include three evaluation criteria for assessing the options: (a) to reduce the estimated coefficients of variation for 51 poststratum groups (related to the 357-cell poststratification design discussed below); (b) to reduce the differences in coefficients of variation for race/ethnicity and tenure groups; and (c) to reduce the coefficients of variation for estimated state totals. (Option (3) above is motivated by criterion (b).) Without going into detail, it is also useful to mention that the Census Bureau has instituted a number of design changes from the 1990 Post-Enumeration Survey for the A.C.E. that will reduce the variation in sampling weights for blocks, which will reduce the sensitivity of the final estimates to results for individual blocks. This represents a key improvement in comparison with the 1990 design.

The current plan to produce poststrata involves modification of the 357-cell poststratification design suggested for use in 1990-based intercensal estimation. Current modifications under consideration by the Census Bureau include expansion of the geographic stratification for non-Hispanic whites from four regions to nine census

4  

The Census Bureau is aware that mixtures of strategies (2) and (3) are also possible, although such mixtures are not currently being considered.

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

divisions, adding a race/ethnicity group, changing the definition of the urbanicity variable, and adding new poststratification factors, such as mail return rate at the block level. Logistic regression, modeling inclusion in the 1990 census, is being used to help identify new variables that might be useful, as well as to provide a hierarchy of the current poststratification factors that will be used to guide collapsing of cells if that is needed. (In comparison, the analysis that generated the 357-cell poststratification was based on indirect measures of census undercoverage, such as the census substitution rate.)

The Census Bureau plan demonstrates awareness of the interaction of its modification of the 750,000 housing unit sample design with its modification of the 357 poststrata design. (On the most basic level, the sample size allocated to each poststratum determines the variance of its estimate.) The plan also makes clear that even though much of the information used to support this modification process must be based on the 1990 census, it is important that the ultimate design for the A.C.E. survey (and any associated estimation) allows for plausible departures from the 1990 findings. For example, significant differences between the 1990 and 2000 censuses could stem from the change in the surrounding block search for matches, the planned change in the treatment of A.C.E. movers, or changes in patterns and overall levels of household response.

Observations and Comments
Sample Design to Select the 300,000 Housing Units

Because of the need to keep the A.C.E. on schedule by initiating resource allocations that support the independent listing of the 2 million addresses relatively soon, as well as the need to avoid development and testing of new computer software, the Census Bureau has decided to subsample the 300,000 A.C.E. housing units from the 750,000 housing units of the previously planned ICM design. The panel agrees that operational considerations support this decision.

The cost of the constraint of selecting the 300,000 A.C.E. housing units from the 750,000 ICM housing units, in comparison with an unconstrained selection of 300,000 housing units, is modest. While the constrained selection will likely result in estimates with some-

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

what higher variances, the panel believes that careful selection of the subsample can limit the increase in variance so that it will not be consequential. (By careful selection, the panel means use of the suggested approaches of the Census Bureau, or new or hybrid techniques, to identify a method that best satisfies the criteria listed above.) This judgment by the panel, although not based on a specific analysis by itself or the Census Bureau, takes into account the fact that a large fraction of the 750,000 housing units of the ICM design are selected according to criteria very similar to those proposed for the A.C.E. design.

In addition, the panel notes that the removal of the requirement for direct state estimates permits a substantial reduction in sample size from the 750,000 ICM design in sparsely populated states, for which A.C.E. estimates can now pool information across states. As a result, the A.C.E. design could result in estimates with comparable reliability to that of the previously planned, much larger ICM design.

Given the freedom to use estimates that borrow strength across states, the final A.C.E. sample should reduce the amount of sampling within less populous states from that for the preliminary sample of 750,000 housing units. However, there is a statistical basis either for retaining a minimum A.C.E. sample in each state, or what is nearly equivalent, for retaining a sample to support an A.C.E. estimate with a minimum coefficient of variation. The estimation now planned for the A.C.E. survey assumes that there will be no important state effects on poststratum undercoverage factors. In evaluating the quality of A.C.E. estimates, it will be important to validate this assumption, which can only be done for each state if the direct state estimates are of sufficient quality to support the comparison, acknowledging that for some of these analyses one might pool data for similar, neighboring states. (Identification of significant state effects would not necessarily invalidate use of the A.C.E. estimates for various purposes but would be used as part of an overall assessment of their quality.)

This validation could take many forms, and it is, therefore, difficult to specify the precise sample size or coefficient of variation needed. We offer one approach the Census Bureau should examine for assessing the adequacy of either type of standard. Using the criteria for evaluating alternative subsample designs (i.e., the estimated coefficients of variation for 51 poststratum groups, the

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

differences in coefficients of variation for race/ethnicity and tenure groups, and the coefficients of variation for state totals), the Census Bureau should try out various state minima sample sizes to determine their effects on the outputs. It is possible that a moderately sized state minimum sample can be obtained without affecting the above coefficients of variation to any important extent. There are a variety of ways in which the assumption of the lack of residual state effects after accounting for poststratum differences could be assessed, including regression methods. We encourage the Census Bureau to consider this important analytic issue early and provide plans for addressing it before the survey design is final.

The panel makes one additional point on state minima. The state minima will support direct state estimates that will be fairly reliable for many states. The Census Bureau should consider using the direct state estimates not only for validation, but also in estimation—in case of a failure of the assumption that there will be no important state effects on undercoverage factors. Specifically, the Census Bureau should examine the feasibility of combining the currently planned A.C.E. estimates at the state level with the direct state estimates, using estimated mean-squared error to evaluate the performance of such a combined estimate in comparison with the currently planned estimates. We understand that the necessity of prespecification of census procedures requires that the Census Bureau formulate an estimation strategy prior to the census, which adds urgency to this issue.

Finally, the panel has two suggestions with respect to the criteria used for assessing the A.C.E. sample design. First, there should be an assessment of the quality of the estimates for geographic areas at some level of aggregation below that of states, as deemed appropriate by the Census Bureau. (This criterion is also important for evaluating the A.C.E. poststratification design, discussed below.) Second, the importance of equalizing the coefficients of variation for different poststrata depends on how estimates for specific poststrata with higher coefficients of variation adversely affect the variance of estimated counts for certain areas. Coefficients of variation for poststrata that do not have much effect have less need to be controlled, assuming that the estimates for these poststrata do not have other uses.

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Poststratification Plans

The 1990 census adjusted counts used 1,392 poststrata, but post-production analysis for calculating adjusted counts for intercensal purposes resulted in the use of 357 poststrata. The panel believes that the use of these 357 poststrata (and the hierarchy for collapsing poststratification cells) was a reasonable design for 1990, and that, in turn, the 1990 design is a good starting point in determining the poststrata to be used in the 2000 A.C.E. The Census Bureau is considering four types of modifications to the 357 poststrata design, although it has not yet set the criteria for evaluating various poststratification designs. Logistic regression will be used to identify new variables and interactions of existing variables that might be added to the poststratification. Finer poststrata have the advantage of greater within-cell homogeneity, potentially producing better estimates when carried down to lower levels of geographic aggregation. Some gains with respect to the important problem of correlation bias might also occur. However, stratifying on factors that are not related to the undercount will generally decrease the precision of undercount adjustments. The tradeoff between within-cell homogeneity and precision needs to be assessed to determine whether certain cells should be collapsed and whether additional variables should be used.

It is also important to examine the effects of various attempts at poststratification on the quality of substate estimates, especially since certain demographic groups are more subject to undercoverage, and so substate areas with a high percentage of these groups will have estimates with higher variances. (This argument is based on the fact that, as in the binomial situation, the mean and the variance of estimated undercounts are typically positively related.) We believe it is extremely important that analyses at substate levels of aggregation be conducted to inform both the sample design and the poststratification scheme. Furthermore, this issue needs to be studied simultaneously with that of the effect of the design and poststratification on the poststratum estimates. The fact that analysis of substate areas appears in both sample design and poststratification design is an indication of the important interaction between these two design elements and justifies the need for studies of them to

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

be carried out simultaneously. The panel encourages the Census Bureau to work on them at the same time.

The panel notes that the decision to use a modification of the 357-strata system from 1990 for the A.C.E. poststratification design will probably not permit many checks against estimates from demographic analysis that use direct estimates from A.C.E. This limitation may increase the difficulty of identifying the precise source of large discrepancies in these comparisons. However, the panel does not view this as a reason not to proceed, since the precision of direct estimates at the finest level of detail of poststratification (using 1,392 strata in this context) could make such comparisons more difficult to interpret, and the estimates from demographic analysis are not extremely useful for this purpose (except for blacks, and then only nationally).

As work on both the sample design and poststratification design progresses, the Census Bureau should not rely entirely on information from the 1990 census: substantial differences might occur between the 1990 and the 2000 censuses that would lead to either a sample design or a poststratification design that was optimized for 1990 but that might not perform as well in 2000. Instead, the Census Bureau should use a sample design that moves toward a more equal probability design than 1990 information would suggest. Similarly, the Census Bureau, using whatever information is available since 1990 on factors related to census undercoverage, should develop a poststratification design that will perform well for modest departures from 1990.

Finally, when considering criteria for both sample design and poststrata, it is important to keep in mind that the goal of the census is to provide estimated counts for geographic areas as well as for demographic groups. Since the use of equal coefficients of variation for poststrata will not adequately balance these competing demands, the Census Bureau will need to give further attention to this difficult issue. The balancing of competing goals is not only a poststratification issue, but also a sample design issue. For example, if block clusters that contain large proportions of a specific demographic group are substantially underrepresented in the A.C.E. sample, the performance of the estimates for some areas could be affected.

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Documentation

Given the importance of key decisions and input values for the A.C.E. design, it is important that they be documented. In particular, the Census Bureau should produce an accessible document in print or in electronic form that (1) gives the planning values for state-level, substate level, and poststratum level variances resulting from the decisions for the sample and poststratification designs and (2) provides the sampling weights used in the A.C.E. selection of block clusters.

Summary

From its review of the Census Bureau’s current plans for design of the A.C.E. survey, the panel offers three general comments:

  • The panel concludes that the general nature of the Census Bureau’s work on the A.C.E. design represents good, current practice in sample design and poststratification design and their interactions.

  • The panel recognizes that operational constraints make it necessary for the Census Bureau to subsample the A.C.E. from the previously planned ICM sample. The subsampling, if done properly, should not affect the quality of the resulting design if compared with one that sampled 300,000 housing units that were not a subset of the 750,000 housing units previously planned for the ICM.

  • The panel believes that removal of the constraint to produce direct state estimates justifies the substantial reduction in the A.C.E. sample size from the ICM sample size. The planned A.C.E. could result in estimates with comparable reliability to that of the larger ICM design.

The panel offers three suggestions for the Census Bureau as it works to finalize the A.C.E. design, some of which the Census Bureau is already considering: (1) a method for examining how large a state minimum sample to retain; (2) some modifications in the criteria used to evaluate the A.C.E. sample design and poststratification, namely, lower priority for coefficients of variation for excessively

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

detailed poststrata and more attention to coefficients of variation for substate areas; and (3) a possible change in the A.C.E. estimation procedure, involving use of direct state estimates in combination with the currently planned estimates. In addition, the Census Bureau should fully document key decisions for the A.C.E. design.

The panel looks forward to continuing to review the A.C.E. design and estimation as the Census Bureau’s plans are further developed. The panel is especially interested in the evolving plans for poststratification design, including the use of logistic regression to identify additional poststratification factors; plans for the treatment of movers in A.C.E.; and the treatment of nonresponse as it relates to unresolved matches in A.C.E. estimation. In addition, after data have been collected, the panel is interested in the assessment of the effect of nonsampling error on A.C.E. estimation and the overall evaluation criteria used to assess the quality of A.C.E. estimates.

We conclude by commending you and your staff for the openness you have shown and your willingness to discuss the A.C.E. survey and other aspects of the planning for the 2000 census.

A.4.b November 2000 Letter Report

Dr. Kenneth Prewitt

Director

U.S. Census Bureau

Room 2049, Building 3

Washington, DC 20233

Dear Dr. Prewitt:

This letter comments on the Census Bureau’s plans for deciding whether to release adjusted estimates of the population of states and substate areas from the 2000 census. That decision will be made in March 2001 so that the Bureau can meet its April 1 deadline under Public Law 94-171 for providing data to the states for redrawing congressional districts.

This is the second letter report of our Panel to Review the 2000 Census. The first report, issued in May 1999, commented on the Bureau’s plans for the design of the Accuracy and Coverage Evaluation (A.C.E.) Survey. The results of that survey will be used along with

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

other information to evaluate the census, and if deemed beneficial, will be combined with census data, using dual-systems estimation methods, to yield adjusted estimates of the population. In this letter report we comment on the Bureau’s plans for evaluating the census data and the A.C.E. data and for deciding whether to release both the census population estimates, unadjusted for coverage errors, and the adjusted estimates.

The Census Bureau presented its plans for the evaluation and decision process that will lead up to the adjustment decision at an open panel workshop on October 2 in Washington, D.C. Bureau staff provided 16 papers that are now in draft form. The papers contain table shells that will be filled in, as data become available over the next few months, with information that is important for the evaluation and decision. The papers cover a variety of topics, including: overall census and A.C.E. quality indicators, quality of census processes, demographic analysis results, person interviewing, person matching and follow-up, missing data, variance estimates by size of geographic area, correlation bias, synthetic assumptions, and other topics. Bureau staff discussed the draft papers and responded to questions and comments from panel members and other workshop participants.

The panel commends the Census Bureau for the openness and thoroughness with which it has informed the professional community of the kinds of evaluations that it plans to conduct of the census and A.C.E. data prior to March 2001. The papers presented at the panel workshop provide evidence of the hard work and professional competence of Census Bureau staff in specifying a series of evaluations that can inform the adjustment decision.

The panel recognizes the difficult task faced by the Census Bureau in evaluating the census and A.C.E. data by the time that it must provide congressional redistricting data to the states. Since it will not be possible for the Bureau to complete all possible analyses by March 2001, it will have to act on the basis of analyses that can be conducted before that time. In view of that constraint, the panel concludes that the set of papers presented to the workshop in draft form reflect competent, professional work to develop an informative set of evaluations for the short term. The planned analyses appear to cover all of the evaluations that can reasonably be expected to be completed within the time available. Furthermore, they appear to

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

be sufficiently comprehensive that they will likely provide support for a reasonably confident decision on adjustment in March.

However, since the numbers themselves, which are, of course, critical to the evaluation process, are not yet available, it is not possible at this time to comment on what the adjustment decision should be nor to conclude definitively that the planned short-term evaluations will be adequate to support the decision. Such commentary will be possible only after the Bureau has completed its work and has provided the supporting data to the professional community.

In addition to the evaluations that are planned specifically to inform the adjustment decision in March 2001, the Census Bureau has a longer term evaluation agenda that includes projects to assess all major systems used in the 2000 census and many aspects of census data quality. That agenda, which will take several years to complete, also includes evaluations that are related to the A.C.E., such as a study of error in the process by which census enumerations and A.C.E. enumerations are matched in A.C.E. blocks. The panel urges the Census Bureau to identify those longer term studies that are likely to provide useful information with which to evaluate the adjustment decision, to give priority to these evaluations, and to provide detailed plans for these evaluations and a schedule that allows for completing them as soon after March 2001 as possible. Users of census data need to know when and what kinds of evaluations will be available in the longer term, just as they have been made aware of the Bureau’s plans for the evaluations to be completed by March.

The short-term evaluations that are planned to inform the adjustment decision in March will provide voluminous, complex data and analyses on a range of aspects of the census and the A.C.E. Review and assessment of this necessarily complex set of information will present a challenge for the Census Bureau’s Executive Steering Committee for A.C.E. Policy (ESCAP), which is charged to recommend an adjustment decision to the Bureau director, as well as for the professional community and stakeholders. The panel believes it would be useful for all concerned parties for the Census Bureau to develop a summary tabular presentation of the factors affecting its decision.

One of the Census Bureau’s 16 papers, “Data and Analysis to Inform the ESCAP Recommendation,” is intended to summarize the analyses and the approach; it should usefully serve this purpose.

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

However, that paper is itself lengthy, and we encourage the Bureau to provide a summary table in addition that focuses on key pieces of evidence in the decision-making process. The summary table would indicate for each piece of evidence such information as the nature of the empirical findings, the type(s) of analysis that were the basis for the findings, the strengths and weaknesses of the analysis, and the implications of the findings for the adjustment decision. The summary should also note relevant types of evidence that are not yet available.

We understand that no simple formula will lead from the available data to a recommendation whether or not to issue adjusted census counts by March 2001 for use in congressional redistricting. A sound recommendation will ultimately rest on professional judgment informed by the available scientific evidence. However, we believe that preparation of a summary presentation of key evidence will assist ESCAP to integrate what will necessarily be a large volume of complex information, some of which may be conflicting, in reaching an adjustment decision. Such a presentation will also assist the professional community and stakeholders to understand the basis for the decision.

In summary, the panel appreciates the openness and professionalism with which you and your staff have undertaken to provide an extensive set of data for use in determining the quality of the enumerated census and, alternatively, of any adjustments that might be made to improve the census data. In furthering your efforts, we make the following two suggestions that were discussed above:

  1. The Census Bureau should prioritize its plans for previously planned long-range evaluation studies with a view toward completing those evaluations most directly relevant to the adjustment decision as soon as possible. While we understand that some studies, especially those involving new data collection, cannot be concluded before the statutory requirements for release of the data, we believe it would be useful for you to produce a schedule that allows for completing them soon after March 2001. The Bureau should make public its plans for these evaluations and their release.

  2. Although the Census Bureau has developed plans for comprehensive review of large bodies of data for use in its decision on

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

whether to release adjusted census counts, the panel believes that it would also be useful to have a summary presentation of key evidence. In particular, a summary table listing each piece of evidence and how it relates to the adjustment decision would be helpful to all parties concerned.

We conclude by thanking you and your staff for your cooperation in providing information for our workshop on the forthcoming decision process.

A.4.c November 2001 Letter Report

The panel issued a third letter report in November 2001, regarding the Census Bureau’s October 2001 decision not to adjust 2000 census data for such purposes as fund allocation. This letter is not reproduced here since it was printed in full in our interim report (National Research Council, 2001a).

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×

This page intentionally left blank.

Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 353
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 354
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 355
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 356
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 357
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 358
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 359
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 360
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 361
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 362
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 363
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 364
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 365
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 366
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 367
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 368
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 369
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 370
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 371
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 372
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 373
Suggested Citation:"Appendix A: Panel Activities and Prior Reports." National Research Council. 2004. The 2000 Census: Counting Under Adversity. Washington, DC: The National Academies Press. doi: 10.17226/10907.
×
Page 374
Next: Appendix B: Questionnaire Items on the 2000 and 1990 Censuses and Census 2000 Supplementary Survey »
The 2000 Census: Counting Under Adversity Get This Book
×
Buy Hardback | $80.00 Buy Ebook | $64.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The decennial census was the federal government’s largest and most complex peacetime operation. This report of a panel of the National Research Council’s Committee on National Statistics comprehensively reviews the conduct of the 2000 census and the quality of the resulting data. The panel’s findings cover the planning process for 2000, which was marked by an atmosphere of intense controversy about the proposed role of statistical techniques in the census enumeration and possible adjustment for errors in counting the population. The report addresses the success and problems of major innovations in census operations, the completeness of population coverage in 2000, and the quality of both the basic demographic data collected from all census respondents and the detailed socioeconomic data collected from the census long-form sample (about one-sixth of the population). The panel draws comparisons with the 1990 experience and recommends improvements in the planning process and design for 2010. The 2000 Census: Counting Under Adversity will be an invaluable resource for users of the 2000 data and for policymakers and census planners. It provides a trove of information about the issues that have fueled debate about the census process and about the operations and quality of the nation’s twenty-second decennial enumeration.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!