Skip to main content

Currently Skimming:

3 Reconsideration of Important Census Bureau Decisions
Pages 37-67

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 37...
... This should be done with the benefit of evaluation results and data collected from 2000. This chapter discusses the following features involved in these decisions, some of which will be used in 2000 and some not, in roughly the chronological order of their appearance in the census process: · the decision to carry out a full master address file canvass prior to the census; · the decision not to move census day; · the use of multiple response opportunities; · the use of blanket replacement questionnaires; · the use of four sampling rates for the long form (assuming use of the long form in 2010~; 37
From page 38...
... The completeness and accuracy of the geographically referenced address list (MAF-TIGER) is important to provide adequate support for key data collection operations planned for the 2000 census: · mailout and postal delivery of the census questionnaires for mailback return; · census delivery of questionnaires for mailback return in rural areas; · unduplication of multiple questionnaire responses from the same household, which results from multiple response options and mailout of replacement questionnaires; and · enumerator field follow-up for nonresponse, including accurate sampling to achieve 90 percent direct enumeration in each census tract.
From page 39...
... The new plan called for expanded field canvassing operations in 1998 and 1999 in a manner similar to the traditional, blanket canvassing operations used in prior censuses. This effort was to be in combination with an opportunity for local governments to review the Bureau's address list under the Local Update of Census Addresses (LUCA)
From page 40...
... The ability to carry out the above operations with little error depends on (1) the recruitment and supervision of a high quality field staff to carry out the expanded field canvassing operations in the limited time available, (2)
From page 41...
... Finally, the Census Bureau must incorporate the local challenges under LUCA into its field canvassing operations and then provide feedback to local governments on their challenges. It is extremely important that local governments are assured that the needed MAF-TIGER quality has been achieved in order to garner their support for the enumeration operations that follow.
From page 42...
... The panel suggested that LUCA pay special attention to structures that have units either without clear or unique labels or units that are not clearly distinguishable. The panel also suggested that it might even be preferable in some cases to treat an entire structure as the "dwelling unit" for purposes of the MAE, nonresponse follow-up, and integrated coverage measurement (ICM)
From page 43...
... DATE OF CENSUS DAY In preparing for the 2000 census the Census Bureau considered pursuing legislation to move the date of the census from April 1 to midMarch (while retaining the mandated delivery dates of state counts by December 31 and counts for redistricting by the following April 1~. The panel regrets that this change was not pursued.
From page 44...
... Finally, it makes the census reference date somewhat ambiguous and may increase the number of situations in which people who move and are in the integrated coverage measurement survey have a different residence for April 1 or later but had a March residence for the census. USE OF MULTIPLE RESPONSE OPPORTUNITIES For the past few censuses there have been complaints from individuals who thought either that their residence had been left off the census mailing list or that they had been omitted from the questionnaire returned for their household of residence.
From page 45...
... to produce an unfeasible amount of "unduplication." However, it might be a far greater problem in the 2000 census, because of either increased amounts of undiscovered unduplication or a more compressed time schedule, making unduplication either much more time consuming or error prone. Therefore, the results from the 1998 census dress rehearsal in evaluating the primary selection algorithm, which determines which forms are considered to be duplicates, should be used to better understand the problems 2Another concern is the number of fictitious or incorrect enumerations that are received.
From page 46...
... The 1998 dress rehearsal will provide some important evidence as to the value of blanket replacement questionnaires, and final decisions about their use in the 2000 census should not be made until the dress rehearsal experience is evaluated. In addition, the Census Bureau should determine early in the next decade whether it will be technologically feasible to use a targeted replacement questionnaire in the 2010 census, since that is the strongly preferred procedure.
From page 47...
... with fewer than 800 housing units, while the new 1-in-4 rate will be applied to governmental units with 800 to 1,200 housing units. Governmental units that exceed this size will have sampling rates set by census tract.
From page 48...
... Early plans for the 2000 census called for follow-up visits to a 1-in-10 sample of postmaster returns identified as vacant and estimation for the other 90 percent, in an operation separate from the main nonresponse follow-up. In census tests in Oakland, California, and Paterson, New Jersey, 66 and 59 percent, respectively, of households initially identified as vacant were in fact discovered to be vacant, so it makes sense to handle vacant units from postmaster returns separately from the main nonresponse follow-up to reduce the time between census day and follow-up operations and, in the event of a targeted replacement questionnaire, to avoid the cost of mailing replacement questionnaires.
From page 49...
... With an occupation rate of about 30 percent, the variance of household counts for units identified as vacant by postmaster returns should be roughly the same as that for the main nonresponse follow-up sample, arguing for a sampling rate closer to that for nonresponse follow-up.6 The panel therefore recommended that optimal design theory guide the choice of higher sampling rates for units that postmasters identified as vacant. While optimal design theory would seemingly support a higher UAA-vacant sampling rate than the selected one of 30 percent, since the overall sampling rate for nonresponse followup is expected to be roughly 70 percent, it is important to keep in mind that the sampling rate for nonresponse follow-up was not determined solely through optimal design considerations.
From page 50...
... The current proposed sampling rate of 3 in 10 is therefore preferred. USE OF NONRESPONSE FOLLOW-UP TO DIRECTLY ENUMERATE AT LEAST 90 PERCENT OF HOUSEHOLDS In 1990 the Census Bureau conducted nonresponse follow-up of all housing units that failed to respond by mail to the census questionnaire.
From page 51...
... The present plan ensures low coefficients of variation from the use of sampling for nonresponse follow-up for relatively small units of census geography. The panel understands and endorses the Census Bureau's desire to use a plan with this property, even at the expense of retaining a relatively high overall rate of nonresponse follow-up with the associated limits on cost savings and quality improvements.8 Recommendation 3.6: The Census Bureau should explore the advantages of sample designs for nonresponse follow-up that do not require a predetermined response rate and that can therefore achieve near equity in coefficients of variation across region, regardless of initial response rates.
From page 52...
... The Census Bureau has decided to use sequential nearest-neighbor hot deck imputation, the methodology that has a long history of use in the decennial census in treating various forms of nonresponse. Given its relative ease of use, its success in the past, and the fact that geographical proximity is usually a relatively strong predictor of similarity of race and housing type, it is a sensible choice.
From page 53...
... USE OF COMPUTER-ASSISTED PERSONAL INTERVIEWING FOR INTEGRATED COVERAGE MEASUREMENT The Census Bureau has decided that its integrated coverage measurement (ICM) interviewing staff will use laptop computers and computerassisted personal interviewing (CAPI)
From page 54...
... While the panel cannot offer any suggestions in this difficult area, there is the hope that further evaluation based on the dress rehearsal experience will make clear the tradeoffs in the use of CAPI for this important data collection in 2000. TREATMENT OF MISSING DATA IN INTEGRATED COVERAGE MEASUREMENT Plans have been made for dealing with several types of missing data that will appear in the ICM process, with an emphasis on simplicity of methodology.
From page 55...
... If missing-data rates are shown to have been low in the 1998 census dress rehearsal, the panel considers this decision appropriate, but it should be reconsidered if missing-data rates are at the 1990 levels. DEMOGRAPHIC ANALYSIS AND INTEGRATED COVERAGE MEASUREMENT Given current plans, there will be two methods available to assess the amount of net undercoverage in the 2000 census integrated coverage measurement and demographic analysis.
From page 56...
... in a sample of blocks or block clusters, using an independently created list of housing units. Residents of a housing unit are asked who lived there on census day, using an interview with special probes designed to elicit as complete a roster of household members as possible.
From page 57...
... As in the PES, in the Census Plus coverage measurement survey interviewers go to a sample of housing units and first ask the residents who lived there on census day. Census Plus adds a second phase to the interview in which the interviewer attempts to reconcile the roster from the first phase of the interview with the initial census enumeration that has been loaded into the interviewer's laptop computer, to obtain a "resolved roster." Little or no follow-up is conducted after this two-phase interview.
From page 58...
... For this reason, procedures for the 1995 and 1996 test censuses defined the PES sample as consisting of people resident in PES sample housing units on census day ("PES-A"~; when a household moves shortly after census day, the PES requires finding and interviewing the family that moved out. The Census Bureau's plan for the 1998 census dress rehearsal called for use of a hybrid, third method ("PES-C")
From page 59...
... However, the ultimate sample size was only about 160,000 housing units, which necessitated borrowing of information across states to obtain state-level estimates of marginally acceptable precision. Given the 750,000 housing unit PES currently planned for the 2000 census, which will permit useful estimates at lower geographic levels than the planned PES in 1990, there should be less need for smoothing.
From page 60...
... is based on estimates of the number of poor children in counties, and regression models that blend information across counties and states are used to allocate considerable amounts of federal funds to counties and states (for a description of the method, see National Research Council, 1998~. Also, with the use of sequential hot deck imputation, it is clear that at some small level of geography, the analogue of the same-state constraint is not adhered to.
From page 61...
... At the other extreme, a criterion of equal variance of direct population estimates for every state would imply larger sampling rates (and therefore disproportionately larger sample sizes) in larger states.
From page 62...
... The second form of this constraint is restricting the allocation of state population shares to substate areas based only on information from that state. Similar to the argument above, assuming there are consistent patterns to substate variation in adjustment factors, accepting this constraint increases the sampling variance for estimates of substate population shares.
From page 63...
... Although this assumption can only be approximately true, the synthetic estimates still should be more accurate than direct estimates at low levels of aggregation, since direct estimates would be based on very small samples. Synthetic estimation also has the somewhat conservative property that the adjustment for a poststratum in a small area is never more extreme than that estimated for the poststratum in a larger area, unlike some regression methods that can extrapolate beyond the range of values estimated directly.
From page 64...
... Technically, a log linear model for adjustment factors is fit to the population data.) The poststratum adjustment factors are used in synthetic estimation as described above, which preserves consistency with direct estimates for substate regions and statewide sociodemographic groups.
From page 65...
... Consequently, the Census Bureau has worked to develop methods to produce a household data file that is consistent with the official person counts produced by integrated coverage measurement, while assigning all persons to realistic households. Isaki et al.
From page 66...
... Accurate assignment of persons to households would benefit from direct evidence about the types of assignment errors made in the basic census enumeration and about the true distribution of household types. The panel encourages the Census Bureau to conduct research on methods for using integrated coverage measurement to estimate the frequency of household-type assignment errors.
From page 67...
... However, the Census Bureau should continue research on production of public-use files that are consistent for persons, housing units, and households, along the lines of current research on a transparent file. Considerable effort should be taken to avoid use of a special nonhousehold category in the 2010 census.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.