National Academies Press: OpenBook

Planning the 2010 Census: Second Interim Report (2003)

Chapter: 5. The 2003 and 2004 Census Tests

« Previous: 4. American Community Survey
Suggested Citation:"5. The 2003 and 2004 Census Tests." National Research Council. 2003. Planning the 2010 Census: Second Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/10776.
×
Page 103
Suggested Citation:"5. The 2003 and 2004 Census Tests." National Research Council. 2003. Planning the 2010 Census: Second Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/10776.
×
Page 104
Suggested Citation:"5. The 2003 and 2004 Census Tests." National Research Council. 2003. Planning the 2010 Census: Second Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/10776.
×
Page 105
Suggested Citation:"5. The 2003 and 2004 Census Tests." National Research Council. 2003. Planning the 2010 Census: Second Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/10776.
×
Page 106
Suggested Citation:"5. The 2003 and 2004 Census Tests." National Research Council. 2003. Planning the 2010 Census: Second Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/10776.
×
Page 107
Suggested Citation:"5. The 2003 and 2004 Census Tests." National Research Council. 2003. Planning the 2010 Census: Second Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/10776.
×
Page 108
Suggested Citation:"5. The 2003 and 2004 Census Tests." National Research Council. 2003. Planning the 2010 Census: Second Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/10776.
×
Page 109
Suggested Citation:"5. The 2003 and 2004 Census Tests." National Research Council. 2003. Planning the 2010 Census: Second Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/10776.
×
Page 110
Suggested Citation:"5. The 2003 and 2004 Census Tests." National Research Council. 2003. Planning the 2010 Census: Second Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/10776.
×
Page 111
Suggested Citation:"5. The 2003 and 2004 Census Tests." National Research Council. 2003. Planning the 2010 Census: Second Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/10776.
×
Page 112

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

5 The 2003 and 2004 Census Tests AN IMPORTANT PART OF PLANNING for the decennial census is testing trying out new procedures and techniques in or- der to finalize the census design before the count begins. A regular feature of the census process since the 1940 census, the Census Bureau's program of intercensal tests has pursued several major direc- tions (Bailer, 2000~: · major changes in census methodology (most notably the conver- sion to mailout/mailback as the dominant mode of census collec- tion and the use of sampling); · techniques to improve coverage and to better measure census coverage; · optimal questionnaire wording and format; · new technology; and . . · Improved census processing. 103

104 PLANNING THE 2010 CENSUS From all indications, the Census Bureau is not eager to repeat the experience of the 2000 census, in which the lateness in reaching a general census design limited the effectiveness of operational testing. T T 1 .1 1 1- ``T · 1 ~ · 1 1 rr ~ 1 0 Under the heading Lessons Learned from Census 2000, Waite (2002) emphasized the importance of effective testing: "If we want to achieve our Census 2010 Goals, operational testing of design infrastructure must start early in the decade and continue through the Dress Re- hearsal. In particular, the census dress rehearsal typically held 2 years prior to census deployment should properly be a comprehen- sive run-through of census machinery to fine-tune the final census design. However, in 1998, the dress rehearsal had to serve as a feasi- bility test for three quite different general designs, involving different levels of sampling techniques (National Research Council, 2001a). The Census Bureau's proposed plans for the 2010 census partic- ularly the elimination of the census long form are sufficiently aiffer- ent from the plans for the 2000 census that all possible opportunities for testing design options must be tully exploited In order to Analyze an effective design for 2010. As depicted in Table 1-1, milestones in the 2010 planning process include major census tests roughly every other year leading up to 2010. The Census Bureau shared plans for the 2003 census test with the panel at its September 2002 meeting and in subsequent discussions. However, the plans were shared with us too late to allow us to suggest or for the bureau to effect any meaningful change in the 2003 test plan. Plans for 2004 and 2006 are still under development, and the panel looks forward to continued work with the Census Bureau on those plans. In this chapter, we offer comments on the 2003 test and initial comments on the 2004 test and will revisit the 2004 and 2006 tests in great detail in the final report. 1 1 1 1 ~ 1 1 1 · 1 · 1 ~— 1 - 1 1 1 r 1 _ A AN 2003 NATIONAL CENSUS TEST 1 1 AS presented to the panel, the 2003 National Census Test will be a nationwide test involving 250,000 households. It is strictly a mailout test; no field enumerators will be used to conduct nonresponse follow- up, thus distinguishing this test from the proposed 2004 Census Field Test, which will have a follow-up component. Households selected for

THE 2003 AND 2004 CENSUS TESTS . . 105 inclusion in the test were set to be notifier! by an advance mailing in late January (U.S. Census Bureau, Public Information Office, 2003a). The 2003 test focuses primarily on two issues: · Multiple Self-Response Modes. Offering respondents the opportu- nity to respond by mailing the questionnaire back (the traditional method, filling out an Internet version of the questionnaire, or responding to the questionnaire via an automated TouchTone telephone system known as interactive voice response (IVR). · Race and Ethnicity Question Wording. Altering the way in which the questions on race anct Hispanic origin are presented, incluct- ing the omission of the "some other race" category currently of- ferect as an alternative uncler race. Prior to the 2000 census, it was hoped that a seconct mailing of replace- ment questionnaires coulc! be usec! to increase responses to the initial mailed questionnaire; however, logistical concerns dictated that it be ciroppec! from the 2000 census plan. The 2003 census test is not directly a test of the use of targeted replacement questionnaires; however, the current plan is to use replacement questionnaires as part of the limited (not in-person) follow-up process. Hence it is thought that the test could yield some insight as to the effectiveness of replacement ques- . . . . . t1onna1res In Improving response. Plans call for the test to consist of 16 experimental groups, in each of which the households will be sent a letter in advance of the actual questionnaire delivery. A control group will be eligible for a replacement questionnaire in nonresponse follow-up but will have the race and ethnicity questions as worded in the 2000 census. Eight treatment groups will be used in the response mode portion of the test; each group will vary on whether IVR or Internet response is mentioned on the initial questionnaire or encouraged in a reminder postcard. An acictitional seven treatment groups on race anct ethnicity wording will round out the experimental groups. Each of these groups has slight variations on the wording of the questions: for example, whether "some other race" is allowocl as a possible response or whether examples of possible Hispanic origin groups are listed in the question. A particularly subtle variation involves whether respondents are askoct . .

106 PLANNING THE 2010 CENSUS "Is Person x Spanish/Hispanic/Latino?" versus the alternative "Is Person x of Spanish, Hispanic, or Latino origin?" From a purely conceptual point of view, developing a test strategy for a new decennial census design shoulc! flow from anc! build upon the experience of the previous census. In this case, evaluations from the 2000 census woulc! be usec! to identify problem areas parts of the cen- sus process in which procedures may not have performed as expected or in which problems occurrec! anc! tests woulc! be clevisec! to evalu- ate alternative routines. In this context, the selection of the preceding two topics for the first major census test leacling up to the 2010 census is intriguing but somewhat confusing. That the Census Bureau should investigate the effectiveness of accommodating responses by phone anc! Internet is clear, but whether the issue merits a large resource-intensive test this early in the clecacle is a valict point of clebate. With regard to both topics, it coulc! be argucc! that resources might better be channeled into completing evaluations of the 2000 census that could guicle more effective targeted testing. For example, one might naturally prefer to exploit data already in hand to answer key questions before launching a large-scale study: How many respondents submitted both Internet anct paper responses? What was the level of item nonresponse in the limited number of Internet responses relative to mail responses? Is there evidence that the restrictive security measures surrounding the 2000 census Internet option cleterrec! respondents from answering via the Internet? How many Internet respondents started to provide data but ctict not complete the instrument? How cloes the population that iclentifiec! itself as "some other race" compare to other subgroups, en c! why is dropping the possible response "some other race" a useful or meaningful possibility? Another consideration for timing the tests of Internet responses, in particular, is that this is an extremely dynamic area, with tremen- clous changes in the uptake of Internet-capable home computers, high- speecl connections, and other technologies. It would be risky to precli- cate a test of methods for 2010 on any predictions that could be made now other than that current state-of-the-art technologies will be more widely ctispersect anct new technologies will have appeared In fact, even without a test of Internet technologies, we probably could say more today about results that will be obtainer! in 2003 than we can about the relationship between those results and what will actually be found in

THE 2003 AND 2004 CENSUS TESTS 107 2010. Thus, data on Internet response collected in 2003 might con- tribute little to the planning process, relative to competing activities. A primary concern for the panel was the design of the race anct ethnicity component of the test. The plan for this portion involves seven treatment groups that like the control anct response mocle treatment groups are stratified only by their mail response rate in the 2000 census (specifically, they are grouped into "high" and "low" response groups based on a selectee! cutoff). The panel's concern is whether this design is sufficient to answer the primary questions of interest. The differences in the questions aciministerec! to the cliffer- ent treatment groups are often quite subtle the ctistinction between being "Hispanic" versus "of Hispanic origin," for example. Hence it is unclear whether the sample design will generate enough coverage in Hispanic communities to facilitate conclusive comparisons whether it will reach enough of a cross-section of the populace to gauge sensi- tivity to slight changes in question worcting anct whether it will reach a sufficiently heterogeneous mix of Hispanic nationalities anc! origins to clecicle whether including instructions or examples improves response. To this encI, there are two methods by which the effective number of respondents who are active on this question could be augmented. First, either through selection of test sites or through oversampling of blocks within test sites, the number of respondents who are likely to have rel- evant characteristics can be increasecl. Seconcl, given that the two major treatment factors being examiner! response mocle anc! questionnaire wording are likely to be relatively independent (orthogonal) in ac- tion, it seems reasonable to completely cross the two treatment factors in this experiment. By varying response mode and race/ethnicity worcl- ing for the complete sample, power to distinguish between alternatives may be gained relative to the current design using effectively separate experiments. Also, the alternative race and Hispanic origin questionnaire worcl- ings plannec! for the 2003 test are variations along three different fac- tors: identification as "Hispanic" versus "of Hispanic origin," inclusion of "some other race" as a valic! choice, en c! provision of examples of cletailect Hispanic origin. The panel is concerned that these alternatives are relatively narrow anct, as a result, may not provide sufficient infor- mation on question formats for use in 2010. It is also unclear that these particular selections are derived from an established research base. For 1 . . .

108 PLANNING THE 2010 CENSUS the purposes of a true experiment, our view that it may be useful to consider even more radical question alternatives, such as folding race and Hispanic origin into a single question in a "check one or more" format. Such an alternative may in fact not be tenable as a choice in 2010 it may force higher levels of imputation for groups who select Hispanic but no legally clefinec! race category, anc! it may be counter to Office of Management anct Budget guidelines on race anct ethnicity categorization. But from a research perspective, such might be the best way to determine how people prefer to categorize themselves, anct the results may inform strategies for analysis anc! tabulation of final census results. 2004 CENSUS FIELD TEST In October 2002, the Census Bureau announced the selection of sites for the 2004 Census Field Test. The selectee! sites are Colquitt, Thomas, anc! Tift counties in Georgia; Lake County in Illinois; en c! a portion of northwestern Queens County in New York. Original plans called for the 2004 test to involve approximately 450,000 housing units across the various sites; since the test involves a fielct work component, approximately 3,000 temporary jobs will be created to conduct the test (U.S. Census Bureau, Public Information Office, 2002~. As cliscussec! at the panel's September 2002 meeting, still-cleveloping plans for the 2004 test call for work on at least seven different topic areas. Test sites were selectee! to try to obtain a variety of geographic types (urban/suburban/rural) and racial groups (Waite, 2002~. Though fielc! work will be clone in each of the test sites, anti, in some re- spects, the activity will almost seem to be a census in miniature, the Census Bureau is not promising or even offering participating sites a population count at the enc! of the test. 1 Under the fiscal year 2004 budget proposed by President Bush in January 2003, the Census Bureau would scale down the 2004 test to drop Lake County, Illinois, as a test site and to reduce the planned workload in Queens County, New York (Lowenthal, 2003b).

THE 2003 AND 2004 CENSUS TESTS 109 Mobile Computing Devices A clear, primary thrust of the 2004 test is work with mobile com- puting crevices (MCDs) for fielc! work. To ciate, the Census Bureau's testing of MCDs have been small pilot tests of basic skills. For in- stance, small numbers of fielc! staff with different levels of local famil- iarity were assigned to locate an assigned set of acictresses on TIGER- basec! maps on a Pocket PC-class crevice in a pilot test in Gloucester County, Virginia. This test concentrated only on locating features us- ing a small-screen map and not on using the computer to calculate a route to those features (U.S. Census Bureau, Mobile Computing De- vice Working Group, 2002~. The Census Bureau's hope is that the 2004 test will be a more com- prehensive test of MCD capabilities, including use of global position- ing system (GPS) receivers and computer-assistect personal interview- ing software to administer short-form interviews in English or Spanish. As we commented in Chapter 2, prototyping of MCDs in 2004 is very important for getting a sense of current capabilities, but it is more important to clarify the requirements and information flows associated with the crevices. That said, we hope that sufficient information is gained about the process of MCD and GPS use cluring the 2004 test to inform final design decisions later in the clecacle. Race and Hispanic Origin Questionnaire Wording The Census Bureau also intends to follow up its work in the 2003 mailout-only test by including alternate wordings of the race en c! Hispanic origin census questions in the 2004 field test; these proposed wordings were discussed earlier. It is unclear whether the full range of alternatives will be workoct into the 2004 test; given the panel's concern that the 2003 test alone is unlikely to provide definitive eviclence in favor of any one of the subtle alternatives uncler consideration, we hope that the 2004 test is not used merely to test one "favored" alternative from the 2003 results. Foreign Language Questionnaires In the 2000 census, respondents requiring a questionnaire in a language other than English could request it from the Census Bureau,

110 PLANNING THE 2010 CENSUS from local census offices, or from follow-up enumerators. The 2004 test will be a first test for some proposer! improvements, including test deployment of a dual language (English and Spanish) question- naire. Delivery of the English/Spanish questionnaires is intenclec! to be clone in a targeted manner (e.g., based on high concentrations of Hispanic-origin responses in the 2000 census for a particular tract or block group). Other 2004 Test Topics Other topics have been identified for testing in 2004; however, the panel has not yet seen information on them or, more precisely, how exactly the Census Bureau plans to test them in fuller detail than bul- letec! lists of possible directions. Consequently, further comment on them awaits further interaction between the panel anct the Census Bu- reau as plans continue to take shape. These general topics include: · Field-Based Coverage Improvement. Strategies for reclucing per- son duplication by clearer explication of residence rules and by better tracking of housing unit occupancy status cluring nonre- sponse follow-up. . · Targeted Canvass to Update MAT: Use of administrative records and the Census Bureau's housing unit estimates program to iclen- tify localities with potential MAF coverage problems anct com- parison of aciciress canvasses in those areas with existing MAF coverage. . . Contact Strategy for Self-Response. Strategies for reminding and encouraging respondents to submit their questionnaires on their own (as opposed to having to be contacted by a field enumerator cluring nonresponse follow-up). One such strategy, a targeted mailing of a seconct questionnaire, is to be tested in both the 2003 Census Test and the 2004 Field Test. Special Place/Group Quarters. Development and testing of new definitions for group quarters en cl refinement of the techniques usec! to list group quarters for enumeration.

THE 2003 AND 2004 CENSUS TESTS 111 Counting Americans Overseas Distinct from the 2004 Census Field Test, the Census Bureau has announced plans for another test program for 2004. The Overseas Enu- meration Test would attempt to count American citizens resicting in France, Kuwait, anc! Mexico. The test will rely on a publicity campaign to be mounted using English-language media in the three countries. Po- tential respondents woulc! be urged to request that a questionnaire be mailed to them, to pick up a questionnaire at an embassy or consulate, to obtain a questionnaire from Census Bureau "partner organizations that serve Americans overseas," or to complete the questionnaire via the ~ % _% ~ ~ · _ ~ · ~ ~ ~ _ _ _ ~ ~ ~ ~~ lnternet I.. (census Bureau, Public lntormat~on Mince, 200Jb). the panel has not seen any more cletailec! information on this test, which is an apparent reaction to concerns raised in litigation shortly after the release of reapportionment totals.2 T· ,- Imellne Uncler the basic timetable presented in Waite (2002), active prepara- tion for the 2004 Census Test will begin in April-Tune 2003 with the hir- ing anc! training of staff. Aciciress canvassing anc! the authoring of initial test evaluations are scheclulect for late 2003, as is questionnaire printing. Questionnaires are scheclulect for mailout in early 2004, with the refer- ence date (Census Day) set at April 1, 2004; questionnaire check-in, data capture, anct nonresponse follow-up continue through July 2004. Processing of the results en c! continucc! cirafting of evaluation reports are expected to continue through the rest of 2004 anct extenct into early 2005. 2Under the reapportionment counts issued by the Census Bureau in December 2000, North Carolina was allocated the 435th seat in the U.S. House of Represen- tatives, as prioritized by the method of "equal proportions" used to reapportion the House. A fourth congressional district for Utah was ranked 436th, falling short of the additional seat by less than 1,000 people. Consequently, Utah challenged the reappor- tionment counts on the basis that the Census Bureau's limited overseas enumeration omitted Mormon missionaries and other private citizens who should have been tallied in the census. However, Utah's case was rejected by a federal appeals court, and the U.S. Supreme Court declined to take the case on appeal (Lowenthal, 2003a). Utah would subsequently challenge the Census Bureau's imputation strategy but lost that challenge in a U.S. Supreme Court ruling (Utah v. Evans, 526 U.S. 452, 2002~.

Next: 6. Conclusions and Future Work »
Planning the 2010 Census: Second Interim Report Get This Book
×
Buy Paperback | $49.00 Buy Ebook | $39.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Panel on Research on Future Census Methods has a broad charge to review the early planning process for the 2010 census. Its work includes observing the operation of the 2000 census, deriving lessons for 2010, and advising on effective evaluations and tests. This is the panel's third report; they have previously issued an interim report offering suggestions on the Census Bureau's evaluation plan for 2000 and a letter report commenting on the bureau's proposed general structure for the 2010 census.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!