National Academies Press: OpenBook

Experimentation and Evaluation Plans for the 2010 Census: Interim Report (2008)

Chapter: APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH

« Previous: REFERENCES
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 56
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 57
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 58
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 59
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 60
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 61
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 62
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 63
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 64
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 65
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 66
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 67
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 68
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 69
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 70
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 71
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 72
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 73
Suggested Citation:"APPENDIX A THE CENSUS BUREAU'S SUGGESTED TOPICS FOR RESEARCH." National Research Council. 2008. Experimentation and Evaluation Plans for the 2010 Census: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/12080.
×
Page 74

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

APPENDIX A THE CENSUS BUREAU’S SUGGESTED TOPICS FOR RESEARCH The following chart was provided to the panel by the Census Bureau as a partial summarization (augmented by several other reports and presentations) of their deliberations as to the research topics that should be considered for either experimentation during the 2010 census or evaluation shortly after. The leftmost column provides an identification key for each topic along with a short series of either questions or a brief discussion that defines the topic. The next block of columns provides criteria that should be used to help rank these topics, initiated by a high-medium-low ranking of the resulting importance of the topic. The criteria are anticipated impacts on cost, quality of data, whether the topic would require a new census component process, and whether it was accomplishable. Finally, the last block of columns provides information on whether the topic was better suited to 2010 or 2020 and whether a census environment was needed to assess alternatives to current census processes. 56

57

TABLE A-1 2010 Census Program for Evaluations and Experiments: Appendix to Summaries of Suggested Research Criteria Considerations Topics and Questions Rank Cost (Big Quality New to Accomplishable Other For For Census 1=H Payoff) Census Criteria 2010? 2020? Environment 2=M Required 3=L A. Coverage Measurement A.1 Census Coverage Measurement (CCM) is the program 1 Yes Yes No Yes Yes Yes Yes that will answer the question: How accurate was the coverage of the population? A.2 How effective is the CCM interview and subsequent 1 No Yes Yes Yes Yes Yes Yes. processing in determining the members of the household at (Note: Some each housing unit on CCM interview day and the usual things can be residence of each household member on Census Day? done outside When there are errors in determining household membership the census.) and usual residence, what are the causes and what are the possible remedies? What are the effects of recall errors and reporting errors on the CCM interview? A.3 Can we start to learn if comparing the history of census 1 No Yes No Yes, as a No Yes Maybe operations with the CCM results in the sample blocks can feasibility study help us explain how and when errors occur, and also suggest in small number potential remedies? of blocks in the CCM sample. 58

Criteria Considerations Topics and Questions Rank Cost (Big Quality New to Accomplishable Other For For Census 1=H Payoff) Census Criteria 2010? 2020? Environment 2=M Required 3=L A.4 Our knowledge about Group Quarters (GQ) coverage is 1 No Yes No Yes Always a No Yes Yes very limited. Efforts to estimate GQ coverage in 1980 and problem 1990 were limited (in both scope and success), and the GQ population was out-of-scope for the 2000 Accuracy and Coverage Evaluation (A.C.E). Since the GQ population will also be out-of-scope for CCM in 2010, we need to consider that the problems are likely to be different for different types of GQs (e.g., college dorms vs. nursing homes vs. migrant farm worker camps). A.5 How can we develop a standard of comparison for 2 No Yes Yes Yes, with Yes Yes Yes, Ideally household membership on CCM interview day and usual limitations residence on Census Day? What are the effects of recall (scope must be errors and reporting errors on the CCM interview? very limited, and Candidate methods for developing the standard include any “standard” ethnographic studies matched to the census and CCM cannot be interviews, respondent debriefings following a CCM expected to get interview, and an in-depth Living Situation Survey. exact truth). A.6 Can administrative records augment CCM fieldwork 2 Yes Yes, N/A Yes No Yes Yes from telephone follow-up to reduce cost and improve CCM unclear data quality? For example, administrative records information may aid in confirming which enumerations linked in the computerized search for duplicates are the same person when the determination cannot be made in the field. An evaluation of A.C.E. Revision II estimates of duplication in Census 2000 using administrative records information demonstrated potential for improving CCM data quality in this manner. 59

Criteria Considerations Topics and Questions Rank Cost (Big Quality New to Accomplishable Other For For Census 1=H Payoff) Census Criteria 2010? 2020? Environment 2=M Required 3=L B. Race and Hispanic Origin B.1 Evaluate alternative race and Hispanic origin questions 1 No Yes No Yes No Yes Desirable to include (1) double-banking of response categories and especially for shared write-in spaces, (2) modified examples to follow the small groups Advisory Panel recommendations, (3) separate evaluation of the features of 2005 National Census Test (NCT) panel 6, to better understand how each influences Hispanic and race reporting, and to inform future decisions, (4) modified Hispanic question that allows multiple Hispanic reporting (Y/N, yes multiple types). The latter must be tested in both Nonresponse Followup (NRFU) and the mailout. Samples must adequately represent small groups. Re-interview is needed to assess data quality. B.2 Develop a combined race and Hispanic origin question. 3 No Yes No Yes Yes Yes Yes B.3 Conduct Research to support rules for editing 1 No Yes No Yes No Yes Yes problematic race and Hispanic origin responses (e.g., Y/N responses to Hispanic origin). A goal is to better understand respondent intent of write-in entries in the presence of, and in the absence of, marking checkboxes. C. Coverage Improvement Address List Development C.1 How accurate was the final address list? 1 No Yes No Yes Yes Yes Yes C.2 How should we deal with updating the address frame 1 Yes Yes Yes See Below Yes Yes Not sure coming out of the 2010 Census, so that we can avoid a large (C.2.a) and expensive address canvassing operation in the future, or so that the operation could be conducted at a much reduced cost? C.2.a How can the quality of the address frame be improved 1 Yes Yes Yes Yes, we need Yes Yes Not sure with a more scientific extract process? evaluations to demonstrate that this can be done. 60

Criteria Considerations Topics and Questions Rank Cost (Big Quality New to Accomplishable Other For For Census 1=H Payoff) Census Criteria 2010? 2020? Environment 2=M Required 3=L C.2.b How can we use additional information (like the 1 Yes Yes Yes Yes, we need Yes Yes Not sure Delivery Sequence File in rural areas, American Community evaluations to Survey (ACS) Time of Interview data, Carrier Route data, demonstrate that and the National Change Of Address file) to improve this can be done. address list maintenance? C.3 Can we target Address Canvassing activities better? 1 Yes No No Yes, there are Politically Yes Yes No; need 2010 concerns about acceptable – data the ability to Can we reliably match convince persons with stakeholders common names that we and across long don’t need distances to do an address canvassing in their jurisdiction, while we do need to do it in the neighboring jurisdiction. C.4 How accurate were the data collected in Address 1 No Yes No Yes Yes Yes Yes Canvassing? How can we improve Address Canvassing quality? C.4.a How well does automated Global Positioning System 1 No Yes No Yes Yes Yes Yes (GPS) collection work in terms of completeness and accuracy of GPS coordinate data? C.4.b How can we improve GPS collection—increase 1 No Yes No Yes Yes Yes Yes human intervention, improve automated collection, both? C.5 How can we improve address list maintenance, 2 No Yes No Yes Yes Yes Yes operational procedures, and enumeration of small multi-unit structures (2-10 units)? 61

Criteria Considerations Topics and Questions Rank Cost (Big Quality New to Accomplishable Other For For Census 1=H Payoff) Census Criteria 2010? 2020? Environment 2=M Required 3=L Administrative Records C.6 How can we avoid the need for followup and use 1 Yes Yes Yes Yes Privacy Not Yes Yes administrative records to: concerns sure a. Identify coverage problems? and issues b. Identify and classify duplicates? with file c. Resolve potential coverage problems identified by access. the coverage probes. Stakeholders have expressed reservations about the use of administra- tive records 62

Criteria Considerations Topics and Questions Rank Cost (Big Quality New to Accomplishable Other For For Census 1=H Payoff) Census Criteria 2010? 2020? Environment 2=M Required 3=L Coverage Followup (CFU) C.7 Does Coverage Followup actually work? 1 Yes Yes Yes Yes Yes Yes Yes a. How effective is it? b. Is CFU effectively identifying omissions? c. Is it introducing bias? d. How do recall and reporting errors affect its determination of residency, and hence erroneous enumerations (EEs)? e. How can we afford to follow up on more coverage improvement cases? f. Is the expense of CFU worth the coverage gain? g. Can certain categories of response to coverage questions be automatically coded, or field coded by interviewers to reduce follow up workload? . . . h. What recall and reporting problems affect CFU’s ability to identify: missed people the respondent had in mind when filling out the undercount question? Are enumerators screening out people who are eligible to be listed? i. How to optimize which cases are coded for CFU? C.8 Develop and experimentally evaluate alternative designs 1 Yes Yes Yes Yes Yes Yes Yes for coverage followup instruments. Alternative methodologies might involve dependent questions; self- response by all relevant household members; immediate follow-up; and other methodological improvements to facilitate recall and reporting in CFU (i.e., conduct an integrated experiment). 63

Criteria Considerations Topics and Questions Rank Cost (Big Quality New to Accomplishable Other For For Census 1=H Payoff) Census Criteria 2010? 2020? Environment 2=M Required 3=L C.9 To what extent did nationwide person matching improve 1 Yes Yes Yes Yes there are Yes Yes Yes the identification and removal of duplicates of housing units concerns about and persons in the census? In particular, what the ability to improvements can be made in the identification and removal reliably match of census duplicates of persons across some distance given persons with the challenges created by chance agreements of names and common names birth dates? and across long distances. C.10 Develop and experimentally evaluate alternative 1 Yes Yes Yes Yes Yes Yes Yes designs of the undercount (and overcount?) questions in the mail form to effectively identify census coverage errors for follow-up. Variations might include format (open vs. closed) and wording of questions and response categories, and placement in the form. Residency Rules/Questionnaire Design C.11 Implement and experimentally evaluate alternative 2 No Yes No Yes No Yes Yes, for some residence rules and presentation of roster instructions in aspects paper and other modes, including the National Academy of Sciences (NAS) recommendations to ask a sufficient number of residence questions to determine residence, and to obtain alternative addresses. Panels would be included in an alternative questionnaire experiment (AQE) and would require a coverage re-interview. Cognitive testing is needed for development, along with research on respondents’ reading behavior and use of flashcards or other ways of presenting instructions. Alternative approaches might include: a. de facto approach b. Worksheet approach c. Alternate address elsewhere 64

Criteria Considerations Topics and Questions Rank Cost (Big Quality New to Accomplishable Other For For Census 1=H Payoff) Census Criteria 2010? 2020? Environment 2=M Required 3=L Be Counted C.12 What effects does the Be Counted Program have in 3 No Yes No Yes Public Yes Yes Yes filling gaps in coverage? perception a. Is it worth it? —of missing b. Is including it better than trying to ensure people people or are counted in other ways? not giving c. Does it introduce coverage errors? them an opportunity to be enumerated in the census if not for the Be Counted program. General C.13 How accurate was vacancy/occupancy status of 1 No Yes No Yes Yes Yes Yes housing units in Census? Are there ways to improve accuracy? C.14 Through ethnographic research, can we learn more 2 No Yes No Yes No Yes Yes about American Indian and Alaska Native households, Hispanic households, and immigrant communities that might result in different methods for enumeration? This research could provide insight into CFU and Census Coverage Measurement (CCM) to understand deficiencies. 65

Criteria Considerations Topics and Questions Rank Cost (Big Quality New to Accomplishable Other For For Census 1=H Payoff) Census Criteria 2010? 2020? Environment 2=M Required 3=L D. Field Activities Automation D.1 What was the impact of adding expanded automation to 1 Yes Yes Yes Yes Yes Yes Yes field data collection for Address Canvassing, Nonresponse Followup (NRFU) and Census Coverage Measurement - Personal Interview (CCM-PI)? Did we gain in efficiency? Did we see cost savings? Did automation contribute to operational improvements? Should we use the hand-held computers (HHCs) in operations other than Address Canvassing, NRFU and CCM-PI in 2020(e.g., U/E)? D.2 What was the impact on field staff of using HHCs to 2 Yes Yes Yes Yes Yes Yes No conduct field data collection operations? Did using HHCs help us to improve the effectiveness and efficiency of field staff? Did using the HHC help us improve the productivity of field workers? What impact did the HHC have on field staff training? Training D.3 How can enumerator training be improved? 1 Yes Yes Yes Yes Yes Yes a. Can we make enumerator training more efficient/effective through the redesign of enumerator training materials and job aids? b. Can we make enumerator training more effective by expanding the use of technology-based training? 66

Criteria Considerations Topics and Questions Rank Cost (Big Quality New to Accomplishable Other For For Census 1=H Payoff) Census Criteria 2010? 2020? Environment 2=M Required 3=L D.4 How can we better prepare enumerators to ensure they 1 Yes TBD No This research Yes Yes Yes are effective and efficient in their job? What is the optimal requires an contact strategy for NRFU? How many contacts should we experiment make for NRFU? (Requires experimental design). design with different contact strategy in different locations. Implementing such a design in the census environment may not be practical or worthwhile D.5 How can enumerator training be improved to 2 No Yes Yes There are studies Yes Yes Yes reduce/minimize errors that may be introduced by that could be interviewers? The focus is on interviewers’ contributions to done to coverage errors (in NRFU, CFU, and CCM-PI) in particular, determine but also errors in other short form items. Possible research interviewer approaches might include: an interviewer variance study, in contribution to which interviewer assignments are randomized, or assigning error. However, a sample of mail returns to NRFU enumerators for re- we are not sure if interview. training could be improved or changed to address those contributions. 67

Criteria Considerations Topics and Questions Rank Cost (Big Quality New to Accomplishable Other For For Census 1=H Payoff) Census Criteria 2010? 2020? Environment 2=M Required 3=L Quality Control (QC) D.6 How can Global Positioning System (GPS) Technology 1 Yes Yes Yes Yes Privacy No Yes Yes be used as a QC tool for field work, e.g., to identify concerns— curbstoning or inefficient field work? GPS Tracking of employees D.7 How can the QC design for field operations be 2 No Yes No Yes Yes Yes Yes improved to be more effective/efficient? a. How much does the QC improve the quality of the census operations? Does the QC have a high probability of identifying data falsification and/or violation of procedures? b. Is there an efficient way to verify the QC work? Is it worth it to verify the QC work? D.8 Can a batch level approach to re-interview sampling 3 Yes Yes No Yes No Yes Yes improve efficiency and/or effectiveness of field re-interview operations? E. Language E.1 Can an alternative design for the bilingual 1 Yes Yes Yes Yes Backlash; Yes Yes Yes English/Spanish questionnaire result in improved data? referring to lowered mail response from non- Spanish- speaking populations E.2 Is there a better or more efficient way to stratify the 2 Low Low Yes Yes No Yes No mailing of the bilingual forms? 68

Criteria Considerations Topics and Questions Rank Cost (Big Quality New to Accomplishable Other For For Census 1=H Payoff) Census Criteria 2010? 2020? Environment 2=M Required 3=L E.3 What information can systematic observations yield 2 No Yes No Yes, but may Yes Yes Yes about how census enumerators are obtaining information present complex from households with little or no understanding of English? stratification Are there changes we can or should make to our issues methodologies and practices to improve these interviews? E.4 Can we obtain better mail response and/or higher quality 1 Yes Yes Yes Yes No Yes Yes data by mailing a Language Assistance Guide Booklet that depicts the questionnaire in Five (5) languages? F. Mode Effects F.1 What is the magnitude of the effects of mode on 1 No Yes No Designing an Yes Yes Yes responses to 2010 census questions? This research would experiment that compare the 2010 mail mode content to the adaptation of the reduces/ specific content items for other modes used in the 2010 eliminates the census. An example would be comparing the 2010 mail self-selection form relationship question, which shows all 14 categories, to bias is complex the proposed 2010 telephone-adapted version, which asks an and may not be open-ended question. In general, this study would examine feasible from a response distributions (or reliability and other data quality field/budget/ measures) for the 2010 mail items compared to the adapted schedule versions used in 2010 for other modes; comparable random standpoint. samples would be ideal to avoid self-selection confounds. 69

Criteria Considerations Topics and Questions Rank Cost (Big Quality New to Accomplishable Other For For Census 1=H Payoff) Census Criteria 2010? 2020? Environment 2=M Required 3=L F.2 What are the effects of mode for alternative adaptations 2 No Yes No Designing an Yes Yes Yes of the 2010 census questions in non-mail modes? This experiment that research would compare the 2010 mail mode content to reduces/ alternative adaptations of the specific content items for other eliminates the modes. These alternative versions for the non-mail modes self-selection are adaptations, which show promise in terms of providing bias is complex comparable data to the mail form, but were not used in the and may not be 2010 census. We would examine response distributions (or feasible from a reliability and other data quality measures) for the 2010 mail field/budget/ items compared to the alternative adapted versions for other schedule modes; comparable random samples would be ideal to avoid standpoint. self-selection confounds. G. Content G.1 What are the combined effects on the data of all 1 No Yes No Yes Yes Yes Yes questionnaire changes made in the 2010 mail questionnaire? G.2 What are the consistency and reliability of reporting in 1 No Yes No Yes Yes Yes Yes the 2010 census? G.3 How well do questions perform in interviews? 2 No Yes No Yes Yes Yes Yes G.4 How comparable are Census 2010 data and American 1 No Yes Yes Yes Yes Yes Census data Community Survey (ACS) data? required but not environment G.5 Do current methods for identifying the 2 No Yes, No Yes No Yes No householder/Person 1 perform well in all modes? If not, can possibly improved method(s) be developed? 70

Criteria Considerations Topics and Questions Rank Cost (Big Quality New to Accomplishable Other For For Census 1=H Payoff) Census Criteria 2010? 2020? Environment 2=M Required 3=L H. Self-Response Options H.1 How can we improve alternatives for increasing mail 1 Yes Yes No Yes No Yes Yes response? An experiment for 2006 showed that a deadline plus delayed mailing of questionnaires improved the mail response rate by two percentage points? This should be replicated in the 2010 Census as an experiment to see if results hold up in a census environment, and to get good data on timing of returns under a deadline. If design permits, effect of deadline messaging and compressed schedule could be teased out. H.2 Experiment testing an additional contact reminder after 1 Yes Yes No Yes No Yes Census replacement questionnaire. This contact would contain environment is stronger language, relative to the reminder postcard and optimal but at replacement questionnaire, indicating that failure to comply least site test is would mean inclusion in the Nonresponse Followup required to (NRFU) workload (more expense, etc). Different types of utilize NRFU contacts could be tested such as postcard, full size letter, message. phone message, etc. I. Special Places/Group Quarters I.1 Did the revised Group Quarters (GQ) definitions 1 No Yes No Yes Yes Yes Yes improve the identification and classification of GQs (GQs versus housing units, and by type)? I.2 Evaluate methods for improving GQ data collections by: 2 No Yes Yes Yes Yes Yes Yes 1) assessing the yield from the various sources used to update the MAF/TIGER database (MTDB), as well as assessing how the various census operations update the MTDB; 2) studying effects of allowing a Usual Home Elsewhere in more types of GQs; and 3) collecting additional information to assist with unduplication of college students. 71

Criteria Considerations Topics and Questions Rank Cost (Big Quality New to Accomplishable Other For For Census 1=H Payoff) Census Criteria 2010? 2020? Environment 2=M Required 3=L I.3 The National Academy of Sciences recommends, “The 3 No No Yes Maybe—the There could Yes Yes Yes U.S. Census Bureau should participate in a comprehensive scope of this be possible review of the consistency of content and availability of comprehensive implications prison records. The accuracy of prisoner-reported prior review of for other addresses is uncertain, and should be assessed as a census prisoner reported types of experiment. A research and testing program, including address GQs. This experimentation as part of the 2010 census, should be information topic has initiated by the Census Bureau to evaluate the feasibility and would be generated cost of assigning incarcerated and institutionalized massive if all much individuals, who have another address, to the other levels of discussion location." (National Academy of Sciences (NAS) report correctional and varying entitled “Once, Only Once, and in the Right Place – facilities (e.g., views Residence Rules in the Decennial Census (September 2006), federal, state, among pp 9-10).” local, and census private) were stake- included in the holders. review. Additionally, the Census Bureau has concerns about the feasibility of actually collecting the prisoner address information. J. Marketing/Publicity/Paid Advertising/Partnerships 72

Criteria Considerations Topics and Questions Rank Cost (Big Quality New to Accomplishable Other For For Census 1=H Payoff) Census Criteria 2010? 2020? Environment 2=M Required 3=L J.1 How effective was the communication strategy for 1 Yes Yes No The efforts in Yes Yes Yes improving response and accuracy of the census? 2000 to evaluate a. How do the separate components of the the Advertising/ communications strategy contribute to the Marketing improvements (e.g., advertising, partnerships, approach was etc.)? inconclusive. b. How effective were the targeted messages at Whether or not reaching specific audiences? we can evaluate c. Did the communications strategy change attitudes the 2010 or behavior toward and/or increase awareness of approach participation in the census? depends on the solution put forth by the vendor and our ability to develop reliable technology. K. Privacy K.1 Test alternative presentation and placement of privacy 1 Yes Yes No Yes No Yes Yes messages in cover letter, etc. K.2 Monitor public concerns about privacy and 1 No No Yes Yes Can help Yes Yes Yes confidentiality in a series of quick-turnaround surveys decision conducted during the census to provide U.S. Census Bureau makers executives with timely information about emerging concerns respond to and issues. Data from monitoring surveys can also augment emerging (or replace) traditional outreach evaluation surveys, which issues, crises are slow and do not provide useful information on a timely basis. 73

Criteria and Considerations for Assessing Proposed Research Topics and Questions Criteria: Cost (Big Payoff) – [Yes/No] Will results potentially lead to substantial cost savings in the 2020 Census? Quality – [Yes/No] Could results conclusively measure effects on data quality? New to Census – [Yes/No] Does the question address operations that are new since Census 2000, experienced significant procedural change, or experienced significant issues during Census 2000? Accomplishable – [Yes/No] Will data be available to conclusively answer the question? Will there be a high demand of resources to address and answer the question? Are complex or untested methods foreseen to address and answer the question? Considerations: For 2010 – [Yes/No] Is this research question intended to assess an operation in the 2010 Census? For 2020 – [Yes/No] Is this research question intended to assess a 2010 Census operation to inform the 2020 Census? Census Environment Required? [Yes/No] SOURCE: 2010 Census Program for Evaluations and Experiments—Appendix to Summaries of Suggested Research (planning document shared to the panel by the U.S. Census Bureau, April 13, 2007). 74

Next: APPENDIX B INTERNET RESPONSE OPTIONS IN SELECTED POPULATION CENSUSES »
Experimentation and Evaluation Plans for the 2010 Census: Interim Report Get This Book
×
 Experimentation and Evaluation Plans for the 2010 Census: Interim Report
Buy Paperback | $29.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

For the past 50 years, the Census Bureau has conducted experiments and evaluations with every decennial census involving field data collection during which alternatives to current census processes are assessed for a subset of the population. An "evaluation" is usually a post hoc analysis of data collected as part of the decennial census processing to determine whether individual steps in the census operated as expected. The 2010 Program for Evaluations and Experiments, known as CPEX, has enormous potential to reduce costs and increase effectiveness of the 2020 census by reducing the initial list of potential research topics from 52 to 6. The panel identified three priority experiments for inclusion in the 2010 census to assist 2020 census planning: (1) an experiment on the use of the Internet for data collection; (2) an experiment on the use of administrative records for various census purposes; and (3) an experiment (or set of experiments) on features of the census questionnaire. They also came up with 11 recommendations to improve efficiency and quality of data collection including allowing use of the Internet for data submission and including one or more alternate questionnaire experiments to examine things such as the representation of race and ethnicity.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!