–4–
Revitalizing Census Research and Development
THE CENSUS BUREAU NOT LONG AGO led the world in goal-oriented research and development (R&D) for continuous improvement of its censuses and surveys. The fruits of that R&D included such pathbreaking achievements as:
-
the use of probability sampling in censuses and surveys (first used in the decennial census in 1940), which dramatically reduced respondent burden and the costs of data collection compared with a complete census, while allowing the collection of detailed information with known error due to sampling;
-
computerized processing of census returns, begun on a small scale in the 1950 census and fully implemented in the 1960 census, which made it possible to deliver detailed census results on a faster schedule, improve methods for handling missing data by using “hot decks” instead of “cold decks,” and dramatically increase the data products provided to users, including public-use microdata samples, first produced from the 1960 census in 1963;
-
mailout-mailback enumeration, partially implemented in the 1960 census (the mailout portion) and fully implemented for much of the country in the 1970 census, which reduced errors in coverage and content (self-reports on census mail questionnaires are more accurate than enumerator reports) and, at least initially, reduced the size of the enumerator workforce;
-
the use of dual-system estimation for census coverage measurement, first implemented in the 1980 census, which made possible more accurate estimation of net undercount by a “do it again, independently” approach, compared with the “do it again, better” approach used in the 1950 and 1960 censuses, in which enumerators rechecked the counts of housing units and people in sampled areas; and
-
the TIGER geographic coding and mapping system, developed for the 1990 census, which made it possible for the first time to generate maps and geocode addresses by using a computerized database that represented physical features, census geography, and street networks for the entire country.
More recently, the Census Bureau has successfully designed and implemented the American Community Survey (ACS) as a replacement for the census long-form sample. And the Census Bureau has many innovations to its credit in other programs, such as its economic censuses and surveys and its household surveys.
Yet over the past two or three decades, there has been significant erosion in the Census Bureau’s once preeminent position as a world leader in statistical research and development. The cumulative effects of actions and inactions—on the part not only of the Census Bureau, but also of the Department of Commerce and Congress—have led to a situation in which research and development for the decennial census and other programs too often is limited to incremental improvements in existing systems, is planned from the bottom up without sustained top-down strategic direction, is executed without the benefit of using best practices for the design of experiments and tests, expends scarce resources on testing factors that are already well established in the literature while neglecting to test factors that are unique to the scope and scale of the census or another program, is fragmented organizationally, is not well integrated with operations, is not considered a key driver of future directions or new operational procedures, and lacks resources commensurate with needs.
The results of an inadequate and unfocused research infrastructure for the decennial census are evident in the failure to carry out the planned development of handheld technology for nonresponse follow-up in the 2000 census, the failure—even after several decades of on again, off again effort—to make significant use of administrative records in the census and household surveys, the failure to use the Internet in the 2010 census or in household surveys (a test of an Internet response option is planned for the ACS), the failure to adequately evaluate and improve the procedures for updating the Master Address File, the limited and unfocused experiments planned for the 2010 census, and the lack of clearly specified “stretch” goals for plan-
ning the 2020 census that are designed to break the unsustainable trend of escalating costs and complexity of census operations.
In this chapter, we not only describe the functions and properties of an effective R&D program for a major statistical agency in general terms, but also make specific recommendations to revitalize the R&D function at the Census Bureau. Given our charge, we focus on R&D for the decennial census, although many of our comments may apply to R&D for other bureau programs as well. Section 4–A begins by fleshing out what we mean by R&D in the context of a statistical agency followed by a description in Section 4–B of the properties of a successful R&D program for the Census Bureau. We then turn our attention to the organizational structures around R&D (4–C) before closing in Section 4–D with recommendations for developing an improved census R&D environment.
4–A
IN-HOUSE R&D—WHY AND WHAT
We begin by dismissing any thought that a statistical agency, such as the Census Bureau, does not require a significant in-house R&D capability. R&D is central to the ability of a statistical agency to carry out its mission to deliver relevant, accurate, and timely statistics to the public and policy makers in the face of changing data needs that reflect a changing society, declining public cooperation with censuses and surveys, constrained staff and budget resources, and changing technology for data collection, processing, estimation, and dissemination. The Committee on National Statistics in its Principles and Practices for a Federal Statistical Agency (National Research Council, 2009b:11–12, 43–45) specifies an “active research program,” including substantive analysis and research on methodology and operations, as 1 of 11 essential practices for a statistical agency. Indeed, unless an agency is simply a data collection contractor to other agencies that provide the ongoing scientifically based leadership for censuses and surveys, then it must itself have an ongoing, high-quality, adequately resourced in-house R&D capability. Even for those surveys in which the Census Bureau is the data collection contractor, it behooves the Bureau to continually improve all of its statistical capabilities, such as sampling, editing, quality assurance, data collection, data processing, software development, and analytic approaches, similar to what the major private-sector survey data contractors do in their efforts to be competitive and provide customer value.
There are a number of ways to define R&D, including the classic distinctions of “basic research,” “applied research,” and “development,” which actually work well for our discussion. We define R&D to include the following components:
-
“Basic research,” by which we mean analytical work that is ongoing and devoted to fundamental problems of improving relevance, accuracy, timeliness, and efficiency of a statistical agency’s data programs. Such research might, for example, investigate alternative methods for imputing missing responses, including not only the traditional hot-deck method, but also model-based multiple imputation, in a wide variety of survey contexts. Or such research might investigate ways to improve the timeliness and accuracy of census and survey response through redesign of questionnaires in a variety of modes, including mixed-mode census and survey designs.
-
“Applied research,” or “applied methods,” by which we mean analytical work that is directed to the specific needs of a specific census or survey program. Such work would take research findings and adapt them to a specific context by, for example, providing weighting or imputation specifications for a particular census or survey.
-
“Development,” by which we mean work, involving some combination of researchers, methodologists, and operations people, to implement research findings on the necessary scale for a census or survey. Some of the recent failed attempts to reengineer the decennial census, such as the collapse of the plan to use handheld technology for nonresponse follow-up, have involved a failure to conduct the needed developmental work with sufficient lead time.
While we strive to make clear when we are talking about one of the three components listed above, we also use “research” as a short hand for the entire array of activities that must be part of a statistical agency’s R&D portfolio in order to ensure that its data are as relevant, accurate, and timely as possible within resource constraints.
4–B
PROPERTIES OF A SUCCESSFUL R&D PROGRAM
To be successful, a research and development program for a major statistical agency of the size and scope of the Census Bureau should have the following characteristics:
-
Research activities related to strategic goals and objectives: In the case of the decennial census, the overarching goals of methodological research and development are to materially reduce costs and increase (or at least maintain) quality in terms of the coverage of the population and the completeness and accuracy of responses to content items. Therefore, all R&D projects should be justified on that basis. Furthermore, each cycle of census design work needs to start with the development of a small number of competing visions for the next census, in which the ultimate selection of the vision to use as the foundation for
-
the design of the next census depends on the resolution of a handful of basic research questions. Any research that helps to address these fundamental questions should be given a higher priority than research that is not associated with those questions.
-
Research-supported decision making: There is evidence that some of the major census innovations, or attempts at innovation, have been implemented without sufficient support from census experiments or tests. Examples include the inadequate testing of the census hand-helds in the 2010 planning cycle, the inadequate testing of the optical scanning procedure in the 2000 cycle (which nearly resulted in a major delay for the 2000 census data collection effort—see U.S. General Accounting Office, 2000), and the inadequate operational testing (as preparation for implementation) of the use of a targeted replacement questionnaire leading up to the 2000 census. Research needs to be seen as an initial, key step in all major decisions concerning decennial census design. Accordingly, the outputs, or evaluation metrics, for each research project need to be carefully specified—for example, whether a particular test of a handheld device for census-taking is primarily to assess data quality or operational feasibility or costs or some combination—and provision made to collect the necessary information in a form that can readily be analyzed.
-
Appropriate balance between fundamental research and applied methodology: The research program at the Census Bureau needs to emphasize basic studies aimed at establishing general principles for the design of censuses and surveys as much as, if not more than, it emphasizes applied studies designed to determine how these principles apply to specific surveys. Thus, research on the census that is too context-dependent and too focused on the immediately upcoming census will probably not yield results that are helpful for the next census, with the consequence that the R&D cycle for that next census will have to start afresh with little cumulative knowledge gained from prior research. Moreover, while the decennial census is relatively singular in such features as its large scale, extent of public scrutiny, and unforgiving timetable, there are important commonalities between the census and other household surveys, in particular the ACS. Consequently, research that addresses fundamental issues—such as why certain types of question formats or certain data collection modes elicit more or less complete and accurate responses—is more likely to yield results that help more than one census or survey in more than one time period than is research that is too specific to a particular survey and time period.
-
Continuity over the decades: Successive stages of research on a given topic need to build on previous results, otherwise they are reinventing the wheel, or else the resulting disparate research findings from isolated tests and experiments will be difficult to evaluate and connect to existing theory. Each successive research activity needs to incorporate what was learned in previous research activities about the question at hand through the choice of appropriate control treatments, alternative procedures, and environments of study. For example, a postcensus questionnaire test should include as a control the previous census questionnaire or, alternatively, a questionnaire that was tested in that census and proved efficacious (this was not done in the 2010 questionnaire testing conducted in 2003). Moreover, substantial development research and testing followed by operational testing will generally be needed for innovations in decennial census design given the heterogeneity of the U.S. population, its living situations, and questions of scale. Ideally, such research would build on work conducted for the previous census and the ACS.
-
Adequate expertise and professional development: Research should be seen as having a very high importance in the organization, and this would be evident in the size and funding of the research group, the talent of the staff, and their role in decision making. An effective research program for the census and surveys would have staffing—with many personnel at the doctorate level—with expertise in experimental design, survey design, the technology of survey data collection, cognitive methods in survey research (especially questionnaire design), geographic information systems, database management tools, and statistical methods in such areas as record linkage, analysis of complex survey data, survey variance estimation, and methods for treatment of missing data. Such staff should have adequate support to not only maintain, but also continually develop their human capital—for example, by being funded to attend several technical conferences a year and encouraged to prepare research papers for publication. The research staff should be afforded opportunities for direct and frequent interaction in teams with Census Bureau field and program staff across the Bureau’s organizational divisions. In addition, the research staff should have the capability for regular interaction with external experts through not only advisory committees, but also appropriate contracting mechanisms that provide for more extended interaction. The ability to work directly with external experts is critical to enable the in-house research staff to keep abreast of innovations in survey methodology in academia and the major private survey research corporations.
-
Information technology development: Another important area of expertise for the Census Bureau’s research staff should be information technology (IT) knowledge and skills that permit the staff to work effectively with IT contractors. If it is difficult to attract a sufficient core of in-house staff with expertise in systems and software design, then it becomes even more important to reach out to academic and private-sector experts who can function as part of the in-house group and provide valuable guidance on such matters as evaluating proposals from contractors and overseeing the work on major IT contracts. Undoubtedly contributing to the Census Bureau’s failure to successfully manage the contract for use of handheld computing devices in the 2010 census was the lack of integration of the contractor staff with in-house technical staff.
-
Consistent use of state-of-the-art experimental design methods: The research group should identify and follow sound principles and practices for the design of experiments and tests and update them as the state of the art advances; Chapter 3 discusses current deficiencies along these lines in more detail.
Fundamentally, as we discuss in Sections 3–B and B–1.c in this report and in our interim and letter reports, census experiments and tests are rarely sized through explicit estimation of the power needed to support the statistical tests that will be used to compare the effects of alternative treatments. An undersized test, in terms of the number of completed sample cases, will not permit conclusive analysis of the effects of one or another treatment. Relatedly, testing resources are often wasted by the failure to target relevant population groups. For example, if tests of variation in question wording for eliciting responses from small ethnicity groups, such as Afro-Caribbean, are not targeted to areas of expected concentrations of such respondents, the tests are likely to collect too few cases of interest while at the same time wasting taxpayer resources and respondents’ time by collecting a large number of irrelevant cases.
-
Appropriate balance of types of research and testing: The R&D cycle in recent censuses has focused on large-scale tests, such as a complete census operation in a locality or mailings of thousands of questionnaires to test wording alternatives. The only inputs to large-scale questionnaire tests have generally been cognitive testing with very small numbers of respondents (fewer than 10 people). While both large- and very small-scale tests have their place, it is important for decennial census R&D to include other research and testing methods. Targeting of questionnaire tests, for example, could reduce the number of respondents required and thus make better use of scarce resources. A series
-
of smaller tests focused on potential new features of census-taking—for example, a series of Internet data collection tests—would probably be more cost-effective than two large tests, the first of which does not typically provide results in time to affect the second test. In addition, cost-effective cumulative R&D for the census would make extensive use of such techniques as simulation of changes in operations—such as a targeted rather than complete address canvass—using well-documented databases from the previous census. Relatedly, wherever possible, the “not-invented-here syndrome” would be rejected in favor of adopting well-established methods from other organizations. For example, as mentioned above, the Census Bureau conducted an elaborate line of testing of multiple questionnaire mailings in the early 1990s. This work was solid and demonstrated gains in response rates that could be achieved through replacement questionnaire mailing; however, to a large extent, it replicated work on mailing package research and confirmed findings that were already known in the survey research literature. The consequence was that relatively little had been done on developmental work—developing operational specifications to determine whether multiple mailings were feasible on the scale and timetable required for the census—until it was determined that a second questionnaire mailing could not be successfully used in the 2000 census.
-
Facilitated access to data outputs: Data on the outcomes of experiments, tests, and other research—such as effects on response rates or the distribution of imputations or the costs of operations—should be made available to the research group in a form that facilitates exploratory and confirmatory analysis. More concretely, this means that research projects should produce outputs that are well documented and provided in databases that are easy to access for a wide range of different analysis, using different covariates and statistical measures. It also means that operational tests and, indeed, full-scale census operations—for example, nonresponse follow-up or data capture—should record and store transactions in well-documented formats that researchers can readily access for cost-modeling or evaluating the effects of one or another operation on data quality.
-
Research on implementation and human factors: There is a role in census research for small-scale tests or experiments of potential innovations in methodology, just as there is a need for research that establishes the feasibility of those innovations at a census scale of operations. The trick lies in balancing these activities and not—as in previous recent censuses—favoring complete tests of all census operations in one or more locations to the exclusion of smaller, focused tests that
-
could have been more efficient and effective. Useful, midlevel research between these extremes could involve working with vendors and Census Bureau field division staff to identify requirements to bring innovations to scale and to conduct tests of specific components to determine operational feasibility. An important aspect of feasibility testing should be the explicit consideration of human factors, such as whether an innovation alters the division of responsibilities among enumerators, local census offices, regional offices, and census headquarters and the flows of information among them in productive or counterproductive ways. Although the planned use of handheld devices for nonresponse follow-up had major implications for the interactions of enumerators, local and regional offices, and census headquarters, such human factors were not explicitly part of the testing program.
4–C
STRUCTURING A SUCCESSFUL R&D PROGRAM
The conduct of relevant, high-quality, and timely censuses and surveys within resource constraints is a complex enterprise, which depends on research that is integrated into, yet independent from, daily practice. The success of an effective, well-integrated research program depends critically on the Census Bureau’s structure for research and how leadership and organization permit research to interact with, and not be impeded by, the constraints of census operations.
Unfortunately, there is no pat answer to the question of the most appropriate organizational structure for basic and applied statistical R&D. In this section, we briefly describe some possibilities for the organization of research in the Census Bureau; we offer these suggestions based in part on our reading of Principles and Practices for a Federal Statistical Agency (National Research Council, 2009b) and in part on the experiences of members of our panel in the management and oversight of censuses and complex survey operations.
4–C.1
Leadership
It is essential to have someone at the level of top management of a statistical agency who provides overall leadership for the technical side of the agency’s work and who can articulate and defend the resources needed for basic research and applied methodology. This person should be responsible for methodology and statistical standards, as well as for informatics. It is extremely useful for this individual to be a noted expert in statistical methodology, who therefore can speak authoritatively about the importance of research and methodology not only in broad terms, but also in the context of particular projects.
At the Census Bureau, the appropriate level for this position is the associate director, a senior executive service position. Indeed, the top advocate of sound methodology in the Census Bureau until recently was the associate director for methodology and standards, a position that existed within the Bureau as far back as 1929 (under different names—see Box 4-1). But this position was abolished in 2005 in response to a refusal by the Department of Commerce to appoint the person recommended for the position by the Census Bureau director and deputy and, previously, was left vacant for several years after the resignation of the associate director.
4–C.2
Organization
The R&D function is organized in different ways in different national statistical organizations. In some, it is distributed to individual divisions responsible for a given program or subject, such as education or labor. In others, it is distributed to divisions with responsibility for broad subject-matter fields (e.g., demographic or business statistics). In still others, it is more fully centralized, reporting to the equivalent of an associate director of the Census Bureau and organizationally independent of subject-matter or field operations areas.
There are arguments in favor both of centralization and of decentralization. Decentralization can facilitate the integration of methodology into daily practice. However, since the operational entities are typically not headed by methodologists, this model tends to result in lower hierarchical positions for the heads of these decentralized methodology units, which makes it more difficult for them to assume a leadership function. Also, a lack of critical mass makes it more difficult to support specialization and basic research and to maintain high-quality standards for research and practice. Conversely, a centralized model is at greater risk of isolation from the daily practice of the agency, potentially endangering the viability of this function.
The Census Bureau seems at present to have the worst of both worlds. The Bureau’s applied methodology work is decentralized, so there is no central leadership speaking on its behalf, yet its basic research is centralized and even more cut off from the rest of the Bureau than research tends to be intrinsically (see Box 4-1). The lack of central leadership for R&D at the top of the Bureau makes it difficult to integrate the work of the applied statisticians and the researchers with each other and with operational practice; it also makes it nearly impossible to plan research that supports fundamental, long-term changes.
Centralization has the following advantages that are useful to retain. First, it supports the professional independence and functional leadership of applied methodology. While methodologists need to be full and valued members of project teams (that is, staff groups who are working on method-
Box 4-1 Historical Overview of the Census Bureau’s Organization of R&D Early Years
1950–1980 Censuses (intercensal changes in names and responsibilities of directorates and divisions are omitted)
|
Current Organization
|
SOURCES: U.S. Census Bureau (1955, 1966, 1976, 1989, 1993, 1995a,b, 1996); for current staff counts, searches of the staff directory on the Census Bureau web site, http://www.census.gov, on January 20, 2010. |
ological applications for components of specific programs, such as sample design and weighting for a particular survey), at the same time it is crucial that the methodologists receive expert guidance and technical supervision. This can best be achieved in a centralized organization in which the hierarchical position of everyone is strongly influenced by his or her technical competence. Professional independence is also vitally important, since on the rare occasion in which it makes a difference, these staff should be able to assert themselves and appeal, on professional grounds, decisions that are made within their project team with which they strongly disagree.
4–C.3
Project Teams
The contribution of methodology to an applied project, as well as the funds needed to finance the project, should be considered in the planning
process before the project gets under way. This includes an assessment of the costs involving the contribution of all members of the team to the project, including methodologists, subject-matter specialists, and operations and IT people as appropriate, and a broad project plan that is formulated by high-level specialists from the participating disciplines.
On these projects, methodologists perform two general functions. First, at a strategic level, they help to ensure that the overall plan strikes an optimal balance between costs, timeliness, and respondent burden constraints on one hand, and other desired outcomes, especially improvements in data quality, on the other. While this is a leadership function and involves the entire project team, it is the methodologists who provide the framework and techniques enabling the team to grapple with trade-offs that must be considered. At a more tactical level, the methodologist is concerned with providing the statistical methods that are to be incorporated into the overall project design, which may include sample design, weighting, quality control, editing and imputation strategies, estimation, and analytic methods.
While the tactical contributions of methodology are easily understood, it is the strategic contributions that most benefit from leadership and sophistication and therefore support use of a centralized approach to manage R&D. Furthermore, in the case of major efforts, which will be directed by higher-level groups that may include directors of the participating divisions, a centralized approach will make it easier to judge major trade-offs and to resolve any conflicts with such an approach.
4–C.4
Funding
Funding of the basic research and applied methodology unit(s) should provide for pure research, applied methodological work, developmental projects, and maintenance work (quality control, routine reviews of edit failures, variance estimation, and minor design adjustments), together with supplementary resources from requests from operational units for additional methodological work. It is essential that there be a sound planning process that ensures that the funding needed to provide R&D support to the top priority basic and applied research projects for the agency as a whole and for particular programs, such as the decennial census, is obtained.
4–C.5
Training
Training must be a substantial portion of the budget for R&D, with additional emphasis on career development. This can be carried out not only in formal courses internally, but also through professional education courses at conferences, etc. Training serves a multitude of purposes. Most important, it should not only inculcate a basic knowledge of all that is involved,
but also drive home the critical importance of teamwork and respect for the professional contributions of all the relevant disciplines.
4–C.6
Advisory Committees
A key tool for developing best practices and integrating them into the daily work of the organization is the effective use of advisory committees. Such committees can be used to provide critiques of all significant basic and applied research projects. Such critiques provide not only important contributions to the design and analysis of specific projects, but also a type of training for staff and validation based on the approval of the members given their professional standing. To be effective, advisory committees need to be given substantive information and important issues to address. In addition, their work needs to be buttressed by arrangements for bringing outside experts into the organization for intensive collaboration with in-house research staff.
4–C.7
Opportunities to Participate in Research
Most basic research should be conducted in partnership with applied methodologists to help ensure that the research carried out is relevant and that the results have the best opportunity to lead to changes in practice. Cooperative project work also helps with morale: while not everyone wants to do research (or is able to do so), a number of staff want to try their hand at it. And the very act of conducting some research, by those capable of it, leads to more open mind sets and a better informed practice.
4–D
A NEW CENSUS RESEARCH AND DEVELOPMENT PROGRAM
4–D.1
Organization and Leadership
Consistent with the above discussion, the Census Bureau, as a high priority, should reorganize its basic research and applied methodology functions and how research and applied methods units interact with operational units. The objective should be to ensure that sound methodology pervades census and survey practice and to make sure that research programs are motivated by strategic issues facing the bureau. To inform an appropriate reorganization, the Census Bureau should undertake a fast-track, high-level management review of how research and development is organized in other national statistical offices and leading survey research organizations in academia and the private sector.
Recommendation 4.1: The Census Bureau should comprehensively review the research and development practices and orga-
nization in other national statistics offices and in survey organizations in academia and the private sector, with the goal of modernizing and strengthening the Bureau’s own research and development program. Such a review should include assessments of and recommendations about:
-
How to organize and direct basic and applied methods research to best serve the decennial census and other Census Bureau programs;
-
How to organize information technology and database management to best serve research and operations, including how to manage the development of new technologies and ensure access to adequate expertise in these technical areas;
-
How to operate collaborative project teams to facilitate timely innovation;
-
How to ensure adequate training in survey methods and related fields;
-
How to achieve extensive and intensive interaction with external research organizations and academic departments so that Census Bureau researchers and methodologists can benefit from related research work and ideas elsewhere; and
-
How to fund and establish priorities for research and applied methodology work.
To carry out the findings of this review, the Census Bureau should consider reestablishing and filling an associate director–level executive staff position to head the statistical and survey research activities at the Census Bureau, with authority to organize the Bureau’s research and applied methods activities. This position should have line authority for the basic research function. If the Census Bureau decides to adopt a centralized R&D model, it should also have line authority for the applied methodology function. If the Census Bureau decides to retain a decentralized structure for applied methodology work, the associate director position should have strong functional authority for the applied methods staff, including input on recruitment, promotion, and training of staff, quality standards, and project priorities. The position should have sufficient authority to ensure that research findings play a fundamental role in decisionmaking on the design of the decennial census and other major data collection programs. Given the scale and importance of the decennial census, this position should also have the authority for setting the census R&D agenda, which would include the selection of census experiments and evaluations.
Recommendation 4.2: To carry out the findings from the review recommended above, the Census Bureau should consider reestablishing and filling an associate director–level executive staff position to head the statistical and survey research activities at the Census Bureau, with authority to organize the Bureau’s research and applied methods activities.
In addition, the Census Bureau should consider reestablishing the Center for Survey Methods Research as a unit under an associate director for statistical and survey research to conduct research on census and survey data collection instruments. This unit, which had a proud history of important research on questionnaire design, residency rules, and ethnographic research, no longer exists as a separate entity (see Box 4-1). Moreover, the subunits of the Statistical Research Division that engage in questionnaire design and measurement research, language and measurement research, questionnaire pretesting for household surveys, and human factors and usability research are no longer headed by a researcher of national reputation.
Recommendation 4.3: The Census Bureau should give greater emphasis to survey methodology. One possibility for doing so would be to establish a core survey methods research center, staffed by full-time survey researchers and headed by a nationally recognized expert in census and survey data collection instruments. Such a high-profile center could give priority to research on making effective initial contacts with census and survey respondents, including those made with new technologies.
In this report, we focus principally on methodology and operations research and not substantive analysis—basic research in social sciences. Nonetheless, we strongly support substantive research programs by statistical agencies in the subjects covered by their data collections. Such research is one of the best ways for an agency to obtain input on social, economic, and other kinds of changes that necessitate rethinking data collection and processing methods and the kinds of data that need to be provided to data users; basic research can be an important source of innovative ideas. For example, we echo the comments by the National Research Council (2006:175), recommending an office for research on population changes in geographic location and family living arrangements that relate to census residence rules and have implications for effective enumeration procedures. Substantive research by agency analysts should be relevant to policy and public information needs, although it should not take policy positions or be designed to focus on any particular policy agenda.
4–D.2
Integration
We have noted the importance that basic research be collaborative with applied methods research and that the latter be integrated with operations. We have also noted that research findings need to drive strategic decisions about census and survey operations. To achieve these goals requires that operational staff welcome and act on research results, which can be difficult when they are in the midst of data collection and processing and are under budget and timing constraints. The integration of research into the daily life of the Census Bureau should be the joint responsibility of the director and the associate director responsible for R&D, and it should be facilitated by a planning process that sets aside a block of funds for basic research, rendering explicit the unresolved development issues that need to be addressed for a given project to have a sound basis—and allocating the funds required. It is further incumbent on the leadership of a statistical agency to put in place incentives and structures so that research is integrated with operational planning. Such incentives might take the form of performance criteria and rewards for operational leaders who are assiduous in integrating research into their planning and, vice versa, for research leaders who are assiduous in remaining relevant to the operational needs of the agency.
In addition, the Census Bureau has an opportunity and an obligation to thoroughly integrate decennial census with ACS research. For the first time in census history, the ACS affords a continuous test bed not only for its own needs, but also for the decennial census, covering contact strategies, questionnaire design, data capture technology, and data processing. Although there are significant differences between the two programs, there are sufficient commonalities that basic and applied research and development needs to be conducted with continuous cross-fertilization between them.
Recommendation 4.4: The Census Bureau should put in place incentives and structures so that research is fully integrated and collaborative not only across programs, but also with operational planning. Research should be responsive to operational needs, and, in turn, research findings should play a primary role in informing operational decision making.
Recommendation 4.5: The Census Bureau should integrate decennial census and American Community Survey research—for example, by using the ACS methods panel as a test bed for the Internet and other data collection methods to consider in the census and by matching census and ACS records to evaluate coverage in both programs. To support comparative census-ACS research and to inform users, the Census Bureau should carry out analyses that explore, at both the aggregate level and the level of individual households, the degree of differences and the source
of differences in demographic characteristics and residence between the ACS and the decennial census.
4–D.3
Fostering Outside Collaboration
We have stressed, and cannot stress enough, the importance of extensive and intensive collaboration of in-house R&D staff with outside experts. No in-house R&D program can or should be sufficient unto itself. The attempt to do so is wasteful of scarce resources—whether the outcome is to reinvent the wheel or, even worse, to fail to make improvements in methods because of lack of familiarity with advances in other organizations, including leading survey and computer science research centers in academia and the private sector. In developing these relationships with advisory committees and external researchers, it is important that the Census Bureau view them less as a means of oversight and more as legitimate collaborators in the study and improvement of census operations.
Recommendation 4.6: The Census Bureau should renew and augment mechanisms for obtaining external expertise from leading researchers and practitioners in survey and census methodology and in relevant computer science fields. These mechanisms might include (1) a more active census professional advisory committee program in which the members have an opportunity to work more closely with Census Bureau staff in developing and evaluating ideas for improved census and survey methods; (2) increased opportunities for sabbaticals at the Bureau for university faculty and other short-term appointments for both senior- and junior-level (graduate student) academics at the Census Bureau; (3) increased opportunities for sabbaticals for Census Bureau staff at academic institutions and private-sector survey organizations; (4) the awarding of design contracts early in the decade to support research and development of innovative technologies for census and survey data collection and processing; and (5) more effective use of contracting processes to obtain expert services.
4–D.4
Budgeting for Research
A complication for the Census Bureau’s decennial census research program is the budget process. The timeline of the decennial census is such that it—and its level of spending, the extent of its coverage of and programs for specific population subgroups, and so forth—is a matter of intense attention in the time period immediately around the census year. However, that attention by a wide range of census stakeholders—including Congress, other
executive branch agencies, and advocacy and interest groups—can drop off in the years following a census count. So too can the funds appropriated to the Census Bureau—a result that can restrict or preclude serious research and early planning for the next census.
The decennial census is necessarily a high-stakes program, and to some extent the escalating costs of the census and the steady accretion of coverage improvement operations (without a review of their cost-effectiveness) described in Chapter 2 result from this pressured environment. Absent the resources to conduct research on strategic design issues early in a decade—to guide the selection of principal design components and test the feasibility and interoperability of new and alternative methods—incrementalism in approach to the census is virtually inevitable. To their credit, Congress and presidential administrations have historically been unstinting in providing resources for the census as decennial dates have drawn close; the challenge going forward is to make the case that investment in research early in the decade—and the changes that develop from that research—will yield a more efficient and effective census in the end. Likewise, a Census Bureau research program should engage the entire range of stakeholders throughout the decade on key research and quality issues rather than try to pile on last-minute changes in years ending in 8 or 9.
Our urging in Recommendation 2.1 that the Census Bureau commit to bold and public cost and quality goals for the 2020 census is meant to promote a commitment to change early in the decade. We close this report on directions toward a new vision for the 2020 census by suggesting that national conversations on the nature of the census—and the research needed to effect real change—need to take place early, and over the whole decade.
Recommendation 4.7: The Census Bureau’s planning for the 2020 census, particularly for research in the period 2010–2015, should be designed to permit proper evaluation of significant innovations and alternatives to the current decennial census design that will accomplish substantial cost savings in 2020 without impairing census quality. Otherwise, the census design in 2020 will either be an incremental change from that in 2010 with increased costs, or the Census Bureau may be compelled to implement a poorly evaluated and tested alternative design under severe time and cost constraints with a risk of substantially reduced quality. All involved, including Congress and the administration, should recognize that substantial cost savings in 2020 can be achieved only through effective planning over the course of the 2010–2020 decade and should fund and pursue research efforts commensurately.