Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
â5â Decision-Making Process for a New Victimization Measurement System I N THIS CHAPTER , WE FOCUS on two broader issues related to moving forward with reï¬nements to the National Crime Victimization Survey (NCVS). The ï¬rst is the need to consider ways to best develop the survey in order to shore up and expand constituencies for it (Section 5âA), and the second is the choice of the data collection agent for the survey (5âB). Several of the topics and recommendations in this chapter differ from the rest of the report in that they are agency-level in focus, aimed at better equipping the Bureau of Justice Statistics (BJS) to understand its own products and to interact with its users. This is in keeping with the panelâs charge to focus on the complete portfolio of BJS programs. We make these recommendations here, in initial form, because they are pertinent to the NCVS; however, we emphasize that we expect to expand on them in our ï¬nal report. 5âA BOLSTERING QUALITY AND BUILDING CONSTITUENCIES NCVS data and estimates are routinely used by researchers and the public to understand the patterns and consequences of victimization. Researchers can access the raw data through the National Archive of Criminal Justice Data at the Interuniversity Consortium for Political and Social Research and thus can analyze the data to ï¬t the needs of their investigation. The vast ma- jority of the public, in contrast, has access to the data primarily through the form of routine annual estimates available on the BJS website, or through 117
118 SURVEYING VICTIMS special topic reports developed and released periodically on the website. However, when the public has interest in speciï¬c topics for which no reg- ular NCVS report exists (for example, trends in rural victimization1 ), it is often beyond peopleâs expertise to use the survey data or even to determine whether they can compile this information themselves. This problem can be addressed by using an advisory committee charged with providing BJS with information about public interest in speciï¬c kinds of NCVS reports; improv- ing the organization of the victimization component of the BJS website so that it is clear what NCVS reports are available and what requires special analyses; and expanding the number of trend charts and spreadsheets to include compilations of interest to the public. Any federal statistical agency must constantly strive to maintain clear communications with its users and with the best technical minds in the coun- try relative to its data. While BJS some years ago took the initiative to stim- ulate the creation of the American Statistical Associationâs (ASA) Committee on Law and Justice Statistics, the committee is not a formal advisory com- mittee to BJS. This means that the meetings are not public, the recommenda- tions of the committee have no real formal documentation, and the agency does not consistently turn to the committee for key problems facing it. Fur- thermore, the committee consists exclusively of ASA members, who may or may not have all the expertise needed to advise BJS. A formal advisory committee has both the beneï¬ts and costs of Federal Advisory Committee Act oversight, yet it would address many of the issues cited above. Most other federal statistical agencies actively use their advisory committees (e.g., the National Center for Health Statistics, the Census Bureau, the Bureau of Labor Statistics) to seek technical input into critical challenges. This is espe- cially true now because of the growing pressures on survey budgets arising from declining U.S. response rates. A formal advisory committee should have membership that is appointed for its expertise. It should have experts in criminology, law enforcement, judicial processes, and incarceration. It should include state and local area experts. This expertise in the substance of the statistics should be supple- mented with expertise in the methods of designing, collecting, and analyzing statistical data. Recommendation 5.1: BJS should establish a scientiï¬c advisory board for the agencyâs programs; a particular focus should be on maintaining and enhancing the utility of the NCVS. 1 Comparison of trends in urban, suburban, and rural victimization were the focus of a BJS report issued in 2000 (Duhart, 2000), but this speciï¬c analysis has not been replicated since that time.
DECISION-MAKING PROCESS 119 The NCVS is largely designed and conducted for BJS by the Census Bureau. Complex survey contracts cannot be wisely administered without highly sophisticated statistical and methodological expertise. Federal sta- tistical agencies that successfully contract out their data collection (either to the Census Bureau or a private contractor) generally have mathematical statisticians and survey methodologists who direct, coordinate, and over- see the activities of the contractor. While many of the BJS staff are labeled âstatisticians,â the panel observed the lack of statistical expertise that is cru- cial in dealing with the trade-offs of costs, sample size, numbers of primary sampling units, interviewer training, questionnaire length, use of bounding interviews, etc. The expressions of displeasure about the Census Bureauâs management of the NCVS were not matched with BJS statistical analyses and simulations of design alternatives that might offer better outcomes for the agency. Furthermore, the panel thinks that the number of of BJS full- time staff dedicated to the analysis of NCVS data and the generation of re- ports is insufï¬cient to exploit the full value of the survey and to navigate its challenging future. Some of the issues that require analysis (e.g., the effects of declining response rates on estimates, trade-offs of waves and question- naire length) need statistical and methodological expertise that goes beyond current in-house capabilities. Following the lead of other federal statistical agencies, BJS could usefully enhance statistical expertise on its staff with a program of outside research funds. When federal agencies form useful partnerships with academic re- searchers, they can reduce their overall costs of innovation. BJS has a track record of small research grants connected to the NCVS. The panel applauds these and urges an expansion to tackle the real methodological issues facing the NCVS. Recommendation 5.2: BJS should perform additional and ad- vanced analysis of NCVS data. To do so, BJS should expand its capacity in the number and training of personnel and the ability to let contracts. One reason that the panel thinks that technical stafï¬ng and external re- search are important is that many of the questions posed about the NCVS have not been evaluated sufï¬ciently for us to provide recommendations to BJS on the ï¬nal design of the survey. The panel thinks that this is the long- term result of âeating its seed corn,â of using the operating budget too much to release the traditional reports and too little to scope out the problems of the future. It was well known 15 years ago that household survey response rates were falling; the impact on survey costs of these falling rates was clear (de Leeuw and de Heer, 2002). Federal statistical agencies (see CNSTATâs Principles and Practices of a Federal Statistical Agency) must consistently
120 SURVEYING VICTIMS probe and analyze their own data, beyond the level required for descrip- tive reports, in order to see their weaknesses and their strengths. Only with such detailed knowledge can wise decisions about cost and error trade-offs be made. Recommendation 5.3: BJS should undertake research to contin- uously evaluate and improve the quality of NCVS estimates. Another way that federal statistical agencies improve their data series is by nurturing a wide community of secondary analysts, using as much data as can be released within conï¬dentiality constraints. Such analysts form a ready-made informed constituency for improving data products over time. Such analysts act as a multiplier of the impact of federal data series. Using the Internet, some agencies have expanded their impact by making available various âpredigestedâ forms of survey data in tables, spreadsheets, graphing capabilities, etc. The panel thinks that the BJS should consider such capabil- ities linked to the NCVS website. These might be time series of individual population rates and means in spreadsheet form, attractive to a very broad audience, as well as microdata predesigned to have commonly desired ana- lytic variables on observation units that are popular. Recommendation 5.4: BJS should continue to improve the avail- ability of NCVS data and estimates in ways that facilitate user access. BJS and the Census Bureau must keep their pledges of conï¬dentiality to NCVS respondents. They also have the obligation to maximize the good statistical uses of the data collected with taxpayer money. Geographically identiï¬ed NCVS data were available to qualiï¬ed researchers from approxi- mately 1998â2002 at the Census Bureauâs research data centers (Wiersema, 1999); however, access was subsequently suspended because the data did not conform to technical conditions for research access and oversight. A project to reestablish the availability of these data by documenting and formatting internal Census Bureau data ï¬les so that they conform to Census Bureau standards began in 2005 and should be completed by the time of this report. As soon as such work is completed, these data should be made available to qualiï¬ed researchers. Access to geographically identiï¬ed NCVS data would permit analyses of how local characteristics and policies are associated with victimization risk and its consequences. Recommendation 5.5: The Census Bureau and BJS should en- sure that geographically identiï¬ed NCVS data are available to qualiï¬ed researchers through the Census Bureauâs research data centers, in a manner that ensures proper privacy protection.
DECISION-MAKING PROCESS 121 At this writing, the U.S. statistical budget has been relatively ï¬at for some years (except for the advent of the American Community Survey budget). These ï¬at-line budgets have occurred at the same time that the difï¬culty and costs of measuring U.S. society have increased. In a climate of tight budgets and increasing costs of demographic measurement, federal statistical agen- cies face real threats. Such are the times that need real statistical leadership and careful stewardship of the statistical information infrastructure of the country. We fear that many surveys, the NCVS among them, can easily die âdeaths from a thousand cuts.â Attempts to live within the budgets lead to short-term cuts in features of surveys without certain knowledge of their effects on survey quality. Each such decision runs the risk that the country will be misled due to increased errors in data products. At some point, the basic goals of a survey cannot be met under restricted funding. The country deserves to know this when it is occurring. The panel thinks that one opportunity for such communication comes in the annual report on statistical program funding that the U.S. Ofï¬ce of Management and Budget is required to prepare by a provision of the Paper- work Reduction Act of 1995 (44 U.S.C. 3504(e)(2)). This annual reportâ Statistical Programs of the United States Governmentâhas been published for each ï¬scal year since 1997. The report can serve as a vehicle for alert- ing the executive and legislative branches to how the budget has affected the quality of statistical programs, both to the good and to the bad. With speciï¬c regard to BJS, the annual reports have generally documented the agencyâs responses to declining budgets. For instance, the reports for ï¬scal years 2007 and 2008 bore a similar warning (U.S. Ofï¬ce of Management and Budget, 2006c:8): BJS did not receive the funding requested to restore its base funding necessary to meet the growing costs of data collection and the infor- mation demands of policymakers and the criminal justice community. To address base adjustments insufï¬cient to carry out ongoing opera- tions of its National Crime Victimization Survey (NCVS) and other na- tional collection programs, BJS has utilized many strategies, such as cut- ting sample, to keep costs within available spending levels. However, changes to the NCVS have had signiï¬cant effects on the precision of the estimatesâyear-to-year change estimates are no longer feasible and have been replaced with two-year rolling averages. The guidance provided by these annual reports could be enhanced through fuller explication of the impact of budget reductions (or increases) on the precision of estimates, as well as articulation of constraints and effects on federal statistical surveys systemwide. An example of the latter is the Census Bureauâs sample redesign process; following the decennial census, the Census Bureau realigns the sample frames for the various demographic
122 SURVEYING VICTIMS surveys that it conducts (including the NCVS) so that the household samples are updated and coordinated across the various data collection programs. This work is done in collaboration with the agencies that sponsor Census Bureauâconducted surveys; âthe portion of the sample redesign work that can be linked to a speciï¬c survey is funded by the sponsoring agency as part of the reimbursable cost of the survey,â while portions that are not directly identiï¬ed with a speciï¬c survey are funded by the Census Bureau. âThus, the approach combines central funding with user fees for survey speciï¬c re- design activitiesâ (U.S. Ofï¬ce of Management and Budget, 2000:45â46). Al- though the sample redesign process has been routinely mentioned as an on- going, cross-cutting activity in Statistical Programs of the United States Gov- ernment, little detail on the progress (and consequences) of the effort was provided in the annual reports from 2001 to 2007. Ultimately, conversion from a sample deriving from the 1990 census to one using the 2000 num- bers was not fully achieved for the NCVS until 2007; the redesign work was originally planned to be complete in ï¬scal year 2004.2 We recommend that the annual report provide additional discussionâand warningâof budget- related effects on basic survey maintenance when appropriate. Recommendation 5.6: The Statistical Policy Ofï¬ce of the U.S. Ofï¬ce of Management and Budget is uniquely positioned to identify instances in which statistical agencies have been unable to perform basic sample or survey maintenance functions. For example, BJS was unable to update the NCVS household sample to reï¬ect population and household shifts identiï¬ed in the 2000 census until 2007. The Statistical Policy Ofï¬ce should note such breakdowns in basic survey maintenance functions in its annual report Statistical Programs of the United States Government. 5âB DATA COLLECTION AGENT FOR THE NCVS A review of any survey, particularly one conducted with an eye toward reducing costs, must inevitably consider the question of who collects the data (in addition to exactly how the data are collected). In the case of the NCVS, the U.S. Census Bureau of the U.S. Department of Commerce has been engaged as data collection agent since the surveyâs inception. In fact, as described in Box 1-1, the Census Bureau was heavily involved in the pre- history of the survey, entering into discussions with BJSâs predecessor in the 2 The new sample was phased in panel by panel. One panel of addresses based on the 2000 census was introduced in January 2005 for areas already included in the sample. âBeginning in January 2006, [the Census Bureau] introduced sample based on the 2000 decennial census in new areas. The phase-in of the 2000 sample and the phase-out of the 1990 sample will be complete in January 2008â (Demographic Surveys Division, U.S. Census Bureau, 2007b).
DECISION-MAKING PROCESS 123 late 1960s and convening planning conferences that would give shape to the NCVS and its pretests. Since âit was clear from the pilot studies that large samples would be required to obtain reliable estimates of victimization for crime classes of intense interest (e.g., rape),â âthe Census Bureau was the only organization that could ï¬eld such a large surveyâ and hence was the natural choice as the data collection agent for the new NCVS (Cantor and Lynch, 2000:105). The choice of the Census Bureau as the data collector for the NCVS had implications for the surveyâs design, as summarized by Cantor and Lynch (2000:107): Other design features of NCS were occasioned by the need to ï¬t into the organization of the Census Bureau and the Current Population Survey (CPS). CPS is the largest intercensal survey conducted in the world and, at the time, NCS was to be the second largest of these surveys. Sharing interviewers between the two surveys would mean great efï¬ciencies for the [Census Bureau]. CPS employed a rotating panel design. This was viewed as an advantage to NCS for a number of reasons. One was the ability to use prior interviews to âboundâ subsequent interviews. . . . A second was that the rotating panel design substantially increased the precision of the year-to-year change estimates. The panel design feature produces a natural positive correlation across annual estimates. This, in turn, substantially reduces the standard error on change estimates. As may be expected, the experience of decades of work has illustrated both advantages and disadvantages of the relationship between BJS as spon- sor and funder of the NCVS and the Census Bureau as its data collector. Rel- atively few of the conceptual pros and cons are unique to the BJS-Census re- lationship; rather, they are generally applicable to any contractor and client. Others, however, in the panelâs view deserve comment. A basic con- cern that has arisen about the Census Bureau as the data collection agent for the NCVS is the lack of transparency in costs. Historically, the Census Bureau has not provided its federal agency survey sponsors with detailed breakdowns in survey costs (and rationales for changes in costs, over and above the known increasing costs of gaining compliance in survey research). It is the panelâs view that disaggregated costs are key to effective innovation in large-scale surveys. The data collector must know what survey design choices are associated with the largest portions of costs in order to effec- tively consider trade-offs of costs and errors. Recent attention to survey costs (e.g., at conferences hosted by the Federal Committee on Statistical Methodology and the National Institute of Statistical Sciences) have shown the value of detailed cost accounting.3 3 See http://www.fcsm.gov/events/program/2006FCSMFinalprogram.pdf (see the session on âmodeling survey costsâ); Karr and Last (2006).
124 SURVEYING VICTIMS Recommendation 5.7: Because BJS is currently receiving inad- equate information about the costs of the NCVS, the Census Bureau should establish a data-based, data-driven survey cost and information system. Some of the features of the NCVS are not shared by other designs and, lacking a strong evidentiary base for their choice, this stimulates the panel to wonder why the Census Bureau and BJS have chosen them. These in- clude the recycling of cases from the ï¬eld to centralized computer-assisted telephone interviewing (CATI) (instead of using a dispersed ï¬eld interview- ing corps for the telephone interviews). They include the slowness of moving from paper questionnaires to computer-assisted personal interview- ing (CAPI). They include the failure to study the use of audio computer- assisted interviewing for many of the sensitive topics in the survey, despite its widespread use in other federal surveys (e.g., the National Survey of Drug Use and Health and the National Survey of Family Growth, as well as BJS- sponsored data collections as required by the Prison Rape Elimination Act). They include the lack of study of how best to use the bounding interview in estimation. Finally, the panel notes that there is very little substantive expertise in criminology and justice programs within the Census Bureau staff working on the NCVS. That means that the Census Bureau focuses on ï¬eld and statistical issues without the advantage of formal educational background in the substance of the NCVS. Just as the BJS staff would be stronger with more technical and statistical expertise, the panel thinks that the Census Bureau could mount a better NCVS and partner more effectively with BJS with more substantive expertise. That said, it must be noted with equal force that there are important ad- vantages to the use of the Census Bureau as data collector. Census Bureau household surveys, by and large, achieve higher response rates than com- parable surveys conducted by a private contractor on behalf of the federal government. It is common throughout the world that central government statistical agencies achieve higher response rates than private-sector survey organization (Groves and Couper, 1998). The Census Bureau has main- tained a strong conï¬dentiality pledge through the force of the Title 13 law, although under the widened protection of the Conï¬dential Information Pro- tection and Statistical Efï¬ciency Act of 2002, it is not clear that that advan- tage will be maintained. Furthermore, interagency agreements within the federal government appear to be simpler and less burdened by regulation than federal contracts. Finallyâin the event that a radical option for col- lecting victimization data were necessaryâcontinued partnership with the Census Bureau could offer the beneï¬t of more readily piggybacking some victimization measures on one of the Census Bureauâs ongoing surveys (e.g.,
DECISION-MAKING PROCESS 125 the American Community Survey or Current Population Survey; see Sec- tion 4âB.1). BJS has sought input regarding contracting out the NCVS to the private sector. We urge careful consideration of survey cost structures prior to such a move. The panel notes that this review would be greatly facilitated if BJS could obtain disaggregated costs from the Census Bureau for the current NCVS. BJS should study other federal surveys contracted out to the private sector to determine the extent to which ï¬exibility in dealing with changes and innovations was or was not realized. It should also study the implica- tions of contracting out on the desired staff skills within BJS. One way to increase understanding of the trade-offs of different NCVS designs and different contracting models is to seek formal design alternatives from the Census Bureau and others. A formal design competition could be mounted, perhaps through a set of commissioned designs, both from the Census Bureau and other survey methodologists. The designs would be guided by the same goals, articulated by BJS, but would be left to the creativity of the designers. The design options should be costed out in as much detail as possible, and the designs should be critiqued through peer review. Recommendation 5.8: BJS should consider a survey design com- petition in order to get a more accurate reading of the feasibility of alternative NCVS redesigns. The design competition should be administered with the assistance of external experts, and the competition should include private organizations under contract and the Census Bureau under an interagency agreement.