Skip to main content

Currently Skimming:

3 Improving the Tools and Uses of Policy Analysis
Pages 52-88

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 52...
... Microsimulation models, in our view, offer important capabilities to the policy analysis process in particular, the ability to evaluate fine-grained as well as broader policy changes from the perspective of then impact on subgroups of the population that are of interest to the user. However, microsimulation models do not serve all policy analysis needs, and the capabilities they provide typically require highly complex model structures and databases Rat can be resource-intensive for development and use.
From page 53...
... Hence, agencies will benefit from adopting a broad perspective as they consider how best to improve the tools and associated data they need for policy analysis. In framing an investment strategy, agencies confront the fact of continual change in the policy landscape even though the basic concerns of social welfare policy have not changed much in the years since the Great Depression and World War II: for example, the current interest in revamping the nation's patchwork system of health care financing carries echoes of similar debates going back at least as far as the Truman administration.
From page 54...
... Still another worthwhile approach is for the agencies to work for changes in the policy analysis community, to foster wider use of complex models by analysts and researchers, to encourage production of research that is relevant to modeling needs, and to improve upon some of the ways in which agencies have traditionally operated, both individually and as a group. We discuss these approaches in detail with respect to microsimulation models in Part II.
From page 55...
... However, for many extant policy analysis models, the level of quality is simply unknown. In our review, we found that policy analysis agencies have generally skimped on investment in model validation and related activities, such as archiving and documentation, that support validation.
From page 56...
... needed for social welfare policy analysis seems warranted in light of the resources that are at stake. The federal government spends more than $300 billion annually on social insurance programs (including social security, Medicare, unemployment insurance, and workers' compensation)
From page 57...
... With regard to information specifically needed for social welfare policy, we note first that federal and state spending for social insurance and public assistance programs increased by 32 percent from fiscal 1980 to 1988 in real terms (Bureau of the Census, l991: Table 583~.3 In contrast, spending for the statistical agencies that produce relevant data including the Bureau of Economic Analysis, Bureau of Labor Statistics, Census Bureau, National Center all statistical activities of the federal government including programs of large and small statistical agencies, statistics-related activities of policy research agencies, and the programs of administrative agencies (such as the Emigration and Naturalization Service and the Intemal Revenue Service) that generate statistical data as a byproduct of administrative actions at $1.7 billion, including $0.2 billion for the 1990 decennial census.
From page 58...
... , reductions in survey samples and in the availability of administrative records, and inadequate staff resources are among the factors cited for deterioration in basic economic data series. These deficiencies in the quality and relevance of economic data have had important policy consequences.
From page 59...
... Census Bureau (current programs only; not including censuses) National Center for Health Statisties Statisties of Income Division (SOI)
From page 60...
... Inability to provide adequate descriptions of today's complex family structures and relationships has made it increasingly difficult to assess many important policy initiatives for social welfare. Thus, analysis of child support enforcement programs, which offer the potential to reduce government income support costs, is hampered in the absence of joint information on the family circumstances of both the custodial and the noncustodial parents.
From page 61...
... . Some of the proposals that are relevant to social welfare policy analysis data needs include conducting research on measurement of poverty and
From page 62...
... Coordination of Data Production In addition to budget and staffing constraints, the federal statistical system over the past decade has suffered a deterioration in mechanisms for interagency coordination and the ability to draw on and integrate information from a range of databases, particularly administrative records. The consequences have been reduced timeliness, quantity, and quality of policy-relevant data.
From page 63...
... It is important for reasons of both total cost and analytical usefulness that the major surveys on particular topics- health care, income support, retirement income, tax policy—be designed with a broad focus and in ways that facilitate relating the survey data to other survey and administrative data. Cost concerns are frequently used to argue that surveys be focused in terms of subject matter and that they not duplicate topics covered in other surveys.6 While not denying that costs are important, we believe that false economy is frequently introduced in large-scale data collection efforts by not also considering the benefits gained from more inclusive survey strategies.
From page 64...
... Another way in which federal data collection efforts should be broadened concerns the need for data that relate characteristics of individuals and institutions. There is increasing recognition that the success of social welfare policies depends importantly on complex delivery systems.
From page 65...
... We recommend that federal statistical agencies give more attention to data collection strategies that recognize key interactions among individuals and institutions—employers, hospitals, government agencies, and others.
From page 66...
... But in the past decade, many agencies, particularly the Census Bureau, tightened their policies and procedures to protect confidentiality and restrict access to both administrative records and survey data. In so doing, the agencies responded directly to legislation, such as the 1974 Privacy Act and the 1976 Tax Reform Act, that reflected heightened public unease about potential abuses of government data and indirectly to a broader set of concerns about growing disinclination to cooperate with government surveys.l3 10For example, Denmadc conducted its most recent census by extracting information from a~lministrative data registers rather than by canvassing the population, and Sweden is moving in this direction (Redfem, l9g7)
From page 67...
... The consequences are also evident for the quality and breadth of aggregate data series developed by statistical agencies from internal microdata sources. To cite one example, the Census Bureau no longer prepares for public release exact-match files (with all identifiers removed)
From page 68...
... Such mechanisms include: · setting up "enclaves" whereby statistical agencies could share survey and administrative records; such data sharing would reduce costs and enhance quality even if public access remained limited (e.g., the Census Bureau could use IRS data to improve imputations for missing income data in the CPS and SIPP and to develop improved estimates of the income distribution for publication) ; · using sophisticated techniques to mask or blur data values so that microdata files containing survey and administrative data could be made publicly available; · swearing-in analysts as special employees to use confidential data onsite at a statistical agency or in a secured facility; and · requiring researchers to sign agreements that provide them with access to more complete data sets but also subject them to stiff penalties for any data disclosure that breaches confidentiality.
From page 69...
... In addition, we believe there is a need for reallocation of resources within statistical agencies to emphasize analysis and amelioration of data quality problems, together with a realignment of the data production functions of statistical agencies vis-a-vis those of users in policy analysis agencies (and elsewhere)
From page 70...
... For example, in processing surveys like the CPS and SIPP, the Census Bureau has concentrated on such tasks as adjusting the data records for nonresponse by households and individuals and editing item responses for consistency. It has not performed additional data adjustments such as correcting income amounts for reporting errors or modifying family structure to reflect population coverage errors—that would involve use of administrative records and other external data sources.
From page 71...
... Census data are often used directly for policy analysis and research. In addition, they indirectly affect the quality of many other data sets because of their use in the design of survey samples, in adjusting survey data to match census-derived population controls, as denominators for vital rates and other socioeconomic indicators, and as the basis for postcensal population projections.l9 1 See our more detailed discussion in Chapter 5 of realigning data production responsibilities between statistical and policy analysis agencies with respect to the Census Bureau's recently announced intentions to develop a database from the March CPS, SIPP, and administrative records that will support an improved system of income statistics.
From page 72...
... Recommendation' 3-7. We recommend that the Census Bureau conduct a thorough evaluation of population coverage errors in Me major household surveys and decennial census and their potential impacts on policy analysis and research uses of the data.
From page 73...
... Nor is it restricted to social welfare policy issues or to the use of particular analytical tools. It is the exceptional analysis that can be assessed for validity, not the typical analysis.
From page 74...
... For example, there are only a handful of validation studies of microsimulation models (see Cohen, Chapter 7 in Volume II) , but considerable literature evaluating cell-based population projection models (see Gn~mmer~trawn and Espenshade, in Volume II)
From page 75...
... First, data may never become available on the actual outcome of an analyzed policy. Although any given analysis may consider a range of policy options, the specific policy ultimately enacted is frequently not included among those analyzed: the policy process often begins with a specification of the potential range of policies but ends with a compromise outcome of the legislative debate.
From page 76...
... Moreover, without information to assess uncertainty, policy analysis agencies cannot determine the greatest need for their investment dollars in order to improve the quality of future estimates. They must make resource allocation decisions based largely on instinct rather than on a cumulative body of evidence.
From page 77...
... External Validation One principal technique for model validation, as we have defined it, is "external validation," or assessment of the validity of a model's estimates compared with measures of reality. For example, externally validating the cost and caseload estimates produced during the legislative debate on the Family Support Act would involve comparing them with the corresponding costs and caseloads obtained from AFDC program administrative records after the act took effect.
From page 78...
... Internal validation refers to all of the procedures that are part of conducting an intensive step-by-step analysis of how model components work including the theory behind the venous modules, the data used, the computer programming, and the decisions made by the analysts running the model. All aspects of internal validation are important; in the context of our discussion of the measurement of uncertainty in a model's estimates, however, we focus on internal validation techniques—namely, variance estimation and sensitivity analysis~hat contribute to such measurement.
From page 79...
... What one gives up when going from a variance estimation methodology to a sensitivity analysis is that the probabilistic mechanism underlying a sensitivity analysis is not rigorously determined. Thus, construction of confidence intervals—a type of formal "error bound" to express the uncertainty 24Another way of learning about deficiencies in a model is to make use of completely different modeling approaches to the entire problem, rather than experimenting with individual components.
From page 80...
... . For example, according to the Census Bureau, the estimate of the number of people below the poverty level in 1988, from the March 1989 Current Population Survey is 31.9 million, with a 90 percent confidence interval of plus or minus 0.9 million.
From page 81...
... Hence, we urge that the heads of policy analysis agencies assume the challenge of working toward the goal of having information on uncertainty available as a matter of course for the estimates their agencies produce. Agency heads can take several actions.
From page 82...
... Information on sources of error obtained from sensitivity analysis, along with the results of external validation, is important for determining the priorities for resources for the improvement of policy analysis tools. The focus of the independent evaluation studies will necessarily be on the feedback process whereby evaluation results give rise to better analysis tools that, in turn, produce better numbers for future policy debates.
From page 83...
... . Sensitivity analysis techniques, in which data inputs and model components are systematically varied in a series of model runs, can also be used to assess the magnitude and major sources of variation.
From page 84...
... We recommend that policy analysis agencies routinely provide periodic error analyses of ongoing work. DOCUMENTATION AND COMMUNICATION OF THE RESULTS OF POLICY ANALYSIS Documentation and Archiving as Aids to Validation We turn next to the critical role of good documentation practices for the proper use of models that provide estimates to the policy debate and for evaluation of the quality of their outputs.
From page 85...
... We recommend that policy analysis agencies allocate sufficient resources for complete and understandable documentation of policy analysis tools. We also recommend that, as a matter of standard practice, they require complete documentation of the methodology and procedures used in major policy analyses.
From page 86...
... Articles based on government statistical reports sometimes cite error bounds as well. To make the job easier for the media and other users, the Census Bureau recently instituted a practice in its reports of including error bounds (90% confidence intervals)
From page 87...
... Policy analysts can provide error bounds for their estimates that represent total uncertainty by presenting the widest range obtained through the variety of techniques used in the evaluation, including sensitivity analysis and variance estimation. This approach strongly—perhaps too strongly—communicates the total uncertainty to model users.
From page 88...
... Finally, decision makers can make use of measures of uncertainty to help judge the utility of allocating additional resources to the improvement of policy analysis models and databases. Today, many policy makers recognize the growing deficiencies in data series and modeling tools that support the policy analysis function, but they are not able to relate those problems to the quality of the resulting estimates of costs and distributional effects that are of concern in the policy debate.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.