Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 139
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers Computing Technology
OCR for page 140
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers This page intentionally left blank.
OCR for page 141
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers 5 Future Computing Environments for Microsimulation Modeling Paul Cotton and George Sadowsky INTRODUCTION Microanalytic simulation models have been used worldwide in a variety of contexts for the past 25 years to assess the impact of alternative economic and social programs. In particular, the analysis of tax and transfer systems applied to families and individuals now depends critically on the construction and evolution of such models, and their use is routine and expected in public agencies and research organizations that address these issues. Microanalytic simulation models of nontrivial size or complexity have relied for solution on the use of digital computers. The ability to use these models in a practical manner has depended on rapid technical progress in the computing industry. This progress has allowed the complexity of microanalytic models to increase and the costs associated with a specific simulation experiment to decrease substantially over time.1 Paul Cotton is senior technical advisor at Fulcrum Technologies, Inc., in Ottawa, Canada; George Sadowsky is director of the Academic Computing Facility at New York University. The authors gratefully acknowledge the assistance of a number of colleagues in contributing to the improvement of the first draft of the chapter. The authors note that most of the analysis for this chapter was done during 1989. Since then, some of the microsimulation models we describe have been revised, and some of the computing developments that we forecast have begun to happen. 1 The costs of performing a simulation experiment are distributed over several areas and include formulating the underlying microanalytic model or revising its components and setting up the specific
OCR for page 142
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers Here the term microsimulation model describes some universe of elements solved under a variety of conditions by using computer-based microsimulation techniques. In the context of social and demographic analysis, the microanalytic units of such models are individuals and groupings of individuals, such as families or households. This chapter addresses the impact that computing technology has had and will have, in the medium term, on the conceptualization, design, implementation, and use of microanalytic simulation modeling. Two areas are studied in detail: (1) the current state of computer systems that support static microsimulation models, with specific reference to the TRIM2 (Lewis and Michel, 1990) and SPSD/M (Statistics Canada, 1989a) models and (2) advances in computing technology anticipated in the medium term (1990–1995) and their implications for additional investment in microanalytic simulation models, with special attention given to static models and the computer systems that support them. This chapter also provides an introduction to microanalytic simulation models in terms of their characteristics and the history of their development. The TRIM2 and SPSD/M models, their histories, and model support systems are described and their characteristics are compared. An assessment is made of current computer hardware trends, with emphasis on desktop computing environments that are likely to be available to support such modeling activity in the medium term. Factors affecting the demand for and availability of microanalytic models also are assessed, with special emphasis on shifts in the production function through software advances and the ability to exploit future desktop computing environment characteristics. Finally, an overall assessment is presented of alternatives for investing in the evolution of TRIM2, as well as recommendations for investment planning for future microanalytic simulation model developments in general. Because the focus of this chapter is on examining and comparing a U.S. microsimulation model and a Canadian one, much of the development of microanalytic simulation models outside North America is not covered here. Readers are referred to Orcutt, Merz, and Quinke (1986) for information on the state of activity in Europe. Characteristics of Microanalytic Simulation Models Layered Structure It is useful to think of microanalytic simulation models as being composed of simulation exercise, executing it, and analyzing its results. Although the actual computer-based simulation portion has decreased substantially in cost in the past 30 years, the cost and turnaround time of this step are central in that they determine the feasibility and scope of studies that can be attempted. Overall costs of microsimulation activities are increasingly dominated by the cost of research, programming, support, and operations related to microsimulation, not by computing costs.
OCR for page 143
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers a number of related but distinct components that to some extent can be organized into layers. The component approach is useful in separating underlying knowledge, procedural modeling, and computer-based implementation aspects of a model.2 Our suggested components and their structure are as follows: Knowledge regarding the world and how it works, from either a theoretical or an empirical basis, having two distinct parts: the substantive social science and related knowledge in demography, economics, and other disciplines underlying the content of the socioeconomic model, and a representation of the population of microanalytic units that will be used as a basis for simulation exercises. The process of defining, organizing, and shaping this knowledge into a procedural and computer-executable form, including: defining procedures, or operating characteristics,3 for each component of the model, that describe how the knowledge is to be applied to microanalytic simulation units to derive their behavior under alternative assumptions;4 coding each operating characteristic into a computer-executable module that applies that procedure to a specific micropopulation unit used as the basis for the simulation exercise; and preparing the population microdata for simulation, including precise definition through a data dictionary (or equivalent) and possibly including physical medium transformation, record and file reformatting, subset extraction, value remapping, demographic and economic aging of the data, and similar operations. The application computer system that provides a framework for integrating the operating characteristics into a single module that will execute complete simulation exercises, including: the interface(s) seen and used for model construction and execution; the supervisor program that invokes and sequences the collection of operating characteristics; 2 The components and their structure presented here are somewhat simplified. See Sadowsky (1977:6–9) for a more detailed decomposition of the structure and activities associated with another model, MASH. 3 The term operating characteristics was introduced by Orcutt et al. (1961) and is adopted here. Operating characteristics consist of models of the specific behavioral or structural responses by a microunit to changes in its external environment. They depend on inputs such as the existing state of the microunit as well as the state of the environment that affects the unit; they produce outputs, or changes of state, in the microentity and possibly in other microunits in the population and in the state of the environment. They can be deterministic or stochastic. Their outputs can include state information for present, future, or past time periods. 4 Such a change may represent a simulated behavioral decision by the entity, such as a decision to change portfolio composition because of changes in the market rate of return structure, or it may be deterministic, such as recalculation of a tax unit’s liability based on revised tax legislation.
OCR for page 144
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers the machine-readable versions of the operating characteristics; the input and output programs for manipulating files, including the large population data files; and the data dictionary and related files that define population microdata and, possibly, parameters, aggregate time series, and other related entity types. The computing environment in which the simulation exercises are defined and executed, including: the hardware used—the processor(s), primary and secondary memory, and input and output devices; the system software used to support the simulation system; the network configuration used if more than one computer system is used to support the simulation system, or if the application uses linked comodels on different systems; and the program development and support tools that the system software supports to allow program construction, modification, and testing. The separation of microanalytic simulation models into such components is useful in that the intellectual, human, and physical capital associated with a given model is distributed among these various parts, and knowledge of this distribution leads to a better understanding of a given model’s flexibility and adaptability to change. In addition, the extent to which existing and previous models have either benefited or been limited by current and past investment policies in each of these areas is likely to lead to a better choice of investment policies in the future. Single Period Versus Future Projections While microanalytic simulation models implicitly have an element of future prediction in them, this aspect is not necessary to the concept of such modeling. Initial simulations of the effects of changes in tax policy computed changes based on the state of the population in the base year only, although later work included projections of income, expenses, and tax unit population sizes. When such models have neither behavioral change nor future projection elements, they are sometimes referred to as static accounting models, since the simulation reduces to a set of accounting rules applied to a static population database. However, while the notion of future projection is not a necessary part of a microanalytic simulation model, the use of such models in a public policy setting is oriented toward providing useful estimates of the future impact of alternative programs and legislation. Such a process is often referred to as aging the population, so that it will be representative of future years and will take into account a predicted pattern of social, demographic, and economic changes.5 5 Aging a population can have several dimensions. Demographic aging generally means applying rules
OCR for page 145
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers The effects of time passing can be incorporated entirely outside the model by scaling the results of simulations to match independent estimates of present or future aggregate demographic composition and economic activity, or it can be incorporated in a variety of ways within the simulation model using either data at the micro or at the macro level. Since an accurate representation of all of the factors involved in providing accurate estimates from a microsimulation is probably beyond the scope of what is possible today with microanalytic models, such models generally use a mix of techniques.6 Static Versus Dynamic Simulation Microsimulation models are often categorized as static or dynamic with respect to the method they use to predict future outcomes. Static models are characterized by lack of direct interaction of microanalytic units within the context of the model during the time period simulated. Static models rely on a combination of time-dependent weighting of the micropopulation units and application of normalization factors from external sources to attributes of each micropopulation unit. Dynamic models are characterized by varying degrees of direct interaction between micropopulation units within the simulation process. Such interaction includes the birth, death, and recombination of micropopulation units in a manner intended to simulate accurately those processes in the entire population. Dynamic microanalytic models rely on an accurate knowledge of the dynamics of such interactions. Models that support such general interaction patterns between microunits must carefully sequence the application of operating characteristics to individual micropopulation units so that such interactions are consistent in simulated time. The choice of model used is determined to a large extent by the external requirement for its creation or use. Broad social science research objectives often dictate very general model structures that can evolve in a very flexible manner, tracking intermediate results and decisions in the research process. In contrast, many uses of socioeconomic microsimulation models arise from specific policy initiatives and executive or legislative processes that are more limited in scope and more detailed in focus. In general, dynamic models have arisen from the former area, whereas static models have found greater acceptance in addressing specific legislative issues. for modifying population weights over time. Economic aging involves applying rules for modifying a set of economic variables over time. Both sets of rules apply to individual units in the initial micropopulation. Application of these rules during the progress of the simulation exercise is performed such that key aggregates produced will match control totals that have been defined using methods independent of the simulation. This process may be thought of as a complex normalization process. 6 A good discussion of the mix of techniques used by various models is contained in Devine and Wertheimer (1981).
OCR for page 146
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers Static models can be deterministic or stochastic. Stochastic or probabilistic models have operating characteristics that determine specific outcomes on the basis of probability distributions and use pseudorandom number generators to select those outcomes. Dynamic models require some stochastic processes in order to provide a meaningful representation of some of the processes being modeled. Deterministic models contain operating characteristics that have fixed rules, such as are embodied in an income tax algorithm, with no variability due to chance selection. In practice, most models contain a mix of deterministic and stochastic operating characteristics.7 Static models do not require solutions for simulated future time periods, although they are almost always used in this way. Dynamic models have little meaning unless their solution provides a number of results over the course of simulated time. There is general acceptance that dynamic models provide a more realistic representation of micropopulation unit behavior. However, static models are regarded as more effective at times for specific short-run projection purposes because of their greater simplicity and the often lower costs associated with building such models and obtaining computer-generated model solutions. Historical Background The path-breaking work that created the field of socioeconomic microsimulation—Microanalysis of Socioeconomic Systems: A Simulation Study (1961)—was performed by Guy Orcutt and his colleagues in the late 1950s. The underlying behavioral model was dynamic and stochastic, and the simulation system was implemented in assembly language on an IBM 704 computer system. Initial public policy analysis based on this methodology was first applied to the federal individual income tax system in the United States (Pechman, 1965) and Canada (Bossons, 1967). Initial versions of these models were strictly static accounting models that embodied neither behavioral assumptions nor forward projections in time. Both models were implemented in FORTRAN using IBM 7090/94 and System 360 computers, respectively, and both have been widely used. Additional Canadian models were subsequently built, using similar methodology, for tax reform analysis for the province of Ontario and to study tax and transfer program integration for the province of Quebec. Another initial use of microanalytic simulation methodology was to project the economic status of retired older persons (Schulz, 1968). The underlying 7 For example, while an algorithm ascertaining eligibility for a specific public assistance transfer program may be completely deterministic, a user may choose to assume that only a percentage of those eligible (less than 100%) actually choose to participate in the program. Such an operating characteristic has a stochastic condition for participation, with a deterministic rule applied for those who do participate.
OCR for page 147
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers microanalytic model was dynamic and stochastic, and it emphasized labor market participation and accrual of private and public pension rights. Interest in the late 1960s in welfare reform and negative income tax proposals led to the creation of the RIM model (Wilensky, 1970) for use by the President’s Commission on Income Maintenance in the United States. RIM embodied a static model used to project the effects of alternative tax and transfer policies on families in the United States. Its success in supporting the work of the commission led to the development of TRIM in 1971 and TRIM2 in 1979 to support continued exploration of tax and transfer policy alternatives focused on the lower end of the income distribution. RIM was implemented on a Control Data computer system, while TRIM and TRIM2 have largely been implemented in FORTRAN on IBM System 360/370/3090 computing environments. At the same time, another approach was undertaken under the leadership of Guy Orcutt in the 1970s to develop DYNASIM, a dynamic microanalytic simulation model embodying expanded household-sector submodels (Orcutt et al., 1976). The initial underlying computer system, MASH, was written in FORTRAN for a DECsystem-10 (Sadowsky, 1977). A later implementation, MASS, was created at Yale University by Amihai Glazer and his colleagues in PL/I for an IBM System 370. In 1981 the DYNASIM model was reimplemented as DYNASIM2 for reasons of efficiency and portability. The development of TRIM2 spawned several other microeconomic modeling developments. MATH (Doyle and Neyland, 1979), a model based on TRIM, was developed by Mathematica Policy Research, Inc., to provide more precise estimates of U.S. Department of Agriculture transfers in kind, such as the food stamp program.8 The KGB model (Betson, Greenberg, and Kasten, 1980) was developed by the U.S. Department of Health and Human Services to provide official cost and distributional analyses for President Carter’s major welfare reform initiative in 1977. Independently, ICF, Inc., developed HITSM, a proprietary model used by, inter alia, the Office of the Assistant Secretary for Planning and Evaluation in 1987 to estimate the impact of proposed legislation regarding the Aid to Families with Dependent Children program. More recently, Statistics Canada initiated a substantial modeling effort and has developed SPSD/M (Statistics Canada, 1989a), a microanalytic model of the Canadian household sector, on an MS-DOS-based microcomputer platform. SPSD/M is a static model and is oriented toward assessment of the revenue and distributional effects of Canadian household tax and transfer policies. Further developments of dynamic socioeconomic simulation models has continued at Cornell University under Steve Caldwell. The Cornell CORSIM model is written in C language and is currently being used on an IBM System 3090–600 mainframe and on MS-DOS microcomputers. CORSIM exemplifies 8 See Lewis and Michel (1990) for a discussion of the development and use of the MATH model.
OCR for page 148
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers a new generation of mixed microsimulation modeling systems that use both dynamic and static aging. Core attributes, for which sufficient dynamic knowledge exists, are aged dynamically. Noncore attributes are adjusted using appropriate cross-sectional joint distributions. CURRENT STATIC SOCIOECONOMIC MICRO ANALYTIC SIMULATION MODELS Overview During the past 25 years, socioeconomic microanalytic simulation models have been widely used to study the effects of alternative individual tax and transfer policies. The use of such models was encouraged by the relatively deterministic nature of the programs that implemented such policies.9 Static models have consistently been used to perform such studies. The choice is supported by the following observations: (1) the time horizon for such studies is generally relatively short; (2) the operating characteristics of greatest interest are detailed and deterministic in nature; and (3) the methods of growing the different dimensions of the micropopulation, although independent, are all intuitively straightforward and easily explained to policy makers. Two computer-based static modeling systems now being used to conduct such studies are SPSD/M in Canada and TRIM2 in the United States. In this section these two systems are described and compared. SPSD/M History The Social Policy Simulation Database/Model (SPSD/M) was developed in 1986 by the Social and Economic Studies Division of Statistics Canada. A beta test version of SPSD/M was completed in 1987, and the first publicly available version was completed in November 1988 (see Availability, Customer Base, and Technical Support below). Before SPSD/M existed, a few departments of the Canadian government had a virtual monopoly on the ability to do detailed analyses of the impacts of tax and transfer policy changes. These departments use models that were developed independently and that are substantially nonoverlapping in their capabilities. For example, the Department of Finance uses a microsimulation model that can recompute income tax liabilities for a sample of taxpayers, while the Department of Employment and Immigration has its own 9 The implicit behavioral assumption underlying many such models is that individuals and tax and transfer units have knowledge of the programs that apply to them and will take maximum financial advantage of them. For some tax/transfer programs where it is known that all eligible units do not participate, a participation or takeup function is used to determine which units receive program benefits.
OCR for page 149
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers microsimulation model for the federal unemployment insurance system based on a sample of its own administrative data files. SPSD/M was constructed to provide a single integrated framework to model personal income tax, unemployment insurance, major transfer programs,10 and commodity taxes. It was created by combining individual administrative data from personal income tax returns and unemployment insurance claimants’ histories with survey data on family incomes and expenditure patterns (see Database Creation below). The software used initially to create the SPSD and a prototype version of SPSM was written using the Statistical Analysis System (SAS) statistical package on Statistics Canada’s IBM System 3090 mainframe computer. To ensure the widest possible distribution of SPSD/M, a second version of the modeling software was written in C language (Kernighan and Ritchie, 1978) and was implemented under MS-DOS11 on microcomputer systems having an Intel 80×86 microprocessor (Cotton, 1986). During the rewrite of SPSM, a new compressed data format had to be designed for the SPSD since the SAS version occupied more disk space than was commonly available on MS-DOS microcomputers at the time (see Database Structure and Size below). Database Creation The 1984 SPSD was constructed from four 1984 sources of microdata: The Survey of Consumer Finances (SCF) Statistics Canada’s main source of data on the distribution of income among individuals and families, which served as the host data set. The SCF is rich in data on family structure and income sources, but it lacks detailed information on unemployment history, tax deductions, and consumer expenditures. The 1984 SCF surveyed 34,000 households. Personal income tax return data The 3 percent sample (380,000) of personal income tax returns used as the basis of Revenue Canada’s annual Taxation Statistics (Green Book) publication. Unemployment insurance (UI) claim histories A specially drawn 1 percent sample of histories (about 33,000 records) from the Ministry of Employment and Immigration’s administrative system. The Family Expenditure Survey (FAMEX) Statistics Canada’s periodic survey of very detailed data on Canadian income and expenditure patterns at the 10 Income related to pensions and welfare is not modeled but instead is based on actual data from the sources used to create the SPSD. 11 The de facto standard for such microcomputers was initially set by the IBM Corp.; however, many firms now manufacture and sell compatible computer systems that support an MS-DOS operating system environment. The bulk of the SPSM implementation was performed on a Compaq Deskpro 386/20 or equivalent machine.
OCR for page 224
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers public and private parts of the directory; it was originally designed to be used as a central repository for all TRIM2 activity. If multiple copies of TRIM2 were distributed among multiple desktop computer systems, copies of the central directory would also have to be distributed to those systems. Over time the directories could diverge because of different types of activity on each system. It would not matter if the activities of the users were independent. However, if the various users were to cooperate with each other on joint projects, the likely result of attempting to join entities referenced in independent inconsistent directories would be chaos. Since the current model of TRIM2 use includes a number of users in different physical locations using a single CTD, once the divergence in directories began, there would be uncertain ability to combine results of runs and therefore little benefit in decentralizing the operation. The IBM mainframe version of TRIM2 still contains input/output routines written in assembly language. While this implementation decision reflects relatively inefficient FORTRAN input/output code in the late 1970s, FORTRAN input and output operations are still relatively inefficient. Given the dependence of TRIM2’s efficiency on fast sequential input and output, an additional cost of conversion to any other platform (as was the case with the VAX implementation) may be either significantly slower input and output or a laborious assembly-language-to-assembly-language coding task. The basic difficulty in extracting benefits from a desktop version of TRIM2 goes back to the notion of TRIM2 as a piece of capital embodying the technology of the late 1970s as well as the batch technology of the late 1960s. The software architectures of RIM, TRIM, TRIM2, and most if not all of the variants of these systems have all been optimized to run on a central mainframe computing system that relies primarily on batch processing. Moving these systems into a very different environment minimizes their operational strengths and exposes their lack of ability to exploit the new environment. The benefits of porting TRIM2 in its present form as suggested above are moderate at best and justify neither the real costs involved nor the opportunity costs of preempting investment resources that could better be used elsewhere. To the extent that there are benefits in using desktop systems for extensions of TRIM2, they are more oriented toward increasing programmer productivity than assisting user effectiveness. A case can be made for investing in some front-end tools that would help programmers modify and manage the existing TRIM2 system better. Such tools can and should probably be desktop based and linked by network connection to the TRIM2 mainframe host as appropriate. Even though such investments are likely to be short term, they can assist productivity in the short run, which should not be ignored. The underlying model of operation being suggested is one of specialization of function, with the front and back ends of the operation split, specialized, and relinked for the purpose of increasing programming and system management productivity.
OCR for page 225
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers Recommendation 1. TRIM2 should not be moved to a microcomputer environment. The investment would be sizable without corresponding short-term benefits. Medium-term benefits might be negative in terms of preempting more efficient longer-run strategies. Investment in TRIM2 should remain incremental and should track current areas of policy evaluation interest. Medium-Term Issues Turning to medium-term issues, our first question is: What are the characteristics of environments in which next-generation microanalytic simulation systems should be implemented for maximum utility? In the medium run, we believe that the arguments for implementing new microanalytic simulation system software on desktop systems are compelling and overwhelming, as discussed earlier. Major reasons are as follows: Both hardware and software capabilities on desktop systems now equal if not surpass those offered by mainframe environments in most important respects. The rate of technical progress and innovation in desktop environments is both substantial and sustainable. In the medium term (1990–1995), desktop systems will be appropriate platforms for efficiently implementing microsimulation models that are regarded as large scale today. The synergy of desktop environments allows specialization of function without negatively affecting the user’s ability to exploit an integrated environment. Interactive user interfaces, dedicated hardware, and tightly integrated bit mapped graphics are an integral part of quality user environments and are increasingly better and more readily available in desktop systems than in mainframe environments. Recommendation 2. Medium- and long-term investment in computing environments for microanalytic simulation activities should focus on systems implemented on desktop hardware platforms. Another question is: What should the next-generation system for implementing microanalytic simulation models look like, and how should its design be approached? We believe that current developments in graphical user interfaces, CASE tools, object-oriented systems, and desktop development environments will provide a radically more powerful and effective environment. Our reasons are briefly discussed below. Graphical user interfaces have been commonly available on desktop systems only for the past 5 years, starting with Apple’s Macintosh Operating System. During this time a variety of such interfaces have proliferated in the
OCR for page 226
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers MS-DOS and UNIX workstation world. The computing industry is now gaining substantial experience regarding how to implement and use such interfaces effectively at both the systems and applications software levels. Existing interfaces can now be exploited for the next generation of microanalytic simulation model systems. CASE tools, object-oriented systems, and desktop development environments lag behind the development of graphical user interfaces but not by much. The recent explosion in CASE tools indicates that these markets are becoming established and available for exploitation. It is now time to seriously consider using these emerging tools to change in a fundamental manner the way in which microanalytic models are constructed and simulated. The availability of such a rich set of techniques means that a user-oriented approach can be taken in designing the specifications for the next generation of microanalytic simulation modeling tools. Rather than letting the specification be driven by established computing patterns, it should be possible to design specifications for a user interface, including model construction and simulation activities, that have an object and action orientation and contain the objects and actions that are the basic elements of microanalytic simulation. While it may not be possible to meet such specifications exactly, within the next few years it should be possible to approach them closely enough for practical purposes. Such specifications are, of necessity, an interdisciplinary effort. Formulating them must be the joint responsibility of microanalytic modeling specialists and computer industry specialists. The computing specialists can provide paradigms of what is possible with the new evolving software tools, and the modeling specialists can use the set of possibilities to construct a computer-based, object-oriented environment that best represents their activities in constructing, refining, and using microanalytic simulation models. For reasons given above, neither the current TRIM2 nor SPSD/M environments are appropriate bases on which to build a next-generation model. Neither simulation system was built from a user-oriented perspective (i.e., starting with a user interface and working from it to the underlying system structure). Such an interface could be retrofitted to SPSD/M, but the existing rigidities and limitations of the system’s structure would limit the functionality of the retrofitted interface. And while such an interface could in theory be retrofitted to TRIM2, the functionality of the fit would be limited and would serve as neither a good basis nor a good example for new investment in a system for microanalytic simulation. While any new specification is influenced by what users can currently do with existing systems, any new specification should be free of implementation histories that old systems necessarily carry with them. Recommendation 3. An in-depth, medium-term study should be initiated to define over the next 1–2 years a next-generation computing environment for supporting microanalytic simulation modeling activities. Specification of such
OCR for page 227
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers a system should be oriented to the objects used and the actions taken by system users. Still another question is: How can the capital embedded in current microanalytic simulation models be retained as we move to new computing environments? A substantial part of the capital embedded in TRIM2 is contained in the substantial amount of complex code that describes in detail the many tax and transfer programs and rules that have formed the basis for past evaluations. If this capital could not eventually be moved to a new system that supports microsimulation modeling activities, it would have to be recreated. The magnitude of the reinvestment required is very large and should be avoided at all costs. Both existing and any new microanalytic simulation systems ultimately depend on executing a bottom layer of assembled computer program modules, no matter how they are specified, bound together, and invoked at higher levels. It therefore may be possible to convert TRIM2 code into modules that can be used as the fundamental low-level building blocks of models implemented in the next-generation system. If it is possible, such a conversion will not fall out easily from the system specification process. Nevertheless the amount of capital involved in the TRIM2 code (and maybe other simulation model code) is such that conversion of the code for use in the new system is highly desirable. Work to explore the feasibility and cost of this course of action must proceed in parallel with the system specification process. Recommendation 4. The medium-term next-generation system specification study should assess how to move the capital embodied in the TRIM2 model program modules to the new environment in a verifiably functionally equivalent form. Our last question is: Can the occasion of the recommended move be used to fashion more general systems that would span more types of models and have longer effective lifetimes? Another cycle of significant investment in application systems code to support microanalytic simulation seems clearly in order, to be made after a thorough study and specification of such a system from a user perspective, as discussed above. Given the importance of the methodology and the limited opportunities to obtain funding for an effort de novo, if an investment is made, it is likely to be the one major investment of the decade. Thus, it is important to ensure that the system created be as general as possible and span implementation of as broad a class of microanalytic simulation models as possible. Future obsolescence is a threat to every system, and, for a given level of investment, the lifetime of the resulting system should be maximized. In this regard some conceptual and definitional progress would be very helpful with respect to the concept of what is referred to as aging of the
OCR for page 228
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers micropopulation. The term aging refers to actions taken, generally at the microunit level, to project the unit forward in time by a specific time period, usually 1 year at a time. The actions taken depend on the nature of the model and the attributes being projected forward. The two main classes of attributes requiring projection are demographic attributes and economic attributes. Core demographic attributes are birth, death, age, and marital status. Static aging techniques such as those used in TRIM2 implement aging by changing the population weight associated with the micropopulation unit in a manner such that the distribution of core attributes for the population as a whole corresponds to a set of control distributions derived from sources external to the microsimulation model. Such static aging techniques do not change the values of any attributes in the record describing the micropopulation unit. Dynamic aging techniques such as those used in DYNASIM (Orcutt et al., 1976) change the actual record structure and content of the micropopulation file in a manner believed to be consistent with reality based on the demographic operating characteristics used; units are created and destroyed to simulate birth and death, relinked to simulate family formation and dissolution, and aged by incrementing individual age variables. A variety of economic attributes are often aged, or projected forward in time, by applying growth assumptions for the years over which the projection is made. Such projections are applied to account for changes in economic status between 2 years. For static simulation, projection of economic variables generally relies on assumptions regarding the level of economic activity exogenous to the microsimulation model; dynamic simulation operating characteristics would be likely to regard the level of economic variables for individuals as at least partially endogenous to the model, affected by such variables as age, education level, occupation, and industry. Aging techniques can be applied at either of two stages in microanalytic model simulation activity. They often are applied at initial population creation time, during which the initial population for simulation is projected forward from the date at which the data were collected to the date at which the simulation experiment is to begin. The natural lag between data collection and availability for simulation exercises makes aging at this stage of the activity necessary unless simulation exercises are structured to begin simulation at the point in time when the data were collected. Often such aging steps at this time are combined with initial adjustments to the microdata file to compensate for deficiencies in the data collection process, or with file augmentation through imputation or exact or stochastic record linkages. Aging techniques can also be applied at simulation time (i.e., during the course of the simulation exercise), and generally are applied in most such exercises. Assuming that the exercise concerns itself with future projections (which most do), application of these techniques provides the raison d’être for the simulation exercise.
OCR for page 229
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers Casual discussions of the role of aging in microanalytic simulation often do not differentiate between the type of projections used and when they are used. It is our opinion that lack of precision in the treatment of aging has created an artificial barrier to considering the similarities of static and dynamic aging techniques as opposed to their differences from the viewpoint of system implementation. By reconsidering the structural implementation of various aspects of the aging or projection activities within microsimulation models, it may be possible to create an overall system structure for microsimulation that can incorporate both static and dynamic models and therefore increase the return on system investment. There are two reasons for adopting this approach. First, most microanalytic simulation models do not use purely static or dynamic techniques. As reported by Devine and Wertheimer (1981:6): Virtually all of the major microsimulation models summarized earlier [TRIM, MATH, KGB, STATS, DYNASIM/MASH, DYNASIM/MASS, OTA’s personal income tax model, and HHS’s health care financing model] contain elements of both the static and the dynamic approach. For example, although TRIM relies primarily on static aging techniques, unemployment and labor force participation are both adjusted using a dynamic technique. Conversely, although DYNASIM relies primarily on dynamic aging techniques, immigration is handled entirely through static aging, and other static techniques can be applied if desired by the user. Thus, the capability to implement a heterogeneous aging strategy would be exploited by most models, including TRIM2. Second, investment in a microsimulation software system that is capable of supporting both static and dynamic models is an efficient investment assuming that the overhead involved in satisfying both needs is low. From a consideration of the concepts of a simulation agenda contained in TRIM2 and MASH, we believe that if the notion of aging at simulation time within TRIM2 were made an explicit entry in its run sequence, the execution pattern of TRIM2’s run sequence would parallel that of the simulation agenda in MASH, and both could be mapped into a common software framework. In this case all of the investment in graphical user interfaces, linkages with complementary programs on the same hardware platform, and other benefits of the new system to be constructed would automatically benefit static, dynamic, and mixed models implemented within the context of the software system. Further implementation within the same system could lead to the ability to compare the outputs of models of different types and orientations, running within a common system structure and using common micropopulations, aggregate time series data, and common parameter libraries. Recommendation 5. The imprecise treatment of the notion of aging needs conceptual and definitional attention. A better conceptual framework would
OCR for page 230
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers allow aging components to be implemented in such a way that static, dynamic, and mixed models could be implemented within the same software system framework for performing microanalytic simulation experiments. This issue should receive attention, and results should be used in the specification of the next-generation software construction recommended above. Discussion of the issues raised in this chapter represents only a beginning, although it may be a sufficient beginning toward understanding some of the issues involved in strategic planning for future microanalytic simulation systems. The above recommendations are among the most important that we wish to make. Beyond these primary conclusions, there are a variety of further implications that we could draw from our discussion and analysis of issues presented above and that merit further analysis. We believe that the issues involved are sufficiently important to merit further work, and we look forward to its continuation by parties interested in all aspects of the field. REFERENCES AND BIBLIOGRAPHY Balderston, F.E., and Hoggatt, A.C. 1962 Simulation of Market Processes. Institute of Business and Economic Research. Berkeley: University of California. Bell, Gordon 1989 The future of high performance computers in science and engineering. Communications of the ACM 32(9):1091–1101. Bergsman, Anne 1989 TRIM2 CPS Codebook. Project Report 3826–01. Washington, D.C.: The Urban Institute Press. Bergsman, Anne, et al. 1975 TRIM User’s Guide. Working Paper 718–3. Washington, D.C.: The Urban Institute Press. Betson, D., Greenberg, D., and Kasten, R. 1980 A microsimulation model for analyzing alternative welfare reform proposals: An application to the Program for Better Jobs and Income. Pp. 153–188 in R.Haveman and K.Hollenbeck, eds., Microeconomic Simulation Models for Public Policy Analysis. Vol. I. Distributional Impacts. New York: Academic Press. Bossons, J. 1967 Studies of the Royal Commission on Taxation, Number 25: A General Income Tax Analyzer. Ottawa: Queen’s Printer. Cartwright, David W. 1986 Improved deflation of purchases of computers. Pp. 7–10 in Survey of Current Business. Bureau of Economic Analysis. Washington, D.C.: U.S. Department of Commerce. Cartwright, David W., and Smith, Scott D. 1988 Deflators for purchases of computers in GNP: Revised and extended estimates, 1983–88. Pp. 22–23 in Survey of Current Business. Bureau of Economic Analysis. Washington, D.C.: U.S. Department of Commerce.
OCR for page 231
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers Cole, R., Chen, Y.C., Barquin-Stolleman, J.A., Dulberger, E., Helvacian, N., and Hodge, J.H. 1986 Quality-adjusted price indexes for computer processors and selected peripheral equipment. Pp. 41–49 in Survey of Current Business. Bureau of Economic Analysis. Washington, D.C.: U.S. Department of Commerce. Cotton, P. 1986 Options for Creating a Portable Modelling System for Social Policy Simulation Database and Model. Internal Report. Social and Economics Studies Division, Statistics Canada, Ottawa. Cotton, P., Turner, M.J., and Hammond, R. 1979 RAPID: A data base management system for statistical application. In Proceedings of the 42nd Biennial Congress. Paris: International Statistical Institute. Devine, Judith, and Wertheimer, Richard 1981 Aging Techniques Used by the Major Microsimulation Models. Working Paper 1453–01. Washington, D.C.: The Urban Institute Press. Doyle, Pat, and Neyland, Kevin, eds. 1979 The MATH Technical Description. Washington, D.C.: Mathematica Policy Research, Inc. Duesenberry, J.S., Fromm, G., Klein, L.R., and Kuh, E., eds. 1965 The Brookings Quarterly Econometric Model of the United States. Chicago: Rand McNally and Company. Flamm, Kenneth 1987 Targeting the Computer. Washington, D.C.: The Brookings Institution. Food and Agriculture Organization 1987 Microcomputer-Based Data Processing: 1990 World Census of Agriculture. FAO Statistical Development Series No. 2a. Rome: Food and Agriculture Organization. Giesbrecht, F.G., and Field, L. 1969 Demographic Microsimulation Model POPSIM II: Manual for Programs to Generate Vital Events, Open Core Model. Technical Report No. 5, Project SU-285. Chapel Hill, N.C.: Research Triangle Institute. Greenberger, M.H. 1957 Computer Simulation of the United States Social Economy. Dissertation submitted to Applied Mathematics Department, Harvard University. Greenberger, M.H., Jones, M.M., Morris, J.H., Jr., and Ness, D.N. 1965 On-line Computation and Simulation: The OPS-3 System. Cambridge, Mass.: MIT Press. Guthrie, H.W., Orcutt, G.H., Caldwell, S., Peabody, G.E., and Sadowsky, G. 1972 Microanalytic simulation of household behavior. Annals of Economic and Social Measurement. 1(April):141–169. Hammel, E.A., Hutchinson, D.W., Wachter, K.W., Lundy, R.T., and Deuel, R. 1976 The SOCSIM Demographic-Sociological Microsimulation Program Operating Manual. Institute of International Studies, Research Series, No. 27. University of California, Berkeley. Holt, C.C., Shirey, R., Steward, D., Schrank, WE., Palit, D., Midler, J.L., and Stroud, A.H. 1967 Program Simulate II: A User’s and Programmer’s Manual. Madison, Wisc.: Social Systems Research Institute. Huber, P., Jensen, K., and Shapiro, R.M. 1989 Hierarchies in Coloured Petri Nets. Paper presented at the Tenth International
OCR for page 232
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers Conference on Application and Theory of Petri Nets, Bonn, Federal Republic of Germany. Johnson, J., Roberts, T.L., Verplank, W., Smith, D.C., Irby, C.H., and Beard, M. 1989 The Xerox Star: A retropective. Computer 22(9):11–29. Jorgensen, D.W., and Landau, R. 1989 Technology and Capital Formation. Cambridge, Mass.: MIT Press. Kaufer, S., Lopez, R., and Pratap, S. 1988 Saber-C, an interpreter-based programming environment for the C language. In USENIX Conference Proceedings. Berkeley, Calif.: USENIX Association. Kernighan, Brian W., and Ritchie, Dennis M. 1978 The C Programming Language. Englewood Cliffs, N.J.: Prentice-Hall. Kiviat, P.J., Villanueva, R., and Markowitz, H.M. 1968 The SIMSCRIPT II Programming Language: Santa Monica, Calif.: The RAND Corp. Koenig, Lou P. 1973 TRIM CPSEO Codebook. Working Paper 718–2. Washington, D.C.: The Urban Institute Press. Kuck, David J. 1978 The Structure of Computers and Computations. Volume 1. New York: John Wiley & Sons. Lewis, G.H., and Michel, R.C., eds. 1990 Microsimulation Techniques for Tax and Transfer Analysis. Washington, D.C.: The Urban Institute Press. Maling, W. 1971 Preliminary Version of the TROLL/1 Reference Manual. Cambridge, Mass.: MIT Press. McClure, Carma 1989 The CASE experience. BYTE 14(April):235–246. Meyers, Edmund D., Jr. 1973 Time Sharing Computation in the Social Sciences. Englewood Cliffs, N.J.: Prentice-Hall. Moeller, John F. 1973 TRIM Technical Description. Working Paper 718–1. Washington, D.C.: The Urban Institute Press. Okner, B.A. 1966 Income Distribution and the Federal Income Tax. Institute of Public Administration. Ann Arbor: University of Michigan. Orcutt, G.H., Merz, J., and Quinke, H., eds. 1986 Microanalytic Simulation Models to Support Social and Financial Policy. Amsterdam: Elsevier Science Publishers. Orcutt, G.H., Greenberger, M., Korbel, J., and Rivlin, A.M. 1961 Microanalysis of Socioeconomic Systems: A Simulation Study. New York: Harper & Row. Orcutt, G.H., Caldwell, S., Wertheimer, R., Franklin, S., Hendricks, G., Peabody, G., Smith, J., and Zedlewski, S. 1976 Policy Exploration Through Microanalytic Simulation. Washington, D.C.: The Urban Institute Press. Pechman, J.A. 1965 A new tax model for revenue estimating. In A.T.Peacock and G.Hauser, eds., Government Finance and Economic Development. Paris: Organization for Economic Cooperation and Development.
OCR for page 233
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers Pechman, J.A., Okner, B.A., and Munnell, A. 1969 Simulation of the Carter Commission tax proposals for the United States. National Tax Journal 22(March):2–23. Phister, Montgomery, Jr. 1979 Data Processing Technology and Economics. Second Edition. Bedford, Mass.: Digital Press and Santa Monica Publishing Company. Sadowsky, George 1972 MASH: An online system for socioeconomic microsimulation of the U.S. household sector. In Online 72 Conference Proceedings. Brunel University, England: Online Computer Systems, Ltd. 1977 MASH: A Computer System for Microanalytic Simulation for Policy Exploration. Washington, D.C.: The Urban Institute Press. Schulz, J.H. 1968 The Economic Status of the Retired Aged in 1980: Simulation Projections. Social Security Administration Research Report No. 24. Washington, D.C.: U.S. Department of Health, Education, and Welfare. Statistics Canada 1989a SPSD/M Product Overview. Ottawa: Statistics Canada. 1989b SPSD/M Introductory Manual. Ottawa: Statistics Canada. 1989c SPSD/M User’s Manual. Ottawa: Statistics Canada. 1989d SPSD/M Reference Manual. Ottawa: Statistics Canada. 1990 SPSD/M Programmer’s Guide. Ottawa: Statistics Canada. Sulvetta, Margaret 1976 An Analyst’s Guide to TRIM—The Transfer Income Model. Paper 996–03. Washington, D.C.: The Urban Institute Press. Thompson, M.M., and Shapiro, G. 1973 The Current Population Survey: An overview. Annals of Economic and Social Measurement 2(April):105–129. Triplett, J.E. 1986 The economic interpretation of hedonic methods. Pp. 36–40 in Survey of Current Business. Bureau of Economic Analysis. Washington, D.C.: U.S. Department of Commerce. U.S. Department of Defense 1980 Reference for the ADA Programming Language: Proposed Standard Document. Washington, D.C.: U.S. Government Printing Office. Webb, Randall L. 1979 Toward A New Generation of TRIM: Efficient Data Structure and Supporting Features. Paper 1281–02. Washington, D.C.: The Urban Institute. Webb, Randall L., Michel, Richard C., and Bergsman, Anne B. 1990 The historical development of the Transfer Income Model (TRIM2). Pp. 33–76 in Gordon H.Lewis and Richard C.Michel, eds., Microsimulation Techniques for Tax and Transfer Analysis. Washington, D.C.: The Urban Institute Press. Webb, R.L., Hager, C., Murray, D., and Simon, E. 1982 TRIM2 Simulation Modules. Working Paper 3069–02. 2 vols. March 1982 plus updates. Washington, D.C.: The Urban Institute Press. Webb, R.L., Bergsman, A., Hager, C., Murray D., and Simon, E. 1986 TRIM2 Reference Manual—The Framework for Microsimulation. Working Paper 3069–01. Washington, D.C.: The Urban Institute Press. Wilensky, Gail R. 1970 An income transfer computational model. In The President’s Commission
OCR for page 234
Improving Information for Social Policy Decisions—The Uses of Microsimulation Modeling: Volume II, Technical Papers on Income Maintenance Programs: Technical Studies. Washington, D.C.: U.S. Government Printing Office. Wolfson, M., Gribble, S., Bordt, M., Murphy, B., and Rowe, G. 1989 The Social Policy Simulation Database and Model: An example of survey and adminstrative data integration. Survey of Current Business (May):36–39. Wyscarver, Roy A. 1978 The Treasury Personal Individual Income Tax Simulation Model. Office of Tax Analysis Paper 32. Washington, D.C.: U.S. Department of the Treasury.
Representative terms from entire chapter: