This report represents the findings of an 18-month study conducted by the Panel on Modeling Human Behavior and Command Decision Making: Representations for Military Simulations. For this study, the panel, working within the context of the requirements established by military simulations, reviewed and assessed the state of the art in human behavior representation—or modeling of the processes and effects of human behavior—at the individual, unit, and command levels to determine what is required to move military simulations from their current limited state to incorporate realistic human and organizational behavior.
The need to represent the behavior of individual combatants as well as teams and larger organizations has been expanding as a result of the increasing use of simulations for training, systems analysis, systems acquisition, and command decision aiding. Both for training and command decision aiding, the behaviors that are important to represent realistically are those that can be observed by the other participants in the simulation, including physical movement and detection and identification of enemy forces. It is important that observable actions be based on realistic decision making and that communications, when they originate with a simulated unit, be interpretable as the result of sensible plans and operations. A team should manifest a range of behaviors consistent with the degree of autonomy it is assigned, including detection of and response to expected and unexpected threats. It should be capable of carrying out actions on the basis of communications typically received from its next-highest-echelon commander.
In the panel's view, achieving realism with respect to these observable outcomes requires that the models of human behavior employed in the simulation be
based on psychological, organizational, and sociological theory. For individual combatants, it is important to represent the processes underlying the observable behavior, including attention and multitasking, memory and learning, decision making, perception and situation awareness, and planning. At the unit level it is important to represent the command and control structure, as well as the products of that structure. Added realism can also be achieved by representing a number of behavior moderators at the individual and organizational levels. Moderators at the individual level, such as workload and emotional stress, serve to enhance or degrade performance, as reflected in the speed and accuracy of performance. Moderators at the organizational level, including the average level of training, whether standard operating procedures are followed, the level and detail of those procedures, and the degree of coupling between procedures, all affect performance. In each of these essential areas, this report presents the panel's findings on the current state of knowledge, as well as goals for future understanding, development, and implementation. The goals found at the end of each chapter are presented as short-, intermediate-, and long-term research and development needs. The report also provides descriptions of integrative architectures for modeling individual combatants. Overall conclusions and recommendations resulting from the study are presented as well. This summary presents the panel's overall recommendations in two broad areas: a framework for the development of models of human behavior, and infrastructure and information exchange. Detailed discussion of these recommendations is provided in Chapter 13 of this report.
A FRAMEWORK FOR THE DEVELOPMENT OF MODELS OF HUMAN BEHAVIOR
The panel has formulated a general framework that we believe can guide the development of models of human behavior for use in military simulations. This framework reflects the panel's recognition that given the current state of model development and computer technology, it is not possible to create a single integrative model or architecture that can meet all the potential simulation needs of the services. The framework incorporates the elements of a plan for the Defense Modeling and Simulation Office (DMSO) to apply in pursuing the development of models of human behavior to meet short-, intermediate-, and long-term goals. For the short term, the panel believes it is important to collect real-world, wargame, and laboratory data in support of the development of new models and the development and application of human model accreditation procedures. For the intermediate term, we believe DMSO should extend the scope of useful task analysis and encourage sustained model development in focused areas. And for the long term, we believe DMSO should advocate theory development and behavioral research that can lead to future generations of models of human and organizational behavior. Work on achieving these short-, intermediate-, and long-term goals should begin concurrently. We recommend that these efforts be
focused on four themes, in the following order of priority: (1) collect and disseminate human performance date, (2) develop accreditation procedures for models of human behavior, (3) support sustained model development in focused areas, and (4) support theory development and basic research in relevant areas.
Collect and Disseminate Human Performance Data
The panel has concluded that all levels of model development depend on the sustained collection and dissemination of human behavior data. Data needs extend from the kind of real-world military data that reflect, in context, the way military forces actually behave, are coordinated, and communicate, to laboratory studies of basic human capacities. Between these extremes are data derived from high-fidelity simulations and war games and from laboratory analogs to military tasks. These data are needed for a variety of purposes: to support the development of measures of accreditation, to provide benchmark performance for comparison with model outputs in validation studies, to help set the parameters of the actual models of real-world tasks and test and evaluate the efficacy of those models, and to challenge existing theory and lead to new conceptions that will provide the grist for future models. In addition to the collection of appropriate data, there must be procedures to ensure that the data are codified and made available in a form that can be utilized by all the relevant communities—from military staffs who need to have confidence in the models to those in the academic sphere who will develop the next generation of models. It is important to note that clear measures of performance for military tasks are needed. Currently, these measures are poorly defined or lacking altogether.
Create Accreditation Procedures for Models of Human Behavior
The panel has observed very little quality control among the models that are used in military simulations today. DMSO should establish a formal procedure for accrediting models to be used for human behavior representation. One component needed to support robust accreditation procedures is quantitative measures of human performance. In addition to supporting accreditation, such measures would facilitate evaluation of the cost-effectiveness of alternative models so that resource allocation judgments could be made on the basis of data rather than opinion. The panel does not believe that the people working in the field are able to make such judgments now, but DMSO should promote the development of simulation performance metrics that could be applied equivalently to live exercises and simulations. The goal would be to create state-of-health statistics that would provide quantitative evidence of the payoff for investments in human behavior representation.
There are special considerations involved in human behavior representation that warrant having accreditation procedures specific to this class of behavioral
models. The components of accreditation should include those described below.
Provide proof that the model actually runs and meets the design specifications. This level of accreditation is similar to that for any other model, except that verification must be accomplished with human models in the loop, and to the extent that such models are stochastic, will require repeated runs with similar but not identical initial conditions to verify that the behavior is as advertised.
Show that the model accurately represents behavior in the real world under at least some conditions. Validation with full generality is not possible for models of this complexity; rather, the scope and level of the required validation should be very focused and matched closely to the intended uses of each model. One approach to validation is to compare model outputs with data collected during prior live simulations conducted at various military training sites (e.g., the National Training Center, Red Flag, the Joint Readiness Training Center). Another approach is to compare model outputs with data derived from laboratory experiments or various archival sources. The panel suggests that to bring objectivity and specialized knowledge to the validation process, the validation team should include specialists in modeling and validation who have not participated in the actual model development. For those areas in which the knowledge base is insufficient and the costs of data collection are too high, it is suggested that the developers rely on expert judgment. However, because of the subjectiveness of such views, we believe that judgment should be the alternative of last resort.
Describe the range of predictions that can be generated by the model. This information is necessary to define the scope of the model; it can also be used to link this model with others. Analysis is hampered by the complexity of these models, which makes it difficult to extract the full range of behavior covered. Thus investment in analysis tools is needed to assist in this task.
The accreditation procedures should include standards for the documentation that explains how to run and modify the model and a plan for maintaining and upgrading the model. Models will be used only if they are easy to run and
modify to meet the changing needs of the user organization. Evaluation of the documentation should include exercising specific scenarios to ensure that the documentation facilitates the performance of the specified modeling tasks.
As a high priority, the panel recommends that the above accreditation procedures be applied to military models of human behavior that are either currently in use or being prepared for use, most of which have not had the benefit of rigorous quantitative validation, and that the results of these analyses be used to identify high-payoff areas for improvement. Significant improvements may thereby be achievable relatively quickly for a small investment.
Provide Support for Sustained Model Development in Focused Areas
Several specific activities are associated with model development. They include the following:
Develop task analysis and structure. Researchers and model users must continue and expand the development of detailed descriptions of military contexts—the tasks, procedures, and structures that provide the foundation for modeling of human behavior at the individual, unit, and command levels.
Establish model purposes. The modeler must establish explicitly the purpose(s) for which a model is being developed and apply discipline to enhance model fidelity only to support those purposes.
Support focused modeling efforts. Once high-priority modeling requirements have been established, we recommend sustained support in focused areas for human behavior model development that is responsive to the methodological approach outlined in Chapter 12 of this report.
Employ interdisciplinary teams. It is important that model development involve interdisciplinary teams composed of military specialists and researchers/modelers with expertise in cognitive psychology, social psychology, sociology, organizational behavior, computer science, and simulation technology.
Benchmark. Periodic modeling exercises should be conducted throughout model development to benchmark the progress being made and to enable a focus on the most important shortfalls of the prototype models. These exercises should be scheduled so as not to interfere with further development advances.
Promote interoperability. In concert with model development, DMSO should evolve policy to promote interoperability among models representing human behavior. Although needs for human behavior representation are common across the services, it is simplistic to contemplate a single model of human behavior that could be used for all military simulation purposes, given the extent to which human behavior depends on both task and environment.
Employ substantial resources. Improving the state of human behavior representation will require substantial resources. Even when properly focused, this work is at least as resource demanding as environmental representation. Further, generally useful unit-level models are unlikely to emerge simply through minor adjustments in integrative individual architectures.
In the course of this study, the panel examined the current state of integrated computational models of human behavior and human cognitive processes that might lead to improved models of the future. However, the current state of the art offers no single representation architecture that is suited to all individual human or organizational modeling needs. Each integrated model we reviewed implies its own architecture, and the chapters of this report on particular cognitive content areas each suggest specific alternative modeling methodologies. It is not likely, even in the future, that any single architecture will address all modeling requirements.
On the other hand, we recognize the value of having a unitary architecture. Each new architecture requires an investment in infrastructure beyond the investment in specific models to be built using that architecture. Having an architecture that constrains development can promote interoperability of component modeling modules. As applications are built with a particular architecture, the infrastructure can become more robust, and some applications can begin to stand on the shoulders of others. Development can become synergistic and therefore more efficient.
At this point in the maturity of the field, it would be a mistake for the military services to make a choice of one or another architecture to the exclusion of others. Therefore, we recommend that the architectures pursued within the military focus initially on the promising approaches identified in Chapter 3 of this report. This recommendation is especially important because the time scale for architecture development and employment is quite long, and prior investment in particular architectures can continue to produce useful payoffs for a long time after newer and possibly more promising architectures have appeared and started to undergo development. On the other hand, this recommendation is in no way meant to preclude exploration of alternative architectures. Indeed, resources need to be devoted to the exploration of alternative architectures, and in the medium and especially long terms, such research will be critical to continued progress.
Support Theory Development and Basic Research in Relevant Areas
There is a need for continued long-term support of theory development and basic research in areas such as decision making, situation awareness, learning, and organizational modeling. It would be short-sighted to focus only on the immediate payoffs of modeling; support for future generations of models needs
to be sustained as well. It might be argued that the latter is properly the role of the National Science Foundation or the National Institutes of Health. However, the kinds of theories needed to support human behavior representation for military situations are not the typical focus of these agencies. Their research tends to emphasize toy problems and predictive modeling in restricted experimental paradigms for which data collection is relatively easy. To be useful for the representation of military human behavior, the research needs to be focused on the goal of integration into larger military simulation contexts and on specific military modeling needs.
RECOMMENDATIONS FOR INFRASTRUCTURE AND INFORMATION EXCHANGE
The panel has identified a set of actions we believe are necessary to build consensus more effectively within the Department of Defense modeling and simulation community on the need for and direction of human performance representation within military simulations. The focus is on near-term actions DMSO can undertake to influence and shape modeling priorities within the services. These actions are in four areas: collaboration, conferences, interservice communication, and education/training.
The panel believes it is important in the near term to encourage collaboration among modelers, content experts, and behavioral and social scientists, with emphasis on unit/organizational modeling, learning, and decision making. It is recommended that specific workshops be organized in each of these key areas.
The panel recommends an increase in the number of conferences focused on the need for and issues associated with human behavior representation in military models and simulations. The panel believes the previous biennial conferences on computer-generated forces and behavioral representation have been valuable, but could be made more useful through changes in organization and structure. We recommend that external funding be provided for these and other conferences and that papers be submitted in advance and refereed. The panel believes organized sessions and tutorials on human behavior representation, with invited papers by key contributors in the various disciplines associated with the field, can provide important insights and direction. Conferences also provide a proactive stimulus for the expanded interdisciplinary cooperation the panel believes is essential for success in this arena.
Expanded Interservice Communication
There is a need to actively promote communication across the services, model developers, and researchers. DMSO can lead the way in this regard by developing a clearinghouse for human behavior representation, perhaps with a base in an Internet web site, with a focus on information exchange. This clearing-house might include references and pointers to the following:
Military task descriptions
Data on military system performance
Live exercise data for use in validation studies
Resource and platform descriptions
DMSO contractors and current projects
Military technical reports
Education and Training
The panel believes opportunities for education and training in the professional competencies required for human behavior representation at a national level are lacking. We recommend that graduate and postdoctoral fellowships in human behavior representation and modeling be provided. Institutions wishing to offer such fellowships would have to demonstrate that they could provide interdisciplinary education and training in the areas of human behavior representation, modeling, and military applications.
A FINAL THOUGHT
The modeling of cognition and action by individuals and groups is quite possibly the most difficult task humans have yet undertaken. Developments in this area are still in their infancy. Yet important progress has been and will continue to be made. Human behavior representation is critical for the military services as they expand their reliance on the outputs from models and simulations for their activities in management, decision making, and training. In this report, the panel has outlined how we believe such modeling can proceed in the short, medium, and long terms so that DMSO and the military services can reap the greatest benefit from their allocation of resources in this critical area.