focused on four themes, in the following order of priority: (1) collect and disseminate human performance date, (2) develop accreditation procedures for models of human behavior, (3) support sustained model development in focused areas, and (4) support theory development and basic research in relevant areas.

Collect and Disseminate Human Performance Data

The panel has concluded that all levels of model development depend on the sustained collection and dissemination of human behavior data. Data needs extend from the kind of real-world military data that reflect, in context, the way military forces actually behave, are coordinated, and communicate, to laboratory studies of basic human capacities. Between these extremes are data derived from high-fidelity simulations and war games and from laboratory analogs to military tasks. These data are needed for a variety of purposes: to support the development of measures of accreditation, to provide benchmark performance for comparison with model outputs in validation studies, to help set the parameters of the actual models of real-world tasks and test and evaluate the efficacy of those models, and to challenge existing theory and lead to new conceptions that will provide the grist for future models. In addition to the collection of appropriate data, there must be procedures to ensure that the data are codified and made available in a form that can be utilized by all the relevant communities—from military staffs who need to have confidence in the models to those in the academic sphere who will develop the next generation of models. It is important to note that clear measures of performance for military tasks are needed. Currently, these measures are poorly defined or lacking altogether.

Create Accreditation Procedures for Models of Human Behavior

The panel has observed very little quality control among the models that are used in military simulations today. DMSO should establish a formal procedure for accrediting models to be used for human behavior representation. One component needed to support robust accreditation procedures is quantitative measures of human performance. In addition to supporting accreditation, such measures would facilitate evaluation of the cost-effectiveness of alternative models so that resource allocation judgments could be made on the basis of data rather than opinion. The panel does not believe that the people working in the field are able to make such judgments now, but DMSO should promote the development of simulation performance metrics that could be applied equivalently to live exercises and simulations. The goal would be to create state-of-health statistics that would provide quantitative evidence of the payoff for investments in human behavior representation.

There are special considerations involved in human behavior representation that warrant having accreditation procedures specific to this class of behavioral



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement