National Academies Press: OpenBook

Human-System Integration in the System Development Process: A New Look (2007)

Chapter: 10 Conclusions and Recommendations

« Previous: Part III The Future: Scenarios, Conclusions, and Recommendations, 9 Scenarios for the Future
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

10
Conclusions and Recommendations

In this chapter, we report our broad conclusions related to each of the themes we introduced at the start of the report. These conclusions reflect detailed consideration of (1) our research into current views of systems engineering, (2) what the committee learned is needed to meet the requirements for adequate support for the role of humans in systems, (3) our survey of the methods and tools available to support what is needed, and (4) our assessment of the state of the art in human-system integration (HSI).

Our most fundamental conclusion is that human performance and human-system integration will never be most effective in system design unless it is seen by all stakeholders as an integral part of the entire systems engineering process, from initial exploration and concept evaluation through operational use, reengineering, and retirement. Many systems have failed because the role of humans was considered only after design problems were identified—when it was too late to make the kind of changes that were required to produce systems responsive to users’ needs. We conclude that the definition of user requirements should begin when the system is first being conceived, and those requirements should continue to provide important evaluation criteria right up to the time the system is placed in use.

The military services are recognizing the need for more emphasis on human considerations in design through the introduction of MANPRINT, SEAPRINT, and, most recently, AIRPRINT requirements. More and more commercial software, hardware, and service industries are beginning to realize that commercial success requires attention to the customer’s needs and that achieving that success has implications for the product engineering

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

team, not just the marketing and sales teams. It begins with product conceptualization and continues throughout the product development cycle.

As a process for integrating human considerations into the systems engineering process, the committee has built on the strengths of existing systems engineering process models (waterfall, V-model, concurrent, incremental, spiral, evolutionary, agile) to synthesize an incremental commitment model (ICM) that helps to situate HSI activities within a system’s life cycle. As described in the introduction, this model is based on five critical success factor principles: (1) negotiation to “satisfice” system stakeholders’ (e.g., users, acquirers, developers) requirements; (2) incremental growth of system definition and stakeholder commitment; (3) concurrent system definition and development; (4) iterative system definition and development; and (5) risk management. Incremental commitment model is consistent with current approaches to systems engineering, including the U.S. Department of Defense (DoD) 5000 series of system acquisition policies and guidelines, and provides the kind of emphasis that the committee considers important to achieving human-system integration. Although it is not the only model that could be used on future human-intensive systems, it has served as a reasonably robust framework for explaining the study’s HSI concepts, and for evaluating these via the case studies in Chapter 5. However, there are ways to extend or reinterpret existing process models to accommodate the five critical success factor principles and HSI activities.

In the paragraphs below we build on the six themes first mentioned in the introduction, and highlight features based on them that require special attention from the perspective of human-system integration.

Begin HSI contributions to development early and continue them throughout the development life cycle. If there were a single message to communicate to program managers and developers, it would be to understand that HSI expertise is important from the very beginning of the life cycle, when systems are first being conceived. HSI specialists are trained to explore and understand the environment in which a system will be used. In order to develop an operational concept, full understanding of the context of use is required. These factors need to be assessed even before a conceptual design is put forward. Human factors specialists have a collection of methods and tools for efficiently understanding the system environment and context of use. Consideration of these factors early can have orders-of-magnitude impacts on system performance. If human factors and other HSI input are left until the test and evaluation stage, only small-percentage improvements can be realized under the best of circumstances, and there is a risk that the system will not satisfy the original goals. We have also emphasized that system development needs to be an iterative process, and that there are human-system design considerations that evolve and need to be iterated along with every other aspect of system development.

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

Integrate across human-system domains as well as across the system life cycle. The domains identified in the MANPRINT methodology—human factors, manpower, personnel, training, system safety and health, habitability, and survivability, the first five of which are potentially as relevant to commercial products as to military systems—are not independent, and consideration of them must not be treated separately (i.e., “stove-piped”). While each has its own methods, there are many areas in which the methodologies we describe in Part II can serve multiple purposes across the domains and do not have to be analyzed for each. For example, task analysis, risk analyses, and workload analysis can support human factors, manpower, training, and safety. Ergonomic analysis can support human factors, training safety, and health hazards. For it to do so requires that the individual specialists in each area cooperate up front to ensure that the resulting shared representations meet the requirements of all the domains. This is a critical aspect of negotiation to “satisfice” system stakeholders’ requirements.

Adopt a risk- and opportunity-driven approach to determining needs for HSI activity. At each of the system development milestones, the systems engineering team undertakes an analysis of the development risk and opportunities before proceeding to the next milestone. It is essential that the HSI team contribute an evaluation of HSI risks and opportunities to be considered in collaboration with the rest of the system engineering team. It is through the risk analysis that the argument may be made for assigning resources to evaluate particular risks further or to find ways to mitigate the risks that the system will fail, for example, because of safety risk, risk that it will be too costly to train the personnel in its use, or risk that it will be maladapted to the people who must use it. In addition, considering opportunities may allow the HSI team to improve program execution and system capabilities.

There is often a tendency for the HSI team to insist on a complete HSI analysis. The purpose of the risk and opportunity analysis is to focus attention on the risks whose likelihood and seriousness are both appreciable, as well as the opportunities with the greatest payoffs. It will also serve to identify the areas of development in which the risks are minimal and do not need further attention. HSI risk and opportunity analysis becomes a component of the overall system development risk analyses and is given equal importance to other system risk factors. The use of human-sensitive mission effectiveness models, simulations, and exercises can be highly effective in this regard.

If there are integrated product teams (IPT) for which HSI issues are relevant, there should be at least one HSI representative on each such team, and that person should be responsible for ensuring that the HSI risks and opportunities are considered.

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

Tailor methods to time and budget constraints. Every system development takes place under time and budget constraints. It is not possible to undertake full-scale HSI evaluation of every aspect of a system development. Early in the life cycle, as a part of the iterative system definition and development, it is important to evolve the human-system requirements, prepare the HSI part of the business case for designing and fielding the system, and undertake the risk and opportunity analysis. The business case should include quantitative performance objectives based on human capacities and limitations. From that point on, it is important that the HSI team, driven by its risk, opportunity, and requirements analyses, focus further attention on the critical issues and requirements identified in the risk/opportunity analysis only. With respect to each identified issue, they should evaluate the analysis requirements carefully, consider alternative approaches to achieving them, and select the methods and tools that are most cost-effective for answering the questions at hand. The proposed budget should be based on a careful but realistic analysis of what needs to be done to satisfy the critical and most risky requirements. Doing so will gain the respect and confidence of the program manager and will improve the chances that adequate budget will be provided.

Ensure communication among stakeholders of HSI outputs. Many of the contributions of the HSI team—especially those that are developed early in the development process—tend to be based on observation, interview, and questionnaire methods. The individuals who collect the data become the most knowledgeable about the characteristics of the system environment and context of use. Similarly, the knowledge acquisition associated with developing task and process analysis results in very rich information in the heads of the analysts. However, much of this information is needed by all the system stakeholders, from the funders and program managers to the detail designers and developers. In following the principle of negotiating to satisfice all stakeholders’ requirements, it is very important that the HSI team provide outputs and deliverables that capture the information and its interpretation in forms that are understandable and usable by these stakeholders—we have called them shared representations. We have discussed the kinds of methods to be used for generating the needed information and, for each method, suggested that the kinds of shared representations we recommend should be developed as the outputs. Effort should be made to create these shared representations in a form that is readily assimilated into the engineering process, that is, expressed in terms that are compatible with other engineering outputs. This might be accomplished through the generation of scenarios of use, models and/or simulations based on the task analysis output, or analyses of the context of use. Effective shared representations can be very helpful in smoothing the flow of information among team members and in ensuring that the HSI team output is influential.

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

Design to accommodate changing conditions and requirement in the workplace. There have been and are continuing to be significant changes in many factors that influence the way that work gets accomplished and the nature and complexity of the systems that are developed. Personnel costs are a significant percentage of the operational cost of systems, and everywhere there is pressure to reduce the numbers of personnel. Technology is often seen as the panacea to reduce personnel costs, increase efficiency, and improve safety. Technological evolution has become much more rapid, and the systems developed last year may be already out of date.

It is impossible to capture all the requirements up front, so it is valuable to develop systems that can be more easily adapted or modified in order to continue to provide support as the work context changes. The ultimate ideal is to create evolvable systems that can be “appropriated” (or reinvented) by the users and tailored to meet the inevitable changes that will arise. This argument is consistent with the principles of incremental growth of system definition and concurrent system definition and development—the idea that requirements should not be assumed to be fixed but instead expected to evolve over the life cycle of the system.

The design of systems of systems involves a level of complexity and challenge much greater than the design of individual complex systems themselves. For example, the military is designing command and control systems that span the activities of many logistics, battlefield operations, manned and unmanned aerial systems, and multinational forces. Telephone companies are now faced with integrating digital phone systems with cell phones, Internet access, and television delivery systems. Hospital information systems must be integrated with the accounting, nursing unit, pharmacy, and individual physician’s workstations, not to mention supply systems and inventory control, and they must do it in a way that promotes patient safety.

Complex systems of systems demand new approaches to uncover the multiple points of interdependency across systems and anticipate their impacts on the people operating in those environments. New envisioning methods and modeling tools are needed to predict the kinds of challenging situations that are likely to arise, the kinds of adaptations that will be required of people to cope with new complexities, and the kinds of errors and failures that may emerge in the future (Woods and Dekker, 2000; Woods, 2002; Winograd and Flores, 1987; Feltovich et al., 2004). The ability to anticipate likely reverberations of technology insertions early in the design process can contribute substantively to the design of complex systems and systems of systems that are resilient in the face of a wide range of operational perturbations (Hollnagel, Woods, and Leveson, 2006).

The emergence of systems of systems further emphasizes the importance of considering human-system integration as an integral part of the

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

development process. We have highlighted the value of iterative design and the role that shared representations and especially models and simulations can play in ensuring that all stakeholders remain informed about the current state of development. In this kind of very dynamic development environment, it is important to keep in mind the potential for changes after the system is implemented. Information currency is likely to be a very important consideration, since design requirements can change with each new iteration. In Chapter 2, we described a procedure for accommodating rapid change and high assurance through incremental development. Design for evolvable systems requires anticipating the scope of changes that might take place, making the design modular, leaving appropriate entry points for the changes, providing thorough software documentation, and providing scalable service-oriented architectures.

RESEARCH AND POLICY RECOMMENDATIONS

These recommendations identify further critical steps to facilitate the kind of integration into systems engineering that we consider of paramount importance. Our intent is to provide sufficient detail to guide the development of a research plan and the formulation of policy initiatives for the DoD and other government and private organizations. The recommendations are organized into four areas: (1) realizing the full integration of human systems and systems engineering; (2) methods for defining opportunities and the context of use; (3) methods for defining requirements and design; (4) and methods for evaluation. Accomplishing these steps will provide needed support to realize the future scenarios outlined in Chapter 9. The committee was not able to prioritize these research recommendations as they cover diverse areas of equal importance. We believe work in these areas should proceed concurrently.

Realizing the Full Integration of Human Systems and Systems Engineering

This report presents the incremental commitment model as an example framework for system development activities and discusses how human-system integration fits within this framework. Here we present our policy and research recommendations regarding the principal areas of research, development, and policy initiatives needed to facilitate integration throughout the development life cycle and across HSI disciplines. These areas include

  • Institutionalizing the success factors associated with the incremental commitment model.

  • Accommodating the emergence of HSI requirements.

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
  • Ensuring that HSI operational requirements are included in the initial system development contract and acquisition documents.

  • Managing integrated system development.

  • Providing traceability of HSI objectives, decision points, and the rationale for decisions across life-cycle design phases.

  • Developing approaches to human-system integration and systems of systems research.

  • Sizing the HSI effort.

  • Designing shared representations to facilitate communication across disciplines and life-cycle phases.

  • Creating knowledge-based planning aids for providing HSI information.

  • Developing human-system integration as a discipline and as a lead for the IPT.

  • Fostering more synergy between research and practice.

Institutionalizing a System Development Process Based on the Success Factors

Through our analyses of more and less successful HSI projects, our evaluation of alternative HSI process models, and our case studies, the committee makes the case that a model like the incremental commitment model better enables the kind of human-system integration that will be needed for the complex, human-intensive systems of the future. It embodies the success factor principles of stakeholder satisficing, incremental growth of system definition and stakeholder commitment, iterative system development and definition, concurrent system definition and development, and risk-driven activity levels, product levels of detail, and anchor point milestones. However, it does this in clearer ways than the spiral model, particularly for HSI considerations, and it does so in a manner compatible with the DoD acquisition milestones and the commercial IBM/Rational Unified Process and the Eclipse Process Framework OpenUP milestones. It provides a process framework for the top-level recommendation of realizing the full integration of human engineering and systems engineering.

Recommendation: The U.S. Department of Defense and other government and private organizations should refine and coordinate the definition and adoption of a system development process that incorporates the principles embodied in the incremental commitment model. It should be adopted as the recommended approach for realizing the full integration of human-related design considerations with systems engineering in organizational policies and process standards, such as the DoD 5000 series and the ISO systems engineering standards.

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Accommodating the Emergence of HSI Requirements

Particularly for complex systems of systems and for collaboration-intensive systems, human-system interface states, modes, and functional requirements are not known at the time of program initiation. Many current system acquisition policies and standards require these human considerations to be fully defined before proceeding into development.

Although it is risky to leave HSI requirements completely undefined, it is equally risky to insist on defining them before they are fully understood or allowed to emerge through experience. A reasonable middle approach is to use incremental and evolutionary development processes and to define HSI requirements in terms of capabilities, with more detail provided for later increments, but sufficient detail provided for earlier increments to ensure proper preparation for the later increments. This approach is consistent with the principle of risk-driven levels of product detail.


Recommendation: The U.S. Department of Defense and other government and private organizations should revise current system acquisition policies and standards to enable incremental, evolutionary, capabilities-based system acquisition that includes HSI requirements and uses risk-driven levels of requirements detail, particularly for complex systems of systems and for collaboration-intensive systems.

HSI Operational Requirements in Contracts and Acquisition Documents

In discussing risk management, we have alluded to the importance of considering HSI aspects when negotiating baseline metrics for program execution. This negotiation is a critical phase in product development, when estimates and assumptions are formulated and agreed on by all stakeholders. Customer requirements and value propositions, technical performance measures that measure compliance with technical requirements, schedule milestones, and requisite resources all contribute to that negotiation. Involving HSI practitioners in the negotiation process ensures that their perspective and knowledge are accounted for, increasing the likelihood that HSI risks and issues will not arise during program execution. This recommendation focuses on policy, rather than research, and addresses the need to have human-system integration considered in establishing program execution baselines. Key to successful contract execution, resulting in an end product that fills a specified role and meets operational needs, are crisp requirements that have been properly vetted.

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

Recommendation: The U.S. Department of Defense and other government and private organizations should put the operational requirements of human-system integration on a par with traditional engineering requirements at the beginning of initial requirements analyses to determine which requirements have priority and provide an opportunity for negotiation.


Recommendation: When developing system acquisition programs, the U.S. Department of Defense and other government and private organizations should define potential means for verifying and validating HSI requirements to enable supplier program managers to establish clearly specifiable HSI technical performance measures for contracts.


The procuring agency has the ability to drive contractor HSI efforts by seeding the extent to which HSI considerations are accounted for contractually and their degree of importance. Without the inclusion of HSI considerations throughout program definition efforts, contractors have limited basis for addressing HSI considerations in their business offer.


Recommendation: The U.S. Department of Defense and other government and private organizations should account for HSI considerations in developing the technical, cost, and schedule parameters in the business offer. In particular, contracts need to reflect an understanding of how human-system integration affects the ability to reuse existing technical solutions or the feasibility of inserting new technologies, as well as an appreciation of how anticipated HSI risks may affect meeting program award fee criteria. It is also important that the contractor understand how HSI elements in their product offering contribute to achieving market capture goals and subsequently the viability of their business case.


Overall, the procuring agencies are able to directly influence the extent to which HSI elements are addressed in contracts by establishing well-articulated HSI requirements reflective of end-user needs and working with the contractor to establish verification and validation methods that overcome program management concerns about the typically subjective nature of HSI elements. The contractors or suppliers should take the time to involve HSI practitioners in their business development efforts to account for HSI elements in the business offer, thereby mitigating a portion of potential HSI risks and issues that may arise during program execution.

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Managing Integrated System Development so That All Representations Are Kept in Synchronization

In our vision for an integrated system development methodology, a serious concern is configuration control of the various entities that are being developed in order to support it. It is likely that new developments in web technology will be able to support some of these requirements.


Recommendation: Explore the usefulness of the technologies associated with Web 2.0 and related web developments for providing support for configuration control and synchronization of the component representations in a large system development project as they evolve and become more quantitatively defined.


Recommendation: Support a research program to explore how to provide flexible and open systems with appropriate security protections. The apparent conflict between openness and protection is not a matter of balance or trade-off, but rather of providing strong forms of both attributes.

Traceability and Requirements

The committee has argued for the importance of capturing the context of use in a form that can inform later phases of design. This is important to ensure that operational objectives and constraints and their design implications are taken into account in the system design process, so that the final “as-built” system meets the support objectives and constraints identified in earlier phases. This goal can be met only if methods and tools facilitate capture and traceability of HSI design objectives, decision points (together with the rationale for those decisions) and constraints across design phases.

Our vision is to adapt existing tools or to develop new software tools to facilitate the traceability of HSI design objective implications and how they are being met to ensure that they are preserved across design phases. This includes traceability across multiple intermediate human-system integration shared representations, starting with (1) outputs of context of use analyses that specify domain demands, stakeholder objectives, human performance needs, and design implications; through (2) the products of intermediate design phases, such as scenarios, personas, models, and prototypes; through (3) the decision rationale and system hardware and software design specifications intended to reflect the support objectives embodied in the design concepts; through (4) the final as-built system. Traceability across design phases is important to ensure that HSI objectives and constraints are preserved across design phases or when modification or redesign is un-

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

dertaken. It also makes it easier to assess whether the as-built system meets the operational and support objectives and design implications uncovered by earlier design phases.


Recommendation: Adapt existing or develop new methods and tools that facilitate capture and traceability of HSI design objectives, design rationale, and constraints across design phases. Specifically:

  1. Develop shared representations that effectively communicate how the output of one design activity meets the objectives, design rationale, constraints, and design implications uncovered in the prior design phase.

  2. Develop shared representations that effectively communicate essential design characteristics and their rationale that can be interpreted and used by multiple system development stakeholders—including individuals that did not participate in earlier design activities (see Wampler et al., 2006, for an example of an effort toward this goal).

  3. Adapt existing and develop new software tools to support traceability and update as changes arise in later design phases that require updates to outputs of earlier design phases.

  4. Adapt existing and develop new tools and techniques for explicitly connecting HSI objectives and design implications to higher level system requirements tracked in formal system requirements tracking systems. This is important to ensure explicit links between HSI design objectives and system-level requirements that reflect contractual commitments.

  5. Adapt existing and develop new methods for generating scenarios that reflect the range of complexities uncovered by context of use analyses. This corpus of scenarios can be used to support development and evaluation of designs, procedures, and training, including human reliability and safety analyses. They could also be used to exercise models and simulations as part of the system development process. The goal would be to ensure that the systems have been explicitly designed and tested to support performance across a comprehensive range of representative situations, as identified by context of use analyses. Context of use scenarios are also essential to the meaningful definition of such key performance parameters as response time, reliability, and accuracy.

  6. Develop methods to identify meaningful human (and joint person-computer system) performance metrics that can provide the basis for objective system acceptance criteria. This is important to encourage incorporating HSI objectives as part of formal contractual

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

requirements that are established early in the systems acquisition process. Steps include

  1. Developing methods for identifying individual, team, organization, and joint person-computer system (as well as systems of systems) performance metrics that provide objective measures of factors that are key to successful performance of tasks, of system design, and of accepted systems.

  2. Developing methods for establishing objective acceptance criteria that accurately reflect human-system integration and context of use goals while being straightforward to evaluate.

Shared Representations

The committee has argued for the importance of shared representations, sometimes referred to as boundary objects. They can serve an important role in fostering communication across the various systems engineering disciplines. Focusing explicitly on representations that communicate across discipline boundaries is novel. Although we have provided many examples of artifacts that could serve as shared representations, research is needed to understand just what this means and how best to achieve it. We identified a specific issue concerning shared representations for task analysis among the specialists supporting the various MANPRINT domains, especially the domains of human factors, staffing requirements, training, and safety. Each tends to undertake its own task analysis, resulting in substantial duplication of effort.


Recommendation: Conduct research to identify characteristics of shared representations that communicate effectively across HSI domains and engineering disciplines. We recommend the following steps:

  1. Identify characteristics of a useful shared representation:

    1. Define what it means to share an understanding.

    2. Characterize the mental models and representations associated with design used by various stakeholders, such as flow charts, blueprints, wiring charts, or Gantt charts, as well as more work-oriented representations, such as prototypes and mock-ups.

    3. Define the areas of overlap between those who are practitioners in HSI domains and other stakeholders that represent fruitful areas in which to develop shared representations.

  1. Consider a specific area, such as cognitive task analysis or risk analysis:

    1. Review and evaluate existing and proposed representations.

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
  1. Identify common aspects and differences.

  2. Synthesize representations that have the potential for improving communication across stakeholders.

  1. Assemble an IPT representing the MANPRINT domains with the assignment to reach agreement on a single set of generic specifications for what needs to be included in shared representations for task analysis.

  2. Design a multimedia database format and tool, including coordinated video, as a shared representation derived from HSI evaluations. Build on multimedia software and tools used for documenting usability evaluations.

Systems of Systems

There is a gap in the arsenal of HSI methodologies in that many of them (perhaps most of them) fail to scale up to the systems of systems level. For example, usability methods are typically suited to the single user–single interface scenario: How can these methods be adapted to complex systems of systems, and how can organizational modeling approaches (National Research Council, 1998) be applied to human-system integration? Similarly, how can other HSI methods, such as cognitive task analysis and participatory design, be adapted for this complexity? Is cognitive work analysis as suited for network-centric command and control environments as it is for process control systems (Cummings, 2006)? Other methodological issues, such as envisioned worlds (i.e., systems that do not yet exist in any form and may even be revolutionary, resulting in the need for methods that are not anchored in existing systems) and tailorability to the situation, are exacerbated by the complexity of systems.


Recommendation: Conduct research and development on HSI methods for systems of systems in the following manner:

  1. Develop a test-bed that provides a research environment simulating systems of systems in the context of a domain by working closely with domain users, experts, and developers to design the test-bed, and to ensure transition of work in the test-bed to the real world.

  2. Select methods and identify potential ways to adapt them for complex systems. Include state-of-the art methods and technologies, such as data mining, wikis, social software applications such as blogs and tagging systems, and virtual collaboration and envisioned worlds.

  3. Apply the methods in the context of the test-bed to test reliability and validity, compare methods with each other, and identify meth-

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

ods that scale up and aspects of methods that seem to scale better than others.

  1. Feed back scalable methods and methods for envisioning new systems to system developers.

In addition to a gap in metrics applicable to systems of systems, there are other problems that arise in regard to human-system integration and systems of systems. For example, the human capability for understanding or developing a mental model of a system of system stretches the limits, raising issues for training, operations, and maintenance of these systems, as well as for determining risks or degree of system resilience (Feltovich et al., 2004; Hollnagel, Woods, and Leveson, 2006). Furthermore, systems of systems inherit potentially incompatible human-system interfaces from the best suppliers and legacy systems. Systems of systems also bring together stakeholders with different linguistic, cultural, and technical backgrounds who must effectively collaborate but have a wider range of linguistic, cultural, and technical backgrounds than those involved in smaller systems. Finally, systems of systems must support multiple missions with different objectives, constraints, and success-critical stakeholders.


Recommendation: Conduct research and development studies to

  1. Develop mental models and system transparency as applied to large and complex systems of systems.

  2. Undertake efforts toward envisioning methods and models to uncover the sources of complexity and points of interdependency across systems and anticipate their impacts on the people operating in those environments.

  3. Undertake studies to develop methods and tools for identifying and reconciling incompatibilities inherited from the best suppliers and legacy systems.

  4. Undertake studies to develop methods and tools for analyzing and synthesizing candidate multimission solutions and supporting stakeholders’ convergence on a mutually satisfactory solution.

  5. Undertake studies to develop methods and tools for analysis and design of resilient systems that foster adaptability to cope with unanticipated disturbances and change (Hollnagel, Woods, and Leveson, 2006).

Sizing the HSI Effort

Systems engineering maturity models, such as the capability maturity model integration, require organizations to have objective and experience-

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

based methods for estimating systems engineering effort, but in practice the methods for estimating HSI effort are largely ad hoc. In general, the estimation community has a number of methods for estimating effort, but their relative applicability to HSI effort estimation is not well understood.

Major relevant classes of effort estimation include (1) bottom-up or activity-based methods in which individual performers estimate their needed amount of effort and the results are summed up; (2) top-down or system-based methods that involve various forms of analogy-based estimation (using comparisons with the effort expended on similar previous systems); (3) unit-cost methods that involve counting the number of work units (operational threads or scenarios, transaction types, etc.), perhaps weighted by complexity, volatility, and reuse, and multiplying the number of work units of each type by the average effort for each type; (4) expert consensus methods that involve IPTs or consensus-determination techniques such as Delphi to converge on an effort estimate; (5) parametric models that attempt to characterize and parameterize the factors that cause variations in effort per work unit and to develop parametric models that account for the variations; and (6) risk-based “how much is enough” models that involve balancing the risk of expending too little HSI effort (operational shortfalls, expensive rework, project overruns) with the risk of doing too much HSI effort (critical path delays in making project progress; nonvalue-adding effort). Each of these approaches has strengths and weaknesses.


Recommendation: Conduct research to develop, experimentally apply, evaluate, and refine versions of these methods for HSI effort estimation.

Knowledge-Based Planning Aids for Human-System Integration

As described in our vision for knowledge-based planning, currently human-system integration most often takes place as stand-alone activities that are not well integrated with the mainstream system development processes. Research is needed to develop a framework for integrating and adapting HSI methods and techniques into complex system development environments, supported by a tool implementing the framework that can be used to select the most cost-effective methods and techniques based on operational, business, organizational, and project needs. Human-system integration and systems engineering activities rely on different methods, techniques, languages, and tools.

The basis for integration exists in ISO/IEC 15288 (systems engineering—system life-cycle processes), ISO/TR 18529 (human-centered life-cycle

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

process descriptions), and ISO/PAS 18152 (specification of the process assessment of human-system issues), as well as in approaches to human-system integration. See International Organization for Standardization (2000b, 2002, 2003). Schaffer (2004) has published an example of how to institutionalize usability.

Some example planning tools that could be leveraged to support this kind of development are

  • Logistics planning tools, such as DART and Cougaar in the Defense Advanced Research Projects Agency.

  • Hardware, software, and systems engineering resource estimation tools, such as the Price Systems, Galorath, SEER/SEM, and USC COCOMO/COSYSMO tool suites.

  • Risk assessment tools, such as Active Risk Management, @Risk, the Software Technology Risk Advisor (Toth, 1995), and Expert COCOMO/COCOTS (Madachy, 1995; Yang et al., 2006).

  • Experience base management systems, such as those at the NASA–University of Maryland’s Software Engineering Lab and the Mitre Corporation’s risk repository.


Recommendation: Develop a framework for integrating and adapting HSI methods and techniques into complex system development environments.


Recommendation: Establish a top-down framework for integrating human-system integration with contrasting development environments to provide the common ground to leverage the integration of HSI methods, languages, and techniques into systems development.


Recommendation: Develop a tool for selecting the most cost-effective methods and techniques for human-system integration based on business, organizational, and project needs and for integrating them with system engineering processes. There is currently little agreement in textbooks or the literature on appropriate methods and techniques, with conflicting advice from different sources.


Recommendation: Based on the framework outlined above, develop a set of criteria for selecting methods and techniques derived top-down from specific organizational, project, and life-cycle needs. The criteria will promote effective integration with mainstream system development processes.

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
  1. Provide estimates of the relative costs and benefits that would be obtained by using different combinations of HSI methods and techniques.

  2. Develop support tools incorporating the criteria.

Developing Human-System Integration as a Discipline

This report makes the case that improved system performance and reduced development and operational risk would result from proper attention to the human user’s capacities and limitations and from better integration among user requirements and technical specifications, especially concerning the introduction of computer support and automation. The committee has established a vision for human-system integration to emerge as a distinct discipline. Such a discipline would be made up from components of systems engineering, occupational health and safety, human factors and ergonomics, manpower, personnel and training, as well as business economics. It would provide specialists who could serve as the lead on HSI IPTs, as the HSI representative on multidisciplinary IPTs and, with the appropriate experience, could be selected as system development program managers. As systems and systems of systems become increasingly complex, the kind of expertise associated with this discipline will be a requirement.


Recommendation: Human-system integration should be developed as a distinct discipline. Several questions and actions are posed in reaching this goal:

  1. What is HSI expertise?

    1. Building on the work of Booher (2003a, 2003b), develop a consensus-based taxonomy of skills, knowledge, and abilities by surveying leading HSI subject matter experts in both commercial and military domains. Use the definitions and assumptions from Booher (2003a, 2003b) and from this report to define human-system integration and to design the survey instrument.

    2. Perform a market study that quantifies the benefits and costs associated with formalizing HSI curricula and continuing education programs in current or emergent academic departments. Kleiner and Booher (2003) provide a template and details for such a curriculum. Experience gained thus far with the Naval Postgraduate School HSI program can be benchmarked for additional education programs. There is a need to serve both the military and nonmilitary communities.

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
  1. What does it mean to be proficient at human-system integration?

    1. Benchmark current best practices and requirements derived from this report; create a standardized HSI program management job description (knowledge-skills-abilities expectations).

    2. Assuming the results of the market study are suggestive of further development, fund a number of pilot HSI graduate programs. The details of curricula and proficiency requirements will be established by the academicians in these departments, with input from HSI subject matter experts.

  1. What is the rationale for selecting alternative HSI methods for different purposes?

    1. Research is needed to establish the reliability, validity, and scalability of HSI methods, as well as the knowledge, skills, and abilities required to carry them out. These results are needed so that contractors can justify their selection of methods and procuring organizations can evaluate their selections.

  1. How can the discipline grow internationally?

    1. Establish a source of HSI research funding that requires cross-cultural or international teams. This can be funded by a single agency (e.g., the National Science Foundation or DoD) or can be multiagency (e.g., Department of Commerce/European Union).

    2. Establish international HSI symposia within recognized professional conferences (e.g., International Ergonomics Association, North Atlantic Treaty Organization).

    3. Establish dedicated international HSI meetings and conferences.

    4. Establish or reinforce HSI technical groups in relevant professional organizations, such as the Human Factors and Ergonomics Society that has a systems development technical group, and the International Council on Systems Engineering, which has an HSI working group that can help promote the recommendations in this report.

    5. Establish an International Journal of Human System Integration to disseminate applied research and appropriately evaluated case studies related to human-system integration. Ideally, a relevant government agency or university with appropriate funding would host such a journal. The objective of the journal would be to serve as a repository of applied research, including appropriately designed and evaluated case studies that will expand the depth and breadth of knowledge, skills, and abilities associated with human-system integration worldwide, across application domains and sectors.

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Fostering More Synergy Between Research and Practice

One factor that has hampered the advancement of human-system integration as a discipline is the chasm that exists between research and practice. Practitioners are not sufficiently aware of relevant research, and research is not sufficiently informed by the insights and body of knowledge gained from practice (Norman, 1995; Woods and Christoffersen, 2002). There is a need to develop more effective ways to abstract knowledge and models from individual application contexts in a form that can be readily transferred to new application domains. While there are many examples of excellent HSI designs, their successes rely heavily on local knowledge and expertise. There is a need to develop methods and tools to more effectively leverage the knowledge and insights gained from practice and improve the cross-dialogue between research and practice.


Recommendation: Develop methods and tools to facilitate knowledge generalization and transfer across application domains and improve cross-fertilization between research and practice.

  • Develop methods and tools for extracting abstract descriptions of behavioral patterns and the conditions that shape them that can be generalized across specific application domains (e.g., conditions that lead to specific error forms or foster specific types of expertise).

  • Develop abstract reusable design patterns that embody specific aiding principles and can be transferred across application domains;

  • Create publication vehicles for presenting field studies and design case studies that offer generalizable insights (the new Journal of Cognitive Engineering and Decision-Making is one such example).

  • Encourage practitioner-oriented publications that synthesize research results in a form that can be readily assimilated and applied by HSI practitioners.

Methods for Defining Opportunities and Context of Use

We make research and development recommendations in two major areas. First, we recommend the development of software tools to capture and disseminate the results of context of use analyses so that they can more easily by applied in various phases of system life-cycle development. Second, we make a series of recommendations concerning the active participation of users in engineering design, the future of unobtrusive, passive data collection, and the ethical considerations of both.

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Tools to Support Capture and Dissemination of Results of Context of Use Analyses

The committee has argued for the importance of capturing the context of use in a form that can be more readily communicated and used throughout the HSI design life cycle. Improved software tools are needed to support capture, organization, dissemination, update, and retrieval of results of context of use analyses. This includes capture of the results of task and cognitive task analyses, field observations, participatory analysis and design activities, contextual inquiry, and work domain analyses. The research objective is to provide a suite of software tools to enable analysts to build and maintain a core corpus of work domain and context of use knowledge that can be updated easily as new information is learned, communicated to stakeholders effectively, and accessed and reused more readily across the life cycle of a development project. This core corpus of knowledge would then be available to inform design, the development of procedures, the development of training, the development of safety case submittals, etc. Such a resource would be especially valuable in complex design projects whose development can span multiple years and multiple organizations. Some promising research efforts toward developing core multimedia knowledge repositories include the work domain analysis workbench developed by Sanderson and her colleagues (Skilton, Cameron, and Sanderson, 1998) and the CmapTools software suite created at the Institute for Human and Machine Cognition. More research is needed to produce more robust systems with broader applicability.


Recommendation: Conduct research to provide a suite of software tools to enable analysts to build and maintain a core corpus of work domain and context of use knowledge. Specific steps include

  1. Identify characteristics of a core corpus of work domain and context of use knowledge required to support a variety of stakeholders across the system life cycle. This would include HSI system designers, individuals responsible for system verification and validation, individuals responsible for development of risk analyses and safety case submittals, and individuals responsible for personnel selection, personnel training, and procedure and document development for system operation and maintenance.

  2. Explore multimedia databases and software architectures to support development and retrieval of a variety of shared representations derived from context of use analyses. This would include graphic representations of domain and context of use knowledge, such as concept maps and abstraction hierarchies, and multimedia

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

capture of elements of the work context and proposed design concepts (e.g., videos illustrating expert strategies, still images of work environments illustrating user-generated artifacts and workarounds compensating for poor system designs, sketches of design concepts generated during participant design sessions).

  1. Identify and develop a demonstration project that would exercise and evaluate the approach in a particular complex application—ideally one that involves system of systems design challenges.

User Participation in Systems Engineering and Event Data Analysis and Their Ethical Implications

In the vision for systems engineering for user participation scenarios, we have argued for new ways to understand conditions in the field, as well as the work practices of the end-users that involve unobtrusive, passive logging and interpretation of these activities. We have also argued that the new technologies of Web 2.0 and related web development will allow end-users to modify, create, and revise systems that are already in use, thus providing a significantly greater role for end-users in designing the systems that they will use. Finally, we have argued for greater use of what we called event data analysis, to collect users’ actions and other occurrences in the field and to find emergent patterns from the data. These trends converge into three related sets of recommendations.


Recommendation: Conduct a research program with the goal of revolutionizing the role of end-users in designing the system they will use.

  1. Conduct lab and field studies to understand current practices in server-log data extraction and analysis and develop tools to efficiently generate logs whose format is more useful to analysis than server logs.

  2. Develop tools to facilitate re-representing automatically generated data, such as server logs, reflecting users’ perspectives on their work, their tools, and their experiences. We note that some of these issues may also involve issues of credit or payment or digital rights management of the users’ ideas (intellectual property). Specific research and development activities include

    1. Conduct research (lab and field studies) and develop designs and technology to support user control or influence over the online display of the user’s identity, i.e., impression management (Goffman, 1956) and reputation management (Beard, 1996).

    2. Conduct research (lab and field studies) and develop designs

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
  1. and technology to support user control or influence over descriptions and representations of the user’s experience (individually or collectively with other users).

  2. Conduct research (lab and field studies) and develop designs and technology to support users in developing shared representations that effectively communicate the users’ needs, goals, intentions, strategies, and user-generated solutions to problems (individually or collectively with other users).

  3. Conduct research (lab and field studies) exploring the usefulness of collaborative communication technologies for accomplishing the goals of improved user participation in system development.

  1. Conduct research (lab and field studies) and develop designs and technology to support users in transforming existing technologies and systems into modified or new systems that meet their needs. This process has variously been described as “re-invention,” “evolution,” and “evolvability,” “appropriation,” and “field-modification.” Specifically:

    1. Identify, develop, or refine (as necessary) new software architectures that make it easier for users to modify systems or tailor configurations to support new uses, for subsequent use by other users or for subsequent “harvesting” by organizations.

    2. Develop tools to support users in maintaining credit or ownership for their innovations.

  1. Conduct research on the interactions of the new technologies—such as the introduction of sensors in objects (spimes) and locations (geospatial web), in the targeted contexts. Specifically:

    1. Determine what the introduction of spontaneously communicating ubiquitous devices will do to the work. Develop methods for users to reform or reshape those technologies to change those interactions, as needed.

    2. Determine how users will understand the functionality and security or privacy challenges of the new sensing and data integration technologies. Determine effective ways of presenting these new technologies and new challenges to end-users.

    3. Identify the users’ mental models of the technologies. Determine how the technologies should be changed or packaged to match these mental models. Determine what education or training will be needed on the part of end-users.

    4. Determine how these new technologies can be made useful and usable by end-users in offices, homes, and military theatres through reinvention or field modification and other practices.

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

The first set of recommendations in this section explored the use of these data and technologies for end-users’ recording of data and experiences, harvesting of insights, communication of lessons learned, and expression of needs and recommendations. By contrast, the second set of recommendations in this section explores the analyst’s role in the use of such data for somewhat different purposes. In these recommendations, we focus on a more analytic approach to real-time data collection, with an emphasis on data collection that does not intrude on the users’ consciousness and therefore may provide a more traditional view on time and motion and other quantitative measures of how users do their work. Data such as keystrokes, communications, emails, and web sites visited can be logged unobtrusively over the course of a day, weeks, or years as the user performs the task and potentially serve as a rich source of ethnographic and usability data for human-system integration. In addition, Web 2.0 and the emerging concept of “attention data” (i.e., Where does the user spend time and effort?) promise to create enumerable possibilities for rich yet unobtrusive data collection.


Recommendation: Refine event data analysis methods and develop new methods in line with the following series of interrelated activities:

  1. Explore the data sources described above for types of data that can be collected without interfering with the users’ ongoing work (e.g., keystroke analysis, observational cameras, and transportation data).

  2. Instrument a setting (real or test-bed) for collection of event data of a targeted variety to understand the practical implications for obtaining these kinds of data.

  3. Collect other indices of performance/usability/cognition as well to serve as criterion measures.

  4. Request users to provide their own perspective on their work (e.g., according to selected methods in the first set of recommendations in this section).

  5. Apply, adapt, and develop data-mining or pattern recognition algorithms to identify regularities, anomalies, and changes in the data.

  6. Map the patterns onto meaningful outcomes by associating them with other criteria.

  7. Derive a small number of data structure standards for the records of such a behavioral instrumentation log, to facilitate quick analysis, searchable storage, and (when appropriate) data exchange of behavioral instrumentation logs in a (secured) group of collaborators or analysts.

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

The collection of these kinds of data raises new issues in security, privacy, and (ultimately) ethics. Some organizations provide guidelines or policies in these areas, but even in those cases, there are many questions for which the researcher/practitioner/engineer must take responsibility. Many systems inform the user that her or his data may be used for research purposes. For large-scale systems, users often form a reasonable assumption that their limited use of the system will be under the radar of any research program. However, contemporary and near-future quantitative techniques address very large data sets and can easily find individual users who match certain search criteria. Indeed, many of the commercial applications of attention data operate on just such a basis. Thus, no one can be confidently under the radar any longer, but most users are not aware of this change.

Convergently, there have been major advances in data mining and data extraction by several communities whose interests are not necessarily aligned with the interests of the users, such as advertisers, fraud artists, and intelligence agencies (e.g., legitimate agencies as well as competitive agencies in the commercial space and enemy agencies in the military intelligence space). Various low-visibility industries exist for the purpose of understanding users’ interests and habits from the perspective of manipulating or taking advantage of them. When researchers or engineers compile large data sets, they are producing targets of high value for this shadowy industry.

A third set of issues arises in different national policies. In the United States, most users consider their data privacy to be their own responsibility. By contrast, countries in the European Union are more likely to have rules that govern the privacy of personal data, in which personal data can in some cases include not only private records created by an individual, but also private and public records that make reference to that individual.


Recommendation: Conduct research on technologies to protect privacy and security and on the broader ethical and legal issues surrounding privacy and security.

  1. Develop a graduated scale of data privacy. Some data about users should be generally available; other data should have greater protection. What models of data protection are technically feasible? What options for user privacy and permission should be provided, beyond the current two approaches that have been summarized as

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

“opt-in” and “opt-out”?1 How can the available privacy options be effectively presented and explained to the users? What technology and user experience are required to allow users to define and implement their own data protection and security policies?

  1. Examine the programs of nonprofit organizations that have proposed to store users’ data in a protected repository, so that users can negotiate for some benefit in exchange for allowing other organizations to access and use their data.2 How can these options be implemented technically on a large-scale (market) basis? How can these options be effectively presented and explained to the users? What commercial models of benefit for personal data access transactions should be available? What fraud protections are possible? What are effective mechanisms through which users can (a) make their personal data available to third parties and then, upon need, (b) withdraw both their permission and their data from those third parties?

  2. Explore ways in which large data sets of user information could be made available to authorized users and yet be protected from unauthorized users. Determine ways to (a) detect unauthorized access, (b) record the extent of unauthorized access to data stores, and (c) automatically notify affected users, so they can know what kinds of self-protection to invoke following such unauthorized access.

Methods for Defining Requirements and Design

The committee makes recommendations concerning the research and development needs related to human-system development and to developing prototypes of organizations and training programs.

Human-System Model Development

Human-system models have been shown already to be useful in the system acquisition and development process as a means to reduce uncertainty and development risk; however, they are not employed to the extent that even the current state of development would justify. There is a perception that models that reflect human performance characteristics are too hard to

1

In an opt-in approach, the user’s permission (e.g., to store or share data) is explicitly requested, and no action is taken unless the user takes an action to permit storage or sharing of data. In an opt-out approach, the user is informed that storage and/or sharing will be done unless the user takes action to prevent it or to revoke permission. In this latter case, the burden is on the user to prevent the storage or sharing of her or his data. In general, users prefer opt-in approaches, whereas merchants prefer opt-out approaches.

2

See http://www.attentiontrust.org.

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

use or understand. Potential users focus on the limitations and not on the advantages. In fact, models exist at all levels of complexity from simple mathematical expressions to complex computer programs. That said, it is true that the more sophisticated models, particularly those derived from discrete event simulators and cognitive architectures, are often brittle, costly, and time-consuming to develop and are not yet well validated for all uses in design. There is a wide variety of both research developments and policy changes that have the potential to impact the usefulness and usability of human-system models.


Recommendation: Conduct an in-depth study of how human-system models are created, used, and shared, together with their strengths and limitations. The study should consider not only the various structures and architectures in which to build models, but also how data are acquired and represented in these models. What makes a model easy or difficult to use? To what extent are models reusable? Why aren’t they reused more often? Such a study would support improved education about how to develop models as well as provide recommendations for improving the quality, robustness, usefulness, and usability of the models that are developed. The study should include a retrospective review of a range of models, such as Fitts’s law, signal detection theory, GOMS, Micro-Saint-based models, to complex cognitive architectures, such as ACT-R and EPIC.


Recommendation: Pick, as a case study, a class of models at an intermediate level of complexity and invent a high-level human-system model development language, having as its goal to make building such models as simple as customizing an Excel spreadsheet to a specific application.


Recommendation: Explore the applicability of computer learning and adaptation algorithms for growing more robust models.


Currently, in models such as IMPRINT, the user models included in the systems are useful, but the theories from which they are derived often lead to a basically linear, single thread model of human attention to tasks. Increasingly, multiple task management, the impact of interruptions, and the role of situation awareness for decision making and planning are important in complex system analysis.


Recommendation: Expand the fidelity of the user representations to include these aspects of behavior and how these aspects change with time on task, workload, heat, stress, and other behavior moderators.

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

Recommendation: Expand models, particularly human behavior representation and cognitive models, to include the effects of culture, social processes, and emotion. This will also require gathering additional data, as many studies in these areas are not performed with the application to models in mind.


There is much research on validating models, and it is recognized as a very complex and difficult problem. The consensus is that face validity is inadequate, but that achieving “application validity” is realistic and should be required. Application validity is defined as the degree to which a model or simulation is a faithful representation of the real world from the perspective of the intended users. Models are developed for specific purposes, and it is validation with respect to those purposes that is important.


Policy Recommendation: Require all human-system performance models that are to be used in system acquisition risk reduction to meet the standards of application validity.


Recommendation: At a research level, better validation criteria need to be created. How good is good enough? Better model validation criteria are needed for specific model types and for models in general. Currently, when models are applicable, how much risk they reduce, or how valid they need to be to reduce risk is not well defined or even well explored.


Models and simulations have the potential to serve as effective shared representations for communicating the state of system development across the range of stakeholders. Their major uses will be to support coordination and integration of multiple viewpoints, to provide shared envisioning of operational concepts and predicted performance characteristics, and for system integration. Current examples fall short of achieving required goals and require further development.


Recommendation: Conduct research on how to make the design rationale and the relationships among model and simulation assumptions, execution, and derived performance measures more transparent and understandable.

Prototyping Training and Organizational Design

The committee has explained the role of prototyping in the systems development process. One of the challenges in developing integrated systems is that of the balance of prototyping elements of the proposed system

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

in isolation (in order to support parallel development and validation of the elements) and prototyping the collection of subsystems (in order to evaluate the overall behavior of the linked subsystems and trade-offs among them). In conventional systems engineering practice, both are done. The real challenge comes when the human operator, team, or organization must be considered in a more inclusive HSI design effort. It is clear from increasingly complex system development efforts that the earlier HSI issues can be addressed, the better.

One way of addressing the challenge early is to create—like the early machine system prototypes—early prototypes of the people organization that will be interacting with the mediating technology or system hardware and software components. Organizational prototypes could take many forms. They could be simply verbal or descriptive concepts and theories, involving walkthroughs or talkthroughs with hypothetical organizational structures. Or the rules defining the relationships between organizational elements could be defined and individuals could stand in for each organizational element, a kind of interactive role-playing, and carry out prototypical interorganizational operations (i.e., follow predefined scenarios), while observing these rules and constraints (for summaries of successful applications of these types of approaches, see Bjerknes, Ehn, and Kyng, 1987; Bødker et al., 2004; Muller, 2007; Muller et al., 1997). Alternatively, organizational elements could be represented by computational models and simulations, including the rules and constraints for interacting with each other (National Research Council, 1998). For example, a synthetic teammate based on a computational model could serve as a training or operational aid, as well as a component prototype for system design (Gluck et al., 2006). It will mean that one has to know not only about all the interrelationships or links involved (human to human, human to machine, and machine to machine interactions) but also the nonmachine elements or nodes. Like any prototyping problem, the appropriate level of resolution (person, team, organization) will become even more critical as person-machine coproduction defines the success or failure of the teams, organization, and system involved.

Similarly, it is important to consider prototyping the training program of potential team members as early in the design process as possible. This involves postulating alternative ways the training could be accomplished and testing their usefulness at varying levels of specificity as the design matures. Success in this approach will mean that the system design, organization, and training program all co-influence each other. This kind of work is at such an early stage of development that there are many unanswered questions:

  1. What does it mean to prototype an organization—what is the cur-

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

rent state of the art? Are there differences with prototyping formal versus informal organizations? What are the implications for prototyping static versus dynamic teams and organizations? And how are artifacts—the nonhuman components of the system—accounted for?

  1. Are the prototyping issues different for individual, team, and organizational prototypes? What disciplines should be involved in supporting prototyping at an individual (cognitive psychology), team (social psychology), and organizational level (sociology, economics, anthropology, political science)?

Recommendation: Undertake a review of the current state of the art in prototyping organizations. Define a set of requirements that effective prototyping methods should meet. Select a candidate relatively complex domain, perhaps a system of systems domain, and define alternative organizational structures that might be effective in this domain. Define alternative prototyping methods designed to span the range from very abstract to very specific. Apply the different methods to evaluate the different possible organizations for this domain and revise the methods until they meet the requirements proposed.


Recommendation: Undertake a review of the current state of the art in prototyping training systems. Define a set of operational domains and compare training requirements. Examine use of synthetic agents in the development of training prototypes.


Methods for Evaluation

We have discussed two classes of evaluation methods: risk usability evaluation and risk analysis. Here the committee provides research and development objectives in both areas.

Improve the Use of Usability Objectives

The quantification of usability goals through the use of usability objectives is a recognized human factors and HSI best practice for many kinds of systems. But their use is not employed very often or consistently. The main goal of specifying usability objectives (also known as usability requirements, usability goals, performance goals, human factors requirements) is to create a metric that can be applied during usability testing as a way of having quantitative acceptance criteria for the test. Usability objectives are one way to create a quantitative quality-related goal and avoid qualitative conclusions that are sometimes claimed about devices (e.g., “This device is user-friendly”). Typically, quantified usability objectives include

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
  • Human performance goals (objective goals), such as task completion time, success rate or error rate and type, learning time, and accuracy.

  • Efficiency (number of total steps and missteps), such as number of references to instructions or online help.

  • User satisfaction (subjective goals) using such approaches as rating scales (Likert, e.g., agree or disagree or comparative ratings) and semantic differential (pick rating between two opposite adjectives).

In systems in which usability objectives are relevant, they should be validated as part of customer requirements (using common market research techniques, such as interviews, surveys, and focus groups) and compared with competitive benchmarks (usually obtained from published studies or from comparative usability testing of best-in-class competitor’s products). Only a few critical task-related usability objectives typically are necessary. Examples of quantitative usability objectives or goals are

  • 90 percent of experienced nurses will be able to insert the infusion pump tubing set on the first try with no instructions. And 100 percent will be able to correct any insertion errors.

  • 90 percent of experienced anesthesiologists will be able to calibrate the cardiac monitor within 2 minutes with no errors.

  • Experienced operators working in port security will be able to detect potential dangerous substances with a sensitivity of d’ = 3 or greater.

  • Unmanned aerial vehicle operators will be able to fly 3 planes at the same time in level flight and be able to land the 3 planes within 15 minutes, with no more than a 5-percent failure rate.

  • 80 percent of experienced maintenance technicians will rate their satisfaction with the usability of device X as 7 or higher on a 10-point satisfaction scale.

  • After reading the quick reference card, 90 percent of experienced clinicians will be able to properly configure the display on the first try to show the two ECG lead traces.

  • 80 percent of experienced intensive care unit nurses will prefer the readability of the display for the latest generation ventilator monitor compared with the existing monitors.

  • 95 percent of technicians with no prior experience with this type of network management system will achieve the target mastery level in 2 or fewer hours of use.


Recommendation: For cases in which usability objectives have been shown to be useful, conduct research to develop better ways to investigate, set, and use them as acceptance criteria. This research would

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

specifically show the value and limitations of usability objectives in achieving overall project goals. Specifically:

  1. Improve methods for demonstrating when usability objectives are valuable by surveying DoD and commercial projects on their successes and failures in using usability objectives and collecting examples of usability objectives from surveys and literature reviews; create a taxonomy of usability objectives.

  2. Improve methods for creating and setting usability objectives by surveying methods that have been used and their strengths and weaknesses; conducting experimental research on the relationship of using risk-management techniques, like failure mode and effects analysis and fault tree analysis, to set utility objectives and whether these projects are successful and meet project/mission goals; and searching the literature in other domains, such as software, electrical engineering, and the like and how they have used quantifiable performance objectives and how they set and validate them.

  3. Improve methods for validating usability objectives by surveying validation methods that have been used and their strengths and weaknesses and conducting literature reviews of techniques in other domains, such as marketing research, used to validate their objectives.

  4. Improve methods for using usability objectives as a subset of project acceptance criteria by surveying techniques (including their strengths and weaknesses) for using usability objectives as acceptance criteria, including hypothesis testing and appropriate statistical techniques that have been used.

Maximize the Cost-Effectiveness of Usability Evaluation

Although usability evaluation methods are widely used, no systematic and generalizable research has been carried out on the study size, scope, or protocols that cost-effectively identify the most important usability problems. Nielsen et al. (1994) analyzed the results of usability studies in the early 1990s to produce a formula to relate the number of test participants to the proportion of usability problems identified. This has been criticized as being applicable to only a limited class of products. Molich and Nielsen (1990) have shown that different usability evaluation procedures identify different subsets of problems; however, there is not good matching of problems with evaluation procedures. Furthermore, it is rarely cost-effective to evaluate every permutation of user type and task.

There is little applied research evidence and few practices to assist a practitioner in deciding what number of studies to reduce the risk of

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

human system mismatches are cost-effective in a particular development environment, or to determine which groups or strata of users to include. Market researchers have developed efficient methods from sociology for defining and using segmented or stratified samples, and there has been a small amount of research in human-computer interaction (principally by Siegel and Dray, 2001) to integrate these market-oriented methods with traditional methods.


Recommendation: Conduct research to generalize the sample size formula developed by Nielson so that it can be applied to a wider range of products and systems, including such factors as system complexity, job function diversity, end-user demographics, and other relevant factors.


Recommendation: Conduct research to understand which evaluation procedures are most appropriate for different types of products and systems, and how the evaluation procedure can be refined to maximize the number of problems identified most cost-effectively while producing valid and reliable results.


Recommendation: Conduct research to understand how to choose culturally appropriate evaluation methods, how to treat each method as a lens on a potentially larger set of usability problems, and how to translate from the constraints of a particular evaluation method into a more general or canonical description of the usability problems that were discovered and clarified by that particular method.


Recommendation: There is often a shortage of skilled personnel to carry out usability evaluations. Conduct research to establish whether members of a development team without a formal human factors background could be trained to carry out simple usability evaluations that produce valid and reliable results or, failing that, to understand the trade-offs in data collection and quality when HSI methods are carried out by untrained practitioners.


Recommendation: Conduct research to establish how precisely the evaluation procedure needs to be specified to ensure that two organizations will produce acceptably similar usability measures for summative evaluation.

Identify and Assess HSI Risks

It is often stated in the HSI discipline that usability or human factors risks that are not addressed in the engineering design process are the basis

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

for catastrophic errors. The profession tends to fall back on the history of well-known events, such as Three Mile Island, Bhopal, and the Vincennes downing, to illustrate the perils of failing to address human-system issues in design. While these examples can be compelling, they do not provide a rigorous basis for understanding risks in developing systems or for analyzing potentially catastrophic error conditions that may result from human operation. There are many human error classification schemes, but they tend to be locally focused and do not scale up to system-wide implications.

We envision the initial research activity resulting from this recommendation to be developing a comprehensive database of HSI risks that are described at multiple levels and from multiple perspectives, from the initiating activity (e.g., cognitive error) to a system or society-wide result (e.g., melting the core of a power plant). This is both a theoretical and a practical research activity, requiring the integration and extension of various error classification schemes, with larger scale systems impacts, such as costs, malfunctions, rework, among others, and multiple theoretical and even political frameworks. We see this research activity as going well beyond the typical cost justification exercise for human factors engineering and resulting in a systems model of HSI risks.


Recommendation: Conduct research to develop a robust HSI risk taxonomy and a set of methodologies for analyzing and comparing relevant risk representations and conflicting values.


The general nature of the problem is to define the confluence of human and system factors that may align to create operational problems that exceed the design basis of the system or result in operations that were totally unanticipated (Reason, 1990, 1997). Such features have been referred to as “emergent” in discussions of systems of systems, and that is really the principal focus of this research—linking the human and hardware and software systems with analytic techniques that can better identify extreme situations. Incorporating the concepts of the relatively new domain called resilience engineering (Hollnagel, Woods, and Leveson, 2006) will help to move this approach forward.


Recommendation: Extend traditional fault tree and risk analysis techniques to better identify the “boundary cases” that may lead to extreme operational consequences.


The benefit to the HSI field of conducting this research will be to establish a more robust basis for risk analysis and design than currently exists today. The error taxonomies are a start, but they tend to leave off where theoreticians stop—well before examining the linkages in complex systems

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

during the design process. The overall vision of this research recommendation is that the results will place HSI risk analysis on a more even footing with well-accepted risk methods, such as the probabilistic risk analysis work performed in designing complex process plants, and they will extend the traditional fault analysis techniques to identifying and addressing situations that are beyond the typical design basis faults.

Improve the Communication of Risk

The analysis of risk must be done systematically, with great attention to use error or operational risk, business risk and mission risk, and (when appropriate) societal risk. In this report, the theme of risk reduction is mentioned quite often. Techniques such as failure modes effect criticality analysis and fault tree analysis are recommended to analyze and control risk. These methods have been in use for many years, but they suffer from methodological problems, mostly involving how to make reliable estimates of risk parameters such as fault likelihood and severity of the consequences. Another major issue concerns the weaknesses in the ways these project and user risks are communicated to decision makers and other stakeholders, as well as the political processes that may required to reconcile and integrate views of risks across multiple constituencies that may have different perspectives on systems and their implications (e.g., to achieve a satisficed solution).


Recommendation: Conduct research studies to show the value of improved assessment and shared representations that quantify the risk level for improved communication of business and operational risks to management and development team stakeholders. Specifically:

  1. Survey communication techniques in other domains, such as advertising, sales, and news, and categorize success factors that could apply to business and operational risk communications.

  2. Conduct literature searches and analysis of successful communication techniques used in other domains that might be applicable to risk-management communication stakeholders.

  3. Conduct experiments comparing different risk communication techniques, for example, do risk estimation calibration exercises to improve risk communication as measured by changes in operator or decision-maker behavior.

Recommendation: Support applied interdisciplinary investigations into the communication, representation, and negotiation of risks and related

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×

issues, with the goal of assisting conflicting parties in mutual understanding and satisficed decision making.

Identify and Assess HSI Contributors to System Adaptability and Resilience

While humans are often viewed as the “weak-links” in systems that contribute to errors and risk, there is a growing body of literature that has shown that people in fact play a critical role in system resilience—the ability of systems to operative effectively in the face of unanticipated disturbances. Individuals, teams and organizations contribute to system resilience by planning for, recognizing and adapting to perturbations and surprises—especially ones that fall outside of the range of situations that the system was designed to handle (e.g., Carthey, deLeval and Reason, 2001; Weick and Sutcliffe, 2001). Alternatively, the individuals and management policies can be the detriments to resilience. This has led to a newly emerging area called Resilience Engineering that attempts to advance the study and design of systems that exhibit resilience (Hollnagel, Woods, and Leveson, 2006; Woods and Hollnagel, 2006). More research is needed to understand the role people play in contributing to or inhibiting system resilience, and how new tools and technologies can be deployed to enhance people in the former role.


Recommendation: Conduct research to understand the factors that contribute to system resilience, the role of people in resilient systems and how to design more resilient systems. Some of the key questions that need to be addressed include the following:

  • What kinds of knowledge and strategies enable people (particularly experts) to catch and recover from error and adapt to unanticipated situations?

  • What methods can be used to analyze, measure, and monitor the resilience of organizations, systems and systems of systems?

  • What traits and metrics enable systems to be developed and evaluated according to their adaptability and resilience?

  • What methods can be used to model and predict the short and long-term effects of change on adaptability and resilience?

Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 296
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 297
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 298
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 299
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 300
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 301
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 302
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 303
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 304
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 305
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 306
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 307
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 308
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 309
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 310
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 311
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 312
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 313
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 314
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 315
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 316
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 317
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 318
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 319
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 320
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 321
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 322
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 323
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 324
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 325
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 326
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 327
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 328
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 329
Suggested Citation:"10 Conclusions and Recommendations." National Research Council. 2007. Human-System Integration in the System Development Process: A New Look. Washington, DC: The National Academies Press. doi: 10.17226/11893.
×
Page 330
Next: References »
Human-System Integration in the System Development Process: A New Look Get This Book
×
Buy Hardback | $80.00 Buy Ebook | $64.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

In April 1991 BusinessWeek ran a cover story entitled, “I Can't Work This ?#!!@ Thing,” about the difficulties many people have with consumer products, such as cell phones and VCRs. More than 15 years later, the situation is much the same--but at a very different level of scale. The disconnect between people and technology has had society-wide consequences in the large-scale system accidents from major human error, such as those at Three Mile Island and in Chernobyl.

To prevent both the individually annoying and nationally significant consequences, human capabilities and needs must be considered early and throughout system design and development. One challenge for such consideration has been providing the background and data needed for the seamless integration of humans into the design process from various perspectives: human factors engineering, manpower, personnel, training, safety and health, and, in the military, habitability and survivability. This collection of development activities has come to be called human-system integration (HSI). Human-System Integration in the System Development Process reviews in detail more than 20 categories of HSI methods to provide invaluable guidance and information for system designers and developers.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!