Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 55
Human-System Integration in the System Development Process: A New Look 3 Human-System Integration and the System Development Process An important theme of this report is the integration of human-system methods within the system development process, so that multiple human-system integration (HSI) concerns can be addressed effectively with the least resource expenditure. This reflects the position of Miller (1953) in his initial description of the task analysis method as a procedure that can serve design and training needs analysis equally well. The committee findings indicate that a core set of human factors method classes can serve as integrating links across the diverse HSI concerns of human factors, manpower, personnel, training and system safety, health hazards, and survivability. Furthermore, shared representations of the outputs of these methods can be developed that effectively will communicate findings and conclusions among HSI domains as well with hardware and software developers and other stakeholders. Three general classes of human factors methods provide a robust representation of the multiple HSI concerns, and are applicable at varying levels of effort throughout the development life cycle. These broad classes include methods to: Define opportunities and context of use: Methods for analyses that contribute to early definitions of opportunities and requirements and that attempt to characterize the context of use, including characteristics of users, their tasks, and the broader physical and organizational environment in which they operate so as to build systems that will effectively meet the needs of users and their work and will function smoothly within the broader physical and organizational context.
OCR for page 56
Human-System Integration in the System Development Process: A New Look Define requirements and design solutions: Methods to identify requirements and design alternatives to meet the requirements revealed by prior front-end analyses. Evaluate: Methods to evaluate the adequacy of proposed design solutions and propel further design innovation. These methods generate objective data concerning critical human-system issues, leading to incremental growth of system definition and stakeholder commitment. In Chapter 2 we showed how system development activities of the incremental commitment model (ICM) were distributed across system life-cycle phases (see Figure 2-3). Here, Figure 3-1 illustrates broadly how the four major classes of HSI activity relate to these phases. For example, activities related to understanding the context of use are likely be concentrated early in system development, when characteristics of the users, their work, and the environmental context are first being understood. However, because the context of use is constantly evolving and introduction of new technology is likely to produce operational and organizational changes, not all of which will have been anticipated ahead of time, it is important to continue to devote some (albeit lower) level of effort to examination of the context of use and how it evolves throughout the system development and deployment process, both to guide midcourse design corrections and to lay the groundwork for next-generation system development. FIGURE 3-1 Activity level of HSI methods across system life-cycle phases.
OCR for page 57
Human-System Integration in the System Development Process: A New Look HUMAN-SYSTEM INTEGRATION IN THE INCREMENTAL COMMITMENT MODEL In order to place human-system integration and its associated methods in the risk management context that is central to the incremental commitment model, it is important to distinguish several types of risk. End-state operational system risks include low usability, high rates of human error, low productivity, and safety problems. These types of risks tend to become manifest during the development process as a failure to properly manage HSI risks. They include such problems as specifying the user interface too early in design (or alternatively not considering it all), poorly understood work domain constraints, insufficient stakeholder engagement, and lack of personnel–organizational system interoperability in systems of systems. Often these types of risks are simply accepted or minimized because they pose a threat to maintaining program cost and schedule (program management risks). Properly balancing these various categories of risk can be accommodated in the incremental commitment model, as it is a risk-driven process that aims to identify and properly manage these various risk categories. By engaging appropriate HSI methods during the incremental development process, risks can be reduced throughout the engineering life cycle, increasing the likelihood of a system’s meeting user requirements and satisficing stakeholders. Appendix Table 3-A1 lists best practices for human-system integration taken from ISO/PAS 18152 (International Organization for Standardization, 2003) and categorized by activity category. Each of these practices is valuable for successful human-centered design. Examples of methods to use in implementing these practices are also shown in the table. A risk assessment can be used to decide how much effort is needed to implement each practice in the context of a particular project. Are the objectives that the user or user organization wants to achieve through use of the system already known, or is some field investigation necessary? How important is it to establish measurable usability criteria for the system in its intended context of use? What are the risks if end-users are not involved in each evaluation? Figure 3-2 illustrates the links between desired system end-state (stakeholder satisficing), system phases, and HSI activities. This figure conveys the multiple determinants of the ultimate system design goal, stakeholder satisficing. The system development principles identified in Chapter 2 are shown as inputs to the system engineering processes or phases. Each of the phases is conducted iteratively, as described in the incremental com-
OCR for page 58
Human-System Integration in the System Development Process: A New Look FIGURE 3-2 Linkage of system engineering principles to HSI activities that reduce risks.
OCR for page 59
Human-System Integration in the System Development Process: A New Look mitment model, and requires inputs from multiple HSI methods. This is illustrated by the network of links between HSI activities and the systems engineering phases. As with the level of activity diagram in Figure 3-1, our main point is that HSI activities are concurrent, iterative processes carried out as needed to reduce development risks at various incremental stages of system design. The role of human-system integration in the management of engineering development risk is a relatively new concept. Human factors engineering methods are traditionally conceived as design-aiding techniques, to be used when it is time to design or test very specific elements of the human-system interface. However, this conception is too narrow for complex systems that increasingly involve multiple teams of distributed operational personnel. Instead, human factors methods can be more broadly conceived both as design-aiding techniques and as methods for progressive risk reduction during the life cycle. In this sense, human factors methods contribute to the development process in much the same way as, for example, prototyping or simulation is employed by systems engineers and can be used during the early to middle stages of development to evaluate alternatives and to narrow design choices based on various constraints. The use of such risk-reduction approaches allows developers to select one design approach over another, gain an understanding of unanticipated effects based on simulation, and generally to have a higher level of confidence that system development efforts are on track to meet requirements and avoid the operational stage risks of disuse, error, and high life-cycle costs. Extensive experience with major system development efforts by committee members and colleagues in the profession suggests that human factors issues have often been underutilized during system development because of a perception by program managers that the risks of cost and schedule delay associated with human-system integration exceed the benefit to be delivered. Part of this perception is associated with a standard waterfall model of design, in which specific milestones are set in time, and HSI analytic methods tend to be time-intensive if performed in a linear fashion. Program managers often perceive human factors professionals as overly focused on comprehensive application of methods, while the art of engineering is to accommodate the realities of schedule and cost, conducting studies and analyses only as necessary to manage risks. It is thus important for HSI practitioners to adopt an incremental and iterative approach to analysis and design, recognizing that if there are no HSI risks associated with a particular aspect of a project, then it is unnecessary to apply various methods. The HSI profession has seen a trend in this direction with the development of
OCR for page 60
Human-System Integration in the System Development Process: A New Look such approaches as quick look reports1 in which the precision of laboratory experiment or exhaustive observation is traded off with the expedience of providing the most critical design inputs through rapid prototyping, contextual inquiry, and various forms of participatory design. By incorporating human-system integration as an integral thread within the incremental commitment model, the balance of program risks with HSI risks can be accommodated. This is especially true if HSI professionals are incorporated as members of an integrated product team structure that is involved continuously throughout the design cycle. Early iterations of work domain analysis, for example, may be conducted at a fairly high level to ensure that all appropriate stakeholders are identified and represented. As designs become more elaborated, participatory techniques can be applied to the point of reaching stakeholder consensus for purposes of a specific increment. This linkage of HSI activities to incremental development permits the design and risk management process to serve as a way to select the most appropriate HSI methods (and their extent of application) for a particular phase of the engineering cycle. It is not necessary to apply HSI techniques in a monolithic fashion, and in some cases it may not be necessary at all because there is no risk associated with a particular HSI issue; alternatively, program managers may accept certain identified risks (e.g., commercial off-the-shelf user interfaces) in order to preclude rejecting the use of certain technologies. An important aspect of the incremental commitment model and the spiral representation for HSI professionals is the notion that methods are not only progressive, but also iterative, and that risk analysis determines the frequency and extent of their application. This has important implications for sizing the HSI effort, since the resources devoted to HSI activities should reflect the requirements for their application. This is a difficult task to accomplish currently, since there are no well-established methods for estimating the resource requirements for human-system integration. Various systems engineering approaches to level of effort sizing include activity-based costing, comparison with previous projects of similar scope, applying a unit-cost basis (as when a request for proposal specifies how many human factors engineers should work on a system), parametric models that link effort to project complexity, expert consensus, and risk trade-off analysis. None of these approaches has been systematically examined for sizing HSI efforts, and this area represents a knowledge gap that could be addressed through research. 1 Quick look report is a term used primarily by the National Aeronautics and Space Administration and the Department of Transportation to describe the results of a rapid field observation or appraisal of a prototype system in a test situation. These appraisals are less detailed than formal operational test and evaluation procedures.
OCR for page 61
Human-System Integration in the System Development Process: A New Look COMMUNICATING HSI ISSUES AND OPPORTUNITIES THROUGH SHARED REPRESENTATIONS A critical concern of HSI professionals and a major theme of the committee’s work is the need not only to communicate effectively within the specific HSI domain but to share findings across all systems engineering domains. This can be a daunting challenge for groups tasked with complex systems design. In addition, there is a clear need to share design and process artifacts at all phases in the systems development process, especially with the software and hardware developers who are actually implementing the system, and who are not only relying on clear specifications for development, but who are also expected to contribute to the generation of those specifications at critical decision points in the process. As a consequence, the committee has pursued the concept of shared representations as a means of addressing this concern. Shared representations—particularly diagrammatic models and other more visual, holistic representations—can serve as the fundamental medium for interactions among individuals, teams, and the organization. Imagine a scenario from the commercial software development world. A team meets to kick off a new project. Groups of people from business, technology, and human systems discuss the goals of the project, the timeline, and other variables. Notes from the meeting are distributed within a few hours. Each group works on its tasks for several weeks. At the next meeting, team members discover that each discipline has taken its interpretation of the goals in directions different from those of the other groups, and now weeks have gone by with little truly collaborative progress to show for their efforts. Everyone attended the meeting and everyone read the notes, so what happened? The verbal description of goals and the written documentation of the events were not enough to provide common ground for the team. They lacked a shared representation of the event, their views, and what needed to be done by each of the groups. A shared representation is an artifact or experience that mediates the interaction between and among people coming from multiple perspectives (different organizational roles, distinct technical or business backgrounds, etc.). It can be useful for an individual, a team, or an organization (Curtis, Krasner, and Iscoe, 1988). A representation can provide support or scaffolding for effective collaboration among people in transdisciplinary design teams, and at the same time be used at the organizational level to “communicate up” to forge understandings between and among the various project stakeholders. A shared representation is most powerful when used not only to facilitate activities, but also to make people’s assumptions and individual mind sets explicit. Shared representations act as a means for synchronization, clarification, and grounding in the socially constructed process of design (D’Astous
OCR for page 62
Human-System Integration in the System Development Process: A New Look et al., 2004; Olson et al., 1992). In the process, solutions are negotiated, the representation acts as mediator, and subsequent modifications made to the representation make explicit the result of those negotiations. Why Shared Representations Are Useful Models, diagrams, and other, more visual shared representations are effective for people as they participate in design activities. Don Norman writes: “Without external aids, memory, thought, and reasoning are all constrained. But human intelligence is highly flexible and adaptive, superb at inventing procedures and objects that overcome its own limits.” He goes on to suggest that one can enhance cognitive ability by producing representations or artifacts to help one think (Norman, 1993; see also Hutchins, 1995; Nardi, 1996; Pasztory, 2005). When people externalize their thinking via representations (e.g., get their ideas out on paper or on screen), they produce a representation of their thinking that not only can be examined critically, but also can be used to reduce their working memory load (Nardi, 1996; Suwa and Tversky, 2002). In addition, by producing the representation and taking the information beyond words into a new form or medium, relationships among meaningful elements of the design must either be made explicit or must emerge, by the simple act of creating an explicit shared representation of the component elements in a well-framed space (e.g., a two-D sketch, a three-D volumetric representation, or some higher dimensional parametric space characterizing the design elements). Seeing these placements can not only lead the reader or the originators to recognize previously unacknowledged connections or relations, but also produce new connections and ideas. Suwa and Tversky (2002) call the activity detection of unintended relations and features. For example, in the applied cognitive work analysis (ACWA) method, the functional abstraction hierarchy is designed to highlight critical domain relationships that define the problem-space confronting domain practitioners. Each subsequent artifact in the process builds on the original model and, through negotiation, points to a model of what the system should be in such a way that it can finally be prototyped. Shared representations act as mediators in the collaborative and iterative construction of knowledge in the design process. When multiple people build, share, comment, and change a shared information base, they are collaboratively constructing new knowledge (Bucciarelli, 1988; Suthers, 2005). If participants produce different types of artifacts, representations, or models as they engage in the design and development process—these should, as Norman suggests, “help them think” and make their assumptions explicit. In a sense, shared representations work in the same way that blueprints work for architects in moving from what might be to what is built. Many different views are produced. Everyone involved in the pro-
OCR for page 63
Human-System Integration in the System Development Process: A New Look cess—from stakeholders to the various disciplines involved—is able to use the abstraction reflected in the blueprints to make meaningful decisions. For all these reasons, producing shared representations in transdisciplinary teams can be critical to creating innovative solutions because they help teams collectively see and communicate about novel connections, spawn new ideas, and facilitate a more effective design process (Détienne, 2006; Evenson, 2005). There is a dynamic between the representation’s role in facilitating externalization (making explicit the group’s assumptions and beliefs) and its role in acting as an environment for conversation, to facilitate subsequent negotiations and elaboration. The task of creating the representation initiates the process of making explicit the underlying goals, assumptions, and viewpoints of the different design team members; the process of updating the representation following design team negotiations captures the results of negotiating design meaning out of nonverbal, semiverbal, and verbal conversations engaged in by the various team members while discussing the current model or representation (Suthers, 2005). Attributes of Good Shared Representations Nearly every activity in the system development life cycle results in some form of tangible design artifact, but it may not necessarily be a good shared representation. To be useful, a shared representation should establish a shared language that is appropriately aligned with the development or communication problem to be solved. provide a strategically chosen extent of ambiguity versus definition. facilitate the desired social process (e.g., critique and redesign versus accept/reject decisions). make differences and relationships apparent. facilitate group “thinking with” (Norman, 1993) to transform knowledge and create new understandings (Carlile, 2002). provide a meaningful structure, content, and appearance to both the creators of the shared representation and the consumers of that shared representation. Of the six attributes listed above, the two most important are the shared language and facilitation of a social process. To establish the language, a shared representation should be easily read by all of its users (creators and recipients)—that is, the structure and content should be easily perceived and comprehended, reflect the structure and content of the ideas or mental representation, and create a sort of resonance. The participants in the construction process should agree that
OCR for page 64
Human-System Integration in the System Development Process: A New Look the thing produced adequately represents what they want it to (Tversky, Morrison, and Betrancourt, 2002). To facilitate the social process, the representation must stand in and mediate communication between and among people engaged in the collaborative process (Boland and Collopy, 2004). In other words, the representation must be suitable for facilitating negotiation among the participants. Shared Representations in the Design Process What is useful as a shared representation can change over time in the systems development process. In practice, the process of constructing the shared representation may be more important for team building than the artifact itself. Early on, mapping out the territory the system is expected to address can draw out existing preconceptions and knowledge held by the various team members, helping to bound research activities. A territory map is an example of a shared representation that captures more a gestalt or overview of the system. It is suggestive of everything the system is and—by virtue of what is left out—everything it is not. A good territory map accounts for all the stakeholder interests in the system; a great territory map provides a picture of the system that is comprehensive, cohesive, and visionary. Completed early enough in the process, a territory map can even serve to mediate the communication of the participants in the acquisition process. Teams that agree on what is in the territory have established common ground that can be carried forward throughout the design and development process. Documentation that focuses on activities identified from a territory map often becomes successful shared representations. They are produced in the midst of extensive task, process, or environmental research and provide a way to discuss what currently happens and what should or could happen. A standard recording language (UML, activity diagrams, business process modeling notation, etc.) facilitates discussion and contributes to the production of the shared representation. A less common but often effective shared representation early in the process can be developed with a focus on the target users of the system. For example, the findings from user-generated field journals (incorporating a standardized and embedded framework for users to record their observations) helps to extend the language of the team and build a model of system attributes important to the target user group. Sometimes shared representations can function as a vehicle for the clarification of ideas (e.g., Suthers, 2005) or as an opportunity for groups to combine their different perspectives and knowledge into new insights (Muller, 2003; Muller et al., 1994). This is often the case when a low-fidelity prototype (such as a blank shape that is used symbolically “in place of” a real device or prototype) is used as a candidate for eliciting different
OCR for page 65
Human-System Integration in the System Development Process: A New Look stakeholders’ ideas about what would be done with the product, system, or service, as well as for eliciting and exploring different stakeholder concepts and assumptions. This was successfully demonstrated in the UTOPIA project, in which the implications for working relationships from a new print shop technology were explored by placing low-fidelity mock-ups of the new technology in the existing print shop and by acting out the new work practices around those prototypes (Ehn and Kyng, 1991). Erickson reviewed the importance of “roughness” in a shared representation, noting that less formal and less finished representations were more likely to elicit useful comment, critique, and improvement, whereas more formalized or polished representations were more likely to lead to simple accept/reject decisions (Erickson, 1996). People feel more open to participating in and refining ideas in sketches and than they do in finished prototypes. Personas are also often an excellent shared representation category because they are composite user archetypes based on behavioral data gathered from many actual people during discovery research (see Chapter 7). Personas are useful because they build on people’s expectations about other people’s behavior from what is known about that person. Developing solid profiles or personas contributes to serving individual user needs, aids in integration with customer processes, and leads to a design that the various constituents can participate in co-evolving. As a shared representation, personas and profiles are a tool for making user needs explicit, differentiating between and among different stakeholders, and prioritizing different and sometimes competing goals (Cooper, 2004). Even physical spaces can become successful shared representation of the system or systems to be designed. For example, in a design project intended to reconceptualize 35mm point-and-shoot cameras, a physical design space was built initially to contain the results of qualitative research conducted to understand the existing paradigm (Rheinfrank and Welker, 1994). The space contained images and relevant artifacts that characterized different aspects of use and users of cameras. Initially, each wall of the space individually represented a particular aspect of the experience and was used more as a repository for the information about each dimension. Over time, however, the space was seen as a whole and became a shared representation in the collaborative process to solve a multidimensional camera design problem (Star, 1989). Specifically, the physical space evolved into a shared representation offering the internal and external design teams multiple views of possible 35mm camera futures, in which the view depended on its position on the floor in juxtaposition to the walls of the room (Rheinfrank and Welker, 1994). In later cycles of development, scoping maps (1) illustrate the features, functionality, and content of the designed system, (2) illustrate anticipated user experiences, and (3) enable team members to prioritize a
OCR for page 66
Human-System Integration in the System Development Process: A New Look plan for staged release. These types of shared representations allow the stakeholders to make decisions about what can and should be produced from the potential things that could be built. Eventually, a high-fidelity representation, such as a functional prototype, is a good candidate for validating what the team collectively knows about the system and for communicating a clear idea of the system from one design group to another, as well as “up” the organization. As the design and development process unfolds, and when groups shift from one type of shared representation to another or change the way they are using a shared representation, the shift signals a qualitative change in the know-how needed for continuing to make progress in the design/development process (Cook and Brown, 1999; Gasson, 2005). CONCLUSION Effective use of shared representations depends on understanding the team or organization’s current issues and needs in communication, and in strategically choosing the right kind of shared representation to mediate at the right time. When used appropriately, shared representations enable the design team to coalesce around a shared view, while providing a capacity for increasing the conceptual complexity that can be attended to—activities that are crucial in the design of complex systems. Shared representations can provide a bridge among analysis, design, implementation, and training in complex systems design, development, and fielding. The act of producing the representation can help teams detect unanticipated relations and features that can be exploited to lead to new connections and ideas. Shared representations can be anything from a simple sketch, to a “wizard of Oz” prototype, to a fully active simulation of system design and behavior. Although conventional project planning schedules or spreadsheets can support the design and development process, they can never take the place of consciously planning, producing, and seeding discussion around shared representations to improve the quality of collaboration and productive outcomes of transdisciplinary design teams (Carroll, 2002). Shared representations provide a means for teams to transcend conventional project management paradigms and to coalesce around their ideas to produce work that is a reflection of their shared understanding of the mission to be supported, the user needs, and the best that technology can deliver.
OCR for page 67
Human-System Integration in the System Development Process: A New Look APPENDIX 3-A TABLE 3-A1 Best Practices for Risk Mitigation Activity Category Best Practices for Risk Mitigation from ISO/PAS 18152 Example HSI Methods and Techniques Envisioning opportunities Identify expected context of use of systems [forthcoming needs, trends and expectations]. Analyze the system concept [to clarify objectives, their viability and risks]. Field observations and ethnography Participatory analysis System scoping Describe the objectives which the user or user organization wants to achieve through use of the system. Define the scope of the context of use for the system. Organizational and environmental context analysis Field observations and ethnography Participatory analysis Work context analysis Understanding needs Context of use Tasks Usability needs Design options Identify and analyze the roles of each group of stakeholders likely to be affected by the system. Describe the characteristics of the users. Describe the cultural environment/organizational/management regime. Describe the characteristics of any equipment external to the system and the working environment. Describe the location, workplace equipment, and ambient conditions. Decide the goals, behaviors, and tasks of the organization that influence human resources. Present context and human resources options and constraints to the project stakeholders. Analyze the tasks and worksystem. Perform research into required system usability. Generate design options for each aspect of the system related to its use and its effect on stakeholders. Produce user-centered solutions for each design option. Organizational and environmental context analysis Field observations and ethnography Task analysis Cognitive task analysis Participatory analysis Contextual inquiry Event data analysis Prototyping Models and simulations Usability evaluation methods Success-critical stakeholder identification Context of use analysis Work context analysis Investigate required system usability Usability benchmarking
OCR for page 68
Human-System Integration in the System Development Process: A New Look Activity Category Best Practices for Risk Mitigation from ISO/PAS 18152 Example HSI Methods and Techniques Goals/objectives and requirements Context requirements Infrastructure requirements User requirements Analyze the implications of the context of use. Present context of use issues to project stakeholders for use in the development or operation of the system. Identify, specify, and produce the infrastructure for the system. Build required competencies into training and awareness programs. Define the global numbers, skills, and supporting equipment needed to achieve those tasks. Set and agree on the expected behavior and performance of the system with respect to the user. Develop an explicit statement of the user requirements for the system. Analyze the user requirements. Generate and agree on measurable criteria for the system in its intended context of use. Usability requirements methods Scenarios Personas Define the intended context of use including boundaries Identify staffing requirements and any training or support needed to ensure that users achieve acceptable performance Storyboards Establish performance and satisfaction goals for specific scenarios of use Define detailed user interface requirements Prioritize requirements (e.g., QFD) Architecting solutions System architecting Generate design options for each aspect of the system related to its use and its effect on stakeholders. Produce user-centered solutions for each design option. Design for customization. Develop simulation or trial implementation of key aspects of the system for the purposes of testing with users. Distribute functions between the human, machine, and organizational elements of the system best able to fulfill each function. Develop a practical model of the user’s work from the requirements, context of use, allocation of function, and design constraints for the system. Produce designs for the user-related elements of the system that take account of the user requirements, context of use, and HF data. Produce a description of how the system will be used. Task analysis Work domain analysis Participatory design Prototyping Models and simulations Function allocation Generate design options
OCR for page 69
Human-System Integration in the System Development Process: A New Look Activity Category Best Practices for Risk Mitigation from ISO/PAS 18152 Example HSI Methods and Techniques Human elements Decide the goals, behaviors, and tasks of the organization [that influence human resources]. Define the global numbers, skills, and supporting equipment needed to achieve those tasks. Identify current tasking/duty. Analyze gap between existing and future provision. Identify skill requirements for each role. Predict staff wastage between present and future. Calculate the available staffing, taking account of working hours, attainable effort and nonavailability factor. Identify and allocate the functions to be performed. Functional decomposition and allocation of function. Specify and produce job designs and competence/skills required to be delivered. Calculate the required number of personnel. Generate costed options for delivery of training and/or redeployment. Evolve options and constraints into an optimal [training] implementation plan (4.3.5). Define how users will be re-allocated, dismissed, or transferred to other duties. Compare to define gap and communicate requirement to design of staffing solutions. Task analysis Usability requirements methods Work domain analysis Workload assessment Participatory design Contextual design Situation awareness Methods for mitigating fatigue Human performance model Design for alertness Plan staffing Hardware elements See (a) System architecting. Participatory design Physical ergonomics Prototyping Usability evaluation methods Software elements See (a) System architecting. Participatory design Prototyping Usability evaluation methods User interface guidelines and standards
OCR for page 70
Human-System Integration in the System Development Process: A New Look Activity Category Best Practices for Risk Mitigation from ISO/PAS 18152 Example HSI Methods and Techniques Life-cycle planning Planning Risks User involvement Acquisition Human resources Develop a plan to achieve and maintain usability throughout the life of the system. Identify the specialist skills required and plan how to provide them. Plan and manage use of HF data to mitigate risks related to HS issues. Evaluate the current severity of emerging threats to system usability and other HS risks and the effectiveness of mitigation measures. Take effective mitigation to address risks to system usability. Identify the HS issues and aspects of the system that require user input. Define a strategy and plan for user involvement. Select and use the most effective method to elicit user input. Customize tools and methods as necessary for particular projects/stages. Seek and exploit expert guidance and advice on HS issues. Take account of stakeholder and user issues in acquisition activities. Implement the HR strategy that gives the organization a mechanism for implementing and recording lessons learned. Enable and encourage people and teams to work together to deliver the organization’s objectives. Create capability to meet system requirements in the future (conduct succession planning). Develop and trial training solution to representative users. Deliver final training solutions to designated staff according to agreed timetable. Provide means for user feedback [on human issues]. Usability requirements methods (common industry format) Risk analysis Plan to achieve and maintain usability Plan use of HSI data to mitigate risks Identify HSI issues and aspects of the system requiring user input Develop a plan for user involvement Select and use the most effective methods Customize tools and methods as necessary
OCR for page 71
Human-System Integration in the System Development Process: A New Look Activity Category Best Practices for Risk Mitigation from ISO/PAS 18152 Example HSI Methods and Techniques Evaluation Risks Plan and execute Validation HSI knowledge Staffing Assess the health and well-being risks to the users of the system. Assess the risks to the community and environment arising from human error in the use of the system. Evaluate the current severity of emerging threats to system usability and other HS risks and the effectiveness of mitigation measures. Assess the risks of not involving end-users in each evaluation. Collect user input on the usability of the developing system. Revise design and safety features using feedback from evaluations. Plan the evaluation. Identify and analyze the conditions under which a system is to be tested or otherwise evaluated. Check that the system is fit for evaluation. Carry out and analyze the evaluation according to the evaluation plan. Understand and act on the results of the evaluation. Test that the system meets the requirements of the users, the tasks and the environment, as defined in its specification. Assess the extent to which usability criteria and other HS requirements are likely to be met by the proposed design. Review the system for adherence to applicable human science knowledge, style guides, standards, guidelines, regulations, and legislation. Decide how many people are needed to fulfill the strategy and what ranges of competence they need. Develop and trial training solution to representative users. Conduct assessments of usability [relating to HR]. Interpret the findings. Validate the data. Check that the data are being used. Usability requirements methods (common industry format) Prototyping Models and simulation Risk analysis Usability evaluation methods Obtain user feedback on usability Compare with requirements Performance measurement HR
OCR for page 72
Human-System Integration in the System Development Process: A New Look Activity Category Best Practices for Risk Mitigation from ISO/PAS 18152 Example HSI Methods and Techniques Negotiating commitments Business case Requirements Contribute to the business case for the system. Include HS review and sign-off in all reviews and decisions. Analyze the user requirements. Present these requirements to project stakeholders for use in the development and operation of the system. Identify any staffing gap and communicate requirement to design of staffing solutions. Usability requirements methods (common industry format) Risk analysis Value-based practices and principles (identify success-critical stakeholder requirements) Environment/organization assessment Development and evolution Maintain contact with users and the client organization throughout the definition, development, and introduction of a system. Evolve options and constraints into an implementation strategy covering technical, integration, and planning and manning issues. Usability requirements methods (common industry format) Models and simulation Risk analysis Usability evaluation methods User feedback on usability Performance measurement Monitoring and control Analyze feedback on the system during delivery and inform the organization of emerging issues. Manage the life-cycle plan to address HS issues. Take effective mitigation to address risks to system usability. Take account of user input and inform users. Identify emerging HS issues. Understand and act on the results of the evaluation. Produce and promulgate a validated statement of staffing shortfall by number and range of competence. Organizational and environmental context analysis analysis Risk analysis User feedback Work context analysis
OCR for page 73
Human-System Integration in the System Development Process: A New Look Activity Category Best Practices for Risk Mitigation from ISO/PAS 18152 Example HSI Methods and Techniques Operations and retirement Operations Retirement Analyze feedback on the system during delivery and inform the organization of emerging issues. Produce personnel strategy. Review the system for adherence to applicable human science knowledge, style guides, standards, guidelines, regulations, and legislation. Deliver training and other forms of awareness-raising to users and support staff. Assess the effect of change on the usability of the system. Review the health and well-being risks to the users of the system. Review the risks to the community and environment arising from human error in the use of the system. Take action on issues arising from inservice assessment. Perform research to refine and consolidate operation and support strategy for the system. Collect and analyze in-service reports to generate updates or lessons learned for the next version of the system. Identify risks and health and safety issues associated with removal from service and destruction of the system. Define how users will be re-allocated, dismissed, or transferred to other duties. Plan break-up of social structures. Debriefing and retrospective analysis for replacement system. Organizational and environmental context analysis Work context analysis
OCR for page 74
Human-System Integration in the System Development Process: A New Look Activity Category Best Practices for Risk Mitigation from ISO/PAS 18152 Example HSI Methods and Techniques Organizational capability improvement HSI capability data collection, analysis, and improvement Organizational skill/career and infrastructure development planning and execution Identify and use the most suitable data formats for exchanging HF data. Have a policy for HF data management. Perform research to develop HF data as required. Produce coherent data standards and formats. Define rules for the management of data. Develop and maintain adequate data search methods. Feedback into future HR procurement, training, and delivery strategies. Define usability as a competitive asset. Set usability, health, and safety objectives for systems. Follow competitive situation in the market place. Develop user-centered infrastructure. Relate HS issues to business benefits. Establish and communicate a policy for human-centeredness. Include HR and user-centered elements in support and control procedures. Define and maintain HCD and HR infrastructure and resources. Increase and maintain awareness of usability. Develop or provide staff with suitable HS skills. Take account of HS issues in financial management. Assess and improve HS capability in processes that affect usability, health, and safety. Develop a common terminology for HS issues with the organization. Facilitate personal and technical interactions related to HS issues. Feedback into future HR procurement, training, and delivery strategies. Create capability to meet system requirements in the future (conduct succession planning). Identify any opportunities for redeployment. Develop a strategy for [HR] data gathering. Organizational and environmental context analysis Assess and improve HSI capability Develop and maintain HSI infrastructure and resources Identify required HSI skills Provide staff with HSI skills Establish and communicate a policy on HSI Maintain an awareness of usability NOTES: Italicized items are methods not covered in Chapters 6-8. HF = human factors. HS = human-system. HR = human resources. QFD = quality function deployment. HCD = human-centered design.
Representative terms from entire chapter: