National Academies Press: OpenBook

Human Factors Research and Nuclear Safety (1988)

Chapter: 4. Human-System Interface Design

« Previous: Part II: Human Factors and Nuclear Safety: An Agenda for Research
Suggested Citation:"4. Human-System Interface Design." National Research Council. 1988. Human Factors Research and Nuclear Safety. Washington, DC: The National Academies Press. doi: 10.17226/789.
×
Page 46
Suggested Citation:"4. Human-System Interface Design." National Research Council. 1988. Human Factors Research and Nuclear Safety. Washington, DC: The National Academies Press. doi: 10.17226/789.
×
Page 47
Suggested Citation:"4. Human-System Interface Design." National Research Council. 1988. Human Factors Research and Nuclear Safety. Washington, DC: The National Academies Press. doi: 10.17226/789.
×
Page 48
Suggested Citation:"4. Human-System Interface Design." National Research Council. 1988. Human Factors Research and Nuclear Safety. Washington, DC: The National Academies Press. doi: 10.17226/789.
×
Page 49
Suggested Citation:"4. Human-System Interface Design." National Research Council. 1988. Human Factors Research and Nuclear Safety. Washington, DC: The National Academies Press. doi: 10.17226/789.
×
Page 50
Suggested Citation:"4. Human-System Interface Design." National Research Council. 1988. Human Factors Research and Nuclear Safety. Washington, DC: The National Academies Press. doi: 10.17226/789.
×
Page 51
Suggested Citation:"4. Human-System Interface Design." National Research Council. 1988. Human Factors Research and Nuclear Safety. Washington, DC: The National Academies Press. doi: 10.17226/789.
×
Page 52
Suggested Citation:"4. Human-System Interface Design." National Research Council. 1988. Human Factors Research and Nuclear Safety. Washington, DC: The National Academies Press. doi: 10.17226/789.
×
Page 53
Suggested Citation:"4. Human-System Interface Design." National Research Council. 1988. Human Factors Research and Nuclear Safety. Washington, DC: The National Academies Press. doi: 10.17226/789.
×
Page 54
Suggested Citation:"4. Human-System Interface Design." National Research Council. 1988. Human Factors Research and Nuclear Safety. Washington, DC: The National Academies Press. doi: 10.17226/789.
×
Page 55
Suggested Citation:"4. Human-System Interface Design." National Research Council. 1988. Human Factors Research and Nuclear Safety. Washington, DC: The National Academies Press. doi: 10.17226/789.
×
Page 56
Suggested Citation:"4. Human-System Interface Design." National Research Council. 1988. Human Factors Research and Nuclear Safety. Washington, DC: The National Academies Press. doi: 10.17226/789.
×
Page 57

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

4 Human-System Interface Design Human-system interface design addresses the boundaries be- tween the technical systems and the people who interact directly with those systems (see Boundary A ~ Figure I). At present, the central issues in this area are the introduction and effect of rapidly advancing computing technology and, at least implicitly, the allocation of functions between people and software-hardware combinations. We have grouped research needs in this area under three top- ics: (1) computer-based control and display, (2) automation and computer-based performance aids, and (3) human factors in soft- ware development. Each of these topics is described in the fol- lowing sections. Specific topics identified by the panel as higher priority contain more detail. A successful research program on human-system interaction should help reduce the probability of human error due to poor design of human-computer interfaces, decision aids, and automa- tion. It should also support the further upgrading of control room design begun with the control room design reviews. Maintenance activities should also be improved because, increasingly, mainte- nance personnel will be supported by portable computers, robots, and intelligent decision aids. The need for research on this topic will increase with increasing levels of automation. 46

47 COMPUTER-BASED INFORMATION AND DISPLAY SYSTEMS Rationale and Background In nuclear power plants, large quantities of data are made available through some medium (e.g., hard-wired instruments, sets of displays) to people who are responsible for controlling, trou- bleshooting, maintaining, and supervising operations. Research in a variety of disciplines has addressed the issue of how to build effective systems to deliver data to people; this is part of the body of research on human-computer interaction. One part of this re- search addresses doma~n-independent issues such as the legibility of displays and the potential accessibility of data bases. Although this research is generally applicable to nuclear power plants, it is not a higher-priority need because factors affecting legibility and accessibility are generally well-known and practical guidance is available (e.g., NUREG/CR-4617, 1987e; NUREG/CR-4227, 1985e). The problems involved are concerned with deploying this knowledge when building or evaluating interfaces. The Safety Parameter Display System (SPDS) requirement has become, for all practical purposes, an effort to establish the use of computer-based information handling and display systems in the control room. There is now or soon will be a computerized data base and a computer-based display medium in a majority of control rooms. This provides an opportunity to expand the use of computer-based information and display systems, if meaningful innovations can be defined. What is missing from current design and evaluation of comput- er-based information and display systems is a focus on how to in- fluence human performance positively. People use data displayed about the world in order to solve problems in that world. To do this, problem solvers must collect and integrate available data in order to characterize the state of the world, to identify distur- bances and faults, and to plan responses. A basic fact in cognitive science is that the representation of the world provided to problem solvers can affect their problem-solving performance (I.enat and Brown, 1984; Rasmussen, 1986~. Thus, questions about the dis- play of data can be reinterpreted to be questions about how types of representations vary in their effect on the problem solver's in- formation processing activities and problem-solving performance.

48 This viewpoint goes beyond questions of the media of display, leg- ibility, the potential availability of the data presented, and other questions that are independent of the tasks involved and focuses attention on ways In which information displays can assist human performance. It ~ research on this aspect of human-computer interaction that should be pursued (see also the section on au- tomation and computer-based human performance aids.) The default design position has been simply to make acces- sible to the problem-solver all of the available data and evidence from which judgments can be constructed. If there is only one kind of representation available, then the common belief is that it must be the most detailed representation of the state of the world that would ever be needed under any circumstances (Rasmussen, 1986~. This is often referred to as separable display of data or as a one-measurement-one-display-unit philosophy of data display (Goodstein, 1981~. Given this view, it ~ the role of research on human-computer interaction to provide guidance to ensure that elemental data are legible and potentially accessible. However, simply making data legible and accessible floes not guarantee that the user will access the correct data at the correct time (NUREG/CR-4532, 1986b). In many cases in the history of human-machine systems (especially nuclear power) technolog- ical choices contained serendipitous relations between the form and content of a representation that were used to advantage by the problem solver. For example, the position of a device usu- ally controlled by an automatic system was indicated via digital hard-wired counters. These mechanical step counters happened to make clearly audible clicks when the device position changed. Operators were able to make use of the clicking sounds to monitor this system because the clicks and click rate contained information about the behavior of the automatic controller. However, in many other cases, changes in technology hindered user performance be- cause these serendipitous relations disappeared and no functional alternative was provided. One example is tile-annunciator-based alarm systems and initial attempts to shift to computer-based chronological lists of alarms. Recent studies have shown that the spatially distributed tile system supports operator extraction of many useful forms of information (Kragt and Bonten, 1983~. The shift from a spatial to a temporal organization that occurred when the tile technology was replaced with chronological listing of the same data on abnormal conditions removed the serendipitous

49 benefits provided by the spatial organization inherent in the tile medium. For this reason, In at least one case the new technology was abandoned, forcing a return to the tile system (Pope, 1978~. Regears Recommendations Given today's computer technology and the evolution from analog systems to distributed digital systems, new human-comput- er interfaces and decision aids will be introduced into nuclear power plants, and new problems may arise with the improper use of the new capabilities. Regulatory positions about these kinds of developments may retard the introduction of systems that could enhance safety. On the other hand, an inability to regulate the introduction of new technology may create new safety problems, such as new opportunities for human error. Thus, a method to specify and evaluate the adequacy of new interfaces and decision aids is important. Computerized versions of paper procedures may be the first of these items to challenge NRC and the industry. Research on what constitutes effective computer-based infor- mation and display systems is not yet complete. It is important that the nuclear industry foster and track progress in this area; otherwise this research may be driven by other applications that are not general~zable to the nuclear power plant context. Most de- SigD guidelines for display are oriented toward dialog applications, such as text editing or data base retrieval, and not to real-time graphic display of data. There is an opportunity for informed in- novation to create more effective control room information and dis- play systems; this opportunity exists because most control rooms now contain a computer-based display medium. Additional op- portunities exist in other areas of the nuclear power plant such as maintenance, tests, calibration, and monitoring of technical specifications. There have been a number of applications of computer-based displays already in the nuclear industry, both prototypes and sys- tems in use (e.g., SPDS). However, no useful interpretation of ex- periences with the introduction of computer-based interfaces has emerged. This gap is unfortunate and eliminating it is important to future innovations and development.

so AUTOMATION AND COMPUTER-BASED HUMAN PERFORMANCE AIDS Rationale and Background In the part, allocation of function has been based on catalogs of Things computers do better" and "things people do better." With the current rate of technological development, however, ex- isting catalogs are obsolete, and this distinction may soon cease to be relevant in most situations. As artificial intelligence technol- ogy develops, the idea of fixed allocation is no longer appropriate. NUREG/CR-3331 (19836) outlined an approach to functional al- location that correctly emphasizes an iterative approach to the solution for conventional systems, but the pane} believes a differ- ent conceptual framework is required. For a given allocation of functions and design of controls and displays, appropriate performance aids may be identified. The question of new automation and new human support systems will arise because of changes in technology. First, changes in current technology will naturally occur with system upgrades and system obsolescence and perhaps with plant life extension. A notable example is the changeover in many plant systems to newer distributed digital technologies. Second, new opportunities will exist for aiding and automation afforded by new technologies, such as the opportunities for decision aiding and automation afforded by artificial intelligence techniques and expert systems. Research Recommendations Innovations are currently under development in several ar- eas that will challenge the NRC's and the industry's ability to deal with issues surrounding new computer-based support shy tems. Research is therefore needed to deal with them. Questions concerning utilization of available research include: What is ef- fective computer-based support? How can brittle problem-solving systems be avoided (Brown, Moran, and Williams, 1982; Roth, Bennett, and Woods, in press)? How should human and machine intelligence be combined into an effective overall system (Holl- nagel, Mancini, and Woods, 1986; Mancini, Woods, and Hollnagel, in press)? What are effective human plus machine decision maker systems (Sorkin and Woods, 1985~? What are effective supervi- sory control architectures (Sheridan and Hennessey, 1984; Moray,

51 1987~? What factors affect decision making in multiperson, mul- tifacility systems (Fischhoff, 1986~? How can one measure the effect and quality of different kinds of computerized aids? Re- search on questions such as these should be undertaken or tracked and used when applicable. For example, a research program is under way to track the effects of recent changes in commercial flight deck automation (Curry, 1985; Wiener, 1985b). The results of this research should be tracked and transferred to the nuclear industry. The near future will be a time of widespread innovation and learning from the results of trial innovations. The industry needs to establish mechanisms to enhance access and to support this innovation process. Important areas in which innovation should be encouraged include the management of maintenance, test, and calibration activities; alarm systems; maintenance activities; and aids for supervisory activities, especially in the first two areas. Disturbance analysis has wanes! as a topic of interest to the in- dustry because of the realization that it is a difficult problem; however, it is re-emerging under the label of artificial intelligence. While artificial intelligence techniques offer many advantages, the important problem of providing useful diagnostic and emergency management information to human problem solvers remains un- solved. There are other areas in which innovations are currently un- der way that will challenge the NRC's and the industry's ability to deal with issues surrounding computer-based support systems. New computer-based system aids for measuring compliance with technical specifications and for emergency operating procedures are now available.* Several new kinds of computer-based alarm systems or alarm system supplements have been designed, and several utilities have expressed interest in alarm system upgrades. Several projects are ongoing to develop support systems based on artificial intelligence for nuclear power plant personnel. For ex- ample, a prototype expert system has been developed to support emergency notification decisions during accidents. These devel- opments mean that there is an immediate need for effective tools * Some utilities have contracted for such systems and some have already applied them, with mixed results.

52 to evaluate and measure the impact of new aids and automa- tion on the human-technical system. Developing these measuring techniques is the highest immediate priority in this area. The challenge of new technologies such as artificial intelligence or other forms of automation goes beyond the simple bottom- up construction of what the technology allows to be built. A nuclear power plant is already a highly automated system. When performance deficiencies are noted, they are usually attributed to the human element alone; and, from a purely technological point of view, the obvious solution is more automation. However, there are dangers in accepting a purely technological view to automation. Cases of new automation leading to new kinds of poor human performance are not unknown. Thus, it is important to conceive of new automation decisions as sociotechnical opportunities. These opportunities will arise because of changes in current technology and because of new capabilities for building decision automation. While research is needed to evaluate emerging support sys- tems requiring regulatory review, the same questions will arise continuously with future systems. The problem for the nuclear industry and its regulators is to learn how to achieve the benefits from new technological developments, to avoid retarding techno- logical change, and to find and mitigate deficiencies in the use of the new technologies. Careful examination of the history of au- tomation reveals that shifts have sometimes created new types of errors or accidents because they have changed the entire human- machine system in unforeseen ways (e.g., Nobel, 1984; Hirschhorn, 1984~. Some examples of the unintended and unforeseen negative consequences that have followed from purely technologically driven deployment of new automation capabilities are summarized below: Shifts from manual to supervisory control in process control in which productivity actually fell from previous levels when there was a failure to support the new supervisory control demancis (British Steel Corporation, 1976~; Automation-related disasters in aviation (e.g., Wiener, 1985a); The shift in power plant control rooms from tile annunci- ator alarm systems to computer-based alarm systems that eventually failed and forced a return to the older tech- nology, because strategies to meet the cognitive demands of fault management that were implicitly supported by

s3 . the old representation were undermined in the new (Pope, 1978); Shifts from paper-based procedures to computerized proce- dures that have also failed due to disorientation problems, because of the failure to anticipate the cognitive implica- tions of technological changes for human problem-solving (Elm and Woods, 1985~. Automation may make significant improvement possible, but it is not always beneficial. The point is not that new technology should be avoided because the automation does make possible significant improvements. Similarly, the point is not that new technology is always beneficial because there are post-conditions associated with its introduction that must be satisfied in order for the potential to be achieved and for undesirable consequences to be avoided or mitigated. Furthermore, these post-conditions can influence the way the technology is employed. Examples of ignored post-conditions un- dermining the benefits of technological change are numerous. Au- tomation decisions then become a trade-off between the degree to which potential benefits are achieved and their magnitude against the costs of implementing the new technology plus either the costs of identifying and meeting post-conditions or the costs associated with accepting the new failures that may occur. The relation of operators and maintenance workers to their equipment should be a symbiotic one. Research should not be conducted on allocation by simple assignment of tasks to either the human or to the machine. Formulating the allocation issue in this way assumes that human-related problems can be solved by just a little more engineering. Human-related problems, however, are symptoms, not causes, of underlying problems in the sociotechnical system. One should therefore question how to design the system so that each can support the other, request and give help as needed, and produce the most effective joint outcome. There is little understanding, at present, of what makes a per- son trust or distrust a machine, the advice it gives, or the action it takes, and there is only the beginning of an understanding of the nature of the human cognitive processes that underlie the ac- quisition and assessment of evidence and the genesis of decisions on which trust is based. Yet these processes lie at the core of the human control of complex systems and center on the nature

54 of the operators' mental models, through which they interpret the demands of the task, be it operation or maintenance. Re- searchers should investigate the merits and methods of dynamic allocation, consider whether the human role should be exclusively in a high-level supervisory capacity, and determine the difference in allocation for routine and emergency operations to ensure good performance under both. One critical behavioral science issue related to questions of automation is the man-in-the-Ioop or man-out-of-the-Ioop archi- tectures. This issue involves the relation between the human and the machine roles in controlling and managing a complex machine process; it is not an issue of the level of automation. In the past, the result of increased automation was to move the human role far- ther away from direct contact with the controlled process. In this new role, the human becomes a supervisor and manager of par- tially autonomous machine resources (e.g., Sheridan and Hennessy, 1984~. This means that, with increases in the level of automation, the human Is moved ~out-of-the-Ioop.~ Research on the effects of automation are beginning to suggest (e.g., Wiener, 1985a) that this architecture may be poor for person-machine control of com- plex processes. As an alternative, this research suggests the need for new architectures in which the level of machine involvement is -high (a highly automated system) but in which the human plays a more continuously active role in the control and management of the process is more ~in-the-Ioop~ than in past, partially auto- mated systems. Such a concept has profound implications for the course of future automation and decision support in nuclear power plants. Numerous attempts have been made in recent years to improve plant operating procedures, particularly emergency operating pro- cedures (EOPs). For example, symptom-based or function-based procedures seem to be an improvement. However, frequent com- plaints are still heard from operators about the size and complexity of EOPs. It has been suggested that computer-based EOPs may improve performance, but this will be critically dependent on their design. When EOPs are computerized, the resulting system is one form of a computer-based human performance aid. This means that all of the research issues discussed in this section become relevant to improve procedure design and presentation. Research on the mechanisms that produce human error is also critical to the development of improved procedures. For example,

ss there is evidence that people lose their place in hierarchically or- ganized computer data bases, and questions arise about legibility, accessibility by more than one operator, and place-keeping. The design of EOPs, as with any computer-based human performance aid, should be a systems process in which the layout of the control room, manning levels, and training are taken into account. EOPs must be validated in the control room or in a full-scale simulator to deterrence whether it is possible for crews to carry them out in a timely manner and to observe what will happen if a reduced-size crew has to use them. At present, there ~ no coherent theory for the design of EOPs, and research is required to develop such a theory (Fehrer et al., 1986a, 1986b). The current literature on the evaluation of expert systems and artificial intelligence simulation of cognition should be reviewed. The nature of validation as applied to such systems should also be examined, since the common criterion, a system's ability to simulate human performance, is inadequate for a system designed to enhance human performance. Attention should also be given to the problem of verifying that software has been fully debugged. Experimental studies should be conducted! in simulators with op- erators who have had many hours of practice with the proposed systems: probably a minimum of 50-100 hours is required. At- tempts should be made to find situations in which the system is tested against unknown faults or faults beyond the design basis, and cascading faults whereby many small failures combine to place intense cognitive demands on the operator. The whole concept of valiciating or assessing such systems is itself a major research prom lem whose solution is at present unknown. Research should con- centrate on the validation and assessment of such systems, leaving their developments to industry. Meeting this research challenge is one example in which access for researchers to nuclear power plant facilities and personnel ~ critical. The benefits of research in this area include improved safety through new human performance aids and improved evaluation of new systems. This research will also be needed to address such questions as the impact of high technology, expert system appli- cations, severe accident management, enhanced technical support centers, and measurement of human reliability with new forms of support systems.

56 HUMAN PACTORS IN SOFTWARE DEVELOPMENT Rationale and Background One result of the general trend toward increasing levels of automation ~ nuclear power plants is that system performance (safety, reliability, availability, etc.) wit} Increasingly become a function of the quality of software performance and there are many human factors issues in the development of high quality software (Soloway and Tyengar, 1986; Shneiderman, 1980~. A typical current estimate of the best human performance in computer programming is 3 programming errors per 1,000 lines of code; this is comparable to error rates in many other human tasks. While many of the errors in software development may not be critical, this error estimate suggests that for the software-intensive systems of the future, which can be expected to contain hundreds of thousands of lines of code, software errors will be a major problem unless the rate of programming errors is reduced. Other types of errors, such as conceptual modeling errors and errors due to mismatches between the mental models of programmers, engineers, and users or operators, may be less numerous but of greater importance. As additional decision making is allocated to "intelligently software, the potential for more severe consequences from a single error is increased, and testing and validation of this software is likely to become a critical bottleneck to its acceptance. Research Recommendations A program should be initiated to review and assess the liter- ature on software psychology to identify problems of importance to nuclear plant safety, to determine what, if any, nuclear-plant- specific research is required, and to carry out research as apprm priate. The pane! expects that the results of an initial study will disclose that continued tracking, assessment, and adaptation of general research in the field will be sufficient. However, since many of the problems may involve industry-specific issues and technology, development for the nuclear industry will be needed, especially in the long term. The nuclear industry should encourage research in the area, since it will benefit from the results.

57 Some requirements and guidelines for incorporating human factors concerns into software development can be developed quick- ly. However, other requirements and guidelines cannot be devel- oped without further advances being made in software psychology.

Next: 5. The Personnel Subsystem »
Human Factors Research and Nuclear Safety Get This Book
×
Buy Paperback | $45.00
MyNAP members save 10% online.
Login or Register to save!
  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!