5
Human Factors Considerations for Automatic Identification System Interface Design

From the perspective of the human operator of automatic identification systems (AIS), the “interface” is defined as the display and control mechanisms that enable the exchange of information between the person and the AIS. The interface includes not only the display of information, such as cathode ray tube graphics and auditory warnings, but also data entry and control elements, such as a keyboard or switches.

Developing an effective AIS interface requires a systematic process that considers the capabilities of the users and the demands of the operational environment. Although several researchers have investigated mariner collision avoidance and navigation strategies and information needs, no one has systematically evaluated how AIS can support these and other information needs (Hutchins 1990; Laxar and Olsen 1978; Lee and Sanquist 1993; Lee and Sanquist 2000; Schuffel et al. 1989). To date, neither the design of AIS controls nor the information needs of the mariner and the method of displaying that information have been defined and evaluated sufficiently well. Thus, a focus on human factors considerations for AIS interfaces is needed.

Once a system has been designed, manufactured, and put in service, it must be maintained. The goal of human factors in maintenance, as in design, is to enhance safe, effective, and efficient human performance in the system. In recent years it has become apparent that human factors methodology has as much to contribute to maintenance as it does to design. In the aviation and process control industries, for example, structured human factors methods (e.g., Maintenance Error Decision Aid) are being applied to maintenance with some success (Johnson and Prabhu 1996; Maurino et al. 1998; Reason and Maddox 1998). According to the National Aeronautics and Space Administration (NASA 2002), four primary activities are undertaken in maintenance human factors: (a) human factors task/risk analysis, (b) procedural improvements, (c) maintenance resource management skills and training, and (d) use of advanced displays (to clarify procedures and to make information resources more accessible without task interruption). Because of the changing nature of the workplace and required tasks, especially given



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 5 Human Factors Considerations for Automatic Identification System Interface Design From the perspective of the human operator of automatic identification systems (AIS), the “interface” is defined as the display and control mechanisms that enable the exchange of information between the person and the AIS. The interface includes not only the display of information, such as cathode ray tube graphics and auditory warnings, but also data entry and control elements, such as a keyboard or switches. Developing an effective AIS interface requires a systematic process that considers the capabilities of the users and the demands of the operational environment. Although several researchers have investigated mariner collision avoidance and navigation strategies and information needs, no one has systematically evaluated how AIS can support these and other information needs (Hutchins 1990; Laxar and Olsen 1978; Lee and Sanquist 1993; Lee and Sanquist 2000; Schuffel et al. 1989). To date, neither the design of AIS controls nor the information needs of the mariner and the method of displaying that information have been defined and evaluated sufficiently well. Thus, a focus on human factors considerations for AIS interfaces is needed. Once a system has been designed, manufactured, and put in service, it must be maintained. The goal of human factors in maintenance, as in design, is to enhance safe, effective, and efficient human performance in the system. In recent years it has become apparent that human factors methodology has as much to contribute to maintenance as it does to design. In the aviation and process control industries, for example, structured human factors methods (e.g., Maintenance Error Decision Aid) are being applied to maintenance with some success (Johnson and Prabhu 1996; Maurino et al. 1998; Reason and Maddox 1998). According to the National Aeronautics and Space Administration (NASA 2002), four primary activities are undertaken in maintenance human factors: (a) human factors task/risk analysis, (b) procedural improvements, (c) maintenance resource management skills and training, and (d) use of advanced displays (to clarify procedures and to make information resources more accessible without task interruption). Because of the changing nature of the workplace and required tasks, especially given

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 the increasing use of automation in maintenance, workers in these jobs must acquire new skills for tasks that will not necessarily reduce their workloads. In addition, issues of software version control and data maintenance (e.g., updated chart information, updated cargo information) may require special procedures and training as well as more specialized personnel. As will be seen in the discussion below, many of these types of activities are relevant to AIS shipboard displays. Although maintenance issues are important and merit consideration and comprehensive evaluation before implementation of a specific AIS, these system-level (not display-specific) issues were beyond the scope of this report. Some of the key human factors considerations important in interface design are outlined in this chapter. A description of the human factors design process is given first. How the three stages of understand–design–evaluate might be applied to the design of AIS interfaces is then discussed. A number of human factors guidelines that can assist in the design of current and future AIS interfaces are also provided. CORE ELEMENTS OF THE HUMAN FACTORS DESIGN PROCESS Human factors design activities are an integral element of the overall systems analysis and design process described in Chapter 4. The focus of human factors design is on the interaction between the design and the human. Thus, human factors design processes can be simplified into three major phases: understand the user and the demands of the operational environment, design a system on the basis of human factors engineering principles and data, and evaluate the system to ensure that the system meets the needs of the user (Woods et al. 1996) (see Figure 5-1). These steps are mapped to the systems analysis and design framework outlined in Chapter 4 and shown in Figure 5-2. To begin with, a task or work analysis can be used to provide the initial data to understand the user and the demands of the operational environment (Kirwan and Ainsworth 1992; Vicente 1999). This understanding and the requirements that result are combined with human factors engineering guidelines and principles to create initial design concepts. As shown in Figure 5-1, design often begins by building on findings from the evaluation of an existing system rather than by starting with a blank slate. This is

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 FIGURE 5-1 Iterative cycle of system development. (Adapted from Woods et al. 1996.) FIGURE 5-2 Systems analysis and design framework.

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 true also for AIS. AIS design will occur in the context of previously developed navigation, communication, and planning aids. After these initial concepts are developed, designers conduct heuristic evaluations and usability tests with low-fidelity mock-ups or prototypes (Carroll 1995). Usability evaluations in realistic operational contexts are particularly useful because they often help designers better understand the users and their needs. AIS deployment may result in mariners using the technology in ways that were not anticipated during the initial design. For this reason, analysis of how mariners interact with prototypes is critical to a better understanding of system requirements. This enhanced understanding can then be used to refine design concepts. When the system design becomes more defined, it may be placed in an operational environment for comprehensive testing and evaluation. This final evaluation can be considered to be the final step of system development. It can also be considered as the first step in developing a better understanding of the user for the next version of the prototype or product. For this reason, it is important to consider AIS design and the associated standards and certification development as a continuous process that evolves as more is learned about how mariners use AIS and how AIS affects the maritime industry. This continuous cycle is reflected in the link between evaluation, which ends one iteration, and understanding, which begins the next iteration of the design cycle. Some of the more critical elements of each of these three phases are described in the remainder of this chapter. The most obvious focus of the design process is the physical display and controls that make up the operator interface. However, with complex technologies such as AIS, training and documentation also represent important elements of design. Ignoring documentation (manuals, instruction cards, help systems) and training can lead to errors, poor acceptance, and ineffective use of the system. UNDERSTANDING THE NEEDS OF THE OPERATOR New technology can change demands on the bridge crew dramatically. If they are properly developed, technological advancements should make operators more efficient and safe. Under proper conditions, workload declines and performance improves with the introduction of new navigation tech-

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 nology, even when the number of crew members declines (Schuffel et al. 1989). Other studies, however, have shown significant performance declines with the introduction of new technology, particularly under medium- and high-stress conditions (Grabowski and Sanborn 2001). Studies in other domains suggest that poorly designed automation may reduce workload under routine conditions but can actually increase workload during stressful operations (Wiener 1989; Woods 1991). One possible explanation for these apparently contradictory findings has been suggested by Lee and Sanquist (2000), who point out that the evaluation of modern technologies often addresses only routine performance and does not consider more stressful and nonnormal conditions where new technology can actually impair performance. (A fuller discussion of automation-related issues and their potential impact on operator performance is given in the section “Human/ Automation Performance Issues.”) In addition, new technology can introduce new cognitive demands, such as the need to monitor more ships during collision avoidance, to form mental models of the new technology, and to perform complex mental scaling and transformations to bridge the gap between the data presented and the information needed by the operator. Although problems abound, properly implemented technology (such as AIS) promises to enhance ship safety as it eliminates time-intensive, repetitive, and error-prone tasks. To realize the promise of new maritime technology requires a clear understanding of the needs of the operator. Historical data concerning shipping mishaps indicate that many navigation errors result from misinterpretations or misunderstandings of the signals provided by technological aids (NTSB 1990). Moreover, Perrow (1984) notes that poor judgment in the use of radar contributes to many maritime accidents. In some situations the mariner may receive so many targets and warnings that it may be impossible to evaluate them, and that display may be ignored. In addition, production pressures could force mariners to use the devices to reduce safety margins and operate their vessels more aggressively. The demands and pressures that new technology can place on mariners can induce unanticipated errors. These findings suggest that poorly designed and improperly used technology may jeopardize ship safety. In addition, AIS may eliminate many tasks, make complex tasks appear easy at a superficial level, and lead to less emphasis on training and design. AIS may also introduce new

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 phenomena that affect mariner decision making, such as automation bias and overreliance on a single source of information to guide collision avoidance and navigation. In this situation, if the display fails to contain the information necessary to specify operator actions, errors will result (Rasmussen 1986; Vicente and Rasmussen 1992). This is particularly problematic with AIS because it may provide information on only a subset of the vessels the operator must consider in navigating a safe course. Thus, it is clearly important to understand the cognitive tasks involved with AIS to guide design and training. As a demonstration of this process, the committee conducted a preliminary task analysis using observations of a towing vessel representative of those that operate on the upper Mississippi River and its tributaries. This type of inland towing operation involves transiting locks and relatively long voyages. This compares with fleeting vessels, which operate in a relatively small area of the river, and vessels operating on the lower Mississippi, which may rarely encounter locks. Although towing vessels on the lower Mississippi might not encounter locks, they tend to have a much larger cargo and are likely to interact with deep-draft vessels. The towing vessel observed was also a technological leader that already uses electronic charts. Although many towing companies have adopted electronic charts, many smaller companies have not. The towing industry includes many types of vessels and operations, which may lead to different applications of AIS, particularly compared with the application of AIS for deep-draft vessels. To understand the nature of these differences, preliminary observations and a task analysis were conducted. Similar analyses should be performed for other classes of vessels as well. A simple way to organize observations of navigation and communication activity is according to information, functions, and events (see Figure 5-3). A complementary approach would be to address the underlying constraints of the work domain on behavior (Vicente 1999). Both approaches would be useful in a comprehensive analysis of how AIS could support mariners. Information refers to categories of information that the pilots and captains use to guide their actions. In some cases, such as “lock ticket,” the information is contained on a piece of paper, but the information could be in the form of radio communications that update this information, so “lock ticket” represents more than the physical piece of paper. For each information

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 FIGURE 5-3 Towing vessel meeting example. category, Table 5-1 shows the different activities and their descriptions, including a source for the activity, such as radio communication or visual observation. Functions are the information transformation processes that achieve system goals. These capture what people and technology do in the pilothouse. As shown in Figure 5-3, each function takes information as an input and generates it as an output. The functions are triggered by events and they also initiate events. Events are the triggers that initiate functions and the state changes that are a consequence of the information transformation and activities associated with a function. Table 5-1 shows a sample of representative information, functions, and events. An example of the relationships between functions, events, and information can be seen in the towing vessel meeting diagram shown in Figure 5-3. In this example, a towing vessel meeting another vessel experiences at least two events—hazard detection (e.g., fixed-object hazards) and vessel threat detection. The two events result in functions being performed aboard the towing vessel: communication, meeting planning, establishment of a planning agreement, vessel speed and position control, hazard monitoring, and position establishment. Those functions result in another event—identification of a meeting point. Figure 5-3 also shows that the information being used

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 TABLE 5-1 Representative Information, Functions, and Events Activity Description Type of Activity: Information Event log Documents progress along the river, anomalies, and crew changes. This information is stored and communicated using a computer, note pad, and formal paper log. Lock ticket Data needed to coordinate lock passage (e.g., 600- versus 1,200-ft locks). Tow configuration, length, draft, cargo, barge numbers and types. This information is stored and communicated using paper ticket and note pad; changes are communicated by radio. Vessel/tow configuration Information that affects safe passage through channels, locks, and bends. This information includes draft readings, leaks and water in barges, tow length (visual inspection, notes). Equipment calibration Depth estimated by second vessel, physical state of depth gauge, confirmation with vessel/river interaction. This information is communicated by radio, visual observation, vessel response. Lock waiting location Array of vessels stopped along bank before lock. This information is communicated by radio. Informal chart data Addresses lack of detail in charts. Fleeting areas, location of private docks, steepness of bank, type of bank and bottom, eddies. Critical for picking an appropriate place to stop and identifying upcoming river hazards (visual confirmation, local knowledge, general river knowledge, radio communication, e-mail). Dynamic chart information Addresses changing features of the river. River height, sandbars, current, lock status, obstacles, and hazards. This information is stored and communicated using daily USCG updates on the radio, e-mail, and radio communication with other boats. Traffic situation Number, distance, and distribution of approaching boats. This information is communicated and tracked using radio and visual and radar targets. Estimated meeting location Point where vessels are likely to pass on the basis of estimated speed and distance. This information is communicated using radio and a chart.

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 Activity Description Meeting conventions Southbound has right-of-way, Ohio River convention for southbound vessel to take outside of curve. Subsequent vessels of a sequence follow the first. Opposite for lower Mississippi. This information is communicated by radio. Meeting constraints Space available for passage, intended track, mechanical problems of vessel. This information is stored and communicated using the charts and radio. Immediate river/vessel interaction Depth of water below barges, response of tow to control input. This information is based on the actual compared with the expected rate of turn, behavior of lead barges, cavitation, speed/rpm relationship, and current. This information is communicated through visual cues, haptic cues (vibration), and auditory cues. Lock coordination Configure tow, loading to match lock capacity. Share tow information and any changes with lock manager to establish lock type and order. Type of Activity: Functions Meeting planning Broadcast position and intention to identify relevant forward vessels (northbound responsibility). Use estimated speed, distance, and location of hazards to establish a meeting location. Establishing passing agreement Agreeing where and how the vessels will pass (port-to-port or starboard-to-starboard). Fleeting area and service coordination Plan for support services, such as maintenance personnel boarding and fleet boats. Speed and position control Moment-to-moment control of the vessel course through the water. Identify waiting location Determine availability and suitability of places where tow can be temporarily stopped against the riverbank. Stopped at riverbank Stopped, waiting for fog to lift or for turn through lock. Hazard monitoring and detection Scanning the river to identify hazards, which include other vessels, upcoming turns, sandbars, and recreational boaters.

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 Activity Description Type of Activity: Events Hazard detected Detection that a hazard is present. Upcoming vessels detected Realizing that other vessels are in the vicinity. Meeting point identified Determining location at which vessels will meet. Passing agreement made Agreement made between vessels as to where and manner in which they will meet. Lock delay identified   Change in river height Changes in water depth. Lock approached   Lock passed Have gone through the lock. for the functions includes dynamic chart information, traffic situation information, estimated meeting locations, meeting conventions, meeting constraints, and immediate river/vessel interaction information, among other items. This simple example provides an idea of the relationships between information, functions, and events in a towing vessel meeting situation. Note that the relationship between events and functions is sequential: events trigger functions, and functions result in changes in events. Note also that information is a critical input to functions; information is needed for functions to occur. The relationships between information, functions, and events can be modeled in a variety of ways: for instance, by using data flow diagrams (Hoffer et al. 2002), the Unified Modeling Language for object-oriented software and hardware (Kobryn 1999), data modeling (DATE 2002), transition matrices, and input/output matrices. Computer-aided software engineering tools, as described earlier, are electronic repositories for each of these different types of models that can be used for analysis, design, and development activities. Each of these models and approaches focuses on the use of information to facilitate activities in response to and in order to successfully execute or anticipate events in a domain. The patterns evidenced by different events, information, and functions in a domain provide important clues

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 as to appropriate technology design and development strategies to assist human operators. The variety of functions that AIS might support and the variety of information sources demonstrate the challenge of integrating AIS into the mariner’s decision-making process. The variety of information, events, and functions also demonstrates the vessel- and operation-specific nature of AIS display design. While some of the elements of navigation, communication, and planning tasks remain constant across different types of vessels and operating environments, others change. Thus, systematic analysis of the information, functions, and events that describe mariner activities is needed to derive AIS display and control requirements. The results of this analysis might include a transition matrix that identifies the potential challenges that might interfere with the functions occurring. This can also identify the interface strategies that could help AIS in supporting these transitions (e.g., separate alarms versus integration with other displays). Another result could be an input/output matrix that describes the information flow between functions and whether the information is an input or an output of each function, as well as the data entry and data flow requirements. Unneeded data entry should be avoided, and data output should be organized to avoid overwhelming the operator. An input/output matrix can help identify how AIS outputs can be integrated, combined, and formatted to support the functions with minimal data entry and cognitive transformations. An input/output analysis can also result in specific strategies for supporting efficient manipulation and use of the information and can identify potential breakdowns in information flows and functions. For instance, the initial observation of towing operations identified several considerations for AIS implementation in the inland towing industry: Combining a radar overlay may clutter the electronic chart and require substantial adjustments to avoid inconsistencies in electronic chart orientation. AIS information can further complicate these tasks if it is not carefully integrated. Geographic constraints make meeting point coordination an important task for inland towing. The variable speed and intention of other vessels make meeting location estimation for towing vessels difficult. Any AIS implementation should consider how to address this challenge.

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 as remote collaboration, engineering analyses, scientific data interpretation, and aircraft cockpits (Barrass and Kramer 1999). These applications show that sonification can convey subtle changes in complex time-varying data that are needed to promote better coordination between people and automation. Because sound does not require the focused attention of a visual display, it may enable operators to monitor complex situations. Just as with visual displays, combining sounds generates a gestalt from the interaction of the components (Brewster 1997). The findings support a theoretical argument that sonification can be a useful complement in visual displays. Another sensory channel that is still underutilized is the haptic sense. The sense of touch shares a number of properties with the auditory channel. Most important, cues presented via these two modalities are transient in nature and difficult to miss, and thus are well suited for alerting purposes. The advantage of tactile cues over auditory feedback is their lower level of intrusiveness, which helps avoid unnecessary distractions. Also, like vision and hearing, touch allows for the concurrent presentation and extraction of several dimensions, such as frequency and amplitude in the case of vibrotactile cues. The distribution of information across sensory channels is not only a means of enhancing the bandwidth of information transfer; it can support the following additional functions: Redundancy, where several modalities are used for processing the same information. Given the independence of error sources in different modalities, redundancy in human–computer interaction can support error detection and reduce the need for confirmation of one’s intention, especially in the context of safety-critical actions. For example, the AIS could have a redundant auditory alert for important warnings that are displayed on the screen. Complementarity, where several modalities are used for processing different chunks of information that need to be merged. It has been suggested that such a complementary or synergistic use of modalities is in line with users’ natural organization of multimodal interaction. Substitution, where one modality that has become temporarily or permanently unavailable is replaced by some other channel. This may become necessary in case of technical failures or changes in the environment (e.g., high ambient noise level). For example, the AIS could read text

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 messages to the mariner, making it possible for the mariner to keep watching the surrounding vessels rather than reading messages on the display. In summary, the design of a multimodal AIS interface may be a means of avoiding problems related to data overload. It may allow a reduction in competition among attentional resources and thus support effective attention allocation. For example, a graphic representation of the traffic situation can be combined with speech output or other AIS-specific auditory and tactile alerts that capture the officer’s attention in potential traffic conflicts or other critical events that may be missed because visual or auditory attention is focused on other tasks. In addition to creating multisensory system output, it will be desirable to consider different modalities for providing input to AIS. For example, in some circumstances, the use of a keyboard for AIS data entry may not be possible or desirable. In those cases, voice input or a touch screen could serve as alternatives. Thus, the benefits and limitations of the combined use of input and output modalities should be explored, as well as the need for the adaptive use of modalities. An adaptive approach to the design of multimodal interfaces may be appropriate for various reasons. Factors that vary over time and that may require a shift in modality usage include the abilities and preferences of individual mariners, environmental conditions, task requirements and combinations, and degraded operations that may render the use of certain channels obsolete. For example, the responsiveness to different modalities appears to shift from the visual to the auditory channel if subjects are in a state of aversive arousal (Johnson and Shapiro 1989). Also, modality expectations and the modality shifting effect play a role. The feasibility of multimodal interfaces also needs to be carefully evaluated. If a person expects information to be presented via a certain channel, on the basis of either agreements or frequency of use, then the response to the signal will be slower if it appears in an unexpected channel. If people have just responded to a cue in one modality, they tend to be slower to respond to a subsequent cue in a different modality (Spence and Driver 1997). Environmental conditions also affect the feasibility or effectiveness of using a certain modality. For example, high levels of ambient noise may make it impossible for the mariner to use the auditory channel and thus require a switch to a different modality that would otherwise be less desirable.

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 Human factors design principles and promising multimodal display alternatives may help define useful AIS displays and control designs; however, no research has addressed specific design parameters for AIS. Likewise, multimodal display alternatives seem promising, but research is needed to verify their effectiveness in conveying AIS information. EVALUATION Heuristic Evaluation of AIS Interface Heuristic evaluation, first proposed by Nielsen and Molich (1990), is a low-cost usability testing method for the initial evaluation of human–machine interfaces. The goal of heuristic evaluation is to identify problems in the early stages of design of a system or interface so that they can be attended to as part of an iterative design process. Heuristic evaluation involves having a small set of evaluators examine the interface and judge its compliance with recognized usability principles (the “heuristics”). Each evaluator first inspects the interface independently. Once all evaluations have been completed, the evaluators communicate and aggregate their findings. Heuristic evaluation does not provide a systematic way to generate fixes to the observed problems. However, because heuristic evaluation aims at explaining each observed usability problem with reference to established usability principles, many usability problems have fairly obvious fixes as soon as they have been identified. Interestingly, a typical human–computer interface expert will identify about a third of the problems with a particular interface using this technique. Another expert, working independently, will tend to discover a different set of problems. For this reason, it is important that two to four experts evaluate the system independently. Heuristic evaluation tends to catch common interface design errors but may neglect more severe problems associated with system functionality. For this reason, usability tests are needed to evaluate whether the system is actually useful. Heuristic evaluation relies on design principles that tend to be formulated in a context-independent manner. Thus, while it is important to ensure that a new system interface meets those general guidelines and common practices for human–computer interaction, some problems cannot be uncovered without examining device use in context (Woods et al. 1994). As suggested by

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 Norman (1991, 1): “Clumsiness is not really in the technology; clumsiness arises in how the technology is used relative to the context of demands and resources and agents and other tools.” Thus, it is critical to understand that heuristic evaluation is a necessary but not a sufficient first step in the evaluation of any new system. Usability Tests and Controlled Experiments Although heuristic evaluations can identify many human interface design problems, testing and experimentation are required to understand how people actually use the system. This is particularly true with AIS because it has the potential to substantially change the operators’ tasks in ways that cannot be predicted. In addition, relatively little research has addressed AIS interface design. Usability testing has become a standard part of the design process for many major software companies, and the safety-critical nature of AIS makes it important for usability testing to be a part of AIS design. Operational Test and Evaluation Usability testing typically involves relatively few people using relatively few functions in a controlled environment. These limits mean that important design flaws may go unnoticed until the system is deployed on actual ships. For this reason, operational test and evaluation is a critical element of the design and evaluation process. Operational test and evaluation places the AIS interface in an actual operational environment to assess how it supports the operator in the full range of conditions that might be encountered. The committee did not identify many examples of operational test programs for AIS interfaces; thus, such operational test and evaluation programs are needed. ENSURING GOOD INTERFACE DESIGN: DESIGN, PROCESS, AND PERFORMANCE STANDARDS Good interface design can be guided by three general types of standards: design, process, and performance. Design standards specify the range or value of design parameters. These might take the form of very specific guidance concerning the color and size of display elements or more general guidelines, such as the 13 human factors design principles described above. Although design standards are attractive because they can specify equipment

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 precisely, they can also be vague and conflicting, which could lead to poor designs (Woods 1992). Adherence to design standards does not guarantee a good design. Process standards define the required design and evaluation process but do not define any of the features or characteristics of the device. For AIS interface design, process standards might mandate a process that begins with a task or work domain analysis, involves the application of human factors guidelines during design, and culminates in an operational evaluation. Performance standards define the required efficiency of the human–AIS interface and do not specify the interface features or the design process. Performance standards require a comprehensive test and evaluation process that evaluates how AIS supports the operator in a variety of situations. Performance standards can be complex and costly to administer and may not guarantee a good design because it is impossible to test all possible use scenarios. No one type of standard will guarantee an acceptable AIS interface. A combination of design, process, and performance standards may be necessary to promote effective AIS displays and controls. SUMMARY Human factors considerations of AIS span a broad range that includes standards development, operational testing, training and certification, and research and development. The rapid pace of navigation technology development and the limits of traditional design standards make it likely that process and performance standards could be useful mechanisms to address the human factors considerations of AIS display development. Performance standards require operational test and evaluations. These evaluations provide useful information that can help refine process and design standards. Too frequently system design focuses on the physical system and its operation and fails to consider training and certification programs as part of the overall system design. Training and certification can have an important effect on overall system performance and should be considered with the same care as the development of display icons and color schemes. Shipboard navigation and communication technology is changing quickly. In addition, there are many different operating environments, each with unique requirements for the AIS interface. More important, AIS may be used

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 in a variety of novel ways that cannot be anticipated until mariners start using it. For these reasons, it is critical to remain flexible and not to mandate a single interface standard. At the same time, the success of AIS depends on developing interfaces that are compatible with the capabilities of the operator and the demands of the operator’s tasks. These factors all argue for a continuing process of understanding the user, design, and evaluation that continues after the initial deployment of AIS. A combination of design standards, process standards, and performance standards is needed to ensure adequate interface design without interfering with the ability of designers to create effective AIS interfaces in the context of rapidly changing technology. These standards should evolve as the mariners’ use of AIS changes over time. Currently, the effect of AIS on the mariner is not well understood. General guidelines, such as the 13 heuristics described above, can help guide design, but research into the following issues is needed: Design, process, and performance standards for the human factors considerations of AIS; The potential benefits of multimodal interfaces to support mariner’s attention management; How technology development and trends in other fields, such as aviation, might influence AIS design; and How interface design can help address the trade-off between information requirements and the associated cost of complex shipboard displays of AIS information. REFERENCES Abbreviations DATE Design, Automation, and Test in Europe ISO International Standards Organization NASA National Aeronautics and Space Administration NTSB National Transportation Safety Board Bainbridge, L. 1983. Ironies of Automation. Automatica, Vol. 19, No. 6, pp. 775–779. Barrass, S., and G. Kramer. 1999. Using Sonification. Multimedia Systems, Vol. 7, No. 1, pp. 23–31. Borgatti, S. P., M. G. Everett, and L. C. Freeman. 1992. UCINET IV Version 1.00. Analytic Technologies, Columbia.

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 Brehmer, B., and R. Allard. 1991. Dynamic Decision Making: The Effects of Task Complexity and Feedback Delay. In Distributed Decision Making: Cognitive Models for Cooperative Work (J. Rasmussen, B. Brehmer, and J. Leplat, eds.), John Wiley and Sons, New York, pp. 319–334. Brewster, S. A. 1997. Using Non-Speech Sound to Overcome Information Overload. Displays, Vol. 17, Nos. 3–4, pp. 179–189. Brewster, S. A. 1998. The Design of Sonically-Enhanced Widgets. Interacting with Computers, Vol. 11, No. 2, pp. 211–235. Brewster, S. A., and M. G. Crease. 1999. Correcting Menu Usability Problems with Sound. Behaviour and Information Technology, Vol. 18, No. 3, pp. 165–177. Brown, C. 1988. Human–Computer Interface Design Guidelines. Ablex Publishing, Norwood, N.J. Carroll, J. M., ed. 1995. Scenario-Based Design: Envisioning Work and Technology in System Development. John Wiley and Sons, New York. Cook, R. I., D. D. Woods, and M. B. Howie. 1990a. The Natural History of Introducing New Information Technology into a High-Risk Environment. Presented at the Human Factors Society 34th Annual Meeting, Orlando, Fla. Cook, R. I., D. D. Woods, E. McColligan, and M. B. Howie. 1990b. Cognitive Consequences of “Clumsy” Automation on High Workload, High Consequence Human Performance. Presented at the Space Operations, Applications and Research Symposium, NASA Johnson Space Center. DATE. 2002. 2002 Design, Automation, and Test in Europe Conference and Exposition. www.computer.org/cspress/CATALOG/pr01471.htm. Dzindolet, M. T., L. G. Pierce, H. P. Beck, and L. A. Dawe. 2002. The Perceived Utility of Human and Automated Aids in a Visual Detection Task. Human Factors, Vol. 44, No. 1, pp. 79–94. Grabowski, M. R., and S. D. Sanborn. 2001. Evaluation of Embedded Intelligent Real-Time Systems. Decision Sciences, Vol. 32, No. 1, pp. 95–123. Guerlain, S., and P. Bullemer. 1996. User-Initiated Notification: A Concept for Aiding the Monitoring Activities of Process Control Operators. Proc., 1996 Annual Meeting, Human Factors and Ergonomics Society, Santa Monica, Calif. Hellier, E. J., J. Edworthy, and I. Dennis. 1993. Improving Auditory Warning Design: Quantifying and Predicting the Effects of Different Warning Parameters on Perceived Urgency. Human Factors, Vol. 35, No. 4, pp. 693–706. Helmreich, R. L., and H. C. Foushee. 1993. Why Crew Resource Management? Empirical and Theoretical Bases of Human Factors Training in Aviation. In Cockpit Resource Management (E. L. Wiener, B. G. Kanki, and R. L. Helmreich, eds.), Academic Press, San Diego, Calif., pp. 3–45. Hoffer, J. A., J. F. George, and J. S. Valacich. 2002. Modern Systems Analysis and Design, 3rd ed. Prentice Hall, Inc., Upper Saddle River, N.J. Hutchins, E. 1990. The Technology of Team Navigation. In Intellectual Teamwork: Social and Technical Bases of Cooperative Work (J. Galegher, R. Kraut, and C. Egido, eds.), Lawrence Erlbaum Associates, Hillsdale, N.J., pp. 191–220.

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 Hutchins, E. 1995a. Cognition in the Wild. MIT Press, Cambridge, Mass. Hutchins, E. 1995b. How a Cockpit Remembers Its Speeds. Cognitive Science, Vol. 19, No. 3, pp. 265–288. ISO. 1984. Development and Principles for Application of Public Information Symbols. ISOTR 7239. Geneva, Switzerland. Johnson, T. L., and K. L. Shapiro. 1989. Attention to Auditory and Peripheral Visual Stimuli: Effects of Arousal and Predictability. Acta Psychologica, Vol. 72, pp. 233–245. Johnson, W., and P. Prabhu. 1996. FAA/AAM Human Factors in Aviation Maintenance and Inspection Research Phase VI Overview. hfskyway.faa.gov/HFAMI/lpext.dll/FAA%20Research%201989%20-%202002/Infobase/7ffd?fn=main-j-hfami.htm&f=templates. Kerns, K. 1991. Data-Link Communication Between Controllers and Pilots: A Review and Synthesis of the Simulation Literature. International Journal of Aviation Psychology, Vol. 1, No. 3, pp. 181–204. Kirwan, B., and L. K. Ainsworth (eds.) 1992. A Guide to Task Analysis. Taylor and Francis, Washington, D.C. Kobryn, C. 1999. UML 2001: A Standardization Odyssey. Communications of the ACM, Oct. cgi.omg.org/news/pr99/UML_2001_CACM_Oct99_p29-Kobryn.pdf. Kusiak, A. 1999. Engineering Design: Products, Processes, and Systems. Academic Press, New York. Laxar, K., and G. M. Olsen. 1978. Human Information Processing in Navigation Displays. Journal of Applied Psychology, Vol. 63, pp. 734–740. Lee, J., and N. Moray. 1992. Trust, Control Strategies and Allocation of Function in Human–Machine Systems. Ergonomics, Vol. 35, No. 10, pp. 1243–1270. Lee, J. D., and N. Moray. 1994. Trust, Self-Confidence, and Operators’ Adaptation to Automation. International Journal of Human–Computer Studies, Vol. 40, pp. 153–184. Lee, J. D., and T. F. Sanquist. 1993. A Systematic Evaluation of Technological Innovation: A Case Study of Ship Navigation. IEEE International Conference on Systems, Man, and Cybernetics, pp. 102–108. Lee, J. D., and T. F. Sanquist. 1994. Classes of Maritime Automation and Implications for System Design and Training. Presented at First Automation Technology and Human Performance Conference, Washington, D.C. Lee, J. D., and T. F. Sanquist. 1996. Maritime Automation. In Automation and Human Performance (R. Parasuraman and M. Mouloua, eds.), Lawrence Erlbaum Associates, Mahwah, N.J., pp. 365–384. Lee, J. D., and T. F. Sanquist. 2000. Augmenting the Operator Function Model with Cognitive Operations: Assessing the Cognitive Demands of Technological Innovation in Ship Navigation. IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, Vol. 30, No. 3, pp. 273–285. Lee, J. D., and K. A. See. In press. Trust in Technology: Designing for Appropriate Reliance. Human Factors. Lin, L., K. J. Vicente, and D. J. Doyle. 2001. Patient Safety, Potential Adverse Drug Events, and Medical Device Design: A Human Factors Engineering Approach. Journal of Biomedical Informatics, Vol. 34, No. 4, pp. 274–284.

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 Liu, Y. L., R. Fuld, and C. D. Wickens. 1993. Monitoring Behavior in Manual and Automated Scheduling Systems. International Journal of Man–Machine Studies, Vol. 39, No. 6, pp. 1015–1029. Luce, R. D., and A. D. Perry. 1949. A Method of Matrix Analysis of Group Structure. Psychometrika, Vol. 14, pp. 95–116. Maurino, D. C., N. Johnston, R. B. Lee, and J. Reason. 1998. Beyond Aviation Human Factors. Ashgate Publishing, London. Mayhew. 1992. Principles and Guidelines in Software User Interface Design. Prentice Hall, Englewood Cliffs, N.J. Molloy, R., and R. Parasuraman. 1996. Monitoring an Automated System for a Single Failure: Vigilance and Task Complexity Effects. Human Factors, Vol. 38, No. 2, pp. 311–322. Moray, N. 2003. Monitoring, Complacency, Scepticism and Eutactic Behaviour. International Journal of Industrial Ergonomics, Vol. 31, No. 3, pp. 175–178. NASA. 2002. Maintenance Human Factors. human-factors/arc.nasa.gov/projects/ihs/maintenance.html. Nielsen, J., and J. Levy. 1994. Measuring Usability: Preference vs. Performance. Communications of the ACM, Vol. 27, No. 4, pp. 66–75. Nielsen, J., and R. Molich. 1990. Heuristic Evaluation of User Interfaces. Proceedings of the ACM/SIGCHI’90 Conference, Seattle, Wash., April 1–5, pp. 249–256. Norman, D. A. 1988. The Psychology of Everyday Things. Basic Books, New York. Norman, D. A. 1990. The “Problem” with Automation: Inappropriate Feedback and Interaction, Not “Overautomation.” In Human Factors in Hazardous Situations (D. E. Broadbent, A. Baddeley, and J. J. Reason, eds.), Clarendon Press, Oxford, England, pp. 569–576. Norman, D. A. 1991. Cognitive Science in the Cockpit. CSERIAC Gateway, Vol. 11, No. 2, pp. 1–6. NTSB. 1990. Marine Accident Report: Grounding of the U.S. Tankship Exxon Valdez on Bligh Reef, Prince William Sound, Valdez, Alaska, March 24, 1989. NTSB/MAR90/04. Washington, D.C. NTSB. 1997. Marine Accident Report: Grounding of the Panamanian Passenger Ship Royal Majesty on Rose and Crown Shoal near Nantucket, Massachusetts, June 10, 1995. NTSB/ MAR97/01. Washington, D.C. Pang, A. T., C. M. Wittenbrink, and S. K. Lodha. 1997. Approaches to Uncertainty Visualization. Visual Computer, Vol. 13, No. 8, pp. 370–390. Parasuraman, R., R. Molloy, and I. Singh. 1993. Performance Consequences of Automation-Induced “Complacency.” International Journal of Aviation Psychology, Vol. 3, No. 1, pp. 1–23. Parasuraman, R., and V. Riley. 1997. Humans and Automation: Use, Misuse, Disuse, Abuse. Human Factors, Vol. 39, No. 2, pp. 230–253. Perrow, C. 1984. Normal Accidents. Basic Books, New York. Rasmussen, J. 1986. Information Processing and Human–Machine Interaction: An Approach to Cognitive Engineering. North Holland, New York. Reason, J., and M. Maddox. 1998. Human Error. In Human Factors Guide for Aviation Maintenance, Version 3.0 (M. Maddox, ed.), Galaxy Scientific Corporation, Atlanta, Ga.

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 hfskyway.faa.gov/HFAMI/lpext.dll/FAA%20Research%201989%20-%202002/Infobase/1a4?fn=main-j-hfami.htm&f=templates. Rossano, M. J., and D. H. Warren. 1989. Misaligned Maps Lead to Predictable Errors. Perception, Vol. 18, pp. 215–229. Sarter, N. B., and D. D. Woods. 1994. Decomposing Automation: Autonomy, Authority, Observability and Perceived Animacy. In Human Performance in Automated Systems: Current Research and Trends (M. Mouloua and R. Parasuraman, eds.), Lawrence Erlbaum Associates, Hillsdale, N.J., pp. 22–27. Sarter, N. B., and D. D. Woods. 1995. How in the World Did We Ever Get in That Mode? Mode Error and Awareness in Supervisory Control. Human Factors, Vol. 37, No. 1, pp. 5–19. Schuffel, J. J., P. A. Boer, and L. van Breda. 1989. The Ship’s Wheelhouse of the Nineties: The Navigation Performance and Mental Workload of the Officer of the Watch. Journal of Navigation, Vol. 42, No. 1, pp. 60–72. Segal, L. D. 1994. Actions Speak Louder Than Words: How Pilots Use Nonverbal Information for Crew Communications. Proc., 38th Annual Meeting, Vol. 1, Human Factors and Ergonomics Society, Santa Monica, Calif., pp. 21–25. Sheridan, T. B. 1975. Considerations in Modeling the Human Supervisory Controller. Proc., IFAC 6th World Congress, Vol. 40, No. 2, pp. 1–6. Shneiderman, B. 1998. Designing the User Interface: Strategies for Effective Human–Computer Interaction, 3rd ed. Addison-Wesley, Reading, Mass. Sklar, A. E., and N. B. Sarter. 1999. Good Vibrations: Tactile Feedback in Support of Attention Allocation and Human–Automation Coordination in Event-Driven Domains. Human Factors, Vol. 41, No. 4, pp. 543–552. Smith, S., and J. Mosier. 1986. Guidelines for Designing User Interface Software. ESD-TR-86-278. U.S. Department of Defense; Office of Management and Budget, Washington, D.C. Sparaco, P. 1995. Airbus Seeks to Keep Pilot, New Technology in Harmony. Aviation Week and Space Technology, Jan. 30, pp. 62–63. Spence, C., and J. Driver. 1997. Cross-Modal Links in Attention Between Audition, Vision, and Touch: Implications for Interface Design. International Journal of Cognitive Ergonomics, Vol. 1, No. 4, pp. 351–373. Vicente, K. J. 1992. Memory Recall in a Process Control System: A Measure of Expertise and Display Effectiveness. Memory and Cognition, Vol. 20, No. 4, pp. 356–373. Vicente, K. J. 1999. Cognitive Work Analysis: Towards Safe, Productive, and Healthy Computer-Based Work. Lawrence Erlbaum Associates, Mahwah, N.J. Vicente, K. J., and J. Rasmussen. 1992. Ecological Interface Design: Theoretical Foundations. IEEE Transactions on Systems, Man, and Cybernetics, Vol. 22, No. 4, pp. 589–606. Wasserman, S., and K. Faust. 1994. Social Network Analysis: Methods and Applications. Cambridge University Press, New York. Wickens, C. D., S. E. Gordon, and Y. Liu. 1997. An Introduction to Human Factors Engineering. Longman, New York. Wiener, E. L. 1989. Human Factors of Advanced Technology (“Glass Cockpit”) Transport Aircraft. Contractor Report 177528. NASA Ames Research Center, Moffett Field, Calif.

OCR for page 105
Shipboard Automatic Identification System Displays: Meeting the Needs of Mariners - Special Report 273 Wiener, E. L., and R. E. Curry. 1980. Flight-Deck Automation: Promises and Problems. Ergonomics, Vol. 23, No. 10. Woods, D. D. 1991. Nosocomial Automation: Technology-Induced Complexity and Human Performance. Proc., International Conference on Systems, Man, and Cybernetics, pp. 1279–1282. Woods, D. D. 1992. Are Guidelines on Human–Computer Interaction a Faustian Bargain? Computer Systems Technical Group Bulletin, Vol. 19, No. 2, Aug. Woods, D. D., L. J. Johannesen, R. I. Cook, and N. B. Sarter. 1994. Behind Human Error: Cognitive Systems, Computers, and Hindsight. CSERIAC, Dayton, Ohio. Woods, D. D., E. S. Patterson, J. Corban, and J. C. Watts. 1996. Bridging the Gap Between User-Centered Intentions and Actual Design Practice. Proc., 40th Annual Meeting, Vol. 2, Human Factors and Ergonomics Society, Santa Monica, Calif., pp. 967–971. Woods, D. D., S. S. Potter, L. Johannesen, and M. Holloway. 1991. Human Interaction with Intelligent Systems: Trends, Problems, New Directions. CSEL Report 1991-001. Ohio State University, Columbus. Zuboff, S. 1988. In the Age of Smart Machines: The Future of Work Technology and Power. Basic Books, New York.