Part II
HUMAN FACTORS AND AUTOMATION ISSUES



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control Part II HUMAN FACTORS AND AUTOMATION ISSUES

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control This page in the original is blank.

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control 5 Cognitive Task Analysis of Air Traffic Control So far we have focused on describing the tasks that air traffic controllers must perform in managing traffic, the physical facilities in which they do so, and the means by which controllers are selected and trained for those tasks. In this chapter, we describe the controller's task from a somewhat different, more psychological perspective, identifying the cognitive and information-processing steps demanded of the controller and, by extension, the sources of vulnerability in the controller's performance. Several task analyses of air traffic control tasks have been carried out (e.g., Hopkin, 1988a; Ammerman et al., 1987; Murphy, 1989; Stager and Hameluck, 1990; Harwood et al., 1991; Seamster et al., 1993; Endsley, 1994). Our analysis draws heavily on the work completed by these investigators, particularly by Ammerman and colleagues. It also attempts to place their analyses in a more cognitive framework, by emphasizing the relationship between the tasks performed and the different cognitive or information-processing mechanisms employed by the controller (Wickens, 1992). We begin by presenting a general cognitive model of the controller's task. We then describe the ways in which human cognitive processes both are an asset in air traffic control and are vulnerable to environmental and system variables, discussing factors that moderate these vulnerabilities. Such an analysis has equal relevance for training as it does for design. Our treatment in this chapter is closely related to the discussion in Chapter 6, which specifically links these cognitive elements to the concept of mental workload.

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control COGNITIVE MODEL OF THE CONTROLLER'S TASK Controllers are generally successful and skilled in the performance of their tasks. Our proposed model of the cognitive processes by which these tasks are accomplished is shown in Figure 5.1. It is meant to be generic enough in form that it can accommodate equally the characteristics of tower, TRACON, and en route controllers. At a very global level of detail, we see the controller's task as one in which actions (at the right) are driven by events (at the left). The figure depicts five cognitive stages that intervene between events and actions: selective attention, perception, situation awareness, planning and decision making, and action execution. To elaborate more, the actions performed by the controller (such as communications and manual manipulations) are the result of following well-learned procedures and strategic plans, which are continuously formulated and updated on the basis of current awareness of the situation in the airspace and, in particular, the projection of that situation into the future. This awareness, referred to as the big picture (Hopkin, 1995), is based in turn on external events involving aircraft, weather, and equipment, as these events are selected for processing and then perceived by the controller via radar displays, radio messages, paper printouts, and (occasionally) telephone calls. The five stages do not constitute a rigid sequence. Steps may be skipped; for example, planning and decision making may be unnecessary if the appropriate action is known on the basis of past experience. The processes can be iterative; for example, perception is the basis for situation awareness, but situation awareness can guide selective attention and influence subsequent perception. Finally, each of the five processes draws on knowledge stored in long-term memory, and each of them may modify or add to that knowledge. External Events External events that call for controller actions occur primarily in the airspace outside the tower or en route center. These include the filing of flight plans, pilot requests for clearance, changes in aircraft trajectories, handoffs from other controllers, and changes in weather. Other important events, however, may occasionally occur at an airport or in an air traffic control facility itself, such as blocked runways and instrument or power failures. The immediate manifestation of most of these events is the presentation of new information to the controller. The information presentations that are spawned by the events are easily identified and categorized through task analysis. These categories include visual changes on the primary radar display, information contained on the flight strips, auditory input from voice communications by pilots and other controllers, visual and auditory alerts provided by automated handoffs or by projected or real loss of separation, as well as other input regarding weather conditions and runway status. Information delivered on all of these channels must first be selected. The

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control FIGURE 5.1 Cognitive model of the controller's task.

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control skilled controller knows where to look (or what to listen to) in order to gain critical information at the time that the information is both needed and available (Moray, 1986). Breakdowns in this selective attention process may occur, however, if an event occurs in a visual channel where it is not expected or if the display space is so cluttered that the event cannot easily be seen. Not only the location but also the very nature of the events may be characterized by the extent to which they are expected and anticipated. Expectations may be based on specific past events (a plane is expected to continue on a given heading, a pilot is expected to read back the clearance provided and to change the aircraft's speed, altitude, or heading accordingly; Monan, 1986), or they can be based on general scripts of how the air traffic control process operates (Schank and Abelson, 1977) (e.g., a pilot newly arriving in the sector is expected to initiate communications and exchange information according to a well-established protocol). In either case, perceptual processes are influenced by long-term knowledge. Psychologists speak of top-down processing in describing the influence that expectations and knowledge have on perception. The extent to which controllers, like experts in all fields, easily perceive what is expected cannot be overstated. Conversely, however, the vulnerability of the controller's perception of the unexpected is a fact of life. Working Memory Once information is perceived, it may be retained in working memory. The human working memory system represents the ''workbench" at which most of the conscious cognitive activity takes place (Baddeley, 1986). Working memory may temporarily retain information that is either verbal or spatial. Verbal working memory is the "rehearsable" memory for sounds, typically digits and words, and is the memory system that the controller uses when receiving a request or readback from the pilot (Morrow et al., 1993). Hence, working memory represents a critical component of communications. It is also the mechanism used when the controller, after reading a data block or flight strip, must temporarily retain the written information prior to translating it into a spatial representation. Spatial working memory is used to maintain analog, the representation of the airspace (Logie, 1995). The contents of spatial working memory replicate to some extent the controller's radar display, but they also incorporate, in three-dimensional spatial form, the critical altitude component that is represented only digitally on the radar display. Information in working memory is further interpreted on the basis of knowledge stored in long-term memory. Information that matches stored "schemes" may result in the identification of familiar situations, predictions of future events, and retrieval of associated responses (e.g., weather problems at the airport require delays, which can be achieved by setting up a holding pattern). More effortful reasoning or computation may be required to identify other significant relationships

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control and predictions (e.g., if two aircraft continue on their present trajectories, they will conflict). These processes of comprehension and prediction provide a mental picture of the situation confronting the controller (Seamster et al., 1993) and underlie the controller's situation awareness. Situation awareness has been defined by Endsley as "the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future" (Endsley, 1995:36). Situation awareness includes, ideally, an understanding of the current and future trajectories of all aircraft within the sector, some representation of traffic about to flow into the sector, awareness of other relevant but possibly changing conditions, such as weather and equipment status, and an understanding of how all the factors affect the achievement of air traffic control goals and constraints (such as permissible separations between aircraft, avoidance of terrain and restricted airspace, etc.). In future systems, it may also include awareness of the current operating modes of automated equipment (Sarter and Woods, 1995) and possibly of the momentary distribution of responsibility for traffic separation between ground and air in a free-flight regime. Long-Term Memory The processes involved in comprehending the perceived information draw heavily on knowledge structures in long-term memory. Characterizing less dynamic aspects of the controller's environment, such structures include knowledge of the airspace, including geography, terrain, air routes, fixes, and air traffic control sector shapes around a particular facility (Redding et al., 1992), knowledge of radar and equipment characteristics and capabilities, knowledge of weather configurations, and knowledge of different aircraft performance and maneuvering capabilities. Experience in a domain often leads to long-term memory structures that permit more efficient and/or insightful encodings or "chunking" of multiple events (Chi et al., 1981). In the air traffic control domain, for example, experienced controllers may directly identify important types of events involving multiple aircraft (such as conflict) rather than focusing on individual aircraft (Seamster et al., 1993). On the basis of situation awareness, the controller must select an action. Typical actions include maintaining separation and coordinating traffic flow by requesting changes in the heading, altitude, or speed of one or more aircraft. In most familiar situations, the appropriate action may be immediately retrieved from long-term memory of an extensive repertoire of well-learned and well-documented procedures. In other cases, determining the appropriate action may require greater cognitive effort (such as ordering heading changes to avoid a potential conflict situation). In an unfamiliar situation, the controller may verify the adequacy of a potential action by mentally simulating its consequences (Klein

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control and Crandall, 1995), for example, trying to visualize them in spatial working memory. The decision-making and planning processes also draw heavily on knowledge in long-term memory. Relevant long-term knowledge includes formalized procedures (e.g., for creating the appropriate separation) acquired through training and documented in manuals and texts; goals and constraints, such as the required degrees of separation under different conditions of weather and aircraft equipment; informal strategies or heuristics picked up by experience and the observation of others (Hopkin, 1988b); and knowledge of risks and future uncertainties associated with particular situations or types of actions (Wickens and Flach, 1988). The product of the decision-making process may be an immediate action, a strategic plan for action at a later time, or a series of actions that must occur over a period of time (for example, as aircraft are scheduled to arrive at a particular point in the airspace in a particular sequence). Generally, the more strategic plans take place within the en route centers, particularly in the traffic management units therein. The successful execution of planned actions at a later time depends on the reliability of prospective memory (Harris and Wilkins, 1982), that is, the ability of the controller to remember to take a particular action at a point in time in the future. Prospective memory is also required to confirm the effects of controller actions, like checking the altitude of an aircraft that had previously been directed to a new flight level to determine that the goal has been attained. Such memory is aided by many reminders in the environment, in particular the annotation of flight strips. Another kind of long-term knowledge that is involved in virtually every stage of cognition is one that involves the strategies and heuristics that are developed over time for efficiently managing cognitive processes (Gopher, 1993; Huey and Wickens, 1993; Seamster et al., 1993). These strategies may improve the efficiency and validity of perception, situation awareness, planning, and action execution. For example, strategies for allocating perceptual attention among external events may help controllers handle situations with a high event rate (Stein, 1993; Gopher, 1993). Strategies for prioritizing tasks may also help when workload is high (Gopher et al., 1994). Strategies for remembering important information (for example, by rehearsing or using efficient encodings for aircraft identifications and trajectories, flight plans, and clearances) may help controllers construct a coherent picture of the situation and carry out planned actions. Other strategies apply to novel or complex situations and include judging the time available before some action must be taken (for example, to avoid a conflict) and using the available time effectively to verify the adequacy of situation understanding and the proposed plan (Cohen et al., 1996b; Raby and Wickens, 1994; Orasanu, 1993; Fischer and Orasanu, 1993). For example, controllers may quickly review their mental model or plan for completeness (have all aircraft been considered?), reliability (can the aircraft make the requested maneuver in

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control time?), and consistency (are all the aircraft carrying out the requested procedures?). Attentional Resources In addition to inputs to the controller's task defined by external events and by long-term knowledge, another input is that of the controller's own cognitive effort or attentional resources, which are allocated to sustain task performance. As discussed in more detail in the following chapter on controller workload, events may vary on four related dimensions that influence the resources that the controller must allocate to deal with them. The first two dimensions are (1) the frequency with which the events occur in time and (2) the extent to which the events are complex. Both a high event rate and high complexity of the individual events can increase cognitive workload. The third and fourth dimensions can mitigate the demand on workload; events thus vary in the extent to which they are (3) expected and in the extent to which they are (4) familiar or routine. Extensive experience with high-workload situations, or in handling a particular type of complex event, reduces the cognitive resources required to deal with the event. Even in the absence of such familiarity, if high density or highly complex events are anticipated, then advance preparation can mitigate the demands on resources if the events actually occur. These dimensions interact in a variety of ways. The pilot who reads back a clearance incorrectly generates a simple but unexpected event. To notice such unexpected events, the controller must continuously and carefully monitor all channels of information to assess if there is any change or conflict with expectations. This monitoring itself requires some cognitive effort. However, once the problem is noticed, it can be handled with little or no cognitive effort, as long as the controller is reasonably experienced and the event rate is reasonably low; the formulation of intentions and actions remains fairly routine. Unexpected events that are somewhat more complex (e.g., the announcement of an unanticipated newly arriving aircraft, a request for diversion) require somewhat more cognitive effort to be incorporated into the controller's mental picture of the airspace, but if the events are familiar, this effort is not prolonged. As Rasmussen (1986; Rasmussen et al., 1995) describes it, these events call for rule-based behavior. That is, with minimal problem-solving requirements, the controller may call up internally memorized rules to deal with the routine situations. If events are relatively unfamiliar and complex, then major cognitive effort is required. These events often trigger what Rasmussen describes as knowledge-based behavior, the need for creative problem solving. When such events are also unexpected (e.g., an aircraft is unable to taxi off an active runway because of a malfunction, or an aircraft mistakenly executes an inappropriate maneuver in a crowded airspace), then knowledge-based behavior must be initiated on the spot, often under severe time pressure. In contrast, when the complex events are

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control expected (e.g., a pilot in a crowded airspace requests a rerouting around weather), the required knowledge-based processing may be carried out at least in part in advance of the event itself, by preparing possible solutions in advance. As a result, less cognitive effort is then required to handle the event when it does occur. It is important to recognize that the extent to which events of varying levels of complexity generate rule-based versus knowledge-based behavior is greatly dependent on the skill level and experience of the controller. As in any profession, a relatively complex event may be handled by rule-based behavior by the expert, but it may require knowledge-based problem solving for the novice. It is also important to keep in mind that many aspects of expertise are facility and even sector specific (Redding et al., 1992): the same event that triggers rule-based behavior in a facility (or sector) at which a controller has worked for years may trigger knowledge-based behavior in a facility or sector new to that controller, where local procedures may differ as well as the nature of equipment, the sector structure, the terrain, the traffic mix, and the air routes. The heavy impact of facility-specific learning in air traffic control has important implications for the difficulty of generic training (see Chapter 3). As we discuss in the following chapter, it is assumed that controllers attempt to manage the task demands at a reasonably constant level of cognitive effort (Hart and Wickens, 1990), by drawing on strategies stored in long-term memory. When task demands become excessive because of combinations of high event rates and complexity, controllers attempt to maintain adequate performance without an excessive expenditure of effort by amending the strategies by which they deal with aircraft (Sperandio, 1976), as well as by changing their criteria for dealing individually with pilot requests. They may also shed tasks of lower priority or offload tasks either to the pilot or to other controllers who may be less busy. Hence, task and workload management (dealt with in Chapter 6) is clearly linked to team issues (dealt with in Chapter 7). COGNITIVE VULNERABILITIES IN THE CONTROLLER'S TASK The cognitive task analysis has revealed a diverse array of cognitive skills that the controller must marshall to handle the complex dynamic problems of managing multiple aircraft in an uncertain environment. For some of these skills, the human expert is uniquely qualified and so far has well exceeded the capabilities of even the most sophisticated forms of artificial intelligence. In the most general terms, we can characterize these strengths in terms of the controller's adaptability and flexibility in carrying out knowledge-based behavior. Although most control involves fairly routine following of procedures, the skilled controller is keenly attuned to subtle cues that may predict future unusual events and will possess in long-term memory a wide variety of adaptive strategies and plans to address these events if they do occur. And as the closed loop and iterative nature

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control of Figure 5.1 make clear, the skilled controller is also able to monitor the implementation of the plan and to flexibly modify it in creative but adaptive ways should initially formulated plans appear to be unsuccessful. It is apparent that such adaptive flexibility is (a) not easily taught by formal training procedures but must be learned on the job and (b) becomes progressively more important with traffic that is more complex and less predictable or routinized in its behavior. Although skilled controllers may thus be characterized by their cognitive strengths, it is also the case that human information processing is subject to several forms of vulnerability, all of which have implications for controller performance. In this section we outline seven major categories of vulnerabilities, each inviting performance degradation. Each of these vulnerabilities in turn may be exacerbated or attenuated by certain design or environmental factors, which are outlined in the following section on moderating factors. Each vulnerability in the following discussion relates to a particular aspect of the controller model shown in Figure 5.1. Visual Sampling and Selective Attention Because much of human visual search and pattern recognition is serial, with event-filled displays the controller is vulnerable to missing critical events through breakdowns in the serial visual scanning process (Stein, 1993). This is particularly true to the extent that many of these events must be inferred from signals that are not particularly salient to the untrained eye (e.g., a future conflict, a change in the altitude field in the data, a pilot's failure to implement a requested course alteration) rather than perceived from salient ones (e.g., a blinking data tag or automated alert for loss of separation). The quality of visual sampling is further inhibited by the amount of information or clutter in the visual environment, whether this is the view of a radar scope or the view from the tower of a busy taxi and ramp area. More visual elements to be scanned increase the likelihood that critical ones will not be attended to (Moray, 1986). And if these elements are similar, the likelihood is increased that elements may be confused. Yet in the case of the radar display, what is unwanted clutter at one instant may be valued information at the next, a dilemma that is not easily resolved. Expectation-Driven Processing Expectations influence perceptions. We see (or hear) what we expect to perceive, and this tendency allows the perception of expected and routine events to proceed rapidly and with minimal effort. Yet such expectation can be a source of vulnerability when events occur that are not expected, especially when these events are not perceptually salient or occur under conditions of high workload. Such perceptual errors, for example, lie at the root of the confirmatory "hear back" problem (Hawkins, 1987; Monan, 1986), when the controller incorrectly

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control Such circumstances may be envisioned with possible implementation of free flight. Communications Voice communication involves a two-way process of sending (speaking) and receiving information (listening), and its vulnerabilities in the national airspace system have been well documented by Nagel (1988), who notes that the largest single cause of air traffic control incidents relates to breakdowns in information transfer (see also Kanki and Prinzo, 1995). Its successes and failures both depend on factors described previously: expectation-driven processing and working memory whose limitations may hinder the understanding of long communications strings (Burke-Cohen, 1995) that must be retained for even a few seconds before being translated into action, or that contain unfamiliar material (e.g., strange names, nonnative language). Communications effectiveness also depends on shared assumptions, a shared mental model or shared situation awareness between speaker and listener (Salas et al., 1995). For example, a pilot unaware of other traffic that influences a controller's decision to issue inconvenient instructions may be more resistant to following them in a timely fashion. If a controller is unaware of a pilot's momentary high level of workload or the aircraft's current situation with regard to nearby weather, the controller may issue instructions with which compliance is more difficult. If one controller is unaware of the high workload (or low skill level) of a controller in an adjacent sector, the former may be more likely to take an action that can directly raise the workload of the latter. It is clear that these communications issues directly affect the ability of controllers and pilots to function effectively as teams, and we discuss this further in Chapter 7. Long-Term Memory As we have seen, long-term memory is also relevant for maintaining situation awareness. Vulnerabilities in long-term memory are manifest in four kinds of activities. First, there may be breakdowns in what we call "transient knowledge" in long-term memory, which consists of immediate memory for events in the current situation and prospective memory for actions that the controller plans to perform within the next few minutes. These lapses or breakdowns occur when controllers fail to recall developing aspects of the current situation of which they were at one time aware. Self-generated activities, like writing an amendment on a flight strip, are less likely to lead to such forgetting than activities initiated by another (human or computer) agent (e.g., when an automated control system updates the electronic strip (Hopkin, 1988a; Slamecka and Graf, 1978; Vortac and Gettys, 1990). High levels of workload may also lead to the breakdown of

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control prospective memory, causing controllers to forget to check on the status of certain aircraft. A drastic breakdown of this sort was partially responsible for the collision between two aircraft on the runway of the Los Angeles International Airport in 1991 (National Transportation Safety Board, 1992) and again at St. Louis' Lambert Field in 1994 (Steenblik, 1996). Second, breakdowns in enduring knowledge in long-term memory are in part a result of shortcomings in training—forgetting of procedures, regulations, etc. But these may also reflect an inadequate mental model of the fixed features of the immediate airspace (flight routes, fixes, terrain, etc.). Given that the mental model of the expert controller is quite dependent on precise knowledge of these spatial features, when the controller transfers to work in a different region (i.e., different sector within a facility or different facility), considerable time will be required to attain proficiency in the new area. Third, breakdowns in procedural knowledge may result when different operating procedures or equipment are introduced. Negative transfer of old habits to the new situation may cause the old habits to persist (Singley and Anderson, 1989; Holding, 1987). Fourth, breakdowns may occur because of inadequate knowledge or understanding of aircraft performance limitations and capabilities, inadequate strategies for dealing with future conflicts, and for optimizing deployment of the controller's attentional resources (i.e., knowing which aircraft need most attention now and which can be deferred). Judgment and Decision Making Decision making may become difficult in novel or unusual situations in ways that have little to do with workload demands (that is, even when the number and complexity of events is limited). For example, on occasion in situation assessment, the available data (e.g., regarding expected future traffic flow into the sector or expected changes in weather) may be incomplete, conflict with other evidence, or be unreliable and ambiguous. In planning and decision making, there may appear to be no feasible option that reliably achieves all of the controller's goals (e.g., maintaining separation while avoiding prolonged holding). Still, the nature of most air traffic control decision making is relatively routine and enables controllers to select appropriate procedures to apply once they correctly identify and classify the existing situation. These types of decisions have been studied in many real-world situations by Klein et al. (1993). This research suggests that experienced decision makers learn a large set of patterns and associated responses in a domain. Rather than comparing options in terms of their predicted outcomes, proficient decision makers are more likely to recognize familiar types of situations and retrieve an appropriate response. Pattern recognition by itself, however, does not account for how decision makers handle uncertain or unfamiliar situations. Recent research (Cohen et al., in press; Pennington and Hastie, 1993) suggests that, in these situations, decision makers adopt strategies

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control in unfamiliar or uncertain situations that build on but go beyond recognitional abilities. Such strategies attempt to identify and correct the short-comings in recognitional responses to the situation. For example, decision makers identify and try to fill gaps in a situation model; in addition, they may test the model by identifying predictions and collecting additional data. Decision makers may elaborate the model by means of assumptions to fill gaps when data are not available or to explain data that appear to conflict with the model. Finally, they evaluate the plausibility of the assumptions required by the elaborated model, and, if the assumptions seem implausible, they may explore alternative elaborations. Strategies of this kind enable decision makers to handle uncertainty and competing goals without the formal apparatus of probabilities and utilities based on normative theory. Instead of manipulating abstract symbols, they focus on concrete, visualizable representations. Decision errors may sometimes result, however. For example, decision makers can forget or fail to evaluate the assumptions that are embedded within the picture of their current situation. Many decisions in air traffic control are collaborative. For example, controllers in adjacent sectors may need to develop a joint strategy for avoiding a future conflict that affects aircraft in both. A controller may issue an instruction to a pilot that the latter finds difficult to accept, or the pilot may make an urgent emergency request that the controller finds difficult or unsafe to grant. Hence, much decision making may be viewed as collaborative, and some of it as negotiated. As we noted before, the success or failure of such collaborative decision making may depend substantially on the extent to which common situation awareness is shared by the participants (Salas et al., 1995). Errors The concept of controller error has two somewhat different meanings. Operational errors have a formally defined meaning in terms of loss of separation, and their occurrence has serious safety and personal implications for the controller. In contrast, we refer to controller errors here as any of a much wider range of inappropriate behaviors that result from breakdowns in information processing. Many of these may have only minor safety implications (e.g., pressing the wrong key for accepting an automated handoff). For others, the safety implications may be severe, even if they do not contribute to a formally defined operational error (e.g., issuing an inappropriate instruction that creates a difficult and complex traffic situation for other controllers or pilots). In discussing controller error, it is important to emphasize the fact that humans make errors in working with complex systems (Reason, 1990). This fact is the inevitable down side of the highly advantageous quality of human flexibility and adaptability, which we discussed as a great cognitive strength for air traffic control. Indeed, it is a strength that the human operator brings to any complex system (Rasmussen et al., 1995). Thus, aspects of design should focus less on the

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control complete elimination of all human error (an unattainable goal) and more on error-tolerant design, either through incorporating (and preserving) redundancies or through implementing (or preserving) error recovery mechanisms. Norman (1981) and Reason (1990) have defined similar error taxonomies within the framework of an information-processing model such as that shown in Figure 5.1, and other investigators have applied similar sorts of categorizations to the identification of controller errors (Stager and Hameluck, 1990; Rodgers, 1993; Redding et al., 1992). The models developed by Norman and Reason (see also Wickens, 1992) identify five categories of human error, outlined below. Knowledge-based mistakes are errors in understanding the situation. For example, a controller may not realize that a conflict exists or is pending. Such errors result generally from a lack of knowledge or information regarding the situation—perhaps from impoverished displays, from poor information sampling (scanning), or from a controller's inability to extract the appropriate information from the display or to interpret that information correctly. Ruled-based mistakes involve selecting an inappropriate rule of action to address a correctly diagnosed situation. For example, the controller may have correctly perceived the pending conflict but chooses to implement a solution that is inappropriate, perhaps requesting an aircraft to maneuver in a fashion that imposes limitations on its performance or that violates some other aspect of the airspace. Lapses are a form of error that is relevant in the context of long-term memory. A lapse involves forgetting to take a planned action (a lapse of prospective memory). The mode error happens when the controller performs an action that might be appropriate in one mode without realizing that the system is in a different mode, so that the same action is no longer appropriate. For example, the controller may forget that certain separation standards have temporarily changed (e.g., because of weather conditions). Mode errors are increasingly prominent in more advanced automation systems that themselves have multimode functions. The crash of Airbus A320 near Strasbourg, France, was apparently the partial result of a mode error, when the pilot apparently believed that the autopilot was in a 3.3-degree flight path angle descent mode, when in fact the same ''3" setting triggered a 3,300 ft/minute descent mode. Slips of action occur when the correct intention is formulated, but the incorrect action slips out of the controller's fingers (in the case of keyboard entry) or mouth (as when the controller delivers an instruction intended for one aircraft to a different one) (Norman, 1981). The cause of these slips remains poorly understood, although it is appreciated that they are as likely to occur with experts as with novices (Reason, 1990). One reason is that slips are more likely to occur when the operator is not fully paying attention to error-producing components of the task. For the novice, who must pay attention in order to accomplish the tasks

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control at all, such a state of inattention is not really achievable. However, the expert, for whom several tasks can be performed at an automatic (i.e., inattentive) level, it is easy to understand how slips can occur. A major cause of slips is when procedures or actions to carry out an intention differ from one case to another, but the physical environments (and physical signals triggering the action) are quite similar. These circumstances produce negative transfer from the old to the new, and, as we saw in the section on long-term memory, can also be considered as breakdowns in long-term memory. If feedback from the action is made readily visible or audible (as is the case when one hears one's own voice), then the slips can often be self-detected and corrected before they lead to undesirable consequences. As a result, the system becomes more tolerant of errors—"error tolerant." MODERATING FACTORS Environmental design and system factors may either exacerbate or attenuate the vulnerabilities of the human information-processing system. Some of these factors are summarized in the following section, organized as above in terms of the major categories of vulnerability. Visual Sampling Difficulties with visual sampling can be exacerbated by the low arousal resulting from sleep loss, fatigue, and circadian rhythms, by cluttered displays or a cluttered visual environment, by display environments that have many similar appearing elements; and by the distraction of high workload. Many of these problems can be attenuated by automated assists that recognize critical events and translate them into salient abrupt onset signals (e.g., conflict alerts, minimum safe altitude warnings), by decluttering options, by display technology that integrates (or brings close together) related items (Wickens and Carswell, 1995) and that distinguishes confusable items by physical properties (e.g., color coding), and by concern for fatigue and workload issues. Expectation-Driven Processing The problems resulting from the bias to perceive the expected event (and therefore misperceive or fail to perceive the unexpected) is exacerbated for the perception of rare or atypical events. High workload often leads to less complete perceptual processing of all events and hence disproportionately enhances the likelihood that the unexpected will be perceived inappropriately. Nonredundant channels of communication, poor data quality, and rapid communications via speech channels all tend to make this misperception more likely to occur.

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control The vulnerability of processing of the unexpected can be attenuated by incorporating redundancy into any critical message that may be unexpected, and by training the importance of clarity of communications, giving emphasis to key words and phrases that may be unexpected in the circumstances. Working Memory The limitations of verbal working memory are exacerbated to the extent that long messages are communicated solely by the voice channel, since working memory is heavily involved in the processing of speech (Burke-Cohen, 1995). The vulnerability of working memory is exacerbated still further by the presence of any concurrent verbal activity, whether this activity is in the environment (the controller is trying to listen when there is related verbal activity heard in close proximity) or is carried out by the listener (trying to remember a communication while concurrently speaking or listening). In general, high workload and stress make working memory more vulnerable to information loss, as does the existence of confusable material (similar-sounding words, names or acronyms, similar aircraft call signs). The limitations of working memory can be partially addressed by redundancy. This may be accomplished by restating or repeating critical elements (Burke-Cohen, 1995) or possibly by designing redundant communications channels that will back up (but not replace) auditory communications with a visual "echo" of the spoken message, to be referred to if necessary. The data link system (Kerns, 1991) discussed in Chapters 7 and 12 can accomplish this function, although elimination of auditory channels via datalink would destroy redundancy. Attention to task analysis, minimizing unnecessary auditory stimulation, and minimizing the existence of potentially confusing (similar) auditory utterances also address problems of working memory. Situation Awareness Situation awareness is more vulnerable (and more difficult to achieve) in a crowded, complex, and heterogeneous airspace; when operating procedures are inconsistent; when the controller is handling a less familiar sector; when information must be translated from symbolic (verbal) formats into the spatial mental picture; and under conditions of high workload or distraction. All of these contributing factors to the loss of situation awareness exert even greater influences to the extent that the future state, rather than current one, is to be assessed. Situation awareness is also inhibited by the loss of data from poorly designed displays or from conditions that inhibit the communications from other aircraft or other controllers. Situation awareness may be better preserved by display formats that are compatible with the controller's mental model of the airspace and by the integration

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control and easy accessibility of all necessary information. Predictive displays help the controller anticipate future situations; tools that guide controller's attention to the right place at the right time support maintenance of improved situation awareness (Sarter and Woods, 1995). It is not clear, however, that three-dimensional displays offer similar improvements (May et al., 1995). Communications We have already noted aspects of expectation-driven processing and of working memory that can exacerbate or reduce deficits in communications. There is also evidence that many aspects of the information that is exchanged are conveyed via nonlinguistic features: the tone of voice can convey urgency or uncertainty, and speakers can augment their voice message by pointing or gesturing (Segal, 1995). System design changes like the datalink system and those that physically isolate one controller from another remove some of these vital channels. Furthermore, replacement of voice communications received by the pilot from air traffic control by datalink may eliminate many important nonlinguistic cues available to the pilot, such as the degree of urgency of an instruction. In contrast, efforts to support shared situation awareness, perhaps through common displays or common training, can facilitate communications. Long-Term Memory Failures of transient long-term memory (prospective memory) may be induced by high workload and distraction and by the removal of the operator from the role of an active decision maker in choosing relevant actions, whose impact should be later remembered. Failures of remembering appropriate procedures are invited whenever the procedures are suddenly changed. A corresponding invitation to memory failure occurs when a controller must move to a new sector or facility. Failures of memory are also exacerbated by poor training and the absence of opportunities for recurrent training of infrequently used (but critically important) skills. These issues are relevant to the potential impact of automation, discussed in Chapter 12 and to be advanced further in the panel's Phase II report. Many long-term memory problems are addressed by care given to training. Also, good displays, with reminders of pending actions, address certain problems of forgetting. The limitations of long-term memory reflected in negative transfer from one environment to another may be mitigated if care is given to the consistency of operating rules and procedures whenever possible. Decision Making Decision making is vulnerable when information is incomplete, conflicting, or unreliable or when goals conflict. Deviations from optimal decision making,

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control as noted above, have been demonstrated in experimental tasks in which workload, stress, and task complexity were minimal. Vulnerability may be increased significantly, however, in real-world tasks with high workload (e.g., large numbers of aircraft) and time constraints (e.g., time until separation minimums will be violated). Training in normative decision-making methods (e.g., decomposing decisions into options, outcomes, and goals, assessing probabilities and utilities, and mathematically combining the assessments) is unlikely to result in improved performance by air traffic controllers (Means et al., 1993). However, decision making may be improved by training and displays that are sensitive to strategies that do work well in real-world environments. Training, for example, may sensitize controllers to trade-offs among speed, accuracy, and task prioritization (Means et al., 1993; Cohen et al., 1996). In addition, it may foster techniques for identifying and correcting problems in situation understanding and plans, to the extent that time is available prior to taking action. For example, controllers may learn to recognize gaps in their knowledge of relevant information, conflicts in the data, or unreliable assumptions underlying their understanding of the data (Cohen et al., in press a, b). Similarly, displays may make explicit the time available before action must be taken and alert decision makers to other high-priority tasks. Displays may also highlight conflicts or sources of unreliability that deserves controller attention. Errors Many of the exacerbating factors discussed above are likely to induce errors of different kinds. Poor displays and inadequate training lead to mistakes; high workload leads to lapses, etc. In particular, however, changes in procedures and poor design attention given to compatibility and confusability are invitations to both mode errors and slips. Mode errors are also induced by multimode automated systems, in which similar actions can have very different consequences. Practically all of the attenuating factors discussed in the section on visual sampling will help to remediate controller errors. However, we place particular emphasis here on good interface design (adhering to basic human factors principles; Norman, 1988); on building adequate feedback into a system so that the controller has a clear visible or auditory display of actions taken (and implemented within the system) and their progress in completion; and on incorporating an error-tolerant philosophy into system design, such that redundant elements can catch errors with greater reliability and such that there are recovery paths from errors that may be made but noticed later (Norman, 1988; Rouse and Morris, 1987).

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control Trade-Offs in Human Factors Solutions It should be noted in the above list that certain proposed or considered system modifications may affect one or more of the vulnerabilities. Sometimes these may coincide in attenuating the influence of more than one factor. For example, proposed datalink interfaces (Corwin, 1991; Kerns, 1991), by virtue of their visual displays of communications, which may be coupled with redundant voice communications, should simultaneously reduce expectation-driven processing and phonetic errors in working memory (Corwin, 1991; Kerns, 1991). Attention to consistency of design and procedures will attenuate the undesirable effects of both long-term memory forgetting (negative transfer) and slips. At the same time, such technology may exert detrimental effects: for example, keyboard entries via datalink will increase the vulnerability to slips, the use of the keyboard may slow the transmission of information, and the removal of voice may eliminate important sources of nonverbal information. In a more general sense, our task analysis reveals that there are often trade-offs between human factors solutions that benefit one aspect of processing even as they inhibit another. Decluttering of a display to facilitate visual selective attention may hide information necessary to sustain situation awareness. Silencing auditory chatter to avoid interference with working memory may also have a corresponding negative effect on situation awareness, by removing useful communications channels (Pritchett and Hansman, 1993). Making procedures consistent to avoid slips and negative transfer may inhibit controllers in their requirements to be flexible problem solvers in unusual circumstances. Finally, we note the existence of such trade-offs between workload and situation awareness that are achieved by the introduction of high levels of automation. Automated features, if they are restricted to alerts and display features based on computer computation, will thereby reduce the demands for cognitive spatial activity (i.e., will reduce workload) and will also be useful because they attenuate the vulnerabilities of visual sampling and spatial working memory. If, however, automation is extended to decision-making and action-taking activities with the desire to reduce workload still further, the advantage of reduced workload will be counteracted by a reduced situation awareness, resulting from poorer transient knowledge (Vortac et al., 1993), as well as a potential loss of skill. CONCLUSIONS Cognitive tasks analysis has provided a framework for understanding the implications of design and procedural changes and of training technologies on performance of the individual controller. Our analysis reveals the strengths of the skilled controller in the ability to flexibly adapt to novel or unusual situations, drawing on long-term memory to find solutions and to monitor the success of their implementation. Some vulnerabilities in the controller's information processing

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control can be addressed by careful consideration of design factors, such as ensuring that salient signals (abrupt onsets) characterize important and unexpected events, minimizing the confusability of information, using computer technology to provide visual display of material to be retained in working memory, and providing reminders that will augment the controller's prospective memory for tasks to be performed. However, it is important to note that design or procedural solutions implemented to address one vulnerability may exacerbate the influence of another. The panel identified a number of limitations in human perception that could be addressed by research and design: The less efficient processing of unexpected (and therefore rare) events is an inherent aspect of the human perceptual system, and design implementations and procedural changes must acknowledge this fact. Human perceptual processing is inhibited by display clutter. But what is clutter at one time (or for one controller) may be relevant information at another. Furthermore, loss of situation awareness may result from the absence of information, an absence that could be created by decluttering schemes. Research on the trade-offs between clutter and information in air traffic control would be invaluable. Controllers are limited in their ability to predict future traffic states with multiple aircraft of heterogeneous performance capabilities. Intelligent displays that can automatically accomplish this prediction and explicitly display it are valuable tools to assist controller performance. Research should address the efficacy of analog predictors of aircraft altitude. The panel identified a number of opportunities for improvements in the air traffic control environment: Communication is facilitated by shared knowledge or situation awareness between speaker and listener, and it is important to preserve this wherever possible, perhaps enhancing it through display technology or training. A large component of the permanent knowledge structures that a controller brings to the job is spatial and procedural knowledge about the particular characteristics of the facility and its sectors. This fact limits the effectiveness of generic (i.e., not sector-specific) controller training. Efforts at improving controller decision making should focus on strategies that are effective in time-stressed environments: training in task and goal management strategies; sensitization to gaps in knowledge, conflicting evidence, or goals and to unreliability in situation understanding and plans; and information displays that promote appropriate trade-offs among speed, accuracy, and task allocation and alert controllers to significant uncertainties and goal conflicts.

OCR for page 89
Flight to the Future: Human Factors in Air Traffic Control The effect of high workload on cognitive vulnerabilities, such as error propensity and situation awareness, is complex and requires further research in an air traffic control context to illuminate. Great care and caution must be given prior to implementing procedural and equipment changes and current procedures must be carefully understood, because of the likelihood that new procedures can lead to negative transfer and slips. Although air traffic control errors are inevitable to some extent, they can be minimized by providing attention to good human factors design. Their impact on system performance can be minimized by adopting a design philosophy that preserves some redundancy of human information transmission (redundant displays, multiple operators) and by an error-tolerant philosophy that allows recovery from human errors before they are propagated to major system errors.