Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 48
The Future of Air Traffic Control: Human Operators and Automation 2 Emerging Technological Resources Advances in hardware and software offer promising opportunities for automating a greater range of information processing, decision making, and control functions than has been possible in the past. Along with these advances comes the question of the degree to which emerging hardware and software systems can be trusted to perform functions in a reliable and valid manner. In this chapter we review and assess three technologies that relate to the functions of information acquisition, information distribution, the generation of alternative decision options, and option selection. These technologies are visualization, intelligent decision aiding and intent inferencing, and computer-supported cooperative work. VISUALIZATION Visualization, the process of using a visual mental model, is perhaps the most important cognitive function the controller performs. Visual mental models are what we usually think of when we speak of mental models—we ''see" them in our "mind's eye" (although musicians surely have auditory mental models, professional tasters surely have olfactory and gustatory mental models, etc.). Computerized automation can enhance visualization in many ways, which is the point of revisiting the topic of visualization here. Computer graphic displays help visualization by combining variables into a single integrated display. For example, the old mechanical attitude (8-ball) display combined roll, pitch, and yaw, enabling the pilot to visualize the aircraft attitude much more easily than if such information had to be gleaned from three
OCR for page 49
The Future of Air Traffic Control: Human Operators and Automation separate indicators. The computer graphic display that shows three aircraft symbols (past, present, and predicted future) in roll, pitch, and yaw relative to glide slope, command heading, and altitude integrates even more information. The plan view or map display that shows waypoints, heading, other aircraft, predicted trajectories, and weather is another advanced visualization aid. The digital representation of altitude on the radar display has remained a feature of the air traffic control workstation that is less than optimal. Although controllers adequately handle digital flight-level data, the fact remains that it is difficult to visualize vertical trends or the magnitude of altitude differences from such a representation. Designers have realized the possible advantages to visualization by representing the vertical dimension in analog format. There are in fact two ways in which this might be accomplished (Wickens, 1997). One is through addition of a vertical "profile" display, and the other is through a three-dimensional or "perspective" display. To date, most experimental research has compared conventional plan view displays with perspective displays. Such comparisons have generally not been favorable for the latter (Wickens, Miller, and Tham, 1996; May et al., 1996). Although a perspective view does indeed represent the vertical dimension, it also compresses the three-dimensional airspace onto a two-dimensional viewing surface, leaving a certain amount of perceptual ambiguity regarding the precise lateral and vertical distance separating a pair of aircraft (McGreevy and Ellis, 1986; Merwin et al., 1997). This ambiguity can disrupt the controller's judgments of predictive separation. Three solutions may be available. First, as noted, a profile display could be coupled with the plan view display to represent, without ambiguity, the vertical separations. This approach has proven quite successful for representing traffic separations and terrain awareness in cockpit displays (Merwin et al., 1997; Wickens, Liang, et al., 1996). Second, some designers have proposed using holographic or stereo techniques to create displays, in which the ambiguity is lessened (Wickens et al., 1989). Third, it is possible to provide a controller with interactive tools, whereby the three-dimensional viewpoint of the display can be altered, making the position of aircraft less ambiguous through the perceptual cue of motion parallax (Sollenberger and Milgram, 1993; Wickens et al., 1994); this can also be provided by holographic displays. A more radical form of interaction is created by allowing the controller to change the viewpoint position and "immerse" himself within the airspace, thereby approximating the technology of virtual reality (Durlach and Mavor, 1995; Wickens and Baker, 1995). Certain limitations of this technology for air traffic control, however, appear evident. First, by immersing oneself within the traffic volume, aircraft to the side and behind are "out of sight," a factor that is of considerable concern if the safe separation of all traffic is to be maintained. Second, such immersion can be disorienting to one who is not in active control of the viewpoint and hence would tend to be disruptive to efforts to coordinate a
OCR for page 50
The Future of Air Traffic Control: Human Operators and Automation view among multiple observers. As a consequence, the prospects of this technology for real-time air traffic control operations appear remote. However, the feasibility of both immersive and non-immersive three-dimensional displays in a training capacity appears more promising. INTELLIGENT DECISION AIDING The principal uses of intelligent computer-based decision making systems include diagnosis, planning, decision aiding, intent inferencing, and training. They can be developed from a variety of sources, including highly structured written documents, such as military doctrine; knowledge elicitation methods used to create expert emulations; and algorithms that provide structures and strategies for learning by example or through neural networking. Although these systems may vary in underlying logic or structure, most include both domain knowledge and procedures for operating on that knowledge. In this section we briefly review the technology of expert systems, intent inferencing systems, learning software, and blackboard systems. The current technology for expert, intent inferencing, and blackboard systems requires a programmer to make changes. Learning systems, in contrast, are designed to grow and add new knowledge through iterative operation. In the air traffic control environment, this technology continues to hold promise for equipment troubleshooting, simulation-based training, air traffic flow planning, decision aiding, and intent inferencing for the controller and the pilot. However, caution is needed. Although computer-based systems offer advantages of speed and capacity, the increased efficiency and power predicted by the combination of these systems with human decision makers have not been realized (Mosier and Skita, 1996). Major issues concern validity and reliability and the ease and effectiveness with which human operators (controllers, pilots, and maintainers) can make use of them. One concern is that novices who use expert systems to aid them in their decision making do not perform as well as experts. Novices and experts have different approaches or schema for structuring and solving problems (Chi et al., 1981). Also, computer systems are limited in technical knowledge and are not as versatile as the human expert (Will, 1991). The most favorable results for the combination of novice and automated aid occur when the task is routine and covered by standard procedures. A second concern is that incorrect models of human decision making and automated decision aids may result in systems that are less effective than the human alone. Mosier and Skita (1996) suggest that these incorrect models may create decision making environments that promote decision biases rather than enhancing human capabilities. As an example of this, Adelman et al. (1993), in a study of real-time expert system interfaces for use by air defense officers in identifying aircraft as friendly or hostile, found that aids to focus the operator's
OCR for page 51
The Future of Air Traffic Control: Human Operators and Automation attention on most critical events led to inferior performance for important but less critical cases also requiring the operator's attention. Hockey (1986) has observed similar findings with regard to a tactical aid for managing air traffic in a combat environment. Perhaps the most promising work, directly relevant to air traffic control goals, is in intent inferencing, an approach developed to alleviate the need for an operator to directly input his or her intentions into the system (Geddes, 1985). Intent inferencing provides an intelligent interface to the operating system by informing the system about the plan the operator is intending to implement. This technology has the potential to present controllers with better predictor information, thus overcoming an important limitation in the current air traffic control system. Most of the work in this area has been done in the context of military aircraft (Geddes, 1985; Banks and Lizza, 1991; Andes, 1996); however, a preliminary study of shared intentions for free flight has recently been completed for the National Aeronautics and Space Administration (NASA) (Geddes et al., 1996). Free flight is discussed in detail in Chapter 9. Blackboard systems provide the architecture for the integration of several knowledge sources (expert systems, case-based systems, neural network systems) to interactively solve a problem or develop a plan. These systems also have potential, particularly for long-range strategic planning applications. However, their effectiveness may be limited by the features of the component software program representing the various knowledge sources (Corkill, 1991). What follows is a brief overview of the technologies used in developing intelligent decision aiding systems and a discussion of their applications. Expert System Technology Expert systems are computer programs designed to solve complex problems by emulating human expertise. Work on the first expert system began in the mid-1960s and resulted in a computer-based system that could function as effectively as a human expert in determining molecular structures from chemical data. The basic structure of most expert systems includes a knowledge base and an inference procedure that operates on the knowledge base. The knowledge base contains the facts or declarative knowledge in a particular subject area and the rules of judgment developed by experts who use these facts. In most systems, this knowledge is represented in the form of production rules. Production rules are sets of condition-action pairs presented in the form of if-then statements. Once these pairs are formed, weights are assigned to show the relative strength of the relationship as seen by the expert. In addition to the knowledge base, an expert system has an inference procedure. The two principal forms of reasoning used are goal-directed backward chaining and forward chaining. Backward chaining involves working from a goal to the conditions required to reach that goal. Forward chaining infers the
OCR for page 52
The Future of Air Traffic Control: Human Operators and Automation goals from the conditions. A critical feature of all expert systems is that they can make explicit the reasoning used to reach a specific conclusion or recommendation. This is useful in validating the system and in assisting the user in assessing the value of the advice. In the early days of expert systems development, the focus was on artificial intelligence and research in cognitive psychology that explored the nature of expert knowledge and the data structures and reasoning strategies necessary to create a software representation of the knowledge. Once it was demonstrated that usable expert systems could be developed, the focus moved from research to application. One major stumbling block in the development process is the acquisition of the appropriate knowledge from the expert and the validation of that knowledge. These processes are both labor-intensive and time-consuming and also require highly trained individuals to do the work. Another barrier is that expert systems work well only in well-known problem domains that can be described by procedural rules. Although there have been many applications of expert systems, including training, planning, diagnosing, and scheduling, most have not been developed for real-time uses. Early work focused on such problems as providing physicians with diagnostic aids, advising geologists working with rock formations, planning experiments in DNA synthesis, and assisting in electronic troubleshooting. More recently, the majority of applications have sprung up in areas such as training, counseling (for example, Chwelos and Oatley, 1994; Wilson and Zalewski, 1994; Hile et al., 1995), management planning (for example, Liang and Teo, 1994) and test development and interpretation (Frick, 1992). Production rule systems have also been used to emulate command decision making in military simulations. Regarding training, Chu et al. (1995) have proposed requirements for an intelligent tutoring system that specifies the instructional content and procedures to teach novice operators to manage a complex dynamic system. This approach may also be useful in developing simulations for training air traffic controllers. One well-researched intelligent tutor is the system developed by John Anderson and his colleagues at Carnegie Mellon University in which the adaptive control of thought (ACT) theory of learning and problem solving was used to build the software. They have found that the early stages of learning are dominated by declarative or factual knowledge, whereas the later stages focus on procedural knowledge (Anderson et al., 1993). An important result of this research is that the developers shifted from their earlier model of a tutor that emulates an expert to a tutor as a learning environment in which helpful information can be provided and useful problems can be selected. This work may provide important guidance to the design of computer-based training systems for air traffic controllers.
OCR for page 53
The Future of Air Traffic Control: Human Operators and Automation Systems That Learn During the past 20 years substantial progress has been made in the development and testing of category learning models such as exemplar or case-based models. In these models, past and current states serve as memory cues for retrieving scenarios that are similar to the current situation, and the actions that produced favorable outcomes in the past are retrieved for use as possible future actions. In essence, the decision problem corresponds to pattern recognition based on similarity to previously stored examples. This type of learning model has been extremely successful in a wide range of rigorous experimental tests, including more complex learning problems such as controlling dynamic systems (Dienes and Fahey, 1995). According to Pew and Mavor (1997), an appealing feature of this approach is that it bypasses the knowledge acquisition and knowledge representation bottleneck associated with expert systems by recording past behavior of real humans to form the memory base of exemplars that are retrieved. Neural network models can accomplish the same objective; however, exemplar models may be more compatible and thus more combinable with rule-based systems. Hybrid systems that result from such combinations have yet to be evaluated but contain some promise for being more useful that either pure rule-based or pure exemplar-based systems alone. Bayles and Das (1993) examined the feasibility of using an exemplar-based approach to solving traffic flow management problems. In this study, prior cases of weather and traffic problems with their attendant solutions were documented, stored, and made available for retrieval as new situations arose. This approach, as contrasted with the rule-based approach, seemed more appropriate because of the variability in conditions from one situation to another. The approach works by selecting cases from the past that are similar to the current situation and providing them to traffic flow managers as aids to developing solutions. New solutions are generated from prior cases and these new solutions are added to the data base. Traffic flow managers found the exercise particularly useful because the system provided them with feedback linking solutions to both positive and negative outcomes. Blackboard Systems Blackboard technology (Nii and Aiello, 1986) is essentially an electronic emulation of a group of experts or specialists using a blackboard as the workplace for cooperatively developing a solution to a problem. It offers a problem solving architecture that is particularly useful when the following conditions are present:
OCR for page 54
The Future of Air Traffic Control: Human Operators and Automation The problem is decomposable into a number of subproblems. Many diverse, specialized knowledge sources are needed to address a problem. An integrative framework is needed to manage heterogeneous problem solving representations. The development of an application involves numerous developers. Uncertain or limited data make it difficult to develop an absolute solution. Multilevel reasoning or dynamic control of problem solving activities is needed for the application. Blackboard systems have three major components: a collection of knowledge sources, a control module that schedules the contributions of the knowledge sources, and a blackboard or a database that saves the current state of the problem generated by the knowledge sources. The knowledge sources generate cooperative solutions on a blackboard using a variety of reasoning approaches such as expert systems, numerical analysis, exemplar systems, and neural networks. Each knowledge source represents a different area of expertise or a different perspective on the problem and can be developed independently (using different languages) from the other knowledge sources; however, each knowledge source must use an interaction language that is common to the blackboard. Also, if a knowledge source contains relevant expertise it can be used in more than one blackboard system. The heart of the system is the control module that provides for the integration of the various knowledge sources and controls the flow of the activity in the application, much the way a moderator would control a group of human experts working collaboratively on a problem (Corkill, 1991). Blackboard control architecture has several benefits including modularity and the flexibility to adapt to a wide range of complex heuristics and rules that may change in the course of problem solving. Another strength is that the architecture places all the strategies and rules governing system behavior under system control. The primary drawback is the heavy computational and storage requirements (Hayes-Roth, 1985). In air traffic control, blackboard systems may be useful for addressing flow control and weather prediction problems. For example, in one research project, Craig (1989) used a blackboard system named Cassandra to monitor aircraft separation in a controlled airspace. This system contained separate experts for vertical, lateral, and horizontal separation. Although this technology may offer relevant assistance to the performance of a number of air traffic control tasks, it needs further development. As noted earlier, it combines a number of other technologies (e.g., knowledge based systems and case-based systems) that have their own set of limitations (discussed in previous sections).
OCR for page 55
The Future of Air Traffic Control: Human Operators and Automation Intent Inferencing The fundamental principle underlying intent inferencing is to keep the operator in control, even though the system is able to carry out a series of tasks automatically. Thus, when using this technology, the tasks that the system executes are based on inferences made about the goal the operator is trying to achieve and actions that relate to implementing that goal. The operator does not directly tell the system what to do but rather continues to perform activities. The intelligent system analyzes these activities and makes inferences about the goal of the operator and the best plan to execute to reach that goal. Based on this inference, the system carries out the desired process automatically. One intent inferencing model, described by Jones et al. (1990), was developed as part of the operator function model expert system. This model, the actions interpreter, dynamically builds a model of operator goals for the current system state and then works to interpret the user's actions in terms of these goals. Each goal is decomposed into a hierarchy of plans, tasks, and operator actions required to fulfill the goal. This representation evolves over time as new information is recorded. The operator function model developed by Mitchell (1987) provides the basis for the action interpreter's knowledge about how events trigger likely operator goals. The actions interpreter has been evaluated in the Georgia Tech Multisatellite Operations Control Center simulation, an interactive simulation that supports simulated satellites and the computer and communications technology used for data capture. Another framework used to represent intent inferencing is the plan and goal graph (Geddes, 1989; Rouse et al., 1990; Shallin et al., 1993). The plan and goal graph is a task analytic decomposition of the goals and plans for all operators interacting with the system. The top-level nodes in the graph are goals. A goal represents a specific criterion on the state of the system that can be tested by observation. The next level of nodes are plans. Plans involve activities, time frames, the use of resources, and side effects. Plans are decomposed into subgoal, subplans, and actions. Several plans may share common actions and compete for resources. Thus, a key element of the program involves resolving conflicts. The formalism of the relationship between plans and goals guides the decomposition process. The idea is to develop the goal and plan structure for the set of missions the system is expected to perform. As noted earlier, much of the work in this area has concentrated on military aircraft, specifically the pilot associate program (Banks and Lizza, 1991) and the rotocraft pilot associate (Andes, 1996). Currently, Geddes and his colleagues are working with NASA under the advanced air transportation technology program to demonstrate intent inferencing as an emerging technology that can be used to detect goal and plan conflicts among active participants in free flight scenarios (free flight is discussed in detail in Chapter 9). This research moves from interpreting the intent of one operator to interpreting the intent of several operators.
OCR for page 56
The Future of Air Traffic Control: Human Operators and Automation The heart of the project is the "shared model of intent" (using the OPAL software system) and its use in early conflict detection and resolution. In the shared model of intent, the goals and plans of all ground- and air-based participants are represented in a plan and goal graph. In addition, the system contains a knowledge base that specifies information about all objects in the system, including the conditions and time frames under which state changes may be expected to occur. The shared model of intent has been tested in 10 free flight scenarios focusing on the transition from en route to terminal airspace with particular emphasis on the coordination between large commercial aircraft and general aviation aircraft. A simulation facility was established at Embry-Riddle Aeronautical University to run the test scenarios. As part of this facility, two manned air traffic control consoles were set up, one for the en route sector and the other for the terminal sector. The terminal airspace was a representation of the Orlando International Airport. Each scenario included 32 aircraft (2 manned and 30 digital). By the end of the evaluation scenarios, the shared model of intent was accurately accounting for over 90 percent of manned pilot actions, approximately 80 percent of digital aircraft actions, and 100 percent of the actions at the two manned air traffic control stations. The results of early testing appear to be positive. However, further evaluation is needed with a larger number of participants and different levels of airspace complexity. According to Geddes et al. (1996:3): The potential scope of the shared model of intent includes conflicts in flow control between ground coordination activities as well as aircraft. It will be possible, for example, to detect that the coordination plan to increase take-off rates at Dallas-Fort Worth as a plan to reduce taxiway wait time will have a more serious conflict with the plan to reduce the arrival rate at Atlanta due to high levels of thunderstorm activity in the approach area. … By detecting conflicts at a higher level, not only is conflict detection typically earlier, but it can also result in a more strategic resolution that redirects resources more efficiently. COMPUTER-SUPPORTED COOPERATIVE WORK Distributed networking capabilities plus advances in telecommunications, multiuser applications, shared virtual environment technologies, and the like have created opportunities for users in the same or different locations engaged in interdependent activities to work together in a common computer-based environment. These capabilities have given rise to a relatively new interdisciplinary field of study known as computer-supported cooperative work (CSCW). Its goal is to use groupware technologies to facilitate communication, collaboration, and coordination in accord with the users' organizational and social contexts. Research in this area takes into account situations, roles, social interactions, and task interdependencies among participants as a guide for CSCW system design, development,
OCR for page 57
The Future of Air Traffic Control: Human Operators and Automation implementation, and evaluation. It is easy to see how this work is relevant to the computer mediation of cooperative problem solving and scheduling among air traffic controllers, pilots, and dispatchers. Distinctive Features Cooperative work in this perspective is interpreted broadly to refer to work that is completed through the harmonization of acts carried out by multiple individuals (Malone and Crowston, 1990). The key issue is the use of computer-support to manage effectively the interdependent activities of diverse actors so that the task goals affected by these interdependencies can be achieved. Although CSCW shares some concerns with the traditional human factors field, this orientation creates several distinctive foci. First, CSCW tools attempt to support multiple interdependent tasks rather than individual tasks that can be completed by people acting independently. Interdependencies may stem, for example, from the fact that tasks must be completed in a specified order, that the same resource(s) is needed by multiple activities, that actions must be synchronized, and so on. Second, existing interdependencies may be tacit or implicit, rather than explicit or articulated in task descriptions; often they are taken for granted by workers but not necessarily captured in the task analyses or needs assessments that provide input to traditional systems design. Surfacing such interdependencies has been a consistent contribution from this research. Third, this view of cooperative work recognizes that task interdependencies may often be managed through common "artifacts" or representations of the work (see Suchman, 1995). Ethnographic studies of the use of flight strips by controllers, for example, suggest that the strips do far more than deliver information to an active controller; they also serve as transparent representations of interdependent tasks-in-progress to the incoming controller and others (Harper and Hughes, 1991). It should be noted that studies of the use of paper versus electronic flight strips conducted by Manning (1995) have shown that electronic flight strips have the benefit of reducing workload; these issues are discussed in more detail in Chapter 4. Similar findings about visible representations of work have come from studies of the role of job tickets in equipment repair facilities (Sachs, 1995) and of the way that representations function in shipboard navigation (Hutchins, 1990). Research that focuses exclusively on independent work is likely to overlook how media function as representations and artifacts that support tacit interdependencies among tasks (see also Kyng and others in the September 1995 special issue of Communications of the ACM on representations of work). Fourth, operators may be engaged in cooperative work in autonomous or semiautonomous roles, and such roles may be enacted by either individuals or computer programs (or both). Thus technology designs for CSCW do not assume
OCR for page 58
The Future of Air Traffic Control: Human Operators and Automation a binary choice (to automate or not to automate) but instead consider an array of options for allocating functions within cooperative work activity to humans and machines (Mankin et al., 1996). Finally, the key interdependencies to be managed may involve conflict (as in negotiations or debates) or competition (as in bidding or market-based systems) rather than conscious goal-oriented cooperation or teamwork in the usual sense (Ciborra, 1993). As with collaboration, competition may occur as an explicit or as a tacit form of task interdependence. CSCW technologies can be deployed to support either sort of interdependent activity. Components of Applications Different implementations may involve the use of such component technologies as electronic text or audio messaging, shared spatial views of activities and operators, formal representations of work processes, and the like to support multiple individuals engaged in interdependent work. Many kinds of cooperative action, for example, require members of the interacting group to communicate in some form. Communication Watts and his colleagues (Watts et al., 1996), for example, describe the use of voice loops as a method for space shuttle mission controllers to coordinate their activities. The study found that controllers monitor four voice loops: the conference loop, the support loop, the air/ground loop, and the flight director loop. The conference loop is constantly monitored to receive messages from other consoles; however, actions are taken only when a problem occurs. The support loop is used for front room controllers to communicate with the support staff in another location. The air/ground loop is used by astronauts and the flight controller; all other participants only monitor the loop to maintain awareness of the evolving situation. Finally, the flight director loop contains communication between the flight controller and the front room controllers. Study results show that controllers are able to monitor all four loops and extract meaningful patterns of information. Typically a controller is active only on one loop. But passive awareness or preattentive reference to the other loops enables controllers to maintain shared awareness, to integrate new information about the mission, and to anticipate changes and dynamically shift their activities in response; in general it helps them to synchronize their work (see Dourish and Bellotti, 1992; Woods, 1995). Controllers can segregate loops by listening to them at different volumes, monitoring loops that are less relevant to their tasks and goals at lower volumes than loops that are providing more significant information. A primary factor in the success of voice loops in supporting coordination
OCR for page 59
The Future of Air Traffic Control: Human Operators and Automation appears to be the use of implicit protocols to govern which loops are monitored as well as the reliance on highly coded language and immediate response on demand to directed messages in the active loop. Passive monitoring also facilitates directed communication by helping coworkers negotiate interruptions; controllers are able to listen to colleagues' loops to determine their current workloads before contacting them. Other kinds of situations that demand coordination will require rigorous protocols that are made fully explicit. One area in which the work in voice loops may be relevant to air traffic control is in the communication and coordination required among tower controllers, airport managers, gate managers, pilots, and airline dispatchers in the surface movement advisor system. Shared Spaces In other applications, CSCW technologies may rely on spatial approaches to support interdependent tasks. Such approaches are particularly useful for cooperative work by representing persistence and ongoing activity in a common spatial setting, by enabling peripheral awareness of what others are doing beyond a user's focal activity, by permitting navigation and chance encounters in a shared environment, and by facilitating system usability through natural spatial metaphors. More generally, spatial approaches to CSCW can be regarded as focusing support on the contexts rather than the processes of work (Suchman, 1995; Winograd, 1994). A number of spatial techniques, including media spaces, spatial video conferencing, collaborative virtual environments, and telepresence are reviewed in an article by Benford et al. (1996). Media spaces, for example, employ integrated audio/video communications as a means for supporting social browsing and the development of long-term working relationships between physically separated individuals. Services include views into other participants' offices and open connections with selected individuals. The main drawback to this approach has been a limited field of view and the inability to navigate freely through the shared space (Gaver, 1992). Spatial video conferencing is used to support more formal interactions (e.g., meetings). Some advanced systems include shared document editors. Limitations of this technology stem from the fact that participants find it difficult to visually determine where other members of the group are directing their attention at any given time. Collaborative virtual environments support shared work through networked virtual reality systems. In principle, such technologies provide computer-generated worlds in which participants are graphically represented to each other and each participant controls his or her own viewpoint. The shared space provides a common frame of reference for all participants. Collaborative virtual environments have been tested for use in such applications as education, training, and scientific visualization (Durlach and Mavor, 1995). These applications often integrate representations of users and their information in a common display
OCR for page 60
The Future of Air Traffic Control: Human Operators and Automation space (unlike multimedia systems, which typically display communication and data in separate windows). However, collaborative virtual environments are a less mature technology than either media spaces or spatial video conferencing. Finally, telepresence differs from collaborative virtual environments in that participants are given the experience of a real remote physical space rather than a space that is computer-generated. Telepresence applications currently focus on the control of remote robots in hazardous or inaccessible environments (including telesurgery). One potential application of shared media spaces and collaborative virtual environments is to facilitate strategic planning activities among central flow control staff and staff at facilities across the country. Another possible application is to allow controllers to remain at their own facilities and be trained interactively in real-time with controllers located at other facilities. Integration and Advanced Groupware Although different types of applications are discussed separately above, it should be noted that advanced groupware systems may well integrate aspects of each type. A complex groupware application designed to support a flight crew in their interactions with one another and with the airline both in the air and on the ground, for example, is described in Benson et al. (1990). The system enables synchronous communication, both video- and audio-based, among training/technical managers, crew managers, and chief pilots—who are normally all located in different facilities. It also supports a highly structured asychronous messaging system through which pilots can lodge bids for flights and for training periods. But because the airline has the right to allocate some flights directly and to draft pilots if a flight is still not covered a few days before departure, a built-in workflow system (which models the bidding process and flexibly reallocates tasks to resources) manages many aspects of the bidding logistics. The authors argue that distributed but coordinated work requires groupware systems that can communicate and execute concurrently. However, they conclude that what is most important to the performance of interdependent tasks is cohesion among participants; systems to support such activity must therefore be designed with a view to increasing trust, motivation, flexibility, and the like. Improved information is not the only essential ingredient for successful human collaboration. Issues and Implications A primary concern of the work in CSCW is the development of methodologies to describe roles, relationships, and shared work procedures for coordination, cooperation, and communication. A number of investigators (Hughes et al., 1994; Twidale et al., 1994; Harper
OCR for page 61
The Future of Air Traffic Control: Human Operators and Automation and Hughes, 1991; Bikson and Eveland, 1990, 1996; Bikson and Law, 1993; Bikson, 1996; Eveland et al., 1995; Dubrovsky et al., 1991; Finholt et al., 1990) have employed a variety of social research methods (ethnography, field experiments, replicated case designs, unobtrusive measures, and realistic laboratory studies) in efforts to develop the required social knowledge and incorporate it into design and implementation processes. Less progress has been made toward developing methods for evaluation. To date, there has been little systematic effort to apply CSCW to time-critical operations in air traffic control. However, there are several promising areas in which this approach might usefully be considered in the future. Among these are the strategic activities of air traffic management that involve coordination and communication with facilities across the country regarding local traffic and weather concerns. Such coordination of physically remote individuals is an ideal target for CSCW technology. Other examples are the interactions between pilots and controllers who are working with shared map displays (e.g., shared plan view displays, shared airport movement area safety system maps—see Chapter 5); and the interactions between local maintenance personnel and specialists at centralized maintenance control stations (see Chapter 7). Some researchers hypothesize that interdependencies increase as the number of affected task participants increases and the time-scale for task relevant actions decrease (see, for example, Benson et al., 1990). If so, the projected growth of air traffic suggests that future needs for computer-support of effective interdependency management in this domain will be pressing; consequently, the potential value of CSCW technologies for air traffic control merits serious investigation.
OCR for page 62
The Future of Air Traffic Control: Human Operators and Automation This page in the original is blank.
Representative terms from entire chapter: