levels of automation, when the automated system can express graded levels of certainty or uncertainty regarding the quality of the information it provides (e.g., confidence in resolution and reliability of radar position estimates).

  1. Integrity Checks. Ensuring the reliability of sensors by connecting and comparing various sensor sources.

  2. User Request Enabling. User request enabling involves the automation's understanding specific user requests for information to be displayed. If such requests can be understood only if they are expressed in restricted syntax (e.g., a precisely ordered string of specific words or keystrokes), it is a lower-level of automation. If requests can be understood in less restricted syntax (e.g., natural language), it is a higher level of automation.

The level of automation in information acquisition and integration, represented on the left scale of Figure 1.1, can be characterized by the extent to which a system possesses high levels on each of the six features. A system with the highest level of automation would have high levels on all six features.

Decision and Action Selection and Action Implementation

Higher levels of automation of decision and action selection define progressively fewer degrees of freedom for humans to select from a wide variety of actions (Table 1.1 and the middle scale of Figure 1.1). At levels 2 to 4 on the scale, systems can be developed that allow the operator to execute the advised or recommended action manually (e.g., speaking a clearance) or via automation (e.g., relaying a suggested clearance via data link by a single computer input response). The manual option is not available at the higher levels for automation of decision and action selection. Hence, the dichotomous action implementation scale applies only to the lower-levels of automation of decision and action selection.

Finally, we note that control actions can be taken in circumstances that have more or less uncertainty or risk in their consequences, as a result of more or less uncertainty in the environment. For example, the consequences of an automated decision to hand off an aircraft to another controller are easily predictable and of relatively low risk. In contrast, the consequences of an automation-transmitted clearance or instruction delivered to an aircraft are less certain; for example, the pilot may be unable to comply or may follow the instruction incorrectly. We make the important distinction between lower-level decision actions in the former case (low uncertainty) and higher level decision actions in the latter case (high uncertainty and risk). Tasks with higher levels of uncertainty should be constrained to lower-levels of automation of decision and action selection.

The concluding chapter of the Phase I report examined the characteristics of automation in the current national airspace system. Several aspects of human



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement