behavior, so that if and when a failure occurs, it is not detected. Conversely, an automated tool that is highly accurate and useful may nevertheless not be used if the controller believes that it is untrustworthy.

Attributes of Trust

Trust has multiple determinants and varies over time. Clearly, one factor influencing trust is automation reliability, but other factors are also important. Below is a listing of the characteristics of the most relevant determining factors:

  1. Reliability refers to the repeated, consistent functioning of automation. It should also be noted that some automation technology may be reliably harmful, always performing as it was designed but designed poorly in terms of human or other factors; see the discussion of designer and management errors later in this chapter.

  2. Robustness of the automation refers to the demonstrated or promised ability to perform under a variety of circumstances. It should be noted that the automation may be able to do a variety of things, some of which need not or should not be done.

  3. Familiarity means that the system employs procedures, terms, and cultural norms that are familiar, friendly, and natural to the trusting person. But familiarity may lead the human operator to certain pitfalls.

  4. Understandability refers to the sense that the human supervisor or observer can form a mental model and predict future system behavior. But ease of understanding parts of an automated system may lead to overconfidence in the controller's understanding of other aspects.

  5. Explication of intention means that the system explicitly displays or says that it will act in a particular way—as contrasted to its future actions having to be predicted from a model. But knowing the computer's intention may also create a sense of overconfidence and a willingness to outwit the system and take inappropriate chances.

  6. Usefulness refers to the utility of the system to the trusting person in a formal theoretical sense. However, automation may be useful, but for unsafe purposes.

  7. Dependence of the trusting person on the automation could be measured either by observing the controller's consistency of use, or by using a subjective rating scale, or both. But overdependence may be fatal if the system fails.

Overtrust and Complacency in Failure Detection

If automation works correctly most of the time, the circumstances in which the human will need to intervene because automation has failed are few in number. We can liken this process to a vigilance monitoring task with exceedingly

The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement