tive engineering has grown up around the the practical need to understand and improve quality, safety, and efficiency in high-impact, complex domains, such as aviation, medicine, and nuclear power control, where poorly designed systems can lead to major accidents. In fact, the field emerged largely in response to accidents in large, complex systems that had seemingly nothing to do with the design of those systems.
For example, let’s say a highly trained cockpit crew flies a perfectly good plane into the side of a mountain. This loss of “situational awareness” has in fact happened enough times that the aviation industry has coined a term for this phenomenon called “controlled flight into terrain” (CFIT). In these cases, there was no mechanical failure and the computers and other automation, processes, and so on worked as designed. A first response as to the cause of such accidents is often “human error.”
However, by studying the causes of accidents, or even by studying the day-to-day activities of people using systems designed for them, it turns out that many systems are poorly designed to begin with. The computers, automation, and other engineered processes (such as procedures, handoffs during shift changes, logbooks, regulatory requirements, and other aspects of passing information among people and computers) have weak spots, and, if certain events co-occur at those points, they can, collectively, cause failure.
Ironically, people working day-to-day in such systems often see these failure modes (although they may not think of them in that way) and create “workarounds,” such as placing sticky notes to remind themselves what to do or not do, or they develop an almost “intuitive” understanding of how to react if and when things start to go wrong. These workers, who are important sources of knowledge, are often overlooked by engineers, who may not have been trained in gathering information well. This is where cognitive engineers excel.
Cognitive engineers focus not only on interviewing and observing end users, but also look at intrinsic relationships and requirements of a task. For example, for air-traffic controllers, it is a fact that multiple planes are moving at varying (but constrained) speeds and altitudes, and these facts cannot be “simplified away.” But one can often take advantage of the constraints in system knowledge when designing work flow, representations, and other aspects of decision-support systems.
Cognitive engineering is a subspecialty of the broader field known as ergonomics. When most people think of ergonomics, they think of physical changes to a product or tool to make it “fit” better physically. An early example of this was the design of the Reach toothbrush (Hill and Kreifeldt, 1979), which spawned a whole new field of “toothbrush design”—“mouth-friendly” technology designed to improve the task of cleaning teeth as compared to what could be achieved with the straight-handled, rectangular-shaped toothbrush that was then the norm.
Both physical and cognitive ergonomics are important, and the same environment or system can be analyzed and improved upon from both perspectives. Table 1 provides a few examples of how a cognitive ergonomist and a physical