this in two parts: (1) a discussion of the nature of the Y2K risk and the relationship between that risk and the responses it engendered, and (2) the drawing of lessons from Y2K for ongoing management of ICT risk and vulnerability.

Understanding the Relationship Between Y2K Risk and Response

During Y2K, organizations had great difficultly clarifying the risks they faced. This had a major impact on the nature of the response to those risks. One striking aspect of the Y2K response was a highly reduced tolerance for risk.

Reduced Risk Tolerance

As discussed in Chapter 2 (in particular, see Section 2.2), much of the difficulty in clarifying the risk to ICT from Y2K stemmed from issues involving the complexity of ICT systems and their environments. This complexity contributed to uncertainties about the nature of the Y2K problem itself. “The problem wasn’t understood. We had to assume that we would be operating in uncertainty” (374th AW/LG). In turn, uncertainties surrounding the nature of Y2K led to uncertainties about the threat to critical systems and the operations they supported. “People couldn’t quantify the risk” (AFY2KO).

Faced with an uncertain threat to highly critical operations, organizations significantly reduced their tolerance for Y2K risk—in some cases, to zero. This occurred “not just within the Air Force but DOD [Department of Defense] wide, and probably even government wide” (AMC/HQ). The difficulty of quantifying a complex, multifaceted problem with a fixed deadline, coupled with an extremely reduced tolerance for risk, led to a response that approached anything associated with Y2K with a broad effort to eliminate as much risk as possible. “If you could identify a problem you had to fix it. If you could theorize a problem you had to go after it” (AMC/SCA).

This broad and exhaustive effort led to frustration among those ICT managers who saw a need to distinguish specific Y2K threats by their likelihood and criticality. For example, the AMC/SCA “instituted a review board to have the programs in a technical sense try to defend why they should be certified. …[The board probed] the changes and how the program worked to determine the probability from the technical standpoint that they’d missed something or created some other problem…” (AMC/SCA). In other words, based upon an understanding of its programs and of the nature of the problem, the board could have made a specific response, but because of other reasons—specifically, the inability of senior leadership to accept less than zero risk—it had to apply a general, zero risk response. “I found it very difficult to explain [differing levels of risk] to more senior leadership. … I can’t tell you how much time I could have saved. They basically said ‘Any risk at all, forget it’” (AMC/SCA).

Most ICT managers had never operated under a policy of zero risk tolerance, and they saw it as inappropriate for their situation. “Because of the atmosphere of paranoia, any kind of information that appeared would generate [exaggerated responses]….” If

data, and information,but also its use and management. Therefore, unless specifically noted otherwise, IA stands for the more encompassing notion of “infrastructure assurance.”

The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement