5
Application of Risk Analysis as a Basis for Prioritizing Needs

An essential problem faced by all Department of the Navy (DON) organizations with responsibilities for information assurance (IA) is how to make the inherently complex trade-offs between satisfying IA objectives and all other mission objectives. The committee believes that mission risk analysis is the appropriate foundation for IA trade-offs related to investment and system design choices. The committee has further found that the current naval efforts to apply mission risk analysis relevant to IA issues are limited and inadequate, given the magnitude of the challenge currently faced. The information assurance posture of the Navy and Marine Corps should be based on the need to maintain mission assurance at levels of risk commensurate with those accepted from other threat sources. That is almost certainly not the case today.

Risk is measured by the consequences of things that go wrong and the corresponding likelihoods of occurrence. When consequences can be extreme, the likelihood of occurrence needs to be virtually eliminated. A rigorous mission risk analysis of information assurance issues is likely to lead to a better understood and more rational set of investment and system design priorities, some of which are outlined below as recommendations. As the Navy moves to network-centric concepts of operations (CONOPS) for its fundamental missions, its overall level of mission assurance is increasingly determined by its level of information assurance and dependence. At the macro level, it is evident that electronic information system attacks can potentially provide a relatively low cost and efficient way for adversaries to reduce the effectiveness of naval warfighting capabilities. Thus, the information assurance posture and the architectural choices in DON systems should be exposed to thorough risk analysis, in the same manner that other



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 97
5 Application of Risk Analysis as a Basis for Prioritizing Needs An essential problem faced by all Department of the Navy (DON) organiza - tions with responsibilities for information assurance (IA) is how to make the inher- ently complex trade-offs between satisfying IA objectives and all other mission objectives. The committee believes that mission risk analysis is the appropriate foundation for IA trade-offs related to investment and system design choices. The committee has further found that the current naval efforts to apply mission risk analysis relevant to IA issues are limited and inadequate, given the magnitude of the challenge currently faced. The information assurance posture of the Navy and Marine Corps should be based on the need to maintain mission assurance at levels of risk commensurate with those accepted from other threat sources. That is almost certainly not the case today. Risk is measured by the consequences of things that go wrong and the cor- responding likelihoods of occurrence. When consequences can be extreme, the likelihood of occurrence needs to be virtually eliminated. A rigorous mission risk analysis of information assurance issues is likely to lead to a better understood and more rational set of investment and system design priorities, some of which are outlined below as recommendations. As the Navy moves to network-centric concepts of operations (CONOPS) for its fundamental missions, its overall level of mission assurance is increasingly determined by its level of information assur- ance and dependence. At the macro level, it is evident that electronic information system attacks can potentially provide a relatively low cost and efficient way for adversaries to reduce the effectiveness of naval warfighting capabilities. Thus, the information assurance posture and the architectural choices in DON systems should be exposed to thorough risk analysis, in the same manner that other 97

OCR for page 97
98 INFORMATION ASSURANCE FOR NETWORK-CENTRIC NAVAL FORCES mission-critical elements of naval systems and CONOPS are regularly exposed to mission risk analysis for more conventional threats.1 The committee believes that the most important area of emphasis in risk analysis in the near term should be mission-level risk created by known vulner- abilities of the entire DON network system of systems. Navy personnel interviewed and threat documents reviewed as part of the committee’s deliberations indicate that the DON’s current network architecture has significant vulnerabilities. 2 Although many of the vulnerabilities have mitigations and backups, there appears to be limited evaluation of the threat posed to operational missions by the threats, vulnerabilities, and mitigations as a whole. The best Navy work in the application of information risk analysis appears not to have been shared outside the organiza - tion that sponsored it. Instead, risk analysis seems to be “stove-piped,” without any group forming a comprehensive picture for naval forces as a whole. As a conse - quence of the lack of mission-level understanding of risk, architectural choices are being made that, under certain scenarios, could even make the situation worse. OVERVIEW AND BACKgROUND OF RISK ANALYSIS The committee recognizes that information assurance goals and other goals for naval information systems will often be in conflict. Assurance is often expen - sive, and more strongly assured systems may require compromises in other areas. For example, technologies that provide opportunities for greater levels of integra- tion and consolidation of information system functions also provide opportunities for individual exploits to have greater impact. The process of trading among com - peting goals is difficult because their linkages are typically complex. However, this is not the first time that such complex choices have been faced. Many other aspects of Navy system design and architecture are likewise complex and force 1 General guidance and best practices on cybersecurity risk analysis and risk management for federal information technology systems are found in special publications by the Interagency Work - ing Group, Joint Task Force Transformative Initiative, and the Computer Security Division at the National Institute of Standards and Technology (NIST). See NIST special publications (1) Gary Stoneburner, Alice Goguen, and Alexis Ferings, 2002, Risk Management Guide for Information Technology Systems, No. 800-30, National Institute of Standards and Technology, Gaithersburg, Md., July; (2) Ron Ross, Marianne Swanson, Gary Stoneburner, Stu Katzke, and Arnold Johnson, 2004, Guide for the Security Certification and Accreditation of Federal Information Systems, No. 800-37, National Institute of Standards and Technology, Gaithersburg, Md., May; and (3) Ron Ross, Stu Katzke, Arnold Johnson, Marianne Swanson, and Gary Stoneburner, 2008, Managing Risk from Information Systems, An Organization Perspective, No. 800-39, National Institute of Standards and Technology, Gaithersburg, Md., April. 2 During the course of the study, the committee held discussions with a wide range of naval personnel, including not only those responsible for Navy and Marine Corps network defense, naval intelligence, and naval network architecture and system design, but also personnel responsible for network IA architecture and defense at the National Security Agency and the Defense Information Systems Agency.

OCR for page 97
99 APPLICATION OF RISK ANALYSIS trade-offs among attributes related in complex ways. Before other examples are provided showing where risk analysis in complex circumstances has been accom - plished, consider the following limited subset of information assurance cases that introduce complex trade-offs between assurance and other desired system characteristics: 1. Ships are moving to consolidate their onboard networks physically onto a single, shared medium.3 This consolidation promises to reduce costs, help reduce manning, and facilitate information sharing. However, it also introduces new IA vulnerabilities (especially with regard to denial of service) that do not exist in architectures where different networks are hosted on physically distinct commu - nications media. 2. Resilience to denial-of-service attacks is facilitated by having a large num- ber of Internet exchange points and by having large spare capacity. The current IA posture of the Navy, however, is to reduce the number of Internet exchange points (to facilitate monitoring), and it does not consider maintaining a greater number of exchange points for spare capacity to be cost-efficient on a day-to-day basis. 3. Guided by Department of Defense (DOD) directives, the Navy is moving strongly toward a “monoculture” of operating systems and applications by stan - dardizing desktops. The greater the consistency of software configuration the easier it is to patch and to provide assurance against the day-to-day vulnerabilities that appear on the Internet. However, a software monoculture is also at far greater risk for catastrophic collapse induced by an attack specifically crafted to that com- mon configuration. Is the benefit of reduced day-to-day local disruption worth the increased global risk from a sophisticated adversary? These three examples are typical of the information assurance trade-offs requiring risk analysis at the mission level. The committee has not been presented with evidence that such analyses have been conducted comprehensively or outside of individual DON organizations. PAST NAVY MISSION RISK ANALYSIS CONSEQUENCES While the information assurance area may be lacking with regard to mission risk analysis, this process is not new to the Navy, and its value has been well proven. The Navy’s standard practice is as follows: • Drive system architectural choices based on desired missions capabilities, • Recognize the threats that adversaries pose to those missions, and 3 For example, the Navy’s Consolidated Afloat Networks and Enterprise Services (CANES) program, described in Chapter 2 of this report, is designed to reduce and consolidate naval afloat network systems.

OCR for page 97
100 INFORMATION ASSURANCE FOR NETWORK-CENTRIC NAVAL FORCES • Flow-down the identified threats, with their corresponding risks, into deci- sions regarding individual system developments, including both technological and operational solutions. Consider the following non-IA examples of this standard practice: • The Aegis system is a response to the threat of air and missile attacks on the carrier battle group. Aegis was designed to carry out a basic missile defense CONOPS, but it was also designed with numerous backup operational concepts in recognition of the likelihood that threats would partially succeed and partially degrade the system. Individual components of Aegis are designed and tested with a wide range of defined threats, both kinetic and electronic. The trade-offs involved in the design of Aegis are complicated, and include cost, operability, and impact to seaworthiness as well as mission effectiveness, but the threat and operational scenarios are the foundation for system design trade-offs. • The existing core of secure communication systems (e.g., Military Strategic and Tactical Relay satellite and data links) was designed to mitigate the risks of known electronic attack capabilities. Those secure systems are integrated with a mission CONOPS for operating in a degraded communications environment, and for operating on the basis of objectives for—and an analysis of—achieving a mission-level capability. What is different today is the extent and speed with which the pace of change in day-to-day operations has combined with the pace of change in information technology (IT) to lead to wholly new systems and CONOPS that have not been exposed to traditional risk analysis from an IA perspective. Navy logistics has increasingly moved to Internet applications, along with critically important joint and coalition operations support activities (e.g., U.S. Air Force tanker operations). Welfare and morale for Marines and sailors now means supporting Internet access on workstations used for day-to-day work and onboard ships. The current opera - tional threat environment contains extensive, active, low-end attacks encountered daily, but perhaps with much higher-end (but largely invisible) threats developing in the background. RISK ANALYSIS AND INFORMATION ASSURANCE IN THE FIELD Risk analysis is widely, but inconsistently, used in government and industry. During its deliberations, the committee was exposed to dramatically different levels of rigor and completeness in the descriptions of how different organizations use risk analysis to drive their architectural choices. All of the naval organizations interviewed by the committee were aware of risks and the application of risk analysis, but the degree to which they adopted risk analysis and used it to drive design and architectural choices was highly variable.

OCR for page 97
101 APPLICATION OF RISK ANALYSIS The most notable deficit is the lack of a set of common threat and operational scenarios shared across the relevant DON organizations. While combatant com - mands are conducting mission risk analyses for their assigned operational plans, the same scenarios were not being used by type commands and acquisition orga - nizations. In fact, the risk scenarios that appeared to receive the greatest attention at the type commands and research organizations were quite different from the combatant command scenarios. It might be argued that rigorous, mission-focused risk analysis is too com - plex at the mission level for the Navy and Marine Corps. However, the industry presenters to the committee demonstrated that quite complex analyses, and imple - mentation of their consequences, are carried out routinely in some segments of commercial industry. An especially good example was provided by Citigroup, Inc.4 Citigroup’s information security program is driven by a regular review of all identified risk scenarios (a total of approximately 15,000 business processes that are supported by information infrastructures that expose the possibility for money losses). Information assurance investments in information infrastructure are made in response to the identified risks, and the money loss use-cases are used to verify and red-team each new application. The risk analysis approach was coupled with a set of IA principles, for example rigorously isolating every Internet-facing application. The example of Citigroup demonstrates that thorough and ongoing risk analyses can be conducted at multiple levels of abstraction (the Citigroup method extends from business processes to network design) at complexity levels comparable to those needed in the DON. Another commercial application of risk analysis that was presented to the committee involves Verizon’s use of forensic analysis to determine on a histori - cal basis (for about 500 actual successful cyberattacks) the relative effectiveness of alternative solutions.5 For example, the analysis presented to the committee indicated that additional improvements in system administration offered greater assurance than did additional advances in patch processing. The Navy did not have comparable efforts to understand the relative value of solutions based on actual historical case analyses. While all of the Navy organizations that made presentations to the committee conducted some form of risk analysis, these analyses varied widely, were typi - cally qualitative, and were limited by the scope of decision-making authority of the organization conducting the analysis. Because no substantial operations plan 4 Mark Clancy, Executive Vice President, IT Risk and Program Management, Citigroup, “ Infor- mation Assurance: Financial Institution Perspective,” presentation to the committee, July 17, 2008, Washington, D.C. 5 Peter Tippett, Vice President, Research and Intelligence, Verizon Security Solutions, “2008 Data Breach Investigations Report,” presentation to the committee, July 18, 2008, Washington, D.C. A public copy of the report is available at . Accessed March 16, 2009.

OCR for page 97
102 INFORMATION ASSURANCE FOR NETWORK-CENTRIC NAVAL FORCES (OPLAN) can be conducted within the command purview of a single command, there was a significant lack of mission-level risk analysis to guide decisions. POSSIBLE NEW APPROACHES While the problem of analyzing investment priorities for information assur- ance is difficult, it is no more complex than other Navy investment problems, although the threats and responses are prolific and less familiar to Navy leaders. As in other investment cases, threat analysis and mission scenarios (including the threat scenario) should be the foundation, and they should be coupled with metrics for mission effectiveness. Two nonexclusive approaches to mission risk analysis with respect to information systems were discussed above—operations analysis and incident analysis. In practice, both should be used. Both are reviewed briefly before discussing applications to DON missions. Moderate extensions of conventional operations analysis methods should allow quantification of many (if not all) IA issues. The components of a conven - tional operations analysis model should include the following: • An operations plan for representative conflict scenarios. Each combatant command has several such scenarios. • A threat model, which for the purposes here should include electronic threats as well as cyberthreats to information capabilities. These should be standard threats, vetted through the naval intelligence and the IA technical communities. • Effectiveness operational performance metrics, whose associated models should include the dependencies on network-centric capabilities. Such mission risk analyses are carried out by a variety of DON groups. A par- ticular example discussed with the committee was a communications risk analysis carried out by the Pacific Fleet showing the impact of reduced communications capabilities on a particular operations plan, including a variety of possible threat scenarios.6 However, the committee did not find that these scenarios and analyses are currently being used across multiple DON organizations. While a particular combatant command may have worked out risk analyses for their OPLANs, the threat scenarios and analyses of the consequences are often not used by the type commands and others who supply critical services on which the combatant com - mands depend for execution. For operations analysis to be effective, it must be shared across all of the supplying and consuming stakeholders. An alternative form of risk analysis is incident analysis and system moni - toring, in which the risk to an organization’s system is assessed by seeing what 6 Robert Stephenson, Chief Technology Officer, C41 Operations, Space and Naval Warfare Com - mand, “Maritime Communication Systems (CS) Vulnerabilities Assessment,” presentation to the committee, July 18, 2008, Washington, D.C.

OCR for page 97
103 APPLICATION OF RISK ANALYSIS the organization already experiences in the threat environment in which it is immersed. Statistically based incident analysis is possible for information sys - tems in a manner that is not feasible for platforms and weapons systems, because information systems are subjected to continuous cyberattacks; that is, while DON platforms are rarely attacked in normal operations, DON networks are continu - ously attacked. For incident analysis to be effective, it needs to have the following characteristics: • It needs to be rigorous, in that incidents are followed up to discover root causes and their relationship to the efficacy of possible new preventions. In prac - tice, it was not apparent to the committee that this is being systematically done for incidents on DON networks. • It needs to be representative, in that the incidents tracked need to be similar to those that can potentially pose high-impact threats to mission success. FINDINgS AND RECOMMENDATIONS MAJOR FINDINg: The Navy has not comprehensively translated adversary capabilities into risk analysis assumptions or into an operational threat, and it does not routinely share the risk analyses and threat models that exist across the various Navy and Marine Corps organizations that have responsibility for infor- mation assurance. Based on the information briefed to the committee, there does not appear to be adequate emphasis on understanding how adversaries intend to or could use their capabilities and DOD network vulnerabilities to disrupt naval operations. As discussed previously, all of the risk analyses briefed to the committee were narrow in scope, were restricted to a single DON organization, and were not shared with other organizations with related responsibilities for information assur- ance. A few of the analyses did address mission risk, such as that provided by the Pacific Fleet, but they were restricted to limited threat types and did not include mission-level effectiveness metrics. The other risk analyses were technical, in that they analyzed the risks to a single system or platform but did not extend to include the mission impacts. The heart of mission risk analysis is an understand - ing of the adversary concept of operation and operational objectives and of the adversary capabilities available to achieve those objectives (a threat model). The DON understanding of adversaries’ doctrine, CONOPS, objectives, and capabili - ties with respect to information assurance appears to be very limited. MAJOR RECOMMENDATION: The Director, Naval Intelligence, in col- laboration with the Defense Intelligence Agency and national intelligence orga - nizations, should support cyber risk analysis by collecting and analyzing all source intelligence to improve the Department of the Navy’s understanding of

OCR for page 97
104 INFORMATION ASSURANCE FOR NETWORK-CENTRIC NAVAL FORCES adversaries’ mission intent, strategy, and tactics and to illuminate how these could impact the ability of the Navy and Marine Corps to accomplish their missions and objectives. Some of the consequences of this recommendation are discussed in the sec - tions that follow. An additional finding, elaborating on the original finding from the committee’s letter report, concerns the need to make risk analysis not only realistic (an intelligence issue), but shared. Multiple organizations have responsi - bility for IA, and the information assurance capabilities likely to be needed cannot be achieved without unity of effort. To achieve unity of effort requires that the risk picture be shared. FINDINg: The extent and fast rate of change in day-to-day operations have com- bined with the pace of change in information technology to lead to wholly new systems and concepts of operations that have not been exposed to IA risk analysis. During its deliberations, the committee was exposed to dramatically different levels of rigor and completeness in the descriptions of how different organizations use risk analysis to drive their architectural choices. Navy and Marine Corps orga - nizations generally recognize the importance and role of mission risk analysis, but they typically conduct such analyses only qualitatively, and the analyses are limited by the scope of each organization’s decision authorities. RECOMMENDATION: Threat and risk analysis, specifically including adver- sary concepts of operations and operational capabilities, should be shared across the many Navy and Marine Corps organizations with significant dependencies on information assurance. Standard scenarios and measures of effectiveness should be used by organizations responsible for information assurance. The consequences of the recommended risk analyses should be reconciled across the Navy and Marine Corps organizations responsible for information assurance. Responsible organizations should make trade-offs related to informa - tion assurance based on the shared risk analyses. Information Assurance Risk Considerations Ultimately, information assurance risks to individual systems and subsystems are only relevant if they project into important mission risks—that is, if the threat can potentially prevent the Navy and Marine Corps from accomplishing their assigned missions or cause casualties during the execution of those missions. Risk analysis for information assurance should, therefore, be founded on mission risk analysis and not on risk analyses tied to individual systems or technological components. The committee believes that a risk-based information assurance strategy, once developed from mission risk analysis, will lead to an integrated set of solutions, including the following:

OCR for page 97
105 APPLICATION OF RISK ANALYSIS • Development of resilient systems, with the ability to “fight through” disruptions as a core design characteristic. Just as Navy ships are designed for damage control and the ability to fight through damage, Navy networks and infor- mation systems should be designed to fight through disruptions, with graceful degradations. In furtherance of the fight-through strategy, war games should nor- mally include risk-based disruption scenarios. Among those normally exercised should be these: — Large-scale jamming or loss of satellite communications, eliminating dependable use of this channel of communications over various mission- critical time intervals; — Complete denial of service to unprotected Internet/Non-Classified Inter- net Protocol Router Network (NIPRnet) systems over extended time intervals; — Deceptive operations on Internet/NIPRnet-connected systems; and — Insider-enabled attacks that either deny service or disrupt or alter informa- tion on what are otherwise protected networks. • A risk-based determination of the degree of isolation of various informa- tion functions so as to control the potential for attacks generated through one function to impact another integrated function. The current naval information system environment is very large and quite complex, and comprises systems at every level of criticality from recreational functions to real-time ship and weapons control. In general, the distinctions in criticality are recognized in the defensive posture. For example, networks hosting obviously critical functions are separated either logically or physically from those hosting less-critical func - tions. However, the degrees of separation and the levels of monitoring applied to each need to be determined by a rigorous process of mission-by-mission risk assessment. As an example, functions of greatly differing criticality (recreation, logistics, tanker operations, and coalition communications) are currently hosted on Internet/NIPRnet-connected networks and monitored at essentially the same level. Consequently, should problems occur, the mechanisms available for partial fallbacks are very limited and nonselective. • Undertaking risk analyses of the mission impact of extensive denial and/or deception attacks on Internet-hosted applications. If the mission impact of losing or compromising certain Internet-hosted applications is much greater for some applications than for others (as is almost certain), then the Navy needs to take measures to provide assurance for those applications consistent with their mission risk. Such measures might include the strict separation of nonofficial functions to alternative infrastructure (e.g., laptops and wireless local area networks), perva - sive use of secure protocols among applications, concentration of monitoring on subnetworks, and sandboxing7 of the most exposed applications. 7 “Sandboxing” is a term used to describe the use of security mechanisms to isolate and control the potential spill of exploits from an untrusted program or system.

OCR for page 97
106 INFORMATION ASSURANCE FOR NETWORK-CENTRIC NAVAL FORCES • Extension of counterintelligence-oriented and discovery-oriented monitor- ing on operational Navy networks. Current monitoring schemes strongly empha- size looking for known signatures rather than discovering previously unseen signatures. Since it is likely that high-impact attacks on Navy networks will be customized and will appear nowhere else, the Navy cannot rely solely on com - mercial signature databases to monitor for attacks. It needs an active discovery effort to identify attacks that may appear nowhere else. • More innovative and integrated approaches for bringing intelligence analysis to bear on critical near-term decisions. Intelligence support to informa- tion assurance assessment is similar to intelligence support for other areas of threat assessment in many ways, but in important respects it is different. The committee finds that current intelligence support must be expanded to support IA needs. To elaborate on the committee’s recommendation that intelligence collection and analysis must be expanded to support the impact of information assurance (and failures thereof) on mission success, the following conclusions about such intel- ligence collection and analysis activities are noted: — Intelligence assessments must address adversarial doctrines as well as capabilities, particularly with respect to the use of enemy information operations to steal or manipulate data versus posturing for disruption in conflicts. A large naval investment in pervasive encryption could exten- sively thwart adversary intentions in data theft, but it would have only limited effect on preventing disruptions. Since it is unlikely that any col- lection can be effected that will conclusively resolve adversary intentions, the approach needed must combine collection, analysis, and examination in war games. The outcome should be one, or several, adversary concepts of operation that can be shared across the organizations responsible for information assurance. — Since adversaries seek to conceal their capabilities, there is always uncer- tainty about these capabilities. As in other weapons system fields, esti - mates of capabilities must be made. The established and effective method for doing so is to invite teams of knowledgeable science and technology specialists to imagine their own approaches to the adversary concept of operation, adjusted by estimates of adversary technical capabilities. This type of activity should be conducted for threats to DON information sys- tems, and the results should be used to prepare a threat estimate that can be shared across the organizations responsible for information assurance. — An important difference between adversary information operations capa - bilities and kinetic capabilities is that the information capabilities are being exercised daily. The best signatures of information operations capabilities are not available through remote observation; they are on operational net- works. Thus, network security and operations should be integrated with intelligence collection. Also, sensors deployed on operational networks should be selected for intelligence value and not just for their ability to pro-

OCR for page 97
107 APPLICATION OF RISK ANALYSIS vide current security. Intelligence collectors and analysts should be using the backdrop of adversary activities to improve threat models and chal- lenge assumptions. As an example of an investigation activity that could be conducted, the committee found that little effort has been expended on estimating the scope of unobserved threats. It is impossible to measure the unobserved exactly, but a variety of methods could be used to estimate the extent of unobserved threats and bound their scope. Several examples of these methods are provided in Table 5.1. — Information assurance, at the mission level, can be provided with a mix- ture of defensive and active capabilities. In principle, active and offensive methods can significantly enhance network defense, but the scenarios in which they would be effective and ineffective are not known. The com- mittee found that no truly integrated approach to analyzing active and passive methods, or offensive and defensive methods, has been developed. Although the possibilities for synergy are real, there are also possibilities for antagonism, and there appears to be no comprehensive effort being made to disentangle the issues. In the absence of an integrated understand- ing, it is speculative to make investments in active or offensive methods in the hopes that the defensive posture will be improved. For active and/or offensive techniques to be incorporated into the defensive information assurance strategy, intelligence collection on cyberthreats must be greatly improved, shared, and deeply integrated into the operational plans. — A great deal of effort is being expended in defending against and clean - ing up after less sophisticated attack vectors. The volume of these attacks and their occasional effectiveness are a concern. The amount of effort expended on monitoring and cleanup of these attacks detracts from detect- ing more sophisticated, and likely more important, attacks. It also leads to a production-line attitude in which counts of unimportant attacks can serve to mask the lack of effort expended against sophisticated threats. The Navy should consider modifications to the Internet-exposed portions of the infrastructure that provide fundamental protections against common attack vectors (e.g., strong e-mail authentication to block many types of spear-phishing attacks). — In developing the recommendations for organizational responses, the committee considered currently observed threats, threats that can be reasonably surmised and modeled from intelligence and technology, and unknown threats. With respect to the first two categories, it urges that risk analysis be conducted using a documented set of operational scenarios that include both known and estimated threats. With respect to unknown threats, the committee recommends ongoing science and technology research and an approach to monitoring that moves beyond the search for known signatures and uses techniques that can detect previously unknown attack vectors. With respect to each of these issues, it is of great impor-

OCR for page 97
108 INFORMATION ASSURANCE FOR NETWORK-CENTRIC NAVAL FORCES TABLE 5.1 Example Use of Intelligence Collectors to Aid Network Defense Example Description • Configure honeypots behind the network’s standard protection, Use of honeypotsa but with solidly patched configurations, and observe whether they are compromised (or how long compromise takes). • Run a honeypot with a Web crawler configured to visit high-risk areas and observe what happens. • Forward suspicious e-mail to a honeypot with a program that automatically follows the links in the e-mail and observe whether the honeypot is compromised. • Embed some directories and files in the regular servers, placed and configured so that no legitimate user will have access to them. Observe whether the files are accessed, and if so, use careful monitoring to determine who accessed them. • Run banks of honeypots with tools to detect anything that changes, to look for zero-day exploits. Consider running banks of honeypots in this way, with different protection levels to see which are compromised or get “owned.” • Run attacks against the target systems that need to be defended. Use of deliberate attack Originate attacks from a location outside the network using known attack vectors. If the blockage rate is less than 100 percent, then the number of known attacks can be logged directly to estimate the number of known attack types that successfully penetrate the system defenses. • Run attacks against the target systems that need to be defended. Again, originate the attacks from a location outside the network, but using methods that have not been observed being used against them previously. If the deliberate attacks get through the defenses 100 percent of the time, the system is demonstrably still vulnerable. This method will not prove whether or not these specific types of new attacks are actually occurring, but it does place bounds on the ability of the system to detect. If attacks get past system protections between zero and 100 percent of the time, the logs can be used to estimate the frequency of a specific attack. • Combine the use of honeypots and controlled attack to see if the Combined use of honeypots and controlled external attack triggers the honeypot. Ideally, the attacks will controlled attacks cause activity in the honeypots, even when the operational detection system does not detect them. If the attacks get past system protections between zero and 100 percent of the time, the logs can be used to estimate the frequency of a specific attack. NOTE: The “intelligence collector” examples in this table demonstrate methodology analogous to “defect seeding” and “tag-and-release” counting methods, sometimes used in quality-assurance proto- cols. None of these examples provides exact count measures, but they do provide estimates based on real data. Extensive guidance on the use of honeypots is provided in cyberdefense publications, such as those found at . Accessed November 14, 2008. a“Honeypots” are defined as “closely monitored network decoys serving several purposes: they can distract adversaries from more valuable machines on a network, they can provide early warning about new attack and exploitation trends and they allow in-depth examination of adversaries during and after exploitation of a honeypot.” Definition source: . Accessed February 21, 2009.

OCR for page 97
109 APPLICATION OF RISK ANALYSIS tance that the results be shared across the DON and DOD organizations responsible for information assurance vectors, and that those efforts be coupled with a strong intelligence collection and analysis activity targeted at helping to improve the ability to predict future threats. Cost Issues The potential cost of enhanced information assurance measures and the corre- sponding value provided can only be assessed when risk analysis comprehensively addresses mission risk. The DON justifies very large expenditures on platforms and weapons systems precisely because their absence is estimated to place mis - sions of great national security importance at risk. The committee believes that, increasingly, failures of information assurance will have the same large impacts on mission performance and so will justify equivalent prioritization. However, such a conclusion must be based on comprehensive evaluations of mission risk that do not yet exist.