3
Investing in Information Technology Research

This chapter describes the shape of a strategic research and development (R&D) program with respect to information technology for counterterrorism. However, it should be noted that this program has broad applicability not only for efforts against terrorism and information warfare but also for reducing cybercrime and responding to natural disasters. While the scope and complexity of issues with respect to each of these areas may well vary (e.g., a program focused on cybercrime may place more emphasis on forensics useful in prosecution), the committee believes that there is enough overlap in the research problems and approaches to make it unwise to articulate a separate R&D program for each area.

Although many areas of information technology research could be potentially valuable for counterterrorist purposes, the three areas described below are particularly important for helping reduce the likelihood or impact of a terrorist attack:

1. Information and network security. Research in information and network security is critically relevant to the nation’s counterterrorism efforts for several reasons.1 First, IT attacks can amplify the impact of physical

1  

Computer Science and Telecommunications Board (CSTB), National Research Council (NRC), 1991, Computers at Risk: Safe Computing in the Information Age, National Academy Press, Washington, D.C. (hereafter cited as CSTB, NRC, 1991, Computers at Risk); Computer Science and Telecommunications Board, National Research Council, 1999, Trust in Cyberspace, National Academy Press, Washington, D.C. (hereafter cited as CSTB, NRC, 1999,



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities 3 Investing in Information Technology Research This chapter describes the shape of a strategic research and development (R&D) program with respect to information technology for counterterrorism. However, it should be noted that this program has broad applicability not only for efforts against terrorism and information warfare but also for reducing cybercrime and responding to natural disasters. While the scope and complexity of issues with respect to each of these areas may well vary (e.g., a program focused on cybercrime may place more emphasis on forensics useful in prosecution), the committee believes that there is enough overlap in the research problems and approaches to make it unwise to articulate a separate R&D program for each area. Although many areas of information technology research could be potentially valuable for counterterrorist purposes, the three areas described below are particularly important for helping reduce the likelihood or impact of a terrorist attack: 1. Information and network security. Research in information and network security is critically relevant to the nation’s counterterrorism efforts for several reasons.1 First, IT attacks can amplify the impact of physical 1   Computer Science and Telecommunications Board (CSTB), National Research Council (NRC), 1991, Computers at Risk: Safe Computing in the Information Age, National Academy Press, Washington, D.C. (hereafter cited as CSTB, NRC, 1991, Computers at Risk); Computer Science and Telecommunications Board, National Research Council, 1999, Trust in Cyberspace, National Academy Press, Washington, D.C. (hereafter cited as CSTB, NRC, 1999,

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities attacks and lessen the effectiveness of emergency responses; reducing such vulnerabilities will require major advances in information and network security. IT attacks on supervisory control and data acquisition (SCADA) systems in the control infrastructure could also be highly damaging, and research to improve the security of such systems will be needed. Second, the increasing levels of social and economic damage caused by cybercrime and the tendency to rely on the Internet as the primary networking and communications channel both suggest that the likelihood of severe damage through a cyberattack is increasing. Finally, the evolution of the Internet and the systems connected to it demonstrates increasing homogeneity in hardware and software (Box 3.1), which makes the Internet more vulnerable at the same time that it becomes more critical. To address these problems, more researchers and trained professionals who are focused on information and network security will be needed. 2. Systems for emergency response. “C3I” (command, control, communications, and intelligence) systems are critical to emergency responders for coordinating efforts and increasing the promptness and effectiveness of response (e.g., saving lives, treating the injured, and protecting property). While terrorist attacks and natural disasters have many similarities with respect to the consequences of such events, the issues raised by C3I for emergency response for terrorist disasters differ from those for natural disasters for several reasons. First, the number of responding agencies, including those from the local, regional, state, and federal levels—with possibly conflicting and overlapping areas of responsibility—increases the level of complexity. (For example, in a terrorist attack scenario, the Department of Defense [DOD] might be much more involved than it would be in a natural disaster.) Second, ongoing security and law-enforcement concerns are much stronger in the wake of a terrorist attack. While looting is often a threat to the community affected by a natural disaster (and may result in the deployment of a police presence in the midst of the recovery effort), the threat from a follow-on terrorist attack may well be much greater or more technologically sophisticated than that posed by looters. And, to the extent that an additional security or law-enforcement presence is required, the sometimes-conflicting needs of security and law-enforcement agencies with the needs of others—for ex     Trust in Cyberspace); Computer Science and Telecommunications Board, National Research Council, 2001, Embedded, Everywhere: A Research Agenda for Networked Systems of Embedded Computers, National Academy Press, Washington, D.C. (hereafter cited as CSTB, NRC, 2001, Embedded, Everywhere).

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities BOX 3.1 Monoculture and System Homogenity The existence of “monoculture” on the Internet has both advantages and disadvantages. Its primary advantage is that increased standardization of systems generally allows for greater efficiencies (e.g., easier interoperability). On the other hand, monocultural environments are generally more vulnerable to a single well-designed attack—a fact greatly exacerbated by the extensive interconnections that the Internet provides. For example, a constant barrage of computer viruses has been designed to attack the weaknesses of the Windows operating system and its associated browser and office productivity programs. However, these viruses have had a negligible direct effect on computers running other operating systems. Furthermore, while these attacks can be propagated through computer servers running other operating systems, they are not propagated on these systems. A monoculture is highly vulnerable to attacks because once a successful attack on the underlying system is developed, it can be multiplied at extremely low cost. Thus, for all practical purposes, a successful attack on one system means that all similar (and similarly configured) systems connected to it can be attacked as well. As a general rule, inhomogeneity of systems would make broad-based attacks more difficult. This should be considered carefully when designing primary or redundant critical network systems. (Natural “living systems” provide an interesting analogy to the importance of diversity. Heterogeneity plays an important role in preventing minor changes in the climate, environment, or parasites from destroying the entire system. Different species demonstrate varying levels of vulnerability to the variety of challenges encountered, and this lends resilience to the system as a whole. The diversity of species also lessens the possibility of infectious disease spreading across the entire system.) ample, the fire and medical personnel on-site—mean that security and law enforcement may interfere with rescue and recovery operations. 3. Information-fusion systems for the prevention, detection, attribution, and remediation of attacks. “Information fusion” promises to play a central role in countering future terrorist efforts. Information fusion is an essential tool for the intelligence analysis needed if preemptive disruption of terrorist attacks is to be successful. Knowing that a biological attack is in progress (an issue of detection) or determining the perpetrators of an attack (an issue of attribution) may depend on the fusion of large amounts of information And, in many cases, early warning of an attack increases the effectiveness of any counterresponse to it. In every case, information from many sources will have to be acquired, integrated, and appropriately interpreted to support decision makers (ranging from emergency-response units to intelligence organizations). Given the range of formats, the permanence and growing volume of information from each source, and the difficulty of accurately analyzing information from single let alone multiple sources, information fusion offers researchers a challenge.

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities It must also be noted that although technology is central to all of the areas listed above, it is not the sole element of concern. Research in these areas must be multidisciplinary, involving technologists, social scientists, and domain experts (e.g., the people from the multiple agencies that need to work together during crisis solutions). All technology deployed for operational purposes is subject to the reality of implementation and operation by humans. Thus, systems issues, including human, social, and organizational behavior, must be part of the research to develop the needed technology and the system design to implement it. Technology cannot be studied in isolation of the ways in which it is deployed, and failure to attend to the human, political, social, and organizational aspects of solutions will doom technology to failure. For this reason, Section 3.6 addresses social and organizational dimensions that must be incorporated into study in these areas. To assist decision makers in the formulation of a research program, Table 3.1 presents the committee’s rough assessment of the criticality of the various research areas identified, the difficulty of the research problems, and the likely time scale on which progress could be made. The criticality of a research area reflects an assessment of the vulnerabilities that might be reduced if significant advances in that area were accomplished and deployed; areas are ranked “High,” “Medium,” or “Low.” How hard it will be to make significant progress is rated “Very Difficult,” “Difficult,” or “Easy.” The time frame for progress is ranked as “1-4 years,” “5-9 years,” or “10+ years.” Of course, the deployment of research results also presents obstacles, which may reduce effectiveness or lengthen the time until a research result can become a reality. It must also be noted that these assessments are both subjective and subject to some debate, as they were intended to provide readers with a quick calibration of these issues rather than a definitive conclusion. Finally, while R&D is an essential element of the nation’s response to counterterrorism, it is not alone sufficient. Indeed, the history of information security itself demonstrates that the availability of knowledge or technology about how to prevent certain problems does not necessarily translate into the application of that knowledge or the use of that technology. It is beyond the scope of this report to address such issues in detail, but policy makers should be cautioned that R&D is only the first step on a long road to widespread deployment and a genuinely stronger and more robust IT infrastructure. 3.1 INFORMATION AND NETWORK SECURITY A broad overview of some of the history of and major issues in information and network security is contained in the CSTB report Cybersecurity

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities TABLE 3.1 A Taxonomy of Priorities Category Research Criticality Difficulty Time Scale for R&D for Significant Progress and Deployment Improved Information and Network Security High Difficult 5-9 years Detection and identification High Difficult 5-9 years Architecture and design for containment High Difficult 5-9 years Large system backup and decontamination High Difficult 5-9 years Less buggy code High Very difficult 5-9 years Automated tools for system configuration High Difficult 1-4 years Auditing functionality Low Difficult 10+ years Trade-offs between usability and security Medium Difficult 5-9 years Security metrics Medium Difficult 1-4 years Field studies of security High Easy 1-4 years C3I for Emergency Response High Difficult 1-4 years Ad hoc interoperability High Easy 1-4 years Emergency deployment of communications capacity High Easy 1-4 years Security of rapidly deployed ad hoc networks Medium Difficult 5-9 years Information management and decision-support tools Medium Difficult 5-9 years Communications with the public during emergency High Difficult 1-4 years Emergency sensor deployment High Easy 1-4 years Precise location identification Medium Difficult 5-9 years Mapping the physical telecommunications infrastructure High Easy 1-4 years Characterizing the functionality of regional networks for emergency responders High Difficult 1-4 years Information Fusion High Difficult 1-4 years Data mining High Difficult 1-4 years Data integration High Difficult 1-4 years Language technologies High Difficult 1-4 years Image and video processing High Difficult 5-9 years Evidence combination Medium Difficult 1-4 years Interaction and visualization Medium Difficult 1-4 years Privacy and Confidentiality High Difficult 1-4 years Planning for the Future Medium Difficult 10+ years

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities Today and Tomorrow: Pay Now or Pay Later.2 That report builds on a variety of earlier, more detailed CSTB studies related to information and network security. Despite diligent efforts to create effective perimeter defenses, the penetration of defended computer and telecommunications systems by a determined adversary is highly likely. Software flaws, lax procedures for creating and guarding passwords, compromised insiders, and insecure entry points all lead to the conclusion that watertight perimeters cannot be assumed. Nevertheless, strengthening defensive perimeters is helpful, and this section deals with the methodologies of today and tomorrow that can detect or confine an intruder and, if necessary, aid in recovery from attack by taking corrective action. (Box 3.2 describes some of the fundamental principles of defensive strategy.) As noted above, the technology discussed here is relevant for efforts in defensive information warfare and for fighting cybercrime. In addition, many advances in information and network security can improve the reliability and availability of computer systems, which are issues of importance to users even under ordinary, nonthreatening conditions. The fact that such advances have dual purposes could help to generate broader interest and support in R&D in this area, as well as to motivate its incorporation into industry products. Research and development in this area should be construed broadly to include R&D on defensive technology (including both underlying technologies and architectural issues), organizational and sociological dimensions of such security, forensic and recovery tools, and best policies and practices. Research in information and network security can be grouped in four generic areas: authentication, detection, containment, and recovery. A fifth set of topics (e.g., reducing buggy code, dealing with misconfigured systems, auditing functionality) is broadly applicable to more than one of these areas. 3.1.1 Authentication A terrorist may seek to gain access to a computer system that he or she is not authorized to use. Once access is gained, many opportunities for causing harm are available, including the installation of hostile pro 2   Computer Science and Telecommunications Board (CSTB), National Research Council (NRC). 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. National Academy Press, Washington, D.C. (hereafter cited as CSTB, NRC, 2002, Cybersecurity Today and Tomorrow).

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities BOX 3.2 Principles of Defensive Strategy Computing and communications systems that contain sensitive information or whose functioning is critical to an enterprise’s mission must be protected at higher levels of security than are adequate for other systems. Several policies should be mandatory for such critical systems: The use of encryption for communication between system elements and the use of cryptographic protocols. These practices help to ensure data confidentiality against eavesdroppers and data integrity between major processing elements (e.g., host to host, site to site, element to element); prevent intrusion into the network between nodes (e.g., making “man-in-the-middle” attacks much more difficult); and provide strong authentication (e.g., through the use of public-key-based authentication systems that use encryption and random challenge to strengthen the authentication process or to bind other elements of the authentication, such as biometrics, to the identity of a user). Minimal exposure to the Internet, which is inherently insecure. Firewalls provide a minimal level of protection, but they are often bypassed for convenience. (Balancing ease of use and security is an important research area discussed in the main text of this report.) Truly vital systems may require an “air gap” that separates them from public networks. Likewise, communication links that must remain secure and available should use a private network. (From a security perspective, an alternative to a private network may be the use of a connection on a public network that is appropriately secured through encryption. However, depending on the precise characteristics of the private network in question, a private network may—or may not—provide higher availability.) Strong authentication technology for authenticating users. Security tokens based on encryption (such as smart cards) are available for this purpose, and all entrances from a public data network (such as a network access provider or insecure dial-in) should use them. Furthermore, for highly critical systems, physical security must also be assured. Robust configuration control. Such control is needed to ensure that only approved software can run on a system and that all of the security-relevant knobs and switches are correctly set. Such measures are likely to affect ease of use and convenience, as well as cost. These are prices that must be paid, however, because hardening critical systems will greatly reduce vulnerability to a cyberattack. grams and the destruction or compromise of important data. In other instances, a terrorist might orchestrate the actions of multiple computers to undertake harmful acts, for example through denial-of-service attacks.3 3   Computer Science and Telecommunications Board (CSTB), National Research Council (NRC), 1999, Realizing the Potential of C4I: Fundamental Challenges, National Academy Press, Washington D.C., pp. 144-152 (hereafter cited as CSTB, NRC, 1999, Realizing the Potential of C4I); CSTB, NRC, 1999, Trust in Cyberspace.

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities To prevent a terrorist from gaining unauthorized access to a computer, it is necessary to prevent that person from successfully posing as an authorized user.4 In other words, an authorized user must pass successfully through an authentication process that confirms his or her asserted identity as an authorized user. The same is true of computer-to-computer interactions—for some transactions, it is necessary for Computer A to determine if Computer B is one of the computers authorized to interact with it. Here too, a computer-to-computer authentication process is necessary so that only individual authorized devices connecting to a network can receive services. Today, the prevailing method of user authentication is the password. Passwords are easily compromised by the use of weak password-choosing techniques, the recording of passwords in open places, the communication channel used for the password entry or administration, and by password-cracking techniques. Requiring several log-in procedures (and usually with different passwords for each) for mapping to permission or authorization techniques makes system administration even more complex for the user and the system administrator. Other devices, such as hardware tokens, are usually more secure than are passwords, but a user must have them available when needed. The ideal authentication system would be a simple, easy-to-use system that verified identity, could be managed in a distributed manner, had the trustworthiness of cryptographically based systems without today’s complexity, could be scaled to hundreds of thousands (or even millions), and had a cost of ownership that was competitive with passwords. In practice, these desirable attributes often entail trade-offs with one another; one way to focus a research effort in authentication would be to address the reduction of these trade-offs.5 3.1.2 Detection Even with apparently secure authentication processes and technologies, it might still be possible for an intruder to gain unauthorized access to a system, though with more effort if the system were more secure. This possibility suggests a need for detecting and identifying intruders. How 4   The case of a terrorist as an insider who has already been granted access is not within the scope of this particular problem. Insider attacks are addressed in other parts of this report. 5   An ongoing CSTB project examines in detail the technologies underlying authentication. See the Web site <http://www.cstb.org> for more information on this subject.

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities ever, intruders are often indistinguishable from valid users and frequently take great care to hide their entry and make their behavior look innocuous. Intrusion-detection systems are designed to monitor users and traffic to detect either anomalous users or unusual traffic patterns that might indicate an active attack. Of course, such monitoring requires good characterizations of what “normal” behavior is and knowledge of what various kinds of behavior mean in the context of specific applications. Today, the major deficiency in this approach is the occurrence of too many false positives. That is, the behavior of legitimate users is sufficiently diverse that some types of legitimate behavior are mischaracterized as anomalous (and hence hostile). Thus, research is needed on reducing the rate of false positives in intrusion-detection systems. Another approach to intrusion is based on deceiving the cyber-attacker. For example, traps (sometimes referred to as honeypots)—such as apparently interesting files—can be crafted to attract the attention of an intruder so that he or she might spend extra time examining it. That extra time can then be used to provide warning of hostile intent, and it might help in forensic investigation while the hostile party is connected to the system. Alternatively, tools might be created that disguise the actual details of a network when it is probed. Tools of this nature, as well as the development of forensic tools for use in attacker-deceiving environments, may be fruitful areas of research. A related challenge is the development of intruder-detection methods that scale to function efficiently in large systems. Current approaches to intrusion detection generate enormous amounts of data; higher priority must be given to systems that can analyze rather than merely collect such data, while still retaining collections of essential forensic data. Moreover, the collection and analysis of such large amounts of data may degrade performance to unacceptable levels. Intrusion-detection systems are also one element of technology that can be used to cope with the threat of insider attack. Because the trusted insider has legitimate access to system resources and information, his or her activities are subject to far less suspicion, which usually allows the insider to act with much greater freedom than would be permitted an outsider who had penetrated the system. Thus, new technologies specifically focused on the possibility of insider attack may be a particularly fruitful avenue of research (Box 3.3). In addition, research focused on understanding common patterns of insider attack (e.g., through the use of application-level audits to examine usage patterns) could be integrated with other kinds of audits to provide a more robust picture of system usage.

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities BOX 3.3 Illustrative Technologies for Dealing with the Insider Attack Authentication, access control, and audit trails are three well-understood technologies that can be used in combating the insider threat. Using these mechanisms to enforce strict accountability can improve protection against the insider threat, but in practice they are often not as successful as they might be. For example, many current tools for access control and audits are difficult to use, or they generate large volumes of data that are, for practical purposes, unreviewable. Other technology research areas that may be relevant to dealing with insider attack include: Attack-specification languages. Programming languages designed for ease of modeling attacks and/or expressing attack behaviors and modalities. Modeling and simulation of insider attacks. Better understanding of such attacks to help those seeking to validate technologies to counter the insider threat. Today, simulations of such attacks are difficult to perform and are personnel-intensive. Authentication of roles, rights, privileges. Approaches to using finer-grained authentication strategies based not on authenticating an individual as an individual but as the holder of certain rights and privileges or embodying a certain role. Semantics of authorized access. Development of the semantics of operations and authorization that enable more fine-grained authorization decisions or the flagging of potentially suspect audit trails. Automated, dynamic revocation of privileges. Development of effective strategies for the automatic revocation of privileges based on policy-specified factors such as the timing of certain types of access or other user behavior. Fingerprinting of documents. Strategies for embedding identifying information into a document so that its subsequent disposition can be more effectively traced. Such information would be analogous to a copyright notice in a document that can help to determine its origin, except that it would not be easily removable from the document itself. Continuous authentication. Technologies for authenticating a user after the initial authentication challenge (to deal with the fact that people walk away from their computers without logging out). 3.1.3 Containment Today’s systems and networks often fail catastrophically. That is, a successful attack on one part of a system can result in an entire system or network’s being compromised. (For example, the failure of a perimeter defense, such as a firewall, surrounding otherwise unprotected systems can result in an intruder’s gaining full and complete access to all of those systems.) A system that degrades gracefully is more desirable—in this case, a successful attack on one part of a system results only in that part’s

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities being compromised, and the remainder of the system continues to function almost normally.6 The principle of graceful degradation under attack is well accepted, but system and network design for graceful degradation is not well understood. Nor are tools available to help design systems and networks in such a manner. Even more difficult is the challenge of modifying existing legacy systems to fail gracefully. In addition, the building blocks of today’s systems are generally commercial off-the-shelf components.7 Despite the security limitations of such components, the economics of system development and the speed with which the IT environment changes inevitably require them to be built this way. However, it is not known today how to integrate components safely, how to contain faults in them, and how to disaggregate them when necessary. While this lack of understanding applies to systems ranging from accounting and payroll systems to telephone switching systems, SCADA systems are a particularly important case. Architectural containment as a system-design principle calls for the ability to maintain critical functionality (such as engine control on a ship) despite failures in other parts of a system.8 A sophisticated control system used during “normal operations” must be able to provide basic functionality even when parts of it have been damaged.9 Such an approach could be one of the most effective long-term methods for hardening IT targets that oversee critical operations. For the most part, current approaches to system design involve either the independence of system components (which in modern large-scale systems leads to inefficiencies of operation) or the integration of system components (with the inherent vulnerabilities that this approach entails). Containment essentially navigates between the two extremes; its essential element is the ability to “lock down” a system under attack—perhaps to suspend normal operation temporarily while preserving some basic functionality as the system finds and disables potential intruders and to resume normal system operation afterward—with less disruption than might be caused by shutting down and rebooting. Research is thus necessary in several areas: in understanding how to 6   CSTB, NRC, 1999, Realizing the Potential of C4I, pp. 144-152. 7   CSTB, NRC, 1999, Trust in Cyberspace. 8   It should be noted that an essential aspect of designing for containment is the ability to define and prioritize which functions count as essential. For systems used by multiple constituencies, this ability cannot be taken for granted. 9   As an example, a shipboard networking failure on the USS Yorktown left the ship without the ability to run its engines. (Gregory Slabodkin. 1998. “Software Glitches Leave Navy Smart Ship Dead in the Water.” Government Computing News, July 13.)

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities Hey, this is Lt. John Hennessy from Firehouse 17. This is a Code 73 emergency. There are people screaming behind the doors, the building is going to collapse—what’s the security code? Hurry, I’m losing my signal. Hello? Hello? Better hurry. [Sounds of screaming and other loud noises in background.] Times of stress and emergency, when security is perhaps of most importance, are exactly the times when the strains are the greatest and the need for normally nonauthorized people (such as firefighters, police, rescue and health personnel) to gain access is acute. And this is when it is easiest for a terrorist to get in, using the same mechanisms. Countering social engineering by an adversary is an important counterterrorist technique. But whatever the counter, the solution must not be based on extinguishing the tendencies of people to be helpful. The reason is that helpful people often play a key role in getting any work done at all—and thus the research challenge is to develop effective techniques for countering social engineering that do not require wholesale attacks on tendencies to be helpful. Understand Bystander Apathy As more people are involved in checking a task, it is possible for safety to decrease. This is called the “bystander apathy” problem, named after studies of a New York City crime in which numerous people witnessed an incident but no one helped and no one reported it. Laboratory studies showed that the greater the number of people watching, the lower the likelihood that anyone would help—the major reason being that each individual assumes that if an incident is serious, someone else out of all those watching will be doing something, so the fact that nobody is doing anything means that it must not be a real issue. After all, in New York, anything might be happening: it might be a movie shoot. Similarly, if I am asked to check the meter readings of a technician, and I know that the immediate supervisor has already checked them and that someone else will check my report, I don’t take the check all that seriously: after all, how could a mistake get through with so many people involved? I don’t have to worry. But what if everyone feels that way? The commercial aviation community has done an excellent job of fighting this tendency with its program of crew resource management (CRM). In CRM, the pilot not flying is required to be an active critic of the actions taken by the pilot who is flying. And the pilot flying is supposed to thank the other for the criticism, even when it is incorrect. Obviously, getting this process in place was difficult, for it involved major changes in the culture, especially when one pilot was junior, the other very senior. But the result has been increased safety in the cockpit.

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities Account for Cognitive and Perceptual Biases The research literature in cognitive and social psychology clearly demonstrates that most people are particularly bad at understanding low-probability events. This might be the “one-in-a-million” problem. An airplane pilot might well decide that the situation of three different oil pressure indicators all reading zero oil pressure is likely fallacious, because it is a “one-in-a-million” chance that all three engines would fail at the same time. However, since there are roughly 10 million commercial flights a year in the United States, one-in-a-million means that 10 flights a year will suffer from this problem. On the other hand, salient events are overestimated in frequency. Thus, aviation is considered more dangerous than automobile driving by many people, despite the data that show just the opposite. What about terrorist acts? These are truly unlikely, deadly though each may be, but if airline passengers overreact, they are apt to attack and possibly seriously injure an innocent passenger who meets some of their preconceptions of a terrorist. Indeed, each successful encounter between passengers and potential harrowers increases the likelihood of a future false encounter. The “boy who cried wolf” is a third perceptual bias—potential threats are often ignored because of a history of false alarms. An effective criminal or terrorist approach is to trigger an alarm system repeatedly so that the security personnel, in frustration over the repeated false alarms, either disable or ignore it—which is when the terrorist sneaks in. Probe and Test the System Independently The terms “red team” and “tiger team” refer to efforts undertaken by an organization to test its security from an operational perspective using teams that simulate what a determined attacker might do. Tiger teams develop expertise relevant to their intended targets, conduct reconnaissance to search for security weaknesses, and then launch attacks that exploit those weaknesses. Under most circumstances, the attack is not intended to be disruptive but rather to indicate what damage could have been done. Properly conducted tiger-team testing has the following characteristics: It is conducted on an unscheduled basis without the knowledge of the installation being probed, so that a realistic security posture can be tested.

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities It does not function under unrealistic constraints about what it can or cannot do, so its attack can realistically simulate what a real attacker might do. It reports its results to individuals who are not directly responsible for an installation’s security posture so that negative results cannot be suppressed. It probes and tests the fundamental assumptions on which security planning is based and seeks to violate them in order to create unexpected attacks. Why are tiger teams a “people and organization” issue? The essential reason is that an attacker has the opportunity to attack any vulnerable point in a system’s defenses, whether that point of vulnerability is the result of an unknown software bug, a misconfigured access control list, a password taped to a terminal, lax guards at the entrance to a building, or a system operator trying to be helpful. Over the years, tiger teams have been an essential aspect of any security program, and tiger-team tests are essential for several reasons: Recognized vulnerabilities are not always corrected, and known fixes are frequently found not to have been applied as a result of poor configuration management. Security features are often turned off in an effort to improve operational efficiency. Such actions may improve operational efficiency, but at the potentially high cost of compromising security, sometimes with the primary damage occurring in some distant part of the system. Some security measures rely on procedural measures and thus depend on proper training and ongoing vigilance on the part of commanders and system managers. Security flaws that are not apparent to the defender undergoing an inspection may be uncovered by a committed attacker (as they would be uncovered in an actual attack).89 In order to maximize the impact of these tests, reports should be disseminated widely. The release of such information may cause embarrassment of certain parties or identify paths through which adversaries may attack, but especially in the case of vulnerabilities uncovered for which fixes are available, the benefits of releasing such information—allowing others to learn from it and motivating fixes to be installed—generally outweigh these costs. Furthermore, actions can be taken to mini 89   CSTB, NRC, 1999, Realizing the Potential of C4I, p. 147.

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities mize the possibility that adversaries might be able to obtain or use such information. For example, passing the information to the tested installation using nonelectronic means would eliminate the possibility that an adversary monitoring electronic channels could obtain it. Delaying the public release of such information for a period of time could allow the vulnerable party to fix the problems identified. Finally, tiger-team testing launched without the knowledge of the attacked systems also allows estimates to be made of the frequency of attacks. Specifically, the fraction of tiger-team attacks that are detected is a reasonable estimate of the fraction of adversary attacks that are made. Thus, the frequency of adversary attacks can be estimated from the number of adversary attacks that are detected. 3.6.2 Organizational Practices in IT-Enabled Companies and Agencies An organization’s practices play an important role in countering external threats. The discussion below is not meant in any way to be exhaustive but rather to motivate examination of yet another nontechnological dimension of security. Outsourcing of Product Development and Support For entirely understandable reasons, many companies outsource IT work to parties whose interests may not be fully aligned with their own. Companies outsource for many reasons, ranging from the availability of skilled human resources that are not indigenous to them to the often-lower cost of doing so (especially when the parties doing the outsourced work have access to cheaper sources of labor). The practice of outsourcing has security implications. On the one hand, outsourced work represents a potential vulnerability to the company that uses such work, unless that company has the expertise to audit and inspect the work for security flaws. By assumption, a company that outsources work has less control over how the work is done, and the possibility of deliberately introduced security vulnerabilities in outsourced work must be taken seriously. On the other hand, one reason for outsourcing work is that those doing the work may have greater expertise than the company hiring them—and if security is a special expertise of the former, its capabilities for maintaining security may be greater than that of the latter. Outsourcing is not in and of itself a practice that leads to insecure IT systems and networks. Nevertheless, prudence dictates that a company understand the potential risks and benefits of outsourcing from a security

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities perspective. If it does outsource work, a company should undertake careful and informed inspection of the work on system components that provide critical functionality. Personnel Screening Behavioral and psychological profiles of typical outside “hackers” have been available for a long time, providing insight into their motivations and techniques. However, similar information about persons likely to present an insider threat is not available today. One challenge to assembling such information is the fact that insider adversaries can be characterized in many different ways. For example, the behavior of the insider will likely vary depending on a wide variety of factors, including whether the person is unwitting, incompetent, coerced, vengeful, and so on. Such factors imply that simply relying on externally observable traits and behaviors in order to identify potential insiders may not prove useful, and so integration with background information on individual employees may be necessary to identify potential risks from insiders. Note also that all screening techniques run the risk of incorrectly labeling problematic behavior acceptable or of determining benign behavioral patterns to be indicative of inappropriate behavior or intent. Managing Personnel in a Security-Oriented Environment Managing employees in a security-oriented environment is complex. The practices that characterize the handling of classified information often impede the sharing of information among people. Long-term compliance with security procedures is often difficult to obtain, as employees develop ways to circumvent these procedures in order to achieve greater efficiency or effectiveness. Personnel matters that are routine in nonsecure environments become difficult. For example, from a security perspective, termination of the access privileges of employees found to be improperly hired or retained must happen without warning them of such termination. On the other hand, due process may prevent rapid action from being taken. The temptations are strong to relax the requirements of due process for security, but not observing due process often has detrimental effects on organizational morale and esprit de corps,90 not to mention the possibility that almost any pretext will suffice for some individual supervisors to eliminate workers they do not personally like. 90   Illustrations of the conflict between applying due process and managing the requirements of security can be found in the cases of Wen Ho Lee and Felix Bloch. In investigating the alleged passing of nuclear weapons design information to the People’s Republic of

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities Many issues arise when an employee is merely under suspicion for wrongdoing or malicious intent—before action is taken to rescind access privileges, the company may run a significant risk of suffering significant damage. This suggests that under these circumstances, the work of the employee must be monitored and controlled, but at the same time the requirements of due process must be observed. 3.6.3 Dealing with Organizational Resistance to Interagency Cooperation An effective response to a serious terrorist incident will inevitably require the multiple emergency-response agencies to cooperate. Section 3.2.1 describes technical barriers to effective cooperation, but technological limitations by no means explain why agencies might fail to cooperate effectively. Specifically, it is necessary to note that the character and traditions of agencies have a profound impact on their ability and willingness to cooperate. Different agencies exhibit many differences in internal cultures (e.g., in philosophies of staff reward and punishment, in traditions among disciplines in research and implementation, in ethical criteria of staff in terms of private versus public interest, in performance criteria between public and profit-making enterprises, and in degree of participatory decision making). Turf battles and jurisdictional warfare between agencies with overlapping responsibilities are also common, with each agency having its own beliefs about what is best for the citizenry. For the public record, the rhetoric of every emergency-response agency acknowledges the need for cooperation with other agencies. But the reality in practice is often quite different from the rhetoric, and in practice almost every disaster (whether natural or attack-related) reveals shortcomings in the extent and nature of interagency cooperation.     China, Lee was charged on multiple counts of mishandling material containing restricted data with the intent to injure the United States and with the intent to secure an advantage to a foreign nation. After being charged, Lee was held in custody under conditions described by the cognizant federal judge as draconian, because it was believed that his pretrial release would pose a grave threat to the nation’s security. The case ended with the dismissal of all but one charge of mishandling classified information and with the judge’s apology for the conduct of the government in the prosecution of this case. Felix Bloch was a Foreign Service officer in the State Department, investigated in 1989 by the FBI for spying for the Soviet Union. Bloch was eventually fired and stripped of his pension in 1990 on grounds that he lied to FBI investigators, but he was never charged. It is alleged that Robert Hanssen gave information to the Soviets revealing that Bloch was under suspicion, which might account for the fact that sufficient grounds for charging Bloch were never found.

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities For example, the emergency response to the September 11, 2001, attacks on the World Trade Center revealed a number of cultural barriers to cooperation in the New York City Police and Fire Departments:91 Police helicopters, with the ability to provide firefighters close-up information on the progress of the fire in the upper parts of the buildings as well as some aerial rescue capability of those gathered on the roofs, were never used for those purposes. An on-site Fire Department chief tried to request police helicopters for such roles but was unable to reach the police dispatcher for the helicopters either by phone or radio. Furthermore, the Fire Department had established its command post in the building lobbies, while the police had established their command post three blocks away, and the police did not report to the Fire Department commanders on-site. Said one senior Fire Department official, “They [the police] report to nobody and they go and do whatever they want.” The Police and Fire Departments have a formal agreement (in place since 1993) to share police helicopters during high-rise fires and to practice together. However, neither agency has any records of joint drills, although some less formal “familiarization flights” may have been conducted for the Fire Department a year or so before September 11. While most states and the federal government have forged agreements among emergency-response agencies that specify in advance who will be in overall charge of a crisis response, New York City has no such agreement, which left its Police and Fire Departments with no guidance about how to proceed with overall command arrangements on September 11. Police fault firefighters and firefighters fault police for unwillingness to cooperate. Some police believe that sharing command with the Fire Department is difficult because firefighters lack paramilitary discipline characteristic of the police force, while some firefighters thought that the police felt they could and should do everything. In the aftermath, senior Fire Department and Police Department officials disagreed over the extent to which the departments were able to coordinate. A senior Fire Department official said that “there is no question there were communications problems [between the Fire Department and the Police Department] at this catastrophic incident,” while a senior Police Department official said, “I was not made aware that day that we were having any difficulty coordinating.” 91   See Jim Dwyer, Kevin Flynn, and Ford Fessenden. 2002. “9/11 Exposed Deadly Flaws in Rescue Plan.” New York Times, July 7. Available online at <http://www.nytimes.com/2002/07/07/nyregion/07EMER.html?pagewanted=1>.

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities Cultural barriers separating the CIA and the FBI have also been revealed in the postmortems that have been conducted since September 11. In particular, the essential mission of the CIA is one of intelligence collection and analysis, while the essential mission of the FBI has been directed toward law enforcement. As a broad generalization, these missions differ in that intelligence is more focused on anticipating and predicting bad events, while law enforcement is more focused on prosecutions and holding perpetrators of dangerous events accountable to a criminal justice system. To illustrate, intelligence analysts place a high value on protecting sources and methods for gathering intelligence so that they will be able to continue obtaining information from those channels, while law-enforcement officials place a high value on the ability to use information in open court to gain convictions. (The fact that the FBI and CIA also operate under very different legal regimes governing their domestic activities is also quite relevant, but beyond the scope of this report. Suffice it to say that these different legal regimes impose explicit behavioral constraints and serve to shape the environment in which the cultural attitudes within each agency develop.) Desires of agencies to preserve their autonomy also contribute to an active (if subterranean) resistance to interoperability. Personnel of one agency without the capability of communicating with another agency are not easily directed by that other agency. Furthermore, an agency may have a fear (often justified) that communications overheard by another agency will lead to criticism and second-guessing about actions that it took in the heat of an emergency. There are no easy answers for bridging cultural gulfs between agencies that do not interact very much during normal operations. Different agencies with different histories, different missions, and different day-to-day work would be expected to develop different policies, procedures, and philosophies about what is or is not appropriate under a given set of circumstances. For the most part and under most circumstances, an agency’s culture serves it well. But in crisis, interagency differences impede interagency cooperation, and they cannot be overcome by fiat at the scene of the crisis. For example, a policy directive requiring that agencies adopt and use common communications protocols does not necessarily require emergency responders from different agencies to actually interact with one another while an emergency response is occurring. 3.6.4 Principles into Practice Putting these principles into practice requires that the human requirements be considered equally with technical and security requirements.

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities The most secure and reliable systems will be those developed with behavioral scientists from the user interface community who use an iterative design-test-design implementation strategy. The major point is to recognize that security and reliability are systems problems: the needs and standard working practices of the people involved are as important as the technical requirements. The very virtues of people are often turned against them when intruders seek to broach security: the willingness to help others in distress is perhaps the weakest link of all in any defensive system, but it would be preferable to design security systems that detract minimally from this valuable human attribute. Many failures are due to security requirements that are unreasonable from the point of view of human cognition (e.g., asking for frequent memorization of long, complex passwords) or that severely impact the ability to get the required work done because they conflict with the organizational structures and requirements. It is therefore essential that the needs of the individuals, the workgroups, and the organization all be taken into account. Conscientious workers will do whatever is necessary to get the work done, often at the cost of compromising security. But through proper design, it should be possible to design systems that are both more efficient and more secure. To achieve effective interagency cooperation in crisis, many things must happen prior to the occurrence of crisis, taking into account the realities of organizational resistance to interoperability. Such cooperation is likely to require: Strong, sustained leadership. When a strong leader places a high priority on interagency cooperation, is willing to expend resources and political capital in support of such cooperation, and can sustain that expenditure over a time long enough so that the agencies in question cannot “outwait” his or her efforts, organizational change that moves in the direction of that priority is more likely. Activities that promote interagency understanding and cooperation. One example of such activities is the temporary detailing of personnel from one agency to another (e.g., firefighters and police officers or FBI and CIA analysts on temporary duty at each other’s agencies). Prior exposure to one another’s operational culture generally helps to reduce frictions that are caused by lack of familiarity during a crisis. Of course, to be genuniely helpful, this practice must be carried out on a sufficiently large scale that the personnel so exposed are likely to be those participating in a response to a crisis. Another useful activity is joint exercises that simulate crisis response. As a rule, exercises that involve most or all of the agencies likely to be responding in a disaster—and that use the IT infrastructure that they are expected to use—are an essential preparation for effective

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities interagency cooperation.92 Exercises help to identify and solve some social, organizational, and technical problems, and they help to reveal the rivalries and infighting between agencies whose resolution is important to real progress in this area. To the extent that the agencies and personnel involved in an exercise are the same as those involved in the response to a real incident, exercises can help the response to be less ad hoc and more systematic. Other, less formal activities can also be conducted to improve interagency understanding. For example, personnel from one agency can be detailed to work in another agency in emergency-response situations. As a part of pre-service and in-service training, personnel from one agency can be posted to other agencies for short periods to develop contacts and to understand the operating procedures of those other agencies. Budgets that support interagency cooperation. For many agencies, battles of the budget are as important as their day-to-day operational responsibilities. This is not inappropriate, as adequate budget resources are a prerequisite for an agency’s success. Thus, it is simply unrealistic to demand cooperation from agencies without providing budget resources that are dedicated to that end. Note that budget resources support operations (e.g., personnel and training matters) and the procurement of systems, and both are relevant to interagency cooperation. 3.6.5 Implications for Research The discussion in the previous sections has two purposes. One is to describe the operational milieu into which technology is deployed—a warning that human beings are an essential part of any operational system and that system design must incorporate sophisticated knowledge of human and organizational issues as well as technical knowledge. A second purpose is to develop a rationale for research into human and organizational issues relating to technology in a counterterrorism context. Research in this area will be more applied than basic. The social sciences (used broadly to include psychology, anthropology, sociology, organizational behavior, human factors, and so on) have developed a significant base of knowledge that is relevant to the deployment of IT-based systems. But in practice, social scientists with the relevant domain expertise often lack applied skills and the requisite technical IT knowl 92   Despite the creation of New York City’s Office of Emergency Management in 1996 and expenditures of nearly $25 million to coordinate emergency response, the city had not conducted an emergency exercise between 1996 and September 11, 2001, at the World Trade Center—which had been bombed in 1993—that included the Fire Department, the police, and the Port Authority’s emergency staff.

OCR for page 28
Information Technology for Counterterrorism: Immediate Actions and Future Possibilities edge. Similarly, information technologists often lack the appropriate domain knowledge and often use a system development process that makes it difficult to incorporate human and organizational considerations. This point suggests that research is relevant in at least four different areas. The formulation of system development methods that are more amenable to the incorporation of domain knowledge and social science expertise. The “spiral development” methodology for software development is an example of how user inputs and concerns can be used to drive the development process, but the method is hard to generalize to incorporate knowledge about the organizational context of use. The translation of social science research findings into guidelines and methods that are readily applied by the technical community. The results of this research effort might very well be software toolboxes as well as a “Handbook of Applied Social Science” or a “Cognitive Engineering Handbook” containing useful principles for system development and design derived from the social science research base. The development of reliable security measures that do not interfere with what legitimate workers must do. These methods must minimize loads on human memory and attention and task interference while providing the appropriate levels of security in the face of adversaries who use sophisticated technologies as well as social engineering techniques to penetrate the security. Understanding of the IT issues related to the disparate organizational cultures of agencies that will be fused under the Department of Homeland Security. This is a complex task, with difficult technical issues interspersed with complex procedural, permissions, and organizational issues requiring a mix of technical and social skills to manage. Operationally, the question is how to allow for the sharing of communication and data among different organizations that have different needs to know, differing requirements, and different cultural and organizational structures, in a way that enhances the desired goals while maintaining the required security.