Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 1
Cybersecurity Today and Tomorrow: Pay Now or Pay Later 1 Cybersecurity Today and Tomorrow BACKGROUND AND INTRODUCTION In the wake of the horrific events of September 11, 2001, the nation’s attention has focused heavily on various dimensions of security. Security for nuclear power plants, shopping malls, sports stadiums, and airports, to name just a few entities, received additional scrutiny in the weeks that followed. Computer and telecommunication systems, too, have received significant attention, and because the Computer Science and Telecommunications Board (CSTB, described further on the final pages of this report) of the National Research Council (NRC) has examined various dimensions of computer and network security and vulnerability, it decided to revisit reports relevant to cybersecurity issued over the last decade. In some instances, security issues were the primary focus of a report from the start (see, for example, (1) Computers at Risk, 1991;1 (2) Cryptography’s Role in Securing the Information Society, 1996;2 (3) For the Record: Protecting Electronic Health Information, 1997;3 and (4) Trust in Cyberspace, 19994). In 1 Computer Science and Telecommunications Board, National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. National Academy Press, Washington, D.C. 2 Kenneth W. Dam and Herbert S. Lin (eds.), Computer Science and Telecommunications Board, National Research Council. 1996. Cryptography’s Role in Securing the Information Society. National Academy Press, Washington, D.C. 3 Computer Science and Telecommunications Board, National Research Council. 1997. For the Record: Protecting Electronic Health Information. National Academy Press, Washington, D.C. 4 Computer Science and Telecommunications Board, National Research Council. 1999. Trust in Cyberspace. National Academy Press, Washington, D.C.
OCR for page 2
Cybersecurity Today and Tomorrow: Pay Now or Pay Later other instances, security issues emerged as a prominent element of a study as the study unfolded (see, for example, (5) Continued Review of the Tax Systems Modernization of the Internal Revenue Service, 1996;5 (6) Realizing the Potential of C4I, 1999;6 and (7) Embedded, Everywhere, 20017). (Hereinafter, these reports are referenced by number.) Security issues continue to be an important part of CSTB’s portfolio, and CSTB recently held workshops that explored how to deal with the insider threat to security (2000) and various legal issues associated with protecting critical infrastructure (2001). Though the most recent of the comprehensive reports was issued 2 years ago and the earliest 11 years ago, not much has changed with respect to security as it is practiced, notwithstanding further evolution of the public policy framework and an increase in our perception of the risks involved. The unfortunate reality is that relative to the magnitude of the threat, our ability and willingness to deal with threats have, on balance, changed for the worse (6), making many of the analyses, findings, and recommendations of these reports all the more relevant, timely, and applicable today. This document presents the enduring findings and recommendations from that body of work, and it includes excerpts from three of the reports listed above. THE NATURE OF CYBERTHREATS Much of modern life depends on computers and computer networks. For many people, the most visible interaction they have with computers is typing at the keyboard of the computer. But computers and networks are critical for key functions such as managing and operating nuclear power plants, dams, the electric power grid, the air traffic control system, and the financial infrastructure. Computers are also instrumental in the day-to-day operations of companies, organizations, and government. Companies large and small rely on computers to manage payroll, to track inventory and sales, and to perform research and development. Distribution of food and energy from producer to retail consumer relies on computers and networks at every stage. Nearly everyone in business or government 5 Computer Science and Telecommunications Board, National Research Council. 1996. Continued Review of the Tax Systems Modernization of the Internal Revenue Service: Final Report. National Academy Press, Washington, D.C. 6 Computer Science and Telecommunications Board, National Research Council. 1999. Realizing the Potential of C4I: Fundamental Challenges. National Academy Press, Washington, D.C. 7 Computer Science and Telecommunications Board, National Research Council. 2001. Embedded, Everywhere: A Research Agenda for Networked Systems of Embedded Computers. National Academy Press, Washington, D.C.
OCR for page 3
Cybersecurity Today and Tomorrow: Pay Now or Pay Later relies on electronic communications—whether telephone, fax, e-mail, or instant messages—which are obviously enabled by computers. Many (perhaps even most) computer systems are networked in some fashion today (4), most visibly and commonly via the vast collection of globally interconnected computer networks known as the Internet. A more recent trend is toward embedding computing capability in all kinds of devices and environments and networking embedded systems into larger systems (7). These trends make many computing and communications systems critical infrastructure in themselves and components of other kinds of critical infrastructure, from energy to transportation systems.8 What can go wrong with a computer system or network? It can become unavailable or very slow (1,4,6). That is, using the system or network at all becomes impossible, or nearly so. The e-mail does not go through, or the computer simply freezes, with the result that somebody is unable to get his or her job done in a timely way, be it servicing a customer or reacting to a crisis. It can become corrupted, so that it does the wrong thing or gives wrong answers (1-7). For example, data stored on the computer may become different from what it should be, as would be the case if medical or financial records were improperly modified. Or, freight manifests might be altered so that the wrong material is shipped, an obvious problem for any rapid military deployment. It can become leaky (1-7). That is, someone who should not have access to some or all of the information available through the network obtains such access. For example, a spy who gains access to files stored in an intelligence agency information system may be able to view very sensitive data. CAUSES OF SYSTEM AND NETWORK PROBLEMS What can cause something to go wrong in a computer system or network? It is useful to distinguish between accidental causes and deliberate causes. In general, accidental causes are natural (e.g., a lightning surge that destroys a power supply in a network that causes part of the network to fail) or human but nondeliberate (e.g., an accidental program- 8 The President’s Commission on Critical Infrastructure Protection included under the rubric of “critical infrastructure” telecommunications, electric power systems, gas and oil production and storage, banking and finance, transportation, water supply systems, government services, and emergency services. See President’s Commission on Critical Infrastructure Protection. 1997. Critical Foundations. Washington, D.C.
OCR for page 4
Cybersecurity Today and Tomorrow: Pay Now or Pay Later ming error that causes a computer to crash under certain circumstances, or the unintended cutting of a communications cable during excavation). Accidental causes figure prominently in many aspects of trustworthiness beside security, such as safety or reliability (1,3,4,7). Deliberate problems are the result of conscious human choice. In the context of seeking to understand the laws of physics, Einstein once said that while nature may be subtle, it is not malicious. But in dealing with deliberate problems, one is faced with malicious intent. A malicious human may seek to hide his or her tracks, making it difficult to identify the nature of the problem caused (or even to identify that a problem has been caused).9 A malicious human can, in principle, tailor actions to produce a desired effect beyond the damage to the actual system attacked—unlike an accidental problem whose effects are randomly determined. Security experts often refer to the efforts of these malicious people as “attacks.”10 A central challenge in responding to an information system attack is identifying who the attacker is and distinguishing whether the motive is mischief, terrorism, or attack on the nation. A related challenge is determining whether events that are distant in time or space are related—parts of a given attack (1,4,6). Note also that an attacker—who seeks to cause damage deliberately— may be able to exploit a flaw accidentally introduced into a system. System design and/or implementation that is poor by accident can result in serious security problems that can be deliberately targeted in a penetration attempt by an attacker.11 There are many ways to cause problems deliberately. One way— which receives a great deal of attention in the media—is through an attack 9 Tracing attacks is generally difficult, because serious attackers are likely to launder their connections to the target. That is, an attacker will compromise some intermediate targets whose vulnerabilities are easy to find and exploit, and use them to launch more serious attacks on the ultimate intended target. This, of course, is what has happened in a number of distributed denial-of-service attacks against Web servers of certain U.S. companies and government agencies, in which a number of computers flooded their targets with bogus requests for service, thus making them unavailable to provide service to legitimate users. 10 ”Attack” is a word that has seemed excessive to some (particularly because many attacks have been traced to individuals with motivation more akin to that of a joyrider than a state-supported, well-organized attacker). Recent events suggest that “attack” is increasingly appropriate, insofar as it is analogous to more familiar or conventional forms of attacks on resources of different kinds, in either the military or civilian sector. 11 A particularly insidious “accidental” problem arises because of the fact that the precise software configuration on any operational system (including applications, device drivers, and system patches) has almost certainly not been tested for security—there are simply too many possible configurations to test more than a small fraction explicitly. As new applications and device drivers are installed over time, an operational system is more likely to exhibit additional vulnerabilities that an attacker might exploit.
OCR for page 5
Cybersecurity Today and Tomorrow: Pay Now or Pay Later that arrives “through the wires.” The Internet makes it possible to mount such an attack remotely, anonymously, and on a large scale (4). One example of such “cyber-only” attacks are computer viruses that infect a user’s computer, take some destructive action such as deleting files on the network or local hard drive, and propagate themselves further, such as by e-mailing copies of themselves. A second example is a distributed denial-of-service attack, described in footnote 9. The damage that a cyber-only attack causes may not be immediately (or ever) apparent. A successful attack may lay a foundation for later attacks, be set to cause damage well after the initial penetration, or enable the clandestine transmission of sensitive information stored on the attacked system (1,4,6). For example, a number of recent incidents have compromised the computers of unsuspecting home computer users by implanting unauthorized code; these computers were subsequently used as launch points in a coordinated and distributed denial-of-service attack. Finally, the fact that the Internet connects many of the world’s computers implies that a cyber-only attack can be launched from locations around the world, routed through other countries (perhaps clandestinely and unknown to anyone in those other countries), and directed against any U.S. computer on the Internet. The availability of a plethora of launch points and routes for cyberattack greatly complicates the ability to stop an attack before it reaches the security barriers of the U.S. computers in question, as well as to identify its source. The Internet’s various intentional or inadvertent links to other communications networks make them potentially vulnerable to worldwide attacks as well (1,2,4,6). A cyber-only attack is only one way to cause problems in a computer system or network. Other ways include the following: The compromise of a trusted insider who can provide system or network access to outsiders or use his or her access for improper purposes (e.g., providing passwords that permit outsiders to gain entry) (1,3-6). This trusted insider may be recruited covertly by hostile parties, planted well in advance of any action associated with an actual attack (the so-called “sleeper” problem), or tricked into taking some action that breaches system security (e.g., tricked into disclosing a password or installing software that permits access by malicious outsiders). Physical destruction of some key element of the system or network, such as critical data centers or communications links (4,6,7). Examples of physical vulnerabilities are various backhoe incidents in which accidental cutting of fiber-optic cables (both primary and backup!) resulted in major network outages, and the severe damage to a Verizon central office in the World Trade Center attack on September 11, 2001.
OCR for page 6
Cybersecurity Today and Tomorrow: Pay Now or Pay Later It is useful to distinguish between three important concepts of cybersecurity. A vulnerability is an error or a weakness in the design, implementation, or operation of a system. A threat is an adversary that is motivated to exploit a system vulnerability and is capable of doing so. Risk refers to the likelihood that a vulnerability will be exploited, or that a threat may become harmful. In this lexicon, a system that allows computer viruses to replicate or unauthorized users to gain access exhibits vulnerabilities. The creator of the virus or the unauthorized user is the threat to the system. Operating a system with known vulnerabilities in the presence of possible threats entails some risk that harm or damage will result. THE HARM FROM BREACHES OF CYBERSECURITY How do potential cyberdisasters compare with disasters in the physical world? As the catastrophic events of September 11, 2001, demonstrate, disasters in the physical world can involve massive loss of life and damage to physical infrastructure over a very short period of time. The damage from most cyberattacks is unlikely to be manifested in such a manner—although interference with medical information systems and devices could affect lives. If undertaken by themselves, cyberattacks could compromise systems and networks in ways that could render communications and electric power distribution difficult or impossible, disrupt transportation and shipping, disable financial transactions, and result in the theft of large amounts of money (1,2,4). Economic and associated social harm is a likely consequence of a large-scale cyberattack that is successful. That harm would involve at least opportunity costs—interruption of business, forgoing of various activities and associated benefits, and so on. While such results would qualify on any scale as disastrous, additional harm can come from the interactions of cyber- and physical systems under attack that endanger human life directly and affect physical safety and well-being (3,4,7). In particular, a large-scale coordinated cyberattack could occur at the same time as an attack on the physical infrastructure. For example, a successful cyberattack launched on the air traffic control system in coordination with airliner hijackings could result in a much more catastrophic disaster scenario than was seen on September 11, 2001. Or compromising communications channels during a physical attack of that magnitude could prevent government officials from responding to the attack, coordinating emergency response efforts, or even knowing whether the attack was still ongoing.
OCR for page 7
Cybersecurity Today and Tomorrow: Pay Now or Pay Later WHAT DO WE KNOW ABOUT CYBERSECURITY? With the above perspective in mind, here are some of the main messages about cybersecurity that emerge from a review of CSTB reports. General Observations In the United States, information system vulnerabilities, from the standpoint of both operations and technology, are growing faster than the country’s ability (and willingness) to respond (1,2,4,6,7). Security is expensive, not only in dollars but especially in interference with daily work (1,2,4-6). It has no value when there is no attack (or natural/accidental disruption in the system environment).12 Consequently, people tend to use as little of it as they think they can get away with. Exhortations to be more careful may work for a short time, but operational security can be maintained only by systematic and independently conducted “red team” attacks and correction of the defects that they reveal. Moreover, there are no widely accepted metrics for characterizing security, so it is difficult for a decision maker to know how much security a certain investment buys or whether that investment is enough. The overall security of a system is only as strong as its weakest link (1-7). System security is a holistic problem, in which technological, managerial, organizational, regulatory, economic, and social aspects interact. Weaknesses in any of these aspects can be very damaging, since competent attackers seek out weak points in the security of a network or system. The best is the enemy of the good. Risk management is an essential element of any realistic strategy for dealing with security issues (2-6). Experience has demonstrated—sadly—that the quest for perfection is the enemy of concrete, actionable steps that can provide improved but not perfect security. It is true that given enough time and effort, almost any security system can be breached. But that does not diminish the value of steps that can increase the difficulty of breaching security. Security is a game of action and reaction (1,4,6). When old vulnerabilities are corrected, attackers look for new paths of attack. On the other hand, it takes time to find those new paths, and during that time the system is more secure. Systems have many potential points of vulnerability, and an attacker is free to choose any one of them. For example, as antivirus soft- 12 More precisely, designs and architectures and implementation methods that can be used to make a system more secure are often the same as those that enhance system reliability and trustworthiness. However, specific security features often are not valuable against natural or accidental disruption.
OCR for page 8
Cybersecurity Today and Tomorrow: Pay Now or Pay Later ware came to protect against conventional viruses, virus writers exploited a new channel—the macro capabilities of word processors—that was never intended to provide the capability for implementing viruses. Thus, security must be approached on a system level rather than on a piecemeal basis (1-7). Because cyberattacks can be conducted without leaving publicly visible evidence (unlike, for example, a plane crash), it is easy to cover them up (1,3,4,6). Reporting of attempts, successful and unsuccessful, to breach security—the where, when, and how of attacks—is essential both for forensics (to determine who is responsible and whether incidents in different places are part of the same attack) and for prevention (to defend against future attacks). Researchers, developers, and operators need this information to redesign systems and procedures to avoid future incidents, and national security and law enforcement agencies need it to defend the nation as a whole. Organizations that are attacked prefer to conceal attacks, because publicity may undermine public confidence, disclose adverse information, and make managers look bad. Weighing these costs and benefits should be a public policy issue, but so far the commercial and face-saving concerns of targets have dominated, and there is no effective reporting. The airline industry might be a good model to copy (in the sense that both accidents and near-misses are reported), and the information-sharing problem is being explored in the context of critical infrastructure protection, a perspective that emerged in the late 1990s. Management From an operational standpoint, cybersecurity today is far worse than what known best practices can provide (1-6). Even without any new security technologies, much better security would be possible today if technology producers, operators of critical systems, and users took appropriate steps. But new technologies and new operating procedures— which would require additional investment for research and development—could make things even better. Because a secure system doesn’t allow users to do any more than an insecure system, system and network operators in the private sector spend only as much on security as they can justify on business grounds— and this may be much less than the nation needs as a whole (1,3-6). (The same is true of government agencies that must work within budget constraints, though the detailed cost-benefit calculus may be different.) Further, because serious cyberattacks are rare, the payoff from security investments is uncertain (and in many cases, it is society rather than any individual firm that will capture the benefit of improved security). As a result, system and network operators tend to underinvest in
OCR for page 9
Cybersecurity Today and Tomorrow: Pay Now or Pay Later security. Changing market incentives—for example, by adjusting the liability to which business users of technology might be subject or the insurance implications of good security—could have a dramatic impact on the market for security features.13 For economic reasons, systems are generally built out of commercial off-the-shelf components. These are not very secure because there isn’t much market demand: Customers buy features and performance rather than security. The failure of the U.S. government’s Orange Book14 program even within the federal marketplace is a striking example. The government demanded secure systems, industry produced them, and then government agencies refused to buy them because they were slower and less functional than other nonsecure systems available on the open market (1,4).15 Because security measures are disaster-preventing rather than pay-off-producing, a central aspect of security must be accountability. That is, users and operators must be held responsible by management for taking all appropriate security measures—one cannot count on financial and market incentives alone to drive appropriate action (1,3-6). Many security problems exist not because a fix is unknown but because some responsible party has not implemented a known fix. Of course, appropriate security measures are not free. Management must be willing to pay the costs and must demand from vendors the tools needed to minimize those costs. (Note that costs can include the costs of testing a fix to see if it ruins the production environment.) Management must resolve the conflict between holding people responsible and full reporting of problems, which tends to be easier in an environment in which individuals are not fearful of reporting problems. 13 For example, under today’s practices, a party that makes investments to prevent its own facilities from being used as part of a distributed denial-of-service (DDOS) attack will reap essentially no benefits from such investments, because such an attack is most likely to be launched against a different party. But today’s Internet-using society would clearly benefit if many firms made such investments. Making parties liable for not securing their facilities against being illicitly used as part of a DDOS attack (today there is zero liability) would change the incentives for making such investments. In a current project on critical infrastructure protection and the law, CSTB is exploring this issue (among others) in greater depth. 14 “The Orange Book” is the nickname for the Trusted Computer System Evaluation Criteria, which were intended to guide commercial system production generally and thereby improve the security of systems in use. 15 It did not help that systems compliant with Orange Book criteria also came later to market and often had less functionality (e.g., in some cases, a certified system was unable to connect to a network, because a network connection was not part of the certified configuration).
OCR for page 10
Cybersecurity Today and Tomorrow: Pay Now or Pay Later Operational Considerations To promote accountability, frequent and unannounced penetration testing (so-called red-teaming) is essential to understand the actual operational vulnerabilities of deployed systems and networks (6). No other method is as effective at pointing to security problems that must be solved. Information about vulnerabilities thus gathered must be made available to those who are in a position to fix them—or to upper management, who can force them to be fixed. Note that effective red-teaming is undertaken independently of the system or network being tested—those being tested must not know when the test will occur or what aspects of security will be tested, while those doing the testing must be technically savvy and not constrained by operating orders that limit what they are permitted to do. Many compromises of an information system or network result from improper configuration (1,3,4,6). For example, a given system on a network may have a modem attached to it that is not known to the network administrator, even if it was attached by a legitimate user for legitimate purposes. An installed operating system on a computer may lack critical “bug” fixes, because they were not applied or because the system was restored from a backup tape that did not include those fixes. A system firewall may be improperly configured in a way that allows Web access when, in fact, the system should only be able to transmit and receive e-mail. Or, a group of users may be given privileges that should, in fact, be restricted to one member of that group. Because checking operational configurations is very labor-intensive if done manually, it is essential to have configuration management tools for both systems and networks that can automatically enforce a desired configuration or alert administrators when variances from the known configuration are detected. Such tools are miserably inadequate today. Building them does not require research, but it does require a considerable amount of careful engineering. Since perfect security is impossible, secure configurations need to be updated when new attacks are discovered. These updates need to be delivered automatically to millions of systems (4,7).16 Organizations must have concrete fallback action plans that instruct users and administrators about what they should do under condi- 16 On the other hand, there is a nontrivial chance that updates will diminish existing and needed functionality, and people are sometimes reluctant to apply updates because they are reluctant to instigate system instability. Thus, the trustworthiness of the updates themselves, as well as the updating process, becomes an issue of concern.
OCR for page 11
Cybersecurity Today and Tomorrow: Pay Now or Pay Later tions of cyberattack (6). Admonitions to “be more careful” are not actionable, especially since the effects of a cyberattack may not be obvious, nor will they be constant over time. Instead, the tradeoffs between vulnerability and functionality must be understood, and appropriate responses to attacks must be defined. Usually these responses involve making the systems do less in order to make them less vulnerable: fewer authorized users, less software running, and less communication between systems. For example, software architects might design a system so that operators could close off certain routes of access to it when under attack, thereby losing the useful functionality associated with those routes but preserving critical functions that do not need those routes. Design and Architectural Considerations There are often tensions between security and other good things, such as features, ease of use, and interoperability (1,3,4,6). For example, if users have different passwords for different systems, it is harder for an unauthorized party to gain access to all of those systems, but users must bear the burden of remembering multiple passwords. Because the benefits of successful security can be seen only in events that usually do not happen, resources devoted to security are “wasted” in the same sense that resources devoted to insurance are “wasted.” In both cases, the system user (or the insured party) does not gain additional functionality as the result of its expenditures. But that does not make investments in security worthless—rather, it changes the terms on which such investments should be evaluated, which include the value of being able to continue operation in the face of hostile attacks (and often natural or accidental disruptions as well).17 Human error is usually not a useful explanation for security problems. Usually either operational or management practice is at fault: operational practice that requires people to get too many details right or that does too little red-team testing, and management practice that allows too little time for security procedures or fails to ensure that problems uncovered by testing are fixed (1,3-5). While cryptography is not a magic bullet for security problems, it 17 Note that one fundamental difference between risks in the physical world and risks in cyberspace is the existence of an extensive actuarial database for the former that enables organizations to assess the payoff from investments to deal with those risks. By comparison, operations in cyberspace are new and continually evolving, and risks in cyberspace are not well understood by the insurance industry. That industry has recently increased its activity in this domain, but progress has been slow.
OCR for page 12
Cybersecurity Today and Tomorrow: Pay Now or Pay Later does play key roles in system and network security. Cryptography has three primary uses: authentication, integrity checks, and confidentiality (1,2,4). Cryptographic authentication is an important aspect of security techniques that deny access or system privileges to unauthorized users. Cryptographic integrity checks ensure that data cannot be modified without revealing the fact of modification. Cryptographic confidentiality can be used to keep unauthorized parties from reading data stored in systems and sent over networks. Of course, adversaries can also use cryptography, to the detriment of certain national security and law enforcement purposes, but its widespread use can promote and enhance crime prevention and national security efforts as well. User authentication is essential for access control and for auditing (1-6). The most common method used today to authenticate users is passwords, which are known to be insecure compared with other authentication methods. A hardware token (e.g., smart card), supplemented by a personal identification number or biometrics (assuming good implementation), is much better. The user doesn’t have to keep track of passwords, and a lost token is physically obvious and cannot be broadcast to a myriad of unauthorized parties (but the user does have to remember to bring the token to the computer access point). A common approach to network security is to surround an insecure network with a defensive perimeter that controls access to the network (1,4,6). Once past the perimeter, a user is left unconstrained. A perimeter defense is good as part of a defense in depth, especially because the security burden is placed primarily on those who manage the perimeter rather than those who manage systems inside the perimeter. However, it is entirely vulnerable if a hostile party gains access to a system inside the perimeter or compromises a single authorized user. Another approach to network security is mutual suspicion: Every system within a critical network regards every other system as a potential source of threat. Thus, a hostile party who gains access to one system does not automatically gain access to the whole network. Mutual suspicion can provide significantly higher levels of security, but it requires all system operators to pay attention to security rather than just those at the network perimeter. Perimeter defense and mutual suspicion can be used together to increase network security. WHAT CAN BE DONE? It is helpful to distinguish among actions that can be taken by individual organizations, by vendors, and by makers of public policy. However, the best results for cybersecurity will be obtained through actions by all parties.
OCR for page 13
Cybersecurity Today and Tomorrow: Pay Now or Pay Later Individual Organizations Individual organizations should: Establish and provide adequate resources to an internal entity with responsibility for providing direct defensive operational support to system administrators throughout the organization (3,5,6). To serve as the focal point for operational change, such an entity must have the authority—as well as a person in charge—to force corrective action. Ensure that adequate information security tools are available, that everyone is properly trained in their use, and that enough time is available to use them properly. Then hold all personnel accountable for their information system security practices (3,5,6). Conduct frequent, unannounced red-team penetration testing of deployed systems and report the results to responsible management (6). Promptly fix problems and vulnerabilities that are known or that are discovered to exist (3,5,6). Mandate the organization-wide use of currently available network/configuration management tools, and demand better tools from vendors (3,5,6). Mandate the use of strong authentication mechanisms to protect sensitive or critical information and systems (3,5,6). Use defense in depth. In particular, design systems under the assumption that they will be connected to a compromised network or a network that is under attack, and practice operating these systems under this assumption (1,4,6). Define a fallback plan for more secure operation when under attack and rehearse it regularly. Complement that plan with a disaster recovery program (1,6). Vendors of Computer Systems Vendors of computer systems should: Drastically improve the user interface to security, which is totally incomprehensible in nearly all of today’s systems (1,4,6). Users and administrators must be able to easily see the current security state of their systems; this means that the state must be expressible in simple terms. Develop tools to monitor systems automatically for consistency with defined secure configurations, and enforce these configurations (1, 4,6,7). Extensive automation is essential to reduce the amount of human labor that goes into security. The tools must promptly and automatically respond to changes that result from new attacks.
OCR for page 14
Cybersecurity Today and Tomorrow: Pay Now or Pay Later Provide well-engineered schemes for user authentication based on hardware tokens (3,4,6). These systems should be both more secure and more convenient for users than are current password systems. Develop a few simple and clear blueprints for secure operation that users can follow, since most organizations lack the expertise to do this properly on their own. For example, systems should be shipped with security features turned on, so that a conscious effort is needed to disable them, and with default identifications and passwords turned off, so that a conscious effort is needed to select them (1,3). Strengthen software development processes and conduct more rigorous testing of software and systems for security flaws, doing so before releasing products rather than use customers as implicit beta testers to shake out security flaws (4).18 Changing this mind-set is one necessary element of an improved cybersecurity posture. Policy Makers Policy makers should: Consider legislative responses to the failure of existing incentives to cause the market to respond adequately to the security challenge. Possible options include steps that would increase the exposure of software and system vendors and system operators to liability for system breaches and mandated reporting of security breaches that could threaten critical societal functions (1). Position the federal government as a leader in technology use and practice by requiring agencies to adhere to the practices recommended above and to report on their progress in implementing those measures (1,2,5).19 Such a step would also help to grow the market for security technology, training, and other services. Provide adequate support for research and development on information systems security (1,4,7). Research and development on informa- 18 Note that security-specific testing of software goes beyond looking at flaws that emerge in the course of ordinary usage in an Internet-connected production environment. For example, security-specific testing may involve very sophisticated attacks that are not widely known in the broader Internet hacker community. 19 This concept has been implicit in a series of laws, beginning with the Computer Security Act of 1987, and administrative guidance (e.g., from the Office of Management and Budget and more recently from the Federal Chief Information Officers Council). Although it has been an elusive goal, movements toward e-government have provided practical, legal, and administrative impetus.
OCR for page 15
Cybersecurity Today and Tomorrow: Pay Now or Pay Later tion systems security should be construed broadly to include R&D on defensive technology (including both underlying technologies and architectural issues), organizational and sociological dimensions of such security, forensic and recovery tools, and best policies and practices. Given the failure of the market to address security challenges adequately, government support for such research is especially important.
OCR for page 16
Cybersecurity Today and Tomorrow: Pay Now or Pay Later This page in the original is blank.
Representative terms from entire chapter: