National Academies Press: OpenBook

Cybersecurity Today and Tomorrow: Pay Now or Pay Later (2002)

Chapter: 2 Excerpts from Earlier CSTB Reports

« Previous: 1 Cybersecurity Today and Tomorrow
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

2
Excerpts from Earlier CSTB Reports

This chapter contains excerpts from three CSTB reports: Computers at Risk (1991), Realizing the Potential of C4I (1999), and Trust in Cyberspace (1999). While this synthesis report is based on all of the references in footnotes 1-7 (Chapter 1), the excerpts from these three reports are the most general and broad. To keep this report to a reasonable length, nothing was excerpted from the other four reports. Readers are encouraged to read all of these reports, which can be found online at <http://www.nap.edu>. For the sake of simplicity and organizational clarity, footnotes appearing in the original text have been omitted from the reprinted material that follows. A gray bar in the margin, rather than indentation, is used to indicate extracted text. Subsection heads have been added to show the topics addressed.

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

COMPUTERS AT RISK: SAFE COMPUTING IN THE INFORMATION AGE (1991)

CITATION: Computer Science and Telecommunications Board (CSTB), National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. National Academy Press, Washington, D.C.

The Cybersecurity Challenge

(From pp. 7-8): We are at risk. Increasingly, America depends on computers. They control power delivery, communications, aviation, and financial services. They are used to store vital information, from medical records to business plans to criminal records. Although we trust them, they are vulnerable—to the effects of poor design and insufficient quality control, to accident, and perhaps most alarmingly, to deliberate attack. The modern thief can steal more with a computer than with a gun. Tomorrow’s terrorist may be able to do more damage with a keyboard than with a bomb.

To date, we have been remarkably lucky. Yes, there has been theft of money and information, although how much has been stolen is impossible to know. Yes, lives have been lost because of computer errors. Yes, computer failures have disrupted communication and financial systems. But, as far as we can tell, there has been no successful systematic attempt to subvert any of our critical computing systems. Unfortunately, there is reason to believe that our luck will soon run out. Thus far we have relied on the absence of malicious people who are both capable and motivated. We can no longer do so. We must instead attempt to build computer systems that are secure and trustworthy.

. . .[T]he degree to which a computer system and the information it holds can be protected and preserved . . . , which is referred to here as computer security, is a broad concept; security can be compromised by bad system design, imperfect implementation, weak administration of procedures, or through accidents, which can facilitate attacks. Of course, if we are to trust our systems, they must survive accidents as well as attack. Security supports overall trustworthiness, and vice versa.

Fundamentals of Cybersecurity

(From p. 2): Security refers to protection against unwanted disclosure, modification, or destruction of data in a system and also to the safeguarding of systems themselves. Security, safety, and reliability together are elements of system trustworthiness—which inspires the confidence that a system will do what it is expected to do.

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

(From pp. 49-50): Organizations and people that use computers can describe their needs for information security and trust in systems in terms of three major requirements:

  • Confidentiality: controlling who gets to read information;

  • Integrity: assuring that information and programs are changed only in a specified and authorized manner; and

  • Availability: assuring that authorized users have continued access to information and resources.

These three requirements may be emphasized differently in various applications. For a national defense system, the chief concern may be ensuring the confidentiality of classified information, whereas a funds transfer system may require strong integrity controls. The requirements for applications that are connected to external systems will differ from those for applications without such interconnection. Thus the specific requirements and controls for information security can vary.

The framework within which an organization strives to meet its needs for information security is codified as security policy. A security policy is a concise statement, by those responsible for a system (e.g., senior management), of information values, protection responsibilities, and organizational commitment. One can implement that policy by taking specific actions guided by management control principles and utilizing specific security standards, procedures, and mechanisms. Conversely, the selection of standards, procedures, and mechanisms should be guided by policy to be most effective.

To be useful, a security policy must not only state the security need (e.g., for confidentiality—that data shall be disclosed only to authorized individuals), but also address the range of circumstances under which that need must be met and the associated operating standards. Without this second part, a security policy is so general as to be useless (although the second part may be realized through procedures and standards set to implement the policy). In any particular circumstance, some threats are more probable than others, and a prudent policy setter must assess the threats, assign a level of concern to each, and state a policy in terms of which threats are to be resisted. For example, until recently most policies for security did not require that security needs be met in the face of a virus attack, because that form of attack was uncommon and not widely understood. As viruses have escalated from a hypothetical to a commonplace threat, it has become necessary to rethink such policies in regard to methods of distribution and acquisition of software. Implicit in this process is management’s choice of a level of residual risk that it will live with, a level that varies among organizations.

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

The Security Experience: Vulnerability, Threat, and Countermeasure

(From pp. 13-14): The field of security has its own language and mode of thought, which focus on the processes of attack and on preventing, detecting, and recovering from attacks. In practice, similar thinking is accorded to the possibility of accidents that, like attacks, could result in disclosure, modification, or destruction of information or systems or a delay in system use. Security is traditionally discussed in terms of vulnerabilities, threats, and countermeasures. A vulnerability is an aspect of some system that leaves it open to attack. A threat is a hostile party with the potential to exploit that vulnerability and cause damage. A countermeasure or safeguard is an added step or improved design that eliminates the vulnerability and renders the threat impotent.

A safe containing valuables, for example, may have a noisy combination lock—a vulnerability—whose clicking can be recorded and analyzed to recover the combination. It is surmised that safecrackers can make contact with experts in illegal eavesdropping—a threat. A policy is therefore instituted that recordings of random clicking must be played at loud volume when the safe is opened—a countermeasure.

Threats and countermeasures interact in intricate and often counterintuitive ways: a threat leads to a countermeasure, and the countermeasure spawns a new threat. Few countermeasures are so effective that they actually eliminate a threat. New means of attack are devised (e.g., computerized signal processing to separate “live” clicks from recorded ones), and the result is a more sophisticated threat.

The Asymmetry Between Offense and Defense

(From p. 14): The interaction of threat and countermeasure poses distinctive problems for security specialists: the attacker must find but one of possibly multiple vulnerabilities in order to succeed; the security specialist must develop countermeasures for all. The advantage is therefore heavily to the attacker until very late in the mutual evolution of threat and countermeasure.

If one waits until a threat is manifest through a successful attack, then significant damage can be done before an effective countermeasure can be developed and deployed. Therefore countermeasure engineering must be based on speculation. Effort may be expended in countering attacks that are never attempted. The need to speculate and to budget resources for countermeasures also implies a need to understand what it is that should be protected, and why; such understanding should drive the choice of a protection strategy and countermeasures. This thinking should be captured in security policies generated by management; poor security often reflects both weak policy and inadequate forethought.

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

Confidence in Countermeasures

(From p. 15): Confidence in countermeasures is generally achieved by submitting them for evaluation by an independent team; this process increases the lead times and costs of producing secure systems. The existence of a successful attack can be demonstrated by an experiment, but the adequacy of a set of countermeasures cannot. Security specialists must resort to analysis, yet mathematical proofs in the face of constantly changing systems are impossible.

In practice, the effectiveness of a countermeasure often depends on how it is used; the best safe in the world is worthless if no one remembers to close the door. The possibility of legitimate users being hoodwinked into doing what an attacker cannot do for himself cautions against placing too much faith in purely technological countermeasures.

The evolution of countermeasures is a dynamic process. Security requires ongoing attention and planning, because yesterday’s safeguards may not be effective tomorrow, or even today.

On Network Vulnerabilities

(From p. 17): Interconnection gives an almost ecological flavor to security; it creates dependencies that can harm as well as benefit the community of those who are interconnected. An analogy can be made to pollution: the pollution generated as a byproduct of legitimate activity causes damage external to the polluter. A recognized public interest in eliminating the damage may compel the installation of pollution control equipment for the benefit of the community, although the installation may not be justified by the narrow self-interest of the polluter. Just as average citizens have only a limited technical understanding of their vulnerability to pollution, so also individuals and organizations today have little understanding of the extent to which their computer systems are put at risk by those systems to which they are connected, or vice versa. The public interest in the safety of networks may require some assurances about the quality of security as a prerequisite for some kinds of network connection.

(From p. 8): The threats to U.S. computer systems are international, and sometimes also political. The international nature of military and intelligence threats has always been recognized and addressed by the U.S. government. But a broader international threat to U.S. information resources is emerging with the proliferation of international computer networking—involving systems for researchers, companies, and other organizations and individuals—and a shift from conventional military

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

conflict to economic competition. The concentration of information and economic activity in computer systems makes those systems an attractive target to hostile entities. This prospect raises questions about the intersection of economic and national security interests and the design of appropriate security strategies for the public and private sectors. Finally, politically motivated attacks may also target a new class of system that is neither commercial nor military: computerized voting systems.

Market Influences on Cybersecurity

(From pp. 159-161): Even the best product will not be sold if the consumer does not see a need for it. Consumer awareness and willingness to pay are limited because people simply do not know enough about the likelihood or the consequences of attacks on computer systems or about more benign factors that can result in system failure or compromise. Consumer appreciation of system quality focuses on features that affect normal operations—speed, ease of use, functionality, and so on. This situation feeds a market for inappropriate or incomplete security solutions, such as antiviral software that is effective only against certain viruses but may be believed to provide broader protection, or password identification systems that are easily subverted in ordinary use. . . .

Enhancing security requires changes in attitudes and behavior that are difficult because most people consider computer security to be abstract and concerned more with hypothetical rather than likely events. Very few individuals not professionally concerned with security, from top management through the lowest-level employee, have ever been directly involved in or affected by a computer security incident. Such incidents are reported infrequently, and then often in specialized media, and they are comprehensible only in broadest outline. Further, most people have difficulty relating to the intricacies of malicious computer actions. Yet it is understood that installing computer security safeguards has negative aspects such as added cost, diminished performance (e.g., slower response times), inconvenience in use, and the awkwardness of monitoring and enforcement, not to mention objections from the work force to any of the above. The Internet worm experience showed that even individuals and organizations that understand the threats may not act to protect against them.

Nontechnical Dimensions of Cybersecurity

(From p. 17): Computer security does not stop or start at the computer. It is not a single feature, like memory size, nor can it be guaranteed by a single feature or even a set of features. It comprises at a minimum

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

computer hardware, software, networks, and other equipment to which the computers are connected, facilities in which the computer is housed, and persons who use or otherwise come into contact with the computer. Serious security exposures may result from any weak technical or human link in the entire complex. For this reason, security is only partly a technical problem: it has significant procedural, administrative, physical facility, and personnel components as well.

(From pp. 50-51): Technical measures alone cannot prevent violations of the trust people place in individuals, violations that have been the source of much of the computer security problem in industry to date . . . . Technical measures may prevent people from doing unauthorized things but cannot prevent them from doing things that their job functions entitle them to do. Thus, to prevent violations of trust rather than just repair the damage that results, one must depend primarily on human awareness of what other human beings in an organization are doing. But even a technically sound system with informed and watchful management and users cannot be free of all possible vulnerabilities. The residual risk must be managed by auditing, backup, and recovery procedures supported by general alertness and creative responses. Moreover, an organization must have administrative procedures in place to bring peculiar actions to the attention of someone who can legitimately inquire into the appropriateness of such actions, and that person must actually make the inquiry. In many organizations, these administrative provisions are far less satisfactory than are the technical provisions for security.

(From p. 10): It is important to balance technical and nontechnical approaches to enhancing system security and trust. Accordingly, the committee is concerned that the development of legislation and case law is being outpaced by the growth of technology and changes in our society. In particular, although law can be used to encourage good practice, it is difficult to match law to the circumstances of computer system use. Nevertheless, attacks on computer and communication systems are coming to be seen as punishable and often criminal acts . . . within countries, and there is a movement toward international coordination of investigation and prosecution. However, there is by no means a consensus about what uses of computers are legitimate and socially acceptable. Free speech questions have been raised in connection with recent criminal investigations into dissemination of certain computer-related information. There are also controversies surrounding the privacy impacts of new and proposed computer systems, including some proposed security safeguards. Disagreement on these fundamental questions exists not only within society at large but also within the community of computer specialists.

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

REALIZING THE POTENTIAL OF C4I: FUNDAMENTAL CHALLENGES (1999)

CITATION: Computer Science and Telecommunications Board (CSTB), National Research Council. 1999. Realizing the Potential of C4I: Fundamental Challenges. National Academy Press, Washington, D.C.

C4I is a Department of Defense (DOD) acronym for command, control, communications, computers, and intelligence. While many of the information systems described in Realizing the Potential of C4I: Fundamental Challenges are owned or operated by the Department of Defense, essentially all of the implications and lessons for DOD systems are valid for non-DOD government systems, and for systems in the private sector.20 Furthermore, the description of DOD practices in the field should not be taken as an exoneration of practices in non-DOD systems—indeed, it is highly likely that the same observations would apply to most such systems.

On What a Defense Must Do

(From pp. 144-152): Effective information systems security is based on a number of functions described below. This list of functions is not complete; nevertheless, evidence that all these functions are being performed in an effective and coordinated fashion will be evidence that information systems security is being taken seriously and conducted effectively.

20  

In 1991, Computers at Risk (at pp. 18-19) cast this point in the following terms:

There has been much debate about the difference between military and commercial needs in the security area. . . . This distinction is both superficial and misleading. National security activities, such as military operations, rely heavily on the integrity of data in such contexts as intelligence reports, targeting information, and command and control systems, as well as in more mundane applications such as payroll systems. Private sector organizations are concerned about protecting the confidentiality of merger and divestiture plans, personnel data, trade secrets, sales and marketing data and plans, and so on. Thus there are many common needs in the defense and civilian worlds.

Commonalities are especially strong when one compares the military to what could be called infrastructural industries—banking, the telephone system, power generation and distribution, airline scheduling and maintenance, and securities and commodities exchanges. Such industries both rely on computers and have strong security programs because of the linkage between security and reliability. Nonsecure systems are also potentially unreliable systems, and unreliability is anathema to infrastructure.

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

Some of these functions were also noted in the military context by the Defense Science Board, and some by the President’s Commission on Critical Infrastructure Protection in its report. These functions are listed here because they are important, and because the committee believes that they have not yet been addressed by the DOD in an effective fashion (as described in the committee’s findings below).

Function 1. Collect, analyze, and disseminate strategic intelligence about threats to systems.

Any good defense attempts to learn as much as possible about the threats that it may face, both the tools that an adversary may use and the identity and motivations of likely attackers. In the information systems security world, it is difficult to collect information about attackers (though such intelligence information should be sought). It is, however, much easier to collect and analyze information on technical and procedural vulnerabilities, to characterize both the nature of these vulnerabilities and their frequency at different installations. Dissemination of information about these vulnerabilities enables administrators of the information systems that may be affected to take remedial action.

Function 2. Monitor indications and warnings.

All defenses—physical and cyber—rely to some extent on indications and warning of impending attack. The reason is that if it is known that attack is impending, the defense can take actions to reduce its vulnerability and to increase the effectiveness of its response. This function calls for:

  • Monitoring of threat indicators. For example, near-simultaneous penetration attempts on hundreds of military information systems might reasonably be considered an indication of an orchestrated attack. Mobilization of a foreign nation’s key personnel known to have responsibility for information attacks might be another indicator. The notion of an “information condition” or “INFOCON,” analogous to the defense condition (DEFCON) indicator, would be a useful summary device to indicate to commanders the state of the cyber-threat at any given time. This concept is being developed by various DOD elements but is yet immature.

  • Assessment and characterization of the information attack (if any). Knowledge of the techniques used in an attack on one information system may facilitate a judgment of the seriousness of the attack. For example, an attack that involves techniques that are not widely known may indicate that the perpetrators have a high degree of technical sophistication.

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
  • Dissemination of information about the target(s) of threat. Knowledge of the techniques used in an attack on one information system may enable administrators responsible for other systems to take preventive actions tailored to that type of attack. This is true even if the first attack is unsuccessful, because security features that may have thwarted the first attack may not necessarily be installed or operational on other systems.

Note that dissemination of information about attacks and their targets is required on two distinct time scales. The first time scale is seconds or minutes after the attack is known; such knowledge enables operators of other systems not (yet) under attack to take immediate preventive action (such as severing some network connections). In this instance, alternative means of secure communication may be necessary to disseminate such information. The second time scale is days after the attack is understood; such knowledge allows operators throughout the entire system of systems to implement fixes and patches that they may not yet have fixed, and to request fixes that are needed but not yet developed. . . .

Function 3. Be able to identify intruders.

Electronic intruders into a system are admittedly hard to identify. Attacks are conducted remotely, and a chain of linkages from the attacker’s system to an intermediate node to another and another to the attacked system can easily obscure the identity of the intruder. Nevertheless, certain types of information—if collected—may shed some light on the intruder’s identity. For example, some attackers may preferentially use certain tools or techniques (e.g., the same dictionary to test for passwords), or use certain sites to gain access. Attacks that go on over an extended period of time may provide further opportunities to trace the origin of the attack.

Function 4. Test for security weaknesses in fielded and operational systems.

An essential part of a security program is searching for technical and operational or procedural vulnerabilities. Ongoing tests (conducted by groups often known as “red teams” or “tiger teams”) are essential for several reasons:

  • Recognized vulnerabilities are not always corrected, and known fixes are frequently found not to have been applied as a result of poor configuration management.

  • Security features are often turned off in an effort to improve operational efficiency. Such actions may improve operational efficiency, but at

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

the potentially high cost of compromising security, sometimes with the primary damage occurring in some distant part of the system.

  • Some security measures rely on procedural measures and thus depend on proper training and ongoing vigilance on the part of commanders and system managers.

  • Security flaws that are not apparent to the defender undergoing an inspection may be uncovered by a committed attacker (as they would be uncovered in an actual attack).

Thus, it is essential to use available tools and conduct red team or tiger team probes often and without warning to test security defenses. In order to maximize the impact of these tests, reports should be disseminated widely. Release of such information may risk embarrassment of certain parties or possible release of information that can be used by adversaries to attack, but especially in the case of vulnerabilities uncovered for which fixes are available, the benefits of releasing such information—allowing others to learn from it and motivating fixes to be installed—outweigh these costs.

Tiger team attacks launched without the knowledge of the attacked systems also allow estimates to be made of the frequency of attacks. Specifically, the fraction of tiger team attacks that are detected is a reasonable estimate of the fraction of adversary attacks that are made. Thus, the frequency of adversary attacks can be estimated from the number of adversary attacks that are detected.

Function 5. Plan a range of responses.

Any organization relying on information systems should have a number of routine information systems security activities (e.g., security features that are turned on, security procedures that are followed). But when attack is imminent (or in process), an organization could prudently adopt additional security measures that during times of non-attack might not be in effect because of their negative impact on operations. Tailoring in advance a range of information systems security actions to be taken under different threat conditions would help an organization plan its response to any given attack.

For example, a common response under attack is to drop non-essential functions from a system connected to the network so as to reduce the number of routes for penetration. A determination in advance of what functions count as non-essential and under what circumstances such a determination is valid would help facilitate an orderly transition to different threat conditions, and would be much better than an approach that calls for dropping all functionality and restoring only those functions that

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

people using the system at the time complain about losing. Note that such determinations can be made only from an operational perspective rather than a technical one, a fact that points to the essential need for an operational architecture in the design of C4I systems.

The principle underlying response planning should be that of “graceful degradation”; that is, the system or network should lose functionality gradually, as a function of the severity of the attack compared to its ability to defend against it. This principle stands in contrast to a different principle that might call for the maintenance of all functionality until the attack simply overwhelms the defense and the system or network collapses. The latter principle is tempting because reductions in functionality necessitated for security reasons may interfere with operational ease of use, but its adoption risks catastrophic failure.

It is particularly important to note that designing a system for graceful degradation depends on system architects who take into account the needs of security (and more generally, the needs of coping with possible component failures) from the start. For example, the principle of graceful degradation would forbid a system whose continued operation depended entirely on a single component remaining functional, or on the absence of a security threat.

This principle is often violated in the development of prototypes. It is often said that “it is necessary for one to crawl before one can run,” i.e., that it is acceptable to ignore security or reliability considerations when one is attempting to demonstrate the feasibility of a particular concept. This argument is superficially plausible, but in practice it does not hold water. It is reasonable for a prototype to focus only on concept feasibility, ignoring considerations of reliability or security, only if the prototype will be thrown away and a new architecture is designed and developed from scratch to implement the concept. Budget and schedule constraints usually prevent such new beginnings, and so in practice the prototype’s architecture is never abandoned, and security or reliability considerations must be addressed in the face of an architecture that was never designed or intended to support them.

Function 6. Coordinate defensive activities throughout the enterprise.

Any large, distributed organization has many information systems and subnetworks that must be defended. The activities taken to defend each of these systems and networks must be coordinated because the distributed parts have interconnections and the security of the whole organization depends on the weakest link. Furthermore, it is important for different parts of organizations to be able to learn from each other about vulnerabilities, threats, and effective countermeasures.

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

Function 7. Ensure the adequacy, availability, and functioning of public infrastructure used in systems (a step that will require cooperation with commercial providers and civilian authorities).

Few networks are built entirely using systems controlled by the organization that relies on them. Therefore organizations (including DOD) are required to work cooperatively with the owners of the infrastructure they rely on and relevant authorities to protect them.

Function 8. Include security requirements in any specification of system or network requirements that is used in the acquisition process.

Providing information systems security for a network or system that has not had security features built into it is enormously problematic. Retrofits of security features into systems not designed for security invariably leave security holes, and procedural fixes for inherent technical vulnerabilities only go so far.

Perhaps more importantly, security requirements must be given prominence from the beginning in any system conceptualization. The reason is that security considerations may affect the design of a system in quite fundamental ways, and a designer who decides on a design that works against security should at least be cognizant of the implications of such a choice. This function thus calls for information systems security expertise to be integrally represented on design teams, rather than added later.

Note that specification of the “Orange Book” security criteria would be an insufficient response to this function. “Orange Book” criteria typically drive up development times significantly, and more importantly, are not inherently part of an initial requirements process and do not address the security of networked or distributed systems.

Function 9. Monitor, assess, and understand offensive and defensive information technologies.

Good information systems security requires an understanding of the types of threats and defenses that might be relevant. Thus, those responsible for information systems security need a vigorous ongoing program to monitor, assess, and understand offensive and defensive information technologies. Such a program would address the technical details of these technologies, their capability to threaten or protect friendly systems, and their availability.

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

Function 10. Advance the state of the art in defensive information technology (and processes) with research.

Although much can usually be done to improve information systems security simply through the use of known and available technologies, “bug fixes,” and procedures, better tools to support the information systems security mission are always needed. In general, such improvements fall into two classes (which may overlap). One class consists of improvements so that tools can deal more effectively with a broader threat spectrum. A second class, equally important, provides tools that provide better automation and thus can solve problems at lower costs (costs that include direct outlays for personnel and equipment and operational burdens resulting from the hassle imposed by providing security).

Similar considerations apply to processes for security as well. It is reasonable to conduct organizational research into better processes and organizations that provide more effective support against information attacks and/or reduce the impediments to using or implementing good security practices.

Function 11. Promote information systems security awareness.

Just as it is dangerous to rely on a defensive system or network architecture that is hard on the outside and soft on the inside, it is also dangerous if any member of an organization fails to take information systems security seriously. Because the carelessness of a single individual can seriously compromise the security of an entire organization, education and training for information systems security must be required for all members of the organization. Moreover, such education and training must be systematic, regarded as important by the organization (and demonstrated with proper support for such education and training), and undertaken on a regular basis (both to remind people of its importance and to update their knowledge in light of new developments in the area).

Function 12. Set security standards (both technical and procedural).

Security standards should articulate in well-defined and actionable terms what an organization expects to do in the area of security. They are therefore prescriptive. For example, a technical standard might be “all passwords must be eight or more characters long, contain both letters and numbers, be pronounceable, and not be contained in any dictionary,” or “all electronic communications containing classified information must be encrypted with a certain algorithm and key length.” A standard involving both technical and procedural measures might specify how to revoke

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

cryptographic keys known to have been compromised. Furthermore, security standards should be expected to apply to all those within the organization. (For example, generals should not be allowed to exercise poorer information systems security discipline than do captains, as they might be tempted to do in order to make their use of C4I systems easier.)

Function 13. Develop and use criteria for assessing security status.

Information security is not a one-shot problem, but a continuing one. Threats, technology, and organizations are constantly changing in a spiral of measures and countermeasures. Organizations must have ways of measuring and evaluating whether they have effective defensive measures in place. Thus, once standards are put in place, the organization must periodically assess the extent to which members of the organization comply with those standards, and characterize the nature of the compliance that does exist.

Metrics for security could include number of attacks of different types, fraction of attacks detected, fraction of attacks repelled, damage incurred, and time needed to detect and respond to attacks. Note that making measurements on such parameters depends on understanding the attacks that do occur—because many attacks are not detected today, continual penetration testing is required to establish such a baseline.

On Practice in the Field

(From pp. 156-157): . . .The security in today’s fielded military systems is weak, and weaker than it need be, as illustrated by the following examples of behavior and practices that the committee observed or heard:

  • Individual nodes are running commercial software with many known security problems. Operators use little in the way of tools for finding these problems, to say nothing of fixing them.

  • Computers attached to sensitive command and control systems are also used by personnel to surf Web sites worldwide, raising the possibility that rogue applets and the like could be introduced into the system.

  • Units are being blinded by denial-of-service attacks, made possible because individual nodes were running commercial software with many known security problems.

  • IP addresses and other important data about C2 [command and control] systems can be found on POST-IT notes attached to computers in unsecured areas, making denial of service and other attacks much easier.

  • Some of the networks used by DOD to carry classified information are protected by a perimeter defense. As a result, they exhibit all of the

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

vulnerabilities that characterize networks protected by perimeter defenses.

(From p. 158): Many field commanders told the committee that “cyberspace is part of the battlespace,” and several organizations within the DOD assert that they are training “C2/cyber warriors.” But good intentions have not been matched by serious attention to cyberspace protection. Soldiers in the field do not take the protection of their C4I systems nearly as seriously as they do other aspects of defense. For example, information attack red teams were a part of some exercises observed by the committee, but their efforts were usually highly constrained for fear that unconstrained efforts would bring the exercise to a complete halt. While all red teams operate under certain rules of engagement established by the “white team” that oversees each exercise, the information attack red teams appeared to the committee to be much more constrained than appropriate. In one exercise, personnel in an operations center laughed and mistakenly took as a joke a graphic demonstration by the red team that their operations center systems had been penetrated.

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

TRUST IN CYBERSPACE (1999)

CITATION: Computer Science and Telecommunications Board (CSTB), National Research Council. 1999. Trust in Cyberspace. National Academy Press, Washington, D.C.

Cybersecurity and Other Trustworthiness Qualities Interact

(From p. 14): The trustworthiness of [a networked information system] encompasses correctness, reliability, security (conventionally including secrecy, confidentiality, integrity, and availability), privacy, safety, and survivability . . . . These dimensions are not independent, and care must be taken so that one is not obtained at the expense of another. For example, protection of confidentiality or integrity by denying all access trades one aspect of security—availability—for others. As another example, replication of components enhances reliability but may increase exposure to attack owing to the larger number of sites and the vulnerabilities implicit in the protocols to coordinate them. Integrating the diverse dimensions of trustworthiness and understanding how they interact are central challenges in building a trustworthy [networked information system].

On Managing Risk

(From pp. 175-176): A discussion about consequences must also address the questions of who is affected by the consequences and to what extent. While catastrophic failure garners the most popular attention, there are many dimensions to trustworthiness and consequences may involve various subsets of them with varying degrees of severity. . . . Understanding consequences is essential to forming baseline expectations of private action and what incentives may be effective for changing private action, but that understanding is often hampered by the difficulty of quantifying or otherwise specifying the costs and consequences associated with risks.

(From p. 175): It is the nature of [a networked information system] that outages and disruptions of service in local areas may have very uneven consequences, even within the area of disruption. Failure of a single Internet service provider (ISP) may or may not affect transfer of information outside the area of disruption, depending on how the ISP has configured its communications. For example, caching practices intended to reduce network congestion problems helped to limit the scope of a Domain Name Service (DNS) outage. Corporations that manage their own

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

interconnection (so-called intranets) may be wholly unaffected. Even widespread or catastrophic failures may not harm some users, if they have intentionally or unconsciously provided redundant storage or backup facilities. The inability to accurately predict consequences seriously complicates the process of calculating risk and makes it tempting to assume “best case” behavior in response to failure.

(From pp. 177-178): . . . [T]he costs associated with avoiding all risks are prohibitive. Thus, risk mitigation is more typical and is generally encountered when many factors, including security and reliability, determine the success of a system. Risk mitigation is especially popular in market-driven environments where an attempt is made to provide “good enough” security or reliability or other qualities without severely affecting economic factors such as price and time to market. Risk mitigation should be interpreted not as a license to do a shoddy job in implementing trustworthiness, but instead as a pragmatic recognition that trade-offs between the dimensions of trustworthiness, economic realities, and other constraints will be the norm, not the exception. The risk mitigation strategies that are most relevant to trustworthiness can generally be characterized according to two similar models:

  • The insurance model. In this model, the cost of countermeasures is viewed as an “insurance premium” paid to prevent (or at least mitigate) loss. The value of the information being protected, or the service being provided, is assessed and mechanisms and assurance steps are incorporated up to, but not exceeding, that value.

  • The work factor model. A definition in cryptology for the term “work factor” is the amount of computation required to break a cipher through a brute-force search of all possible key values. Recently, the term has been broadened to mean the amount of effort required to locate and exploit a residual vulnerability. That effort may involve more efficient procedures rather than exhaustive searches. In the case of fault tolerance, the assumptions made about the types of failures (benign or arbitrary) that could arise are analogous to the concept of work factor.

The two models are subject to pitfalls distinctive to each and some that are common to both. In the insurance model, it is possible that the value of information (or disruption of service) to an outsider is substantially greater than the value of that information or service to its owners. Thus, a “high value” attack could be mounted, succeed, and the “insurance premium” lost along with the target data or service. Such circumstances often arise in an interconnected or networked world. For example, a local telephone switch might be protected against deliberate

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

interruption of service to the degree that is justified by the revenue that might be lost from such an interruption. But such an analysis ignores the attacker whose aim is to prevent a physical alarm system from notifying the police that an intrusion has been detected into an area containing valuable items. Another example is an instance in which a hacker expends great effort to take over an innocuous machine, not because it contains interesting data but because it provides computing resources and network connectivity that can be used to mount attacks on higher-value targets. In the case of the work factor model, it is notoriously difficult to assess the capabilities of a potential adversary in a field as unstructured as that of discovering vulnerabilities, which involves seeing aspects of a system that were overlooked by its designers.

Vulnerabilities in the Public Telephone Network and the Internet

(From p. 27): The vulnerabilities of the PTN [public telephone network] and Internet are exacerbated by the dependence of each network on the other. Much of the Internet uses leased telephone lines as its physical transport medium. Conversely, telephone companies rely on networked computers to manage their own facilities, increasingly employing Internet technology, although not necessarily the Internet itself. Thus, vulnerabilities in the PTN can affect the Internet, and vulnerabilities in Internet technology can affect the telephone network.

(From p. 58): . . . [W]hile in one sense the Internet poses no new challenges—a system that can be accessed from outside only through a cryptographically protected channel on the Internet is at least as secure as the same system reached through a conventional leased line—new dangers arise precisely because of pervasive interconnectivity. The capability to interconnect networks gives the Internet much of its power; by the same token, it opens up serious new risks. An attacker who may be deflected by cryptographic protection of the front door can often attack a less protected administrative system and use its connectivity through internal networks to bypass the encryption unit protecting the real target. This often makes a mockery of firewall-based protection.

(From p. 50): The general accessibility of the Internet makes it a highly visible target and within easy reach of attackers. The widespread availability of documentation and actual implementations for Internet protocols means that devising attacks for this system can be viewed as an intellectual puzzle (where launching the attacks validates the puzzle’s solution).

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

On Building Secure Systems and Networks

(From p. 2): Laudable as a goal, ab initio building of trustworthiness into an NIS [networked information system] has proved to be impractical. It is neither technically nor economically feasible for designers and builders to manage the complexity of such large artifacts or to anticipate all of the problems that an NIS will confront over its lifetime. Experts now recognize steps that can be taken to enhance trustworthiness after a system has been deployed. It is no accident that the market for virus detectors and firewalls is thriving. Virus detectors identify and eradicate attacks embedded in exchanged files, and firewalls hinder attacks by filtering messages between a trusted enclave of networked computers and its environment (from which attacks might originate). Both of these mechanisms work in specific contexts and address problems contemplated by their designers; but both are imperfect, with user expectations often exceeding what is prudent.

(From pp. 13-14): Networked information systems (NISs) integrate computing systems, communications systems, and people (both as users and operators). The defining elements are interfaces to other systems along with algorithms to coordinate those systems. Economics dictates the use of commercial off-the-shelf (COTS) components wherever possible, which means that developers of an NIS have neither control over nor detailed information about many system components. The use of system components whose functionality can be changed remotely and while the system is running is increasing. Users and designers of an NIS built from such extensible system components thus cannot know with any certainty what software has entered system components or what actions those components might take.

(From p. 3): Today’s climate of deregulation will further increase [networked information system] vulnerability in several ways. The most obvious is the new cost pressures on what had been regulated monopolies in the electric power and telecommunications industries. One easy way to cut costs is to reduce reserve capacity and eliminate rarely needed emergency systems; a related way is to reduce diversity (a potential contributor to trustworthiness) in the technology or facilities used. Producers in these sectors are now competing on the basis of features, too. New features invariably lead to more complex systems, which are liable to behave in unexpected and undesirable ways. Finally, deregulation leads to new interconnections, as some services are more cost-effectively imported from other providers into what once were monolithic systems. Apart from the obvious dangers of the increased complexity, the inter-

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

connections themselves create new weak points and interdependencies. Problems could grow beyond the annoyance level that characterizes infrastructure outages today, and the possibility of catastrophic incidents is growing.

(From p. 15): To be labeled as trustworthy, a system not only must behave as expected but also must reinforce the belief that it will continue to produce expected behavior and will not be susceptible to subversion. The question of how to achieve assurance has been the target of several research programs sponsored by the Department of Defense and others. Yet currently practiced and proposed approaches for establishing assurance are still imperfect and/or impractical. Testing can demonstrate only that a flaw exists, not that all flaws have been found; deductive and analytical methods are practical only for certain small systems or specific properties. Moreover, all existing assurance methods are predicated on an unrealistic assumption—that system designers and implementors know what it means for a system to be “correct” before and during development. The study committee believes that progress in assurance for the foreseeable future will most likely come from figuring out (1) how to combine multiple approaches and (2) how best to leverage add-on technologies and other approaches to enhance existing imperfect systems. Improved assurance, without any pretense of establishing a certain or a quantifiable level of assurance, should be the aim.

(From p. 247): Security research during the past few decades has been based on formal policy models that focus on protecting information from unauthorized access by specifying which users should have access to data or other system objects. It is time to challenge this paradigm of “absolute security” and move toward a model built on three axioms of insecurity: insecurity exists; insecurity cannot be destroyed; and insecurity can be moved around.

(From p. 250): Improved trustworthiness may be achieved by the careful organization of untrustworthy components. There are a number of promising ideas, but few have been vigorously pursued. “Trustworthiness from untrustworthy components” is a research area that deserves greater attention.

On the Impact of System Homogeneity (“Monoculture”)

(From pp. 191-192): The similarity intrinsic in the component systems of a homogeneous collection implies that these component systems share vulnerabilities. A successful attack on one system is then likely to suc-

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×

ceed on other systems as well—the antithesis of what is desired for implementing trustworthiness. Moreover, today’s dominant computing and communications environments are based on hardware and software that were not designed with security in mind; consequently, these systems are not difficult to compromise, as discussed in previous chapters.

There is, therefore, some tension between homogeneity and trustworthiness. Powerful forces make technological homogeneity compelling . . ., but some attributes of trustworthiness benefit from diversity. . . . On the other hand, a widely used trustworthy operating system might be superior to a variety of nontrustworthy operating systems; diversity, per se, is not equivalent to increased trustworthiness.

Technological convergence may also be realized through the market dominance of a few suppliers of key components, with monopoly as the limit case when technological homogeneity is dictated by the monopolist. However, the number of suppliers could grow as a result of the diffusion of computing into embedded, ubiquitous environments; the diversification and interoperability of communications services; and the continued integration of computing and communications into organizations within various market niches.

Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 17
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 18
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 19
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 20
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 21
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 22
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 23
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 24
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 25
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 26
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 27
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 28
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 29
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 30
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 31
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 32
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 33
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 34
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 35
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 36
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 37
Suggested Citation:"2 Excerpts from Earlier CSTB Reports." National Research Council. 2002. Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: The National Academies Press. doi: 10.17226/10274.
×
Page 38
Next: What is CSTB? »
Cybersecurity Today and Tomorrow: Pay Now or Pay Later Get This Book
×
Buy Paperback | $21.00 Buy Ebook | $16.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

This report reviews past NRC studies that have examined various dimensions of computer and network security and vulnerability and brings the results forward into the context of the current environment of security and vulnerability. The review includes work done since 1991, such as Computers at Risk (1991), Cryptography’s Role in Securing the Information Society (1996), For the Record: Protecting Electronic Health Information (1997), Trust in Cyberspace (1999), Continued Review of the Tax Systems Modernization of the Internal Revenue Service (1996), Realizing the Potential of C4I (1999), and Embedded, Everywhere (2001).

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!