Click for next page ( 172


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 171
6 The Economic and Public Policy Context Factors that cause networked information systems (NISs) to be less trustworthy than they might be environmental disruption, human user and operator errors, attacks by hostile parties, and design and implemen- tation errors are examined in this report. In a number of instances, research and development efforts have yielded state-of-the-art techno- logical solutions that could be deployed to enhance NIS trustworthiness. Why are such technological solutions not used more widely in practice? Some experts posit that the benefits from increased trustworthiness are difficult to estimate or trade off, and consumers therefore direct their expenditures toward other investments that they perceive will have more definitive returns. Similarly, producers tend to be reluctant to invest in products, features, and services that further trustworthiness when their resources can be directed (e.g., toward increasing functionality) where the likelihood of profit appears greater. Thus, there seems to be a market failure for trustworthiness. Other factors, such as aspects of public policy, also tend to inhibit the use of existing solutions. As this report makes clear, while the deployment of extant technolo- gies can improve the trustworthiness of NISs, in many critical areas an- swers are not known. Research is needed. Most of the research activity related to trustworthiness involves federal government funding. (A1- though the private sector conducts "research," most of this effort is devel- opment that is directed toward specific products.) Inasmuch as the fed- eral government is the major funder of basic and applied research in computing and communications, this chapter examines its interests and 171

OCR for page 171
72 TRUST IN CYBERSPACE research emphases related to trustworthiness. Certain aspects of trust- worthiness (e.g., security) are historically critical areas for federal agen- cies responsible for national security interests. The National Security Agency (NSA) and Defense Advanced Research Projects Agency (DARPA), both part of the Department of Defense (DOD), have particu- larly influential roles in shaping research priorities and funding for trust- worthiness. In this chapter, there is a greater emphasis on security than on other dimensions of trustworthiness, because the federal government has placed tremendous emphasis on computer and communications security consis- tent with the importance of this technology in supporting national secu- rity activities. As the broader concept of trustworthiness becomes in- creasingly important, especially in light of the recent concern for protection of critical infrastructures, increased attention to the nonsecurity dimensions of trustworthiness by the federal government may be war- ranted. This is not to say that attention to security is or will become unimportant indeed, security vulnerabilities are expected to increase in both number and severity in the future. Additionally, the success of security in the marketplace is mixed at best, so a discussion of the reasons for this situation merits some attention here. This chapter begins with a discussion of risk management, which provides the analytical framework to assess rationales for people's invest- ment in trustworthiness or their failure to do so. The risk management discussion leads to an analysis of the costs that consumers encounter in their decisions regarding trustworthiness. These first two sections articu- late reasons that there is a disincentive for consumers to invest in trust- worthiness. Producers also face disincentives (but different ones) to in- vest in trustworthiness, as discussed in the third section. Then there is a discussion of standards and criteria and possible roles that they may play to address the market failure problem. The important role of cryptogra- phy is explicated in Chapters 2 and 4; here, the focus is on the question of why cryptography is not more widely used. The federal government's many interests in trustworthiness include facilitating the use of technol- ogy to improve trustworthiness today and fostering research to support advances in trustworthiness. This chapter concludes with a discussion of the federal agencies involved with conducting and/or sponsoring re- search in trustworthiness. Two agencies with central roles in this arena- the NSA and DARPA are examined in some detail. RISK MANAGEMENT The motivation to invest in trustworthiness is to manage risks. While it is conceivable to envision positive benefits deriving from trustworthi

OCR for page 171
THE ECONOMIC AND PUBLIC POLICY CONTEXT 173 Hess, the primary rationale for investment in trustworthiness is to help ensure that an NIS does what people expect it to do and not something else.2 The study of risk management involves the assessment of risk and its consequences, a framework for analyzing alternatives to prevent or mitigate risks, and a basis for making decisions and implementing strate- gies. Although there are a number of analytical tools available to assist in risk management, each step in the process is subject to uncertainty and judgment. Risk Assessment Risk assessment differs depending on whether the emphasis is on security or on safety and reliability. Threat, for example, is a concept most commonly associated with security. Threat assessment is both speculative and subjective, as it necessitates an evaluation of attacker intent.3 Speculation is associated with vulnerability assessment, because the existence of a vulnerability can be shown by experiment, but the ab- sence of vulnerabilities cannot be shown by experiment or any other de- finitive means. There always exists the possibility that some aspect of the system can be exploited in some unexpected way. Whereas security- critical information systems have to defend against such malicious at- tacks, safety-critical systems typically do not. In the security arena, risk is the combination of two probabilities: first, the probability that a threat exists that will attempt to locate and exploit a vulnerability; and second, the probability that the attempt will succeed. Security risk assessment compounds two uncertainties one human and one technical. The human uncertainty centers on the question, Would anybody attack? The technical uncertainty centers on the question, If they did, would they locate and exploit a residual vulnerability? A vulnerability, once discovered, may be exploited again and again. In the Internet era, a vulnerability may even be publicized to the world in PA hypothetical example could entail the use of trustworthiness as a marketing advan- tage, akin to the Federal Express creed of "when it absolutely, positively has to be there." 2There is also the notion that some forms of business activities require or are facilitated by a particular level of trustworthiness te.g., security as an enablers. In the electronic com- merce area, as an example, the availability of secure socket layer (SSL) encryption for Web traffic has caused consumers to feel more comfortable about sending credit card numbers across the Internet, even though the real risk of credit card theft is on the merchants' serv- ers and that is not addressed by SSL. 3The example of residential burglary may help to clarify this point. One may suspect through a series of observations that one s neighborhood has been targeted by burglars: strange cars driving slowly by, noises in the night, phone callers who hang up immediately when the telephone is answered, and so on. One is only sure that burglars are operating when a burglary happens too late for any practical preventive steps to be taken.

OCR for page 171
74 TRUST IN CYBERSPACE the convenient form of an "attack script" that enables the vulnerability to be easily exploited, even by those who are unable to understand it.4 Such behavior means that probabilities are nonindependent in a statistical sense. By contrast, risk assessment in the context of safety or reliability is significantly different. Risk in safety or reliability analysis is a function of the probability that a hazard arises and the consequences (e.g., cost) of the hazard. The most common function is the product of the two numbers, yielding an expected value. Informally, risk can be thought of as the expected damage done per unit of time that results from the operation of a system. Because the probability of failure per unit of time is nonzero, the risk is nonzero, and damage must be expected. If the estimated risks is unacceptably high, then either design or implementation changes must be made to reduce it, or consideration has to be given to withholding deployment. But if a safety incident should occur (e.g., an accident), the probability of a second accident remains unchanged, or may even de- crease as a consequence.6 A major challenge for risk management with regard to trustworthi- ness is the growing difficulty of differentiating attacks from incompe- tence and failure or lack of reliability. It is one of several factors that raise the question of whether comprehensive probability estimation or hazard analysis is possible. Nature of Consequences Attitudes and behavior depend on the nature of consequences. Safety- critical information systems often control physical systems, where the 4A simple example is a one-line command that may allow an individual to steal pass- words. Access the URL , substituting xxx.xxx.xxx with the target site of interest. For some Web sites, the encrypted passwords will be returned to you. If this one-line command works, it is because there is a flawed version of PHF in the /cgi-bin directory. PHF allows users to gain remote access to files "including the /etc/passwd file' over the Web. One can run a password- cracking program on the encrypted passwords obtained. 5Risk estimation is a systems engineering issue, and it involves careful, extensive, and thorough analysis of all aspects of a safety-critical system by systems engineers, safety engineers, domain experts, and others. An important initial activity in the process is haz- ard analysis, an attempt to determine the hazards that would be manifested if the system were to fail. A hazard is a condition with the potential for causing an undesired conse- quence. A hazard of operating a nuclear plant, for example, would be the release of radia- tion into the environment. A hazard of using a medical device might be patient injury. various guidelines, procedures, and standards for carrying out hazard analyses have been developed. The central issue with hazard analysis is completeness it is very important that all hazards be identified if at all possible. 6For example, because of greater operator diligence.

OCR for page 171
THE ECONOMIC AND PUBLIC POLICY CONTEXT 175 consequences of failure include the possibility that lives will be threat- ened and/or valuable equipment may be damaged (e.g., an air traffic control system). The consequences of failure of non-safety-related sys- tems include the possibility that data will be corrupted or stolen, or that essential services will be unavailable. While the latter are serious out- comes, these consequences are not perceived to be as serious as those associated with safety-critical systems. Financial consequences, especially within the private sector, have also attracted considerable attention be- cause these consequences can be reasonably quantified and the implica- tions to the financial bottom line are readily understood.7 Consequences are not static. Consequences that are currently toler- able may become intolerable in the future. For example, as the speed of communications channels continues to increase and applications are de- signed to rely on this speed, the availability8 of a connection may not be sufficient for those applications that depend on high bandwidth and low delay. Moreover, as applications become more dependent on quality of service guarantees from networks, a degradation in service may disrupt future applications more than current ones. It is the nature of an NIS that outages and disruptions of service in local areas may have very uneven consequences, even within the area of disruption. Failure of a single Internet service provider (ISP) may or may not affect transfer of information outside the area of disruption, depend- ing on how the ISP has configured its communications. For example, caching practices intended to reduce network congestion problems helped to limit the scope of a Domain Name Service (DNS) outage.9 Corpora- tions that manage their own interconnection (so-called intranets) may be wholly unaffected. Even widespread or catastrophic failures may not harm some users, if they have intentionally or unconsciously provided redundant storage or backup facilities. The inability to accurately predict consequences seriously complicates the process of calculating risk and makes it tempting to assume "best case" behavior in response to failure. A discussion about consequences must also address the questions of who is affected by the consequences and to what extent. While cata 7In contrast to privacy, for example. 8Increased dependence on connections promotes attention not only to the number of outages but also to the length of outages. For example, a one-second outage in a voice connection may require redialing to reestablish a connection; in a client/server application over a wide-area network, it could require rebooting computers, restarting applications, and considerable other delays that yield a multiplier as compared to voice. 9The master file for ".COM," a major address domain, was corrupted; however, most sites only queried the master file for entries not in their caches. Entries that were cached- and those generally included all the usual peers of any given site were used, despite their apparent deletion from the master file.

OCR for page 171
176 TRUST IN CYBERSPACE atrophic failure garners the most popular attention, there are many di- mensions to trustworthiness and consequences may involve various sub- sets of them with varying degrees of severity. For example, cellular tele- phony fraud has two principal variants approximately equal in size: credit fraud, whereby the cellular telephone owner transfers the account to a second provider and does not pay the first; and cloning, the transfer to a new device of numbers that identify a radio and customer account. In both cases, the service provider loses revenue. Under some circumstances, a legitimate caller may be denied service if illegitimate users saturate the network.l In the case of telephone cloning, if the clone user does not saturate the network, the provider loses revenue but users do not incur an immediate cost.ll Understanding consequences is essential to forming baseline expectations of private action and what incentives may be effec- tive for changing private action, but that understanding is often ham- pered by the difficulty of quantifying or otherwise specifying the costs and consequences associated with risks. Risk Management Strategies Risk management strategies are approaches to managing trade-offs.l2 These strategies address questions about whether it is better to add, for ex- ample, a small degree of security to a large number of products or substantial security to a smaller number of specific products, to use high-security/low- availability solutions or low-security/high-availability ones, or to increase assurance or the ability to identify and quarantine attackers. Trade-offs can be made in system design and engineering; they can also be made in deciding whether to invest in technology, procedure, insurance, or inaction. 1ONote that the cost of denied service to the legitimate caller may far exceed the price of the telephone call itself. For example, a delay in requesting emergency services (e.g., a call to the fire department) may carry catastrophic costs. 1lHowever, to the extent that the cellular carrier is responsible for the resulting wireline and long-distance charges from the telephone clone, a rise in the cellular carrier's rates may be forthcoming. 12It is essential (1) that the actual system matches the model underlying the analysis as closely as possible, and (2) that the failure rates achieved by system components match the estimates used in the model. The former is a systems/safety engineering issue, whereas the latter involves all the engineering disciplines engaged in preparing the components. The process usually followed to achieve these two goals is in two parts: the first is careful management of the development process; the second is iterative evaluation of the system design as it is developed. If changes are made for any reason, the risk estimation might be repeated. If necessary, elements of the system design can be modified to reduce the risk. For example, if a nuclear plant's cooling system is shown to be unable to meet its depend- ability requirements because a particular type of pump tends to fail more often than is acceptable, then the design can be modified to include a backup pump.

OCR for page 171
THE ECONOMIC AND PUBLIC POLICY CONTEXT 177 Risk avoidance is a strategy that seeks to reduce risk to the lowest possible value. Reducing risk takes precedence over cost or effect on the operational characteristics of the system in question. Risk avoidance strat- egies arose in the context of high-consequence systems, such as nuclear weapon command and control or the protection of nuclear weapon stock- piles. At the time these systems were developed, there was a clear bound- ary between high-consequence applications and "ordinary" software- whose malfunctions could be expensive and annoying but did not threaten human life or significant assets. With the increasing use of Inter- net technology, this boundary is becoming blurred. The underlying assumption of risk avoidance strategies, when secu- rity is emphasized, is that there exists a highly capable threat that will expend great effort to achieve its goals. The achievement of those goals will involve such extreme consequences (e.g., uncommanded nuclear weapon release) that all possible effort should be devoted to preventing such consequences from being realized. Risk avoidance strategies, in general, incorporate every protection mechanism and invoke every pos- sible assurance step. Many of these assurance steps, which are discussed in detail in Chapter 3, can handle only certain classes of designs or imple- mentation technologies. When these limitations are imposed in addition to those of the rigid design guidance, the result is very often a system that is expensive, slow to deploy, and cumbersome and inefficient to use. Experience with risk avoidance strategies indicates that residual vulner- abilities will remain irrespective of the number of assurance steps taken. These vulnerabilities will often require quite exotic techniques to exploit; exotic, that is, until they are discovered by a threat or (worse yet) pub- lished on the Internet.l3 However, the costs associated with avoiding all risks are prohibitive. Thus, risk mitigation is more typical and is generally encountered when many factors, including security and reliability, determine the success of a system. Risk mitigation is especially popular in market-driven environ- ments where an attempt is made to provide "good enough" security or reliability or other qualities without severely affecting economic factors such as price and time to market. Risk mitigation should be interpreted not as a license to do a shoddy job in implementing trustworthiness, but instead as a pragmatic recognition that trade-offs between the dimensions of trustworthiness, economic realities, and other constraints will be the norm, not the exception. The risk mitigation strategies that are most 13Some exotic strategies require specialized hardware or physical access to certain sys- tems, whereas other exotic strategies may require only remote access and appropriate soft- ware to be executed. It is this latter class of strategies that is particularly susceptible to dissemination via the Internet.

OCR for page 171
178 TRUST IN CYBERSPACE relevant to trustworthiness can generally be characterized according to two similar models: The insurance model. In this model, the cost of countermeasures is viewed as an "insurance premium" paid to prevent (or at least mitigate) loss. The value of the information being protected, or the service being provided, is assessed and mechanisms and assurance steps are incorpo- rated up to, but not exceeding, that value. The workfactor model. A definition in cryptology for the term "work factor" is the amount of computation required to break a cipher through a brute-force search of all possible key values.l4 Recently, the term has been broadened to mean the amount of effort required to locate and exploit a residual vulnerability. That effort may involve more efficient procedures rather than exhaustive searches. In the case of fault tolerance, the assumptions made about the types of failures (benign or arbitrary) that could arise are analogous to the concept of work factor. The two models are subject to pitfalls distinctive to each and some that are common to both. In the insurance model, it is possible that the value of information (or disruption of service) to an outsider is substan- tially greater than the value of that information or service to its owners. Thus, a "high value" attack could be mounted, succeed, and the "insur- ance premium" lost along with the target data or service. Such circum- stances often arise in an interconnected or networked world. For ex- ample, a local telephone switch might be protected against deliberate interruption of service to the degree that is justified by the revenue that might be lost from such an interruption. But such an analysis ignores the attacker whose aim is to prevent a physical alarm system from notifying the police that an intrusion has been detected into an area containing valuable items. Another example is an instance in which a hacker ex- pends great effort to take over an innocuous machine, not because it contains interesting data but because it provides computing resources and network connectivity that can be used to mount attacks on higher- value targets.l5 In the case of the work factor model, it is notoriously difficult to assess the capabilities of a potential adversary in a field as unstructured as that of discovering vulnerabilities, which involves seeing aspects of a system that were overlooked by its designers. 14If the cryptography is easily broken (e.g., because the keys are stored in shared memory), the work factor may be almost irrelevant. 15A specific example of this comes from the early days of electromechanical cryptosystems. At that time, governments typically deployed an array of different cryptosystems of different strengths: simple (and easier to break) cryptosystems for less sensitive data, and elaborate

OCR for page 171
THE ECONOMIC AND PUBLIC POLICY CONTEXT Selecting a Strategy 179 Risk management seeks to provide an analytical framework for de- ciding how close to the edge one dares to go. Risk avoidance carries with it the danger of overengineering to the point at which the system is never used. Risk mitigation carries with it the danger of underengineering to the point at which the system is defeated, very possibly over and over again. The compound uncertainties of risk management preclude any rigorous method, but it is possible to articulate a few guidelines: Understand how long the system will be used in harm's way. Threats are not static; they become more capable over time, through the release of once-secret information from disgruntled former employees and other sources, access to once-esoteric equipment, and through other means.l6 Assess how much work is needed to exploit a known residual vulnerability. Does the attack require specialized equipment? Is this the sort of equipment that will drop drastically in cost over the next few years? Is it the sort of equipment that is freely accessible in open environ- ments such as universities? Does the attack require a level of physical access that can be made hard to achieve? Context is extremely important. It is necessary to understand how the system might be used, how it is connected to or interacts with other systems, and how it might be exploited in the course of attacking some- thing else. Can the system-support infrastructure react to vulnerabilities? Are system updates possible, and if so, at what cost? How many instances of electromechanical devices to encipher highly sensitive data (called, respectively, "low- grade" and "high-grade" systems). This approach can be looked at as a risk-mitigation strategy, on either the insurance or work factor model, depending on how the decision of which system protected which data was used. Only security that was "good enough" was imposed. What the designers of these systems were slow to realize, however, was that the high-grade systems (e.g., the German Enigma machine) were vulnerable to "known plaintext" attacks where the cryptanalyst was able to match unenciphered and enciphered characters and thereby recover the key that deciphered other, previously unknown, mes- sages. The nature of military and diplomatic communication is such that much text is "cut and pasted" from innocuous messages to more sensitive ones. Breaking the low-grade ciphers then provided the "known plaintext" that facilitated attacks on the high-grade ci- phers. 16The so-called "cloning" attack, which is responsible for a large percentage of cellular fraud today, was at one time understandable only by a small handful of electronic engi- neers and required expensive, custom-made equipment. Today that attack is embodied in clandestine consumer products and can be mounted by any individual with the will and a few hundred dollars. The will has increased for many because there are more targets: high-use areas make listening for identification numbers more feasible.

OCR for page 171
180 TRUST IN CYBERSPACE the system will be deployed and how widely are they dispersed? Is there a mechanism for security recalls?l7 Can the infrastructure continue criti- cal operations at a reduced and trusted level if attacked? The difficulties of anticipating and avoiding most risks can lead to strategies that emphasize compensatory action: detecting problems and responding to minimize damage, recovering, and seeking redress in some circumstances. The difficulty with this approach is the implicit assump- tion that all attacks can be identified. Anecdotal reports of success by "tiger teams" seeking to compromise systems suggest that detection may continue to be a weak vehicle for the future.l8 Findings 1. Security risks are more difficult to identify and quantify than those that arise from safety or reliability concerns. Safety and reliability risks do not involve malice; the tangible and often severe consequences may often be easily articulated. These considerations facilitate the assessment of risk and measurement of consequences for safety- and reliability-related risks. 2. Although a risk-avoidance strategy may maximize trustworthiness, the prohibitive cost of that strategy suggests that risk mitigation is the pragmatic strategy for most situations. 3. Consequences may be uneven and unpredictable, especially for security risks, and may affect people with varying levels of severity. Safety-related consequences are generally perceived to be more serious than other consequences. CONSUMERS AND TRUSTWORTHINESS The spending decisions made by consumers have a profound impact on the trustworthiness of NISs. The consumers of trustworthiness may be partitioned into two groups: information system professionals, who act on behalf of groups of relatively unsophisticated users, and the general public. Information system professionals often have only a modest un- derstanding of trustworthiness because of the limited attention devoted 17For example, in GSM cellular phones, the security algorithms are embedded in per- subscriber smart cards and in a small number of authentication stations. This permits the relatively easy phaseout of an algorithm that has been cracked, although it remains to be seen whether providers will indeed replace the COMP128 algorithm. See for details. 18For example, consider the success of the "Eligible Receiver" exercise in which a team of "hackers" posing as paid surrogates for North Korea could have disabled the networked information systems that control the U.S. power grid (Gertz, 1998~.

OCR for page 171
THE ECONOMIC AND PUBLIC POLICY CONTEXT 181 to trustworthiness within college curricula and professional seminars. Even information system professionals who concentrate on security is- sues vary greatly in their understanding of issues associated with trust- worthiness.l9 The larger group of consumers is the general public, mostly unsophisticated with respect to trustworthiness despite a growing famil- iarity with information technology in general. The rise of an information systems mass market during the last two decades, and the concomitant influx of unsophisticated users, exacerbates the asymmetric distribution of understanding of trustworthiness concerns. Consumer Costs Consumer costs include all costs associated with trustworthiness that are borne by the user. Some of these costs are associated with the preven- tion or detection of breaches in trustworthiness; other costs are related to recovery from the effects of inadequate trustworthiness. Consumer costs include expenditures for the acquisition and use of technology, the devel- opment and implementation of policies and practices, insurance, legal action, and other activities. Consumer costs may be divided into direct costs, indirect costs, and failure costs. Direct Costs Direct costs are those expenditures that can be associated unambigu- ously with trustworthiness. This category includes the purchases of prod- ucts such as firewalls or anti-virus software. Sometimes, direct costs may represent the incremental cost for products that offer superior trustwor- thiness compared with alternatives (e.g., fault-tolerant computers). Ser- vices may also be categorized as direct costs, as in the case of maintaining hot sites,20 consulting and training to improve operational practices, ana- lyzing system audit data, or upgrading hardware to improve reliability. Direct costs vary widely, depending on the requirements of the con- sumer. Historically, specialized users have had the most demanding re- quirements and incurred the most costs; the canonical example is the military, but other institutions such as banking, air traffic control systems, and nuclear power facilities also have exacting requirements for security, safety, and reliability. The direct costs relative to trustworthiness are 19This conclusion was derived from discussions at several committee meetings. 20Hot sites are physical locations where an organization may continue computer opera- tions in the case of a major disruption, such as an earthquake that renders the normal operating site largely unusable. Organizations may maintain their own hot sites or may contract for this service with specialty firms.

OCR for page 171
THE ECONOMIC AND PUBLIC POLICY CONTEXT 229 In a recent study (Anderson et al., 1998), 45 NSA-funded projects in the area of information system security and survivability were identified. Although the enumeration may not be comprehensive, it does indicate the nature and scope of the research funded by NSA (see Appendix T) Of R2's contract funds, a significant portion goes to support nonresearch activities such as participation in standards-setting organiza- tions (e.g., the Internet Engineering Task Force, where R2 contributed the ISAKMP protocol to the IPsec standards effort), consortia membership (e.g., the ATM Forum, where R2 also contributed to security protocol standards), and support for infosec education (e.g., Biometrics consor- tium, Network Security Management Forum, and support for infosec studies at the Naval Postgraduate School and the University of Mary- land). Numerous activities, both external and contract funded, are fo- cused on understanding and assessing various products and technologies (e.g., hacker tools, cryptography for electronic cash). R2 also supports several efforts to modify COTS products to incorporate new or expanded security functionality (e.g., biometrics access controls and intrusion detec- tion for Windows NT).

OCR for page 171
230 Issues for the Future TRUST IN CYBERSPACE The committee reviewed a draft of R2's "Information System Security Research Program Plan," which was revised multiple times in 1996-1997.98 This plan calls for greater interaction with the entire infosec community and a more open but focused R2 research program, which would be based on input from an infosec research council (sponsored by NSA and includ- ing participants from the relevant agencies and the military services), a national infosec technical baseline (established by NSA, DOE, and DOE's national laboratories), and an infosec science and technology study group (composed of leading experts who would provide an infosec perspective from the private sector). By design, the draft plan would support technol- ogy R&D "consistent with the fundamental security principles and con- cepts articulated in the DOD Goal Security Architecture" (Burnham, 1997~. To ensure a supply of knowledgeable experts in the future, the draft plan calls for the establishment of academic centers for infosec studies and research. The plan also emphasizes technology transfer to the infosec side of NSA, to the military services, and to industry. The committee believes that R2 faces two related challenges. One challenge is its research portfolio. Because NSA both funds external infosec research and performs internal infosec research, questions arise as to the appropriate allocation of effort (internal and external) and its coor- dination. Decisions about internal effort, like decisions about external effort, should recognize where the parties have comparative advantage. Hi~hlv classified crv~to~ranhic research is a natural choice for internal {J A J 1 {J 1 ~ ~ TO ~ ~ ~ ~ ~ . .1 .1 . ~ ~ ~ . . research; N5A has widely recognized strength In that area and has better access to mathematical talent in terms of both caliber and number or researchers. Other areas of trustworthiness, less constrained by classifica- tion requirements, seem more appropriate for R2 to pursue externally. The second critical issue is the recruitment, retention, and continuing education of high-quality talent to pursue noncryptographic trustworthi- ness research areas. In these areas, especially those that depend on com- puter science, highly skilled researchers available in many academic and commercial organizations can make significant contributions to infosec technology. R2 will have to compete for that talent with other agencies that have established relationships with top researchers. Furthermore, top-tier talent with security expertise is scarce, and nongovernment em 98Authored by Blaine Burnham, NSA. This document was provided to the committee by R2 when the committee asked for insight into R2's thinking about future directions. The committee examined this document not as a formal plan for NSA, but as a white paper as a source of possibilities for the future.

OCR for page 171
THE ECONOMIC AND PUBLIC POLICY CONTEXT 231 players would appear to offer more rewards, from recognition to pay (Lardner, 1998~. Skills developed in an infosec research group, especially those relating to network security, cryptography, and COTS software, are easily marketable in the commercial sector a fact that constrains both hiring and retention in R2. Finally, there is the perception that the "cloak and dagger image" that once attracted some people to NSA is no longer as strong, because of a smaller defense budget and rapidly growing pri- vate-sector alternatives (Lardner, 1998~. As previously indicated, senior management at NSA and NSA advi- sory groups have stated that it is difficult to obtain and retain highly qualified technical research staff with computer-related expertise for the R2 organization.99 Within R2, staff is spread thinly, and loss of an indi- vidual can have a significant impact on organizational coverage. Further, the ability of a technologist to do research is reportedly limited by admin- istrative and other obligations. The adoption of a rotation program, com- parable to those at the NSF and DARPA for program managers, could be considered as a complement to hiring regular staff members. To be effec- tive, such a program would have to be carefully designed to attract the desired researchers to the NSA. R2 may be at a disadvantage within NSA inasmuch as its work is removed from fielded results that constitute NSA successes and its work is not as directly linked to NSA's mission as that of other units. These circumstances can constrain internal communication, and anecdotal evi- dence suggests that R2 may not always benefit from knowledge of rel- evant work done by sister units. By contrast, program managers pursu- ing trustworthiness topics at DARPA and NSF have more visibility, and they and the researchers they fund are free to publish their results. Although R2 funds and performs unclassified work, it shares the NSA environment and mind-set of tightly controlled information. This envi- ronment presents a real conflict with the need for access to open research information. It can encourage a closed community of workers who do not communicate with others in the community either to seek or contribute information. Although R2 has increased its outreach, the conferences in which it seems most active as an organization, the NSA-NIST-sponsored National Information System Security Conference and its own Tech Fest, tend to attract a small community of researchers with long-standing con- nections to NSA. These audiences have only limited interaction with the larger community of computer science researchers with whom other HCS agency program managers have regular contact. 99They note that R2 has not recruited from the academic researchers it supports.

OCR for page 171
232 Findings TRUST IN CYBERSPACE 1. Some government customers have particularly high needs for se- curity, and there are a handful of systems (e.g., "The President's Laptop") that face levels of threat and require the strength of a mechanism that is not available in commercial products and that would have insufficient demand to support a product in the marketplace. The NSA is particularly well situated to develop such mechanisms. Classified cryptographic re- search is also a natural fit for the NSA internal research program. 2. The R2 university research program emphasizes relatively short term and small projects. Such projects do not tend to attract the interest of the best industrial and academic researchers and institutions. 3. Rotation of R2 researchers with researchers in industry and aca- demia could help to broaden and invigorate the R2 program. Such rota- tion would be most effective with institutions that have large numbers of leading researchers. 4. Inadequate incentives currently exist in R2 to attract and retain highly skilled researchers. Improved incentives might be financial (e.g., different salary scale) and/or nonfinancial (e.g., special recognition, greater public visibility). R2 faces formidable challenges in the recruit- ment and retention of the very best researchers. 5. R2 has initiated several outreach efforts, but these efforts have not significantly broadened the community of researchers who work with R2. Effective outreach efforts are those that are designed to be compatible with the interests, perspectives, and real needs of potential partners. Defense Advanced Research Projects Agency DARPA's charter is to fund research that is likely to advance the mission of the DOD.~ The DOD has requirements, such as the need for high reliability, accommodation of hostile physical environments, and adaptation to varying contexts of use (e.g., whether and what kind of wireline communications are possible; nature of wireless infrastructure available), that are unique to its mission, as well as requirements that are common to other segments of society. Trustworthiness is an issue that cuts across DARPA's portfolio to varying degrees.~~ Relevant work is concentrated in the Information Survivability program (with an approximate budget of $40 million per year) within DARPA's Information Technology Office (ITO) (with a bud- get of $300 million to $350 million per year), which supports research disinformation about DARPA is available online at . Debased on examination of publicly available project descriptions.

OCR for page 171
THE ECONOMIC AND PUBLIC POLICY CONTEXT 233 directly applicable to NIS trustworthiness. As noted above, this program is coordinated with NSA's R2 program using the ITO established between the two agencies (and DISA) for that purpose. Universities and industrial research establishments are supported, with a program that in 1997 was divided into four subareas high-confidence computing, high-confidence networking, survivability of large-scale systems, and wrappers and com- position. A reasonably broad set of topics is covered (see Appendix I), with some emphasis on fault tolerance and intrusion detection, at least as mea- sured by the number of funded projects in these areas. Research in other areas important for NIS trustworthiness, as articulated in previous chap- ters containment, denial-of-service attacks, cryptographic infrastruc- tures, for instance although present, is not treated as prominently as it should be. To support greater use of COTS products, the DARPA Infor- mation Survivability program has sponsored research in wrappers and other technologies for retrofitting trustworthiness properties to existing components. Other programs within ITO also support research that impinges on NIS trustworthiness in areas such as software engineering, programming languages, computer networks, and mobile communications. For ex- ample, encryption, reliability, and various aspects of information security are all concerns in the mobile communications (Global-Mobile) program. Other DARPA offices, including the Information Systems Office, support some work in electronics and other areas related to NIS trustworthiness. Finally, DARPA has provided funding to NSF to support smaller-scale and more theoretically oriented research projects in trustworthiness and software assurance. DARPA funds research based on proposals that it receives from in- vestigators. These proposals are written in response to published broad area announcements (BAAS), which outline general areas of research of interest based on interactions among program managers, operating units of the DOD with specific technology needs, and members of the research community. Proposals are evaluated by DARPA staff as well as others within the federal government, and competition for the funding is keen. Funding levels are high relative to other government sources of research support, reflecting the emphasis on systems that often require research teams and significant periods of time to develop, allowing DARPA- funded projects to undertake nontrivial implementation efforts as well as long-range research. The ITO's culture and its practice of organizing office- and program- wide principal investigator meetings have fostered contact between DARPA program managers and the researchers that they support. This contact enables the research community to contribute to future DARPA

OCR for page 171
234 TRUST IN CYBERSPACE funded research directions, and it helps program managers to catalyze research communities. DARPA principal investigator meetings also fa- cilitate interchange among those involved in DARPA-funded projects. Longer-term issues and planning are considered annually at a special, retreat-style information science and technology (ISAT) activity organized around specific topics. ISAT enables program managers to interact inten- sively with small groups of researchers to better understand research areas (potential BAAs) for which research funding potential is timely. DARPA program managers typically are employed on temporary as- signments, although there is a small cadre of longer-term staff. The ranks are populated by academics on leave from their universities, as well as scientists and developers from other branches of the government and from industry. Limited-term appointments mean that DARPA's direc- tion and priorities are not static, with obvious advantages and disadvan- tages. Most problematic is that longer-term research agendas may suffer from changes in personnel, as newer program managers seek funding for research programs they wish to create, which can be achieved only by reallocating resources at the expense of existing programs. Another con- cern is the ability to attract top researchers for brief government stints. Those academics with well-developed research programs are reluctant to leave them for 2 to 3 years, while those researchers who have been unable to develop such programs are probably not the candidates that DARPA would like to recruit.~02 On the other hand, top researchers who serve for brief government stints bring state-of-the-art thinking to DARPA and may be more willing than career employees to abandon less promising streams of research. Because the existence of effective research programs in trustworthiness and survivability is essential, whatever challenges ex- ist in attracting topflight academics must be overcome. The types of research undertaken have varied over the years, depend- ing on priorities within the DOD and DARPA as well as outside influ- ences (e.g., the NSA, Congress). Historically DARPA projects have been high risk, pushing the envelope of technological capabilities to achieve potentially high payoffs. For example, in the early to mid-1970s, there was strong interest in DARPA security research, sparked in part by a Defense Science Board task force established to address the security problems of multiaccess, resource-sharing computer systems. In an effort to attain the widely shared goal of creating a multilevel secure operating system, the DOD aggressively funded an external research program that yielded many fun ~02Interview conducted by Jean E. Smith for the Computing Research Association on March 25, 1998. Data is available online at .

OCR for page 171
THE ECONOMIC AND PUBLIC POLICY CONTEXT 235 damental advances in computer security. As one view of DARPA in the 1970s put it: "The route to a solution implementing a reference monitor in a security kernel was widely agreed upon" (Mackenzie and Pottinger, 1997~. By reducing some of the research and development risks, the DARPA-funded research stimulated the market to develop enhanced se- curity capabilities (CSTB, 1991) at the same time that, not coincidentally, the United States led the computer security field and agreement emerged about the nature and role of an organization that would certify the secu- rity of actual systems. Not every project was successful. Some were canceled, others ex- ceeded budgets, and yet others outlived their practicality. These experi- ences illustrate some of the difficulties inherent in research. Some "fail- ures" are a positive sign as indicators that challenging ideas are being pursued (which entails some risk) and that spin-offs and learning take place, which may be applied to future successful projects. Issues for the Future A few university computer science departments have several faculty members who emphasize computer security research, but many depart- ments have none who do. In any event, the number of computer security researchers is small compared to the number in other specialties, such as operating systems or networks. Among the consequences are a paucity of educational programs in security and a dearth of security experts. In recent years, DARPA funding for computer security research has been primarily incremental and short term. Longer-range research projects need to be funded, particularly those that address fundamental questions, to develop the basic research that is needed for the long-term vitality of the field. Even fewer faculty conduct research programs in some other areas of trustworthiness, such as operational vulnerabilities. Increased funding is imperative to enable reasonable progress in the critical research areas needed to improve the trustworthiness of NISs. Although the DOD-support mission does not seem to restrict what research areas DARPA pursues, pressures to demonstrate the relevance of their research investments have generally led DARPA program man- agers to encourage their investigators to produce short-term results and make rapid transitions to industry. This approach can discourage investi- gation of more fundamental questions and experimental efforts, and thus affect which research topics are explored. Some of the research problems outlined in this report require long-term efforts (e.g., achieving trustwor- thiness from untrustworthy components); expecting short-term payoff may well have the effect of diverting effort from what may be the more critical problems or the most effective solutions.

OCR for page 171
236 TRUST IN CYBERSPACE The need for an increased emphasis in research on improving the trustworthiness of NISs in the long term is not consistent with the stated emphases of current ITO direction. The current director, in a recent inter- view,l03 articulates three main thrusts for ITO: "Let's get physical" refers to moving beyond the metaphor of a human directly interacting with a computer system to one that places greater attention on the physical world. The second main theme, "Let's get real" suggests an increased focus on real-time applications; the third theme is "Let's get mobile," referring to mobile code research. The committee believes that while some part of this focus is relevant to the research agenda needed to ad- vance the trustworthiness of NISs (e.g., refer to the discussion on mobile code in Chapters 3 and 4), the three themes do not embrace the large majority of the most important topics. The PCCIP calls for an increase in federal spending on information assurance R&D from an estimated $250 million currently to $500 million in FY 1999 and $1 billion in FY 2004 (PCCIP, 1997~. While the study committee certainly endorses the need to increase federal spending on trustworthiness R&D, the study committee has not seen any published rationale for this magnitude of increase. The study committee observes that for the next several years, the population of experts who are qualified to conduct trustworthiness-related research is relatively fixed, because of the lead time needed to recruit and educate new researchers. Thus, in- creased activity in trustworthiness-related research must be conducted by extant researchers who are already engaged in other work. The study committee believes that a quadrupling of the level of activity in the pro- posed time frame is therefore unnecessary. Instead, a lower rate of growth that is sustained over a greater number of years would probably be more effective, especially if it is coupled with programs to increase the number of university training programs in trustworthiness. Findings 1. DARPA funds some research in important areas for NIS trustwor- thiness. However, other critical topics including containment, denial- of-service attacks, and cryptographic infrastructures are not emphasized to the extent that they should be. 2. The use of academics on temporary assignment as program man- agers has both advantages and disadvantages. This rotation of program managers ensures that state-of-the-art thinking is constantly being in 103Interview conducted by Jean E. Smith for the Computing Research Association on March 25, 1998. Data is available online at .

OCR for page 171
THE ECONOMIC AND PUBLIC POLICY CONTEXT 237 fused into DARPA (assuming that the leading researchers in the field are appointed). On the other hand, such rotation does not promote long-term research agendas because a program manager's tenure typically lasts for only 2 to 3 years. 3. DARPA uses a number of mechanisms to communicate with the research community, which include principal investigator meetings, ISATs, and broad area announcements. These mechanisms seem to be generally effective in facilitating the exchange of ideas between DARPA and the research community. 4. The nature and scope of major DARPA projects funded in the 1970s in which security work was an integral part of a large, integrated effort seem to characterize DARPA's greatest successes in the security domain. Not all of these efforts were entirely successful, as is characteris- tic of high-risk, high-payoff research. Some level of failure is therefore acceptable. 5. The committee believes that increased funding is warranted for both information security research in particular and NIS trustworthiness research in general. The appropriate level of increased funding should be based on a realistic assessment of the size and availability of the current population of researchers in relevant disciplines and on projections of how this population of researchers may be increased in the coming years. REFERENCES Anderson, Robert H., Phillip M. Feldman, Scott Gerwehr, Brian Houghton, Richard Mesic, John D. Pinder, and Jeff Rothenberg. 1998. A "Minimum Essential Information Infra- structure" for U.S. Defense Systems: Meaningful? Feasible? Useful? Santa Monica, CA: RAND National Defense Research Institute, in press. Board on Telecommunications and Computer Applications, National Research Council. 1989. The Growing Vulnerability of the Public Switched Networks. Washington, DC: Na- tional Academy Press. Boehm, Barry. 1981. Software Engineering Economics. Englewood Cliffs, NJ: Prentice-Hall. Burnham, Blaine W. 1997. Information System Security Research Program Plan Version 4.0. Ft. Meade, MD: National Security Agency (R2) INFOSEC Research and Technology Of- fice, January. Canadian System Security Centre. 1993. The Canadian Trusted Computer Product Evaluation Criteria Version 3.0e. Ottawa, Canada: The Communications Security Establishment, Government of Canada, January. Carpenter, Brian E., and Fred Baker. 1996. Informational Cryptographic Technology. RFC 1984. August. Clausing, Jeri. 1998. "Federal Reserve Official Warns of Year 2000 Bug," New York Times, April 29. Computer Science and Telecommunications Board (CSTB), National Research Council. 1991. Computers at Risk: Safe Computing in the Information Age. Washington, DC: National Academy Press.

OCR for page 171
238 TRUST IN CYBERSPACE Computer Science and Telecommunications Board (CSTB), National Research Council. 1994. Information Technology in the Service Society: A Twenty-First Century Lever. Wash- ington, DC: National Academy Press. Computer Science and Telecommunications Board (CSTB), National Research Council. 1996. Cryptography's Role in Securing the Information Society, Kenneth W. Dam and Herbert S. Lin, eds. Washington, DC: National Academy Press. Cummins, Arthur J. 1998. "Investors Are Scratching Their Heads Over Details of Convert- ing to Euros," Wall Street Journal, August 14, p. B8. de lager, Peter. 1993. "Doomsday 2000," ComputerWorld, 27~36~:105. Denning, Dorothy E., and Giovanni M. Sacco. 1981. "Timestamps in Key Distribution Protocols," Communications of the ACM, 24~8~:533-536. Diffie, Whitfield, and Susan Landau. 1998. Privacy on the Line: The Politics of Wiretapping and Encryption. Cambridge, MA: MIT Press. Edmondson, Gail, Stephen Baker, and Amy Cortese. 1997. "Silicon Valley on the Rhine," Business Week, November 3, p. 162. Available online at . Electronic Frontier Foundation. 1998. Cracking DES: Secrets of Encryption Research, Wiretap Politics ~ Chip Design. Sebastopol, CA: O'Reilly and Associates. Executive Office of the President, Office of Science and Technology Policy. 1997. Cyberna- tion: The American Infrastructure in the Information Age, A Technical Primer on Risks and Reliability. Washington, DC: Executive Office of the President. Gertz, Bill. 1998. "Infowar Game Shut Down U.S. Power Grid, Disabled Pacific Com- mand," Washington Times, April 17, p. A1. Harreld, Heather. 1997. "Group Says Few Fed Sites Protect Privacy: Lack of Policies and Mechanisms Puts Web Visitors at Risk," Federal Computer Week, September 1, p. 10. Hellman, Martin E. 1979. "DES Will Be Totally Insecure Within Ten Years," IEEE Spectrum, 32(7). Lardner, Richard. 1998. "The Secret's Out," Government Executive, August. Available online at . Lemos, Robert. 1998. "Lloyds to Offer Firms Insurance Against Hackers," ZDNN, April 23. Available online at . Mackenzie, Donald, and Garrel Pottinger. 1997. "Mathematics, Technology, and Trust: Formal Verification, Computer Security, and the U.S. Military," IEEE Annals of the History of Computing, 19~3~:41-59. Masters, Brooke A. 1998. "Laptop Thefts Growing: Businesses Losing Computers, Se- crets," Washington Post, March 30, p. B1. Mayfield, William T., Ron S. Ross, Stephen R. Welke, and Bill R. Brykczynski. 1997. Com- mercial Perspectives on Information Assurance Research. Alexandria, VA: Institute for Defense Analyses, October. Meissner, P. 1976. Report of the Workshop on Estimation of Significant Advances in Computer Technology. Washington, DC: National Bureau of Standards, December. Needham, R.M., and Michael D. Schroeder. 1978. "Using Encryption for Authentication in Large Networks of Computers," Communications of the ACM, 21~12~:993-999. Needham, R.M., and Michael D. Schroeder. 1987. "Authentication Revisited," Operating Systems Review, 21~1~:1. Neumann, Peter, G. 1990. "Rainbows and Arrows: How the Security Criteria Address Computer Misuse." pp. 414422 in Proceedings of the Thirteenth National Computer Secu- rity Conference. Washington, DC: NIST/NCSC. Noll, Roger G. 1996. Reforming Risk Regulation. Washington, DC: Brookings Institution, April. Office of the President. 1997. A Framework for Global Electronic Commerce. Washington, DC: The White House, July 1.

OCR for page 171
THE ECONOMIC AND PUBLIC POLICY CONTEXT 239 President's Commission on Critical Infrastructure Protection (PCCIP). 1997. Critical Foun- dations: Protecting America's Infrastructures. Washington, DC: PCCIP, October. Senior Officials Group. 1991. Information Technology Security Evaluation Criteria. London: European Community Information Systems Security, Department of Trade and Indus- try. U.S. Department of Defense (DOD). 1985. Trusted Computer System Evaluation Criteria, Department of Defense 5200.28-STD, the "Orange Book." Ft. Meade, MD: National Computer Security Center, December. Ware, Willis, H. 1995. "A Retrospective of the Criteria Movement," pp. 582-588 in Proceed- ings of the Eighteenth National Information Systems Security Conference. Baltimore, MD: National Institute of Standards and Technology/National Computer Security Center. Wiener, Michael J. 1994. "Efficient DES Key Search," paper presented at the Rump Session of Crypto '93, School of Computer Science, Carleton University, Ottawa, Ontario, Canada, May. Wilson, Janet. 1998. "The IETF: Laying the Net's Asphalt," Computer, 31~8~:116-117.