The Need to Establish an Information Security Foundation
In the preceding chapters, this report identifies factors contributing to low levels of computer security in commercial or nonmilitary systems, and it recommends a variety of actions intended to promote security in the design, selection, and use of computer systems. This chapter argues that a new organization should carry out many of those actions. In the discussion below, the proposed organization is called the Information Security Foundation, or ISF. Mindful that U.S. efforts have been fragmented and inadequate whereas efforts in Europe are gaining momentum and cohesion, this recommendation is intended to fill a troubling void. After reviewing the requirements and options for such an organization, the committee concluded that the ISF should essentially be a private, not-for-profit organization, largely outside the government once it is launched. It would need the highest level of support from government as well as industry; the strongest expression of such support would be a congressional charter.
ACTIONS NEEDED TO IMPROVE COMPUTER SECURITY
As documented in other chapters, several actions are necessary to improve computer security. These actions form the basis for the mission of the ISF:
Defining requirements and evaluation criteria for users of commercial systems, including private sector users and government processors of sensitive but unclassified information. A major part of this effort is the development and promulgation of the Generally Accepted
System Security Principles (GSSP), which would provide a set of requirements guidelines for trustworthy computer and communications system design and use.
Conducting research and development, especially into criteria and evaluation procedures, in support of the above.
Evaluating the quality of security measures in industry-developed products during their development and throughout their life cycle, and publishing evaluation results. In particular, evaluating products for conformance to GSSP. Eventually evaluations should also consider other aspects of system trustworthiness, such as safety. (See "Assurance Evaluation" in Chapter 5.)
Developing and maintaining a system for tracking and reporting security and safety incidents, threats, and vulnerabilities.
Promoting effective use of security and safety tools, techniques, and management practices through education for commercial organizations and users.
Brokering and enhancing communications between industry and government where commercial and national security interests may conflict.
Focusing efforts to achieve standardization and harmonization of commercial security practice and system safety in the U.S. and internationally.
These actions are complementary and would be pursued most effectively and economically by a single organization. At present, some of these actions are attempted by the National Security Agency (NSA), the National Institute of Standards and Technology (NIST), and other organizations. However, current efforts fall short of what is needed to accomplish the tasks at hand, and the dominant missions of existing agencies and organizations limit the scope of their involvement in addressing the issues of computer security and trustworthiness. In particular, relevant government agencies are poorly suited to represent the needs of nongovernmental system users (although they may take some input from major system users and generate publications of interest to users).
ATTRIBUTES AND FUNCTIONS OF THE PROPOSED NEW INSTITUTION
The ISF should have the following attributes and functions:
It should be free from control by the computer and communication vendors, but it must communicate and work effectively with them. This quality is important to prevent the appearance or reality
of bias or conflict of interest. Vendors can be expected to be responsive to consistent and credible user demand, but they have not shown (and cannot be expected to show) leadership in defining and bringing to market systems with enhanced security. Thus trade associations and conventional industry consortia are not credible vehicles for the needed activities, although they would be a valuable conduit for inputs and for dissemination of outputs such as GSSP.
It should have a strong user presence, through membership and participation in its governance.
It must have defined relationships to existing governmental organizations, particularly NIST and NSA, but also other organizations relevant to its missions, such as the Defense Advanced Research Projects Agency (DARPA) and the National Science Foundation (NSF). By charter and by action, it must command the respect of both government and industry and must seek open personal and institutional communications with both. It must have ready access to technical assistance from government agencies. Most importantly, because of existing agency activities there would have to be a delineation of where the ISF would have lead responsibility in the above areas. Industry, for example, would not tolerate a situation calling for evaluations by both NSA and a new entity—but it should find tolerable a situation involving NSA evaluations for military-critical systems and ISF evaluations for other, GSSP-compliant systems, with coordination between ISF and NSA to minimize any duplication of effort.
It must serve more than just a single industry or just the governmental sector, to ensure the broad relevance of GSSP and of the evaluations that would be performed to ensure conformance to GSSP.
It must strive to be at the forefront of the computer security field, attracting top-notch people to enable it to lead the field. Staffing would take time, but the opportunity to do research is necessary to attract the most talented candidates.
It should address the broader problem of how to make computer systems trustworthy, integrating security with related requirements such as reliability and safety. Implementing these related requirements can benefit from similar techniques and mechanisms in many instances. While the ISF should focus initially on security, it should consider related areas such as safety and reliability from the start. Although a security constituency seems to be emerging outside of government, there is nothing analogous for computer system reliability and safety. The ISF could lead in helping to establish a constituency for system trustworthiness.
It should have a strong, diversified funding base. In particular, it must not depend on government funding, although federal seed
money would be appropriate. Although government has much in common with the rest of the economy in terms of the kinds of computer systems and applications it chooses, governmental priorities in system design, use, and management may differ from those found elsewhere, even for systems processing sensitive but unclassified information. Perhaps most importantly, government funding is unlikely to reach the levels or have the stability necessary to sustain the ISF. Finally, policy independence may be necessary in some cases, such as when the ISF is called on to seek a middle ground between commercial and defense perspectives.
The development and dissemination of GSSP would be central functions of the ISF. These activities would build on research and on consensus across a variety of stakeholding communities (vendors, commercial users, the general public, and government). The goal is to achieve universal recognition along the lines that the Financial Accounting Standards Board (FASB) has for what have been called Generally Accepted Accounting Principles (GAAP). Although the analogy to FASB is not perfect, it presents some notable parallels:
The FASB plays a unique role in our society. It is a [de facto] regulator that is not a government agency. It is an independent private foundation financed by contributions and by revenues from the sale of its publications. Contributions are primarily from corporations and public accounting firms, but the FASB is independent of the contributors by virtue of a carefully drawn charter. By the same token, the FASB is independent of both the American Institute of CPAs and the Securities and Exchange Commission, even though its "clout" comes from the fact that both institutions accept FASB pronouncements as the prime authority for purposes of preparing financial statements in accordance with generally accepted accounting principles.…
The FASB is the latest in a line of accounting standard-setting bodies that go back to the stock market crash of 1929 and the consequent Securities Acts of 1933 and 1934. The stock market crash drove home the point that the U.S. economy depends greatly on a smoothly functioning capital market.… (Mosso, 1987)
While FASB's GAAP are intended to assure fair disclosure by companies to investors and creditors, GSSP are intended to protect companies and individuals both inside and outside a computer-system-using entity. However, similar motivations inform the proposed ISF and FASB. If industry does not pursue such an effort to protect itself and the public, there is a possibility of greater government regulation (see "Regulation as a Market Influence" in Chapter 6).
OTHER ORGANIZATIONS CANNOT FULFILL ISF'S MISSION
As noted above, the beginnings of the ISF's mission can be found in government. The history of government involvement in computer and communications security is outlined in Chapter Appendix 7.1. The forebear closest to the proposed ISF is the National Computer Security Center (NCSC), which has supported the development of the Orange Book and performed evaluations of products against its criteria (see Appendix A of this report). As is discussed in preceding chapters, the Orange Book criteria and the associated evaluation process fall short of what vendors, users, and a wide range of security experts consider necessary. Perhaps most important, the NCSC has undergone a reorganization and downsizing that may severely limit its ability to meet its old mission, let alone an expanded mission.
A number of significant events have shaped the role of the NCSC in civilian computing. The promulgation of National Security Decision Directive (NSDD) 145 in 1984 expanded the NCSC's scope to include civilian government and some aspects of the private sector's concerns for protection of sensitive unclassified information. Subsequent passage of the Computer Security Act of 1987 (P.L. 100–235) and the July 1990 issuance of NSD 42, revising NSDD 145, substantially limited that scope to classified, national-security-related activities. As a result, the NCSC's influence on commercial and civilian government use of computers has been greatly reduced.
Starting in 1985, internal reorganizations within the NSA have merged the separate and distinct charter of the NCSC with NSA's traditional communications security role. Most recently, the NCSC was reduced to a small organization to provide an external interface to product developers. The actual evaluations will be performed by NSA staff, sometimes assisted by specific outsiders (e.g., MITRE Corporation and Aerospace Corporation), in direct response to requirements of the national security community. Although outsourcing evaluation work is a practical solution to NSA's limited resources, it raises questions about the accountability of and incentives facing the evaluators. These questions are of great concern to industry, which has complained about the duration of evaluations and the lateness within the product cycle of the evaluation process. Another issue raised by the reorganization is the extent to which NSA will remain concerned with evaluation of systems at the lower levels of the Orange Book, such as C2.1
The other major government player in this area is NIST, which through the National Computer Systems Laboratory (NCSL) is concerned with computer and communications security. At present NIST lacks the technical and financial resources to execute the agenda defined here for ISF, and it also lacks the necessary charter and organizational support. The recent move by NIST to coordinate a clearinghouse with industry focused on protections against viruses illustrates NIST's opportunities for expansion, but it also illustrates NIST's limited resources—this is a small-scale limited-focus effort (Danca, 1990e).
In the computer security arena, NIST has traditionally focused on supporting technical standards (e.g., those related to Open Systems Interconnection (OSI) and Integrated Services Digital Networking) and developing guidelines for system management and use. These activities are more straightforward than articulating GSSP and developing guidelines for associated evaluations. Evaluating the security functionality and assurance of a computer system, for example, is more difficult than evaluating conformance to interoperability standards. Although NIST has been involved with standards conformance testing (and has begun a program to establish testing for conformance to certain DES standards), it has so far not undertaken either to specify evaluation criteria for the civil government or to evaluate commercial products against any criteria, or to offer guidelines for system-level evaluation.2 Such guidelines would have to describe how to judge the effectiveness of security safeguards against an anticipated threat.
Finally, its relations with NSA, on which it relies for technical assistance and with which it has an agreement not to compete with the Orange Book process, have not given NIST the scope to act with substantial independence. The committee has doubts that NIST's National Computer Systems Laboratory could play the role that is required, given its present charter and in particular the difficulty it has in achieving satisfactory and consistent funding.
As banks, insurance companies, and business in general have become increasingly interested in computer security, these organizations have found that their interests are not well served by the present activities of NCSC or NIST. This situation is evidenced by either ignorance of or resistance to the Orange Book (see Chapter 6) and by observations on the inadequate budget and program of NIST.
But existing private organizations are also poorly suited to undertake the actions needed to improve computer security. Currently, much activity in the private sector is driven by vendors, regulated
industries, and large computer and communications system users. They affect the overall state of commercial security through the marketplace, trade associations, and relevant standards-setting ventures. As discussed in Chapter 6, the influence is uneven and tends to be reactive rather than proactive.
Largely (but not exclusively) in the private sector are security specialists or practitioners and their relatively new professional societies (discussed in Chapter Appendix 7.2). Security practitioners are the principal force promoting computer and system security within organizations, but they operate under a variety of constraints. In particular, the voluntary nature of professional societies for security practitioners limits their reach. Also, professional societies tend to focus exclusively on security and show no signs of addressing broader issues of system trustworthiness (in particular, safety).
WHY ISF'S MISSION SHOULD BE PURSUED OUTSIDE OF THE GOVERNMENT
Apart from the specific limitations of NIST and the NCSC, there are more general concerns about a governmental basis for the ISF.
The government has difficulty attracting and keeping skilled computer professionals. The NCSC, for example, appears to have been largely staffed by young, recently graduated computer scientists who have little practical experience in developing complex computer systems. Issues that constrain federal hiring include salary ceilings and limitations on the capitalization available to technical personnel.
The defense budget is shrinking. Department of Defense resources have supported the activities in the NCSC and relevant activities elsewhere in NSA, DARPA, and research units of the armed services (e.g., the Naval Research Laboratory). As noted in Chapter 8, defense resources will continue to be valuable for supporting relevant research and development.
The international standards arena may become a forum for the negotiation of standards for security and safety and for evaluation criteria. The American National Standards Institute (ANSI) and other private U.S. standards organizations depend on voluntary contributions of time and talent, and the role that NIST and other agencies can play in contributing to international efforts is limited. The United States needs a strong presence in these commercial standards-setting processes, complementing the existing military standards process that to date has been a major impetus to development of trusted systems.
Government's necessary concern for national security sometimes
obscures legitimate commercial interests, occasionally handicapping technology and market development that may be in the country's long-term economic security interests.
The realities of the government environment suggest that accelerating the development and deployment of computer and communications security requires a greater role for the commercial sector.3
A NEW NOT-FOR-PROFIT ORGANIZATION
Given the limitations of private and public organizations, the committee concludes that the proposed Information Security Foundation will be most likely to succeed as a private not-for-profit organization. To assure that its viability would not depend on special-interest funding, multiple sources are necessary.
The ISF would need the highest level of governmental support, and the strongest expression of such support would be a congressional charter that would define its scope and, in particular, set parameters that would permit it to work with NSA, NIST, and other agencies as appropriate. There are general precedents for government establishment of organizations acting in the public interest, including organizations that perform tasks previously performed by public or private entities.4 In all of these organizations, effective working relationships with government and operational flexibility, which would be critical for the ISF, have been key.
Good working relationships with relevant agencies would be necessary so that ISF could contribute to satisfying government needs, especially in developing GSSP and associated evaluations, and to avoid unnecessary duplication of effort. For example, as noted above, there should be one recognized source of evaluations for a given type of system. Government recognition of evaluations conducted by the ISF would also be necessary to support international reciprocity in handling the results of evaluations in different countries (see Chapter 5).
One relatively new government initiative in computer security, the establishment of Computer Emergency Response Teams (CERTs) to deal with threatened or actual attacks in networks and systems, presents a specific opportunity for coordination between agencies and the ISF. The ISF could, building from the base already provided by DARPA, provide a common point for collecting reports of security problems in vendor products and passing these back to the vendor in a coordinated way. This function could be a part of the larger action of providing an incident database (which would not be limited to emergency situations in large networked systems); the ISF should be
able to devote more resources to this important activity than does DARPA or NIST, although DARPA-funded CERT activities could be an input into the ISF.
Success for the ISF would depend on strong participation by users and vendors. The appeal to users is that ISF would provide, through the GSSP and related evaluation processes, a mechanism for making vendors more responsive to users' needs for systems that are more trustworthy and a forum designed to identify and alleviate user problems. Vendors would get a more responsive evaluation mechanism and broader guidance for developing trusted systems than they have had in the NCSC. Both vendors and users would gain from having a single, well-endowed focal point for system security and trustworthiness.
Critical Aspects of an ISF Charter
If the concept of establishing the ISF is accepted, the details of the ISF's form and function will be discussed extensively. This report cannot offer too detailed a vision of the ISF, lest it prematurely over-constrain the approach. However, certain aspects of the ISF seem critical. Summarized here, they should be reflected in any legislation that might bring the ISF into existence.
The board of directors of the ISF must include government, vendor, and user representatives.
The ISF must be permitted to receive private funds as its major source of income. As discussed below, such funds would most likely be in the form of subscription fees and in charges to vendors for product evaluations.
The ISF must not have the salary levels of its employees tied to government scales but must be able to pay competitive rates. The nature of its work means that its most significant asset and the largest source of expense will be technical personnel.
The ISF must be able to solicit support from the government for specific activities, such as research. It should be able to regrant such funds, under appropriate controls.
The legal liability that the ISF might incur by performing an evaluation must be recognized and managed, given the necessarily subjective nature of evaluations. The goal is to facilitate evaluations to protect users and vendors; of course, the ISF must be accountable in the event of negligence. This problem, which has been addressed for product-testing organizations, might in ISF's case best be handled by careful explanation
of what an evaluation does and does not signify; for example, it might signify a given probability of resistance to certain types of attack, although no amount of testing and evaluation can ever guarantee that a system will be impervious to all attacks. It might be necessary for the ISF to set up operating procedures to resolve disputes arising from evaluations; one option would be arbitration, which, unlike litigation, would avoid introducing details of product design and strategy into the public record.
The NCSC experience shows how difficult it can be to launch an effective evaluation program, in which success includes widespread industry awareness and support as well as reasonable cost and time for evaluation. Consequently, the committee believes it might take longer to inaugurate an effective ISF evaluation program than to undertake other ISF activities. The committee believes that GSSP is a vital foundation for increasing customer awareness and vendor accountability, and by extension for building an effective evaluation program. A critical pacing factor would be vendor demand for evaluations. This might be a function of true general acceptance for GSSP, coupled with case law trends that might increase vendors' perceived liability for software and system defects. If prudent customers were to specify GSSP, and vendors then used compliance with GSSP in marketing, independent evaluation of GSSP compliance would protect both vendors and users. Evaluation provides for truth in advertising from the customer's point of view, and it provides a mechanism for the vendor to demonstrate good faith. Note as a precedent that recently proposed legislation would ease the liability burden for vendors of products evaluated by the Food and Drug Administration (FDA) and the Federal Aviation Administration (Crenshaw, 1990).
Selection of an appropriate initial leader for the organization would be a critical step; that person's job would involve not only developing a business plan but also securing commitment from key stakeholders and recruiting a strong core staff. A parent organization should be designated to shelter the ISF during this first stage. Although using a government agency would expose the ISF to government politics during this first critical period, no obvious private group could play this role. A suitable ''launch site" would have to be sought while the details of a charter, operating plan, and budget were being developed.
Funding the ISF
This committee recommends a not-for-profit consortium funded by consumers and procurers of secure systems and functioning as a foundation. The most difficult aspect is to establish stable long-term
funding to ensure the ISF's effectiveness, enabling such a foundation to be a credible source for requirements and evaluation and to attract and keep a first-class staff. The committee suggests that funding be derived from two sources: basic subscription fees, and usage fees from the computer manufacturers and commercial users.5 Also, the committee urges that the federal government provide seed money to launch the operation and sustain it in the early stages. The overall budget for this kind of organization would likely be about $15 million to $20 million. This assumes a budget devoted largely to costs for technical personnel, plus essential plant, equipment, and software tools. While evaluations, which are labor-intensive, might be the most expensive activity, they would be paid for by vendors.
Membership fees paid by private sector consumers of computer security products should be the basic source of funds, since consumers rather then vendors would be the main beneficiaries and would need a guarantee that their interests are paramount. For example, the first increment of funds could derive from basic subscription fees paid by all members. This funding would be used to establish the base of research and criteria development needed for the foundation to function efficiently. Note that subscription fees for Fortune 500 companies of, for example, $50,000 per year per company would generate $10 million annually if 200 participated. This seems to be a modest amount for a $5 billion organization to spend. Successful fund-raising would likely hinge on obtaining commitments from industry clusters (i.e., multiple organizations in each industry); this pattern has been observed in other consortia.
System manufacturers might be asked to pay a subscription fee ranging from $50,000 to $500,000 based on their overall revenue. Twenty vendors contributing an average of $250,000 each would generate an additional $5 million for the base fund. The basic subscription would entitle an organization to participate in the foundation's research, evaluation, and education programs. As a reference point, note that membership in the Corporation for Open Systems, which promotes development of systems that comply with open systems standards and conducts or supplies tools for conformance testing, costs $200,000 for vendors and $25,000 for users.
Contributions that range into six figures are difficult to obtain, especially at a time when computer-related research and standards consortia have proliferated (e.g., Open Software Foundation, Corporation for Open Systems, Microelectronics and Computer Technology Corporation, Sematech, X/Open) and when competitive considerations and the prospect of a recession prompt budget cutting. The mission of the proposed ISF differs from that of any other entity, but the
combination of a government charter and an assured role in product evaluations will be central for gaining the necessary corporate commitments. As noted above, the impact of GAAP comes not merely because a FASB exists but because the government, through the Securities and Exchange Commission and other vehicles, has endorsed GAAP (while industry has a strong voice in GAAP development).
The second source of funds could be fees for the evaluation of industry-developed products. This is analogous to other kinds of product testing, from drug testing (for which producers incur costs directly) to testing requested by vendors but carried out by independent laboratories (e.g., Underwriters Laboratories, Inc.). The actual cost incurred by the foundation for each evaluation would be billed to the vendor. Because the base of research and criteria development activities would be funded by subscription fees, the foundation could maintain a core staff to conduct evaluations and thus could establish its independence from vendors. The special nature of the ISF would eliminate any prospect of competition with vendors and would be consistent with the necessary protection of proprietary information. Furthermore, the stability of the foundation would mean that evaluation fees could be held to a minimum. Without the pool of subscription funds as general base funding, the cost of an evaluation might be prohibitive.
It is critical that the evaluations be charged to the producer of the product. Although it would be nice to imagine the government paying for this service, the committee concludes that this option (which is provided by the NCSC today) is unrealistic. If the government pays, there is no way to adjust the level of effort to meet vendor demands. If the vendor were to pay, the ISF could allocate funds to meet the product cycle of the vendor, and in this way the evaluation process could be more responsive to vendor needs. Vendor funding would permit the organization to respond quickly with appropriate levels of qualified individuals and would provide a critical incentive to complete the evaluation process expeditiously yet thoroughly by working with vendors throughout the entire development process. The evaluations could be completed and available as the products enter the marketplace (instead of years later). The government could use the results of the ISF directly in its own evaluation of particular systems.
ALTERNATIVES TO THE ISF
A number of alternatives to the ISF, ranging from government centers to industry facilities, must at least be considered. The base against which alternatives should be measured is the present situation
wherein the NCSC does detailed technical evaluations for the classified national security community and NIST serves in a limited advisory role to the civilian government. The limitations of this situation have been discussed.
One alternative is that NIST develop its own computer security evaluation facility comparable to the NCSC. The current NIST course of (at least limited) endorsement of the Orange Book plus no direct involvement in actual evaluations argues against this alternative. Without a significant change in operational orientation and funding for NIST, successfully implementing this alternative is highly unlikely.
An alternative considered in 1980, prior to the formation of the NCSC, was the establishment of a single federal computer security evaluation center for all of government, separate from the NSA but involving NSA, NIST, and other personnel representing other parts of government. The 1980 proposal would have been funded jointly by the Department of Defense (DOD) and the Department of Commerce (DOC), and it would have resulted in a center located at the National Bureau of Standards (now NIST) and thus capable of operating in an open, unclassified environment, but with the ability to deal with highly sensitive or classified issues as necessary.
Taking such an approach now would require major changes in management philosophy and funding by DOD and DOC and would most certainly require legislative action crossing many firmly established jurisdictional boundaries. For these reasons and because this alternative echoes the weaknesses of the NIST alternative, the second alternative described is unlikely to succeed. However, if industry were to resist a nongovernmental entity, then a single federal computer security evaluation organization would offer improvements over what is currently available, and it could fulfill the additional missions (development of GSSP or broader educational efforts) proposed above.
A third alternative that might avoid the staffing problems faced by government agencies would be an independent laboratory involved in computer security technology development and funded by the government at a federally funded research and development center (FFRDC) such as MITRE Corporation, Aerospace Corporation, or the Institute for Defense Analysis. Such organizations already participate in NCSC evaluations on a limited basis and can pay higher salaries and retain a core of knowledgeable experts, perhaps even rotating experts from industry. Unfortunately, the experience gained to date with these organizations assisting the NCSC and the nature of the contractual arrangement between them and NCSC have not provided opportunities for improving the existing process or for conducting research and development on the process of evaluation. Also, the
involvement of these groups in developing systems for the government might cause vendors to perceive them as potential or actual competitors, thereby inspiring reluctance to divulge the proprietary information essential for thorough evaluation. This concern has been raised by U.S. vendors in response to the U.K. plans to establish commercial licensed evaluation facilities (CLEFs).
Another approach is that taken by the FDA, a government organization that reviews testing done in-house by the producer of the product. In the case of computer and communications systems, for which evaluation is of necessity rather subjective and the quality of assessments not easily quantified, it seems unreasonable to expect that using vendor staff as evaluators could yield an unbiased result. There is no effective way for a government agency to control the process of evaluating computers and systems if it is limited to review of the results of a vendor's evaluation.
Finally, note that the mission envisioned for the ISF is not one that current independent testing laboratories can fill. Evaluating trusted systems is much more difficult and time-consuming than evaluating the performance of various forms of hardware or conformance to existing technical standards.
A HISTORY OF GOVERNMENT INVOLVEMENT
The dominant public institutions affecting computer and communications security in the United States are government agencies—in particular, but far from exclusively, agencies within the Department of Defense (DOD). Driven by national security concerns, the U.S. government has actively supported and directed the advance of computer security since the dawn of computer development; its involvement with communications security dates back to the Revolutionary War. The government's long history of involvement in computer and communications security illustrates how public institutions can nurture new technology and stimulate associated markets; it also shows where work remains to be done.
The National Security Agency and the DOD Perspective
The government's involvement with computer security grew out of the evolving field of communications security in the early 1950s, when it was deemed necessary in the United States to establish a single organization, the then very secret National Security Agency (NSA), to deal with communication security and related matters (e.g.,
signals intelligence) (Kahn, 1967). The historical role of the DOD and, in particular, of the NSA, has been responsible for a longstanding tension between the DOD, which seeks to fulfill its mission of protecting national security, and civilian agencies concerned with computer security, notably the National Institute of Standards and Technology, together with the general vendor community.
The overall policy responsibility for communications security matters was originally assigned to the U.S. Communications Security (COMSEC) Board, consisting of cabinet-level officials from all branches of the government, that dealt with classified government information. This structure and NSA's highly classified responsibilities under that board existed from the early 1950s until the mid-1970s, when the issue of using encryption to protect other than classified information caused a division within the government. The publication of the Data Encryption Standard (DES) in 1977 (NBS, 1977) (see discussion below) was a major triumph for both the civilian government and commercial communities (IBM contributed substantially to the development of DES) but has been regarded by some in the national security community as a major disaster.6 Up to that time, cryptography had remained largely a dark science, hidden in government secrecy. Encryption systems were designed by and for the government and were built and distributed under strict and highly classified government control. There had also been some open research, particularly in public-key cryptography.
Computer security does not have as extensive a history as does communications security. It has been recognized as a difficult issue needing attention for at least the past two decades. In the early 1970s, the DOD funded research into how to build computer systems that could be relied on to separate access to sensitive information in accordance with a set of rules. In the mid-1970s, several research projects (e.g., secure Multics) were initiated to demonstrate such systems, and in 1978, the DOD Computer Security Initiative was formed both to promote the development of such systems by industry and to explore how to evaluate them so that they could become widely available for both government and commercial use. Perhaps the most important result of the work during the 1970s was the formulation of a computer-relevant model of multilevel security, known as the Bell and La Padula Model (Bell and La Padula, 1976), which became the focal point of DOD computer security research and development. That model (discussed in Chapter 3) formalized decades of DOD policies regarding how information could be accessed, and by whom, in manual paper-based systems.
In 1981, the DOD Computer Security Evaluation Center was estab-
lished at NSA as an entity separate from the communications security structure already in place. The reasons for this separation included the recognition that while communications security had been largely a government-owned function in which NSA developed encryption algorithms, contracted for their production, and fully controlled their distribution and use throughout the government, computers were far more widely deployed even in the early 1980s and could not be developed, produced, and controlled in the same way as encryption systems. A separate organization capable of working with industry, instead of directing it through procurement contracts, was needed.
The DOD Computer Security Center, as it came to be called, published the Trusted Computer System Evaluation Criteria (TCSEC, or Orange Book) in 1983 (superseded in 1985 by DOD 5200.28-STD; U.S. DOD, 1985d) and began working with industry to evaluate how well their products met the various levels of those criteria. It should be noted that the establishment of the Computer Security Center as a separate function at NSA was opposed both within and outside the agency at the time. The internal opposition stemmed from the perception that computer security was merely a subset of communications security and should be handled in the same way by the same organization. The opposite view was that communications security was becoming increasingly dependent on computers, computer networks, and network protocols, and required a new technology base managed by a new organization. The external opposition derived from the negative concerns of many in the defense community, including other parts of DOD and defense contractors, that NSA's slowness to respond and dictatorial authority in the communications security arena would hamper the development of products needed to solve today's problems. These two opposing forces both within and outside NSA continue today to influence the evolution of both computer security and communications security.
Up until the establishment of the Computer Security Center, the preceding U.S. COMSEC Board and another key policy group, the National Communications Security Committee, largely ignored the computer security problem, lumping it, if considering it at all, into the communications security arena. The 1977 Presidential Directive 24 (PD 24), which created the National Communications Security Committee, split the responsibility for communications security, giving NSA authority over the protection of classified and national security-related information and the National Telecommunications and Information Administration, a part of the Department of Commerce not related to the National Bureau of Standards (NBS), responsibility for protecting unclassified and non-national security information. This
split in responsibility resulted in much confusion and was opposed by many in the national security community.
Growing controversy over computer security led to intense pressure during the early days of the Reagan Administration to correct the situation. Those efforts resulted in the publication in September 1984 of National Security Decision Directive 145 (NSDD 145), the National Policy on Telecommunications and Automated Information Systems Security, which expanded NSA's role in both communications and computer security and extended its influence to the national level, to the civilian government, and to a limited extent, to the commercial world. NSDD 145 required federal agencies to establish policies, procedures, and practices to protect both classified and unclassified information in computer systems. It established the National Telecommunications and Information Systems Security Committee (NTISSC) to develop and issue national system security operating policies.
When NSDD 145 was emerging in 1983–1984, computer security had come into its own with a separate organization at NSA. NSDD 145 swept the two forces together and elevated the DOD Computer Security Center to the National Computer Security Center (NCSC), giving it and the NSA's COMSEC Board roles in the civilian government as well as in the commercial world.
In late 1985 a reorganization at NSA created the Deputy Directorate for Information Security, merging the COMSEC and Computer Security functions and encompassing the NCSC. Since it was becoming clear that the technologies needed to develop communications security systems and computer security systems were becoming inextricably linked, this merger was viewed by many as a positive force. Others, however, viewed the expansion of NSA's role beyond the defense and intelligence communities in a highly negative way, and efforts began in Congress to redefine roles and limit the scope of NSA to its traditional communities of interest. The Computer Security Act of 1987 (U.S. Congress, 1987, P.L. 100-235) defined the role of NBS (now NIST) in protecting sensitive information (see below), and limited NSA to its traditional responsibilities for the protection of classified information.
Two recent developments have continued the withdrawal of NSA from direct and active involvement in the nondefense marketplace and its refocusing on the defense community and the protection of classified information and systems generally. First, in mid-1990, NCSC research and evaluation functions were integrated with the NSA's communications security functions. Officially, however, the restructuring was done to more effectively address network and system
security issues and was prompted by "increasing recognition that current user applications virtually eliminate traditional distinctions between telecommunications and information systems" (NSA, 1990a).
Second, NSDD 145 was revised in July 1990, resulting in NSD 42, so that NSA no longer had responsibility for sensitive but unclassified information. In compliance with the Computer Security Act of 1987, that responsibility was assigned solely to NIST, and all references to the private sector were removed. The NTISSC became the National Security Telecommunications and Information Systems Security Committee (NSTISSC), under the new National Security Council Policy Coordinating Committee for National Security Telecommunications and Information Systems.
The National Institute of Standards and Technology
The other government agency with a longstanding interest in enhancing computer and communications security is the National Institute of Standards and Technology (NIST; formerly the National Bureau of Standards, (NBS)), which serves all government unclassified, non-Warner Amendment interests. Involvement in computer and communication security began in the late 1970s and early 1980s at NIST in what is now known as the National Computer Systems Laboratory (NCSL) (formerly the Institute for Computer Sciences and Technology).
The National Institute of Standards and Technology's involvement in computer security has most often resulted in the publication of federal standards or guidelines on topics such as password protection, audit, risk analysis, and others that are important to the use of computers but do not necessarily relate to the technical aspects of protection within computer systems. These documents, formally known as Federal Information Processing Standards (FIPS) publications, are widely used within the civilian government as the basis for computer processing and computer system procurement. NIST has also issued other, tutorial publications to enhance awareness in government, in particular, of issues such as computer viruses. The FIPS publications provide valuable information to government computer managers who have little time to study the detailed technical issues concerning computer systems, but who are responsible for their proper use. FIPS publications may also be valuable to industry, but they are not widely known outside the government (although they are recognized by many security practitioners).
In 1972–1973 interest in the establishment of an encryption algorithm suitable for use by the nonclassified portions of the government and, potentially, the private sector, led to the DES project at NBS. The
issue of what constitutes "information related to national security" arose, perhaps not for the first time and definitely not for the last time, during this period. The DES controversy triggered the first in a series of actions intended to ensure that public policy addressed the broader public interest in computer and communications security, not just the military interest. In particular, it helped to motivate PD 24, discussed above. It is worth noting here that the number of people involved in cryptography and its related activities at NBS during this time frame never approached 1 percent of the number involved at NSA, and NBS's activities were substantially influenced on a continuous basis by the constraints of NSA. NBS got by with few resources by leveraging investments by IBM, which was responsible for the technical development of the cryptographic algorithm that became the DES.
As noted above, the implementation of PD 24 contributed to the issuance of NSDD 145, and concern about the associated expansion of NSA's role led to the passage of the Computer Security Act of 1987 (P.L. 100-235), which defined specific information-protection roles for NBS and thereby limited NSA's responsibilities. Shortly thereafter, NBS was renamed the National Institute of Standards and Technology (NIST). Although the renamed organization has yet to be funded at a level commensurate with its current or anticipated mission, the intent was to strengthen the organization as a vehicle for stimulating nondefense technology development. Under P.L. 100-235, NIST is primarily responsible for establishment and dissemination of standards and guidelines for federal computer systems, including those needed "to assure the cost-effective security and privacy of sensitive information in federal computer systems." NIST is also involved with other objectives of P.L. 100-235 intended to raise security awareness in the federal computing community: the establishment of security plans by operators of federal computer systems containing sensitive information, and training of all persons associated with such systems.
The complementary nature of the respective computer security missions of NSA and NIST as well as NSA's larger role in its national security arena necessitates cooperation between the two. That cooperation has recently been shaped by a Memorandum of Understanding (MOU) developed to help implement P.L. 100-235 and to assure national security review of areas of mutual interest (NIST/NSA, 1989). The Computer Security Act of 1987 calls for NIST to draw on NSA for technical assistance (e.g., research, development, evaluation, or endorsement) in certain areas. The MOU calls for NIST to draw on NSA's expertise and products "to the greatest extent possible" in developing telecommunications security standards for protecting sensitive but unclassified computer data, and to draw on NSA's guidelines for
computer system security to the extent that they are ''consistent with the requirements for protecting sensitive information in federal computer systems." Under the MOU, a joint NSA-NIST technical working group was established "to review and analyze issues of mutual interest" regarding the protection of systems processing sensitive information, especially those issues relating to cryptography.
The National Security Agency as well as NIST personnel are also involved with the NIST Computer and Telecommunications Security Council and with the Computer Systems Security and Advisory Board organized by NIST under P.L. 100-235.
According to the MOU, NIST is prevented from developing a competing set of ratings for security product evaluation.7 It plans instead to issue a management guide, aimed at civilian government, that will explain what trusted and evaluated systems are, and will point agencies toward evaluated systems as appropriate (this topic has already been treated in an NCSL Bulletin). Although NIST does not give specific product ratings or endorsements, it is involved with developing tests of products for conformance to its standards, and it has plans to accredit other organizations to validate products for conformance to certain FIPS. NIST does not appear likely to follow the NSA in publishing lists of evaluated products such as NCSC's Evaluated Products List.
Unlike the NSA, NIST has had only a small program in security-related research. In particular, it has sponsored none of the fundamental operating system research needed to develop or evaluate trusted computer systems, although NBS monitored the research and development activities of the 1970s and held an invitational Rancho Santa Fe Access Control workshop in 1972. NIST continues to participate in the DOD Computer Security Initiative through joint sponsorship of the "NBS" (now National) Computer Security Conference, and NIST has recently held a series of workshops aimed at generating guidelines for integrity.
Observers suggest that NSA continues to have a substantial, although not always direct, influence on NIST's activities, drawing on NSA's national security mission. While NIST's computer security responsibilities grew as a result of P.L. 100-235, it was denied several budget increases requested by the Administration, and it remains funded in this area at the level (i.e., taking into account growth in expenses like salaries) in place prior to the passage of the law. Out of an appropriated NIST budget of approximately $160 million (a level almost matched by externally sponsored research), the appropriated FY 1990 NIST security program was $2.5 million; the NSA budget, the details of which are classified, is on the order of $10 billion (Lardner, 1990b). Accordingly, the number of people involved in computer
security at NBS/NIST has always been relatively small compared with the number at NSA.
Other Government Agency Involvement
The historic emphasis on the roles of NSA and NIST makes it easy to overlook the fact that other government agencies and groups are also involved in promoting computer and communications security. As discussed in Chapter 8, other DOD agencies and the Department of Energy engage in security-related research and development, although, with the exception of DARPA, much of this work is tied to the operating mission of the relevant organization; the National Science Foundation (NSF) funds basic research in mathematics and computer science that is relevant to the development of secure and trusted systems. Note that while the DOD's research and procurement have emphasized a specific area of computer security—namely access control, which has a long-established basis in manual systems—it took almost two decades to transform research concepts into commercially produced, government-evaluated products, which are only now beginning to satisfy DOD application needs. This lengthy gestation reflected the need to develop, and achieve some consensus on, complex technology and an associated vocabulary.
As recognized by P.L. 100-235, the computerization of government activities creates a need for computer and communications security in all government agencies and organizations. For example, in an informal committee survey of 1989 government requests for proposals (RFPs), some of the highest computer security requirements were stipulated for systems being procured by the Treasury Department, the Federal Aviation Administration, and the Senate. Across the government, security is one of many concerns captured in Federal Information Resources Management Regulations (President's Council on Integrity and Efficiency, 1988; GSA, 1988), and P.L. 100-235 mandates computer security planning and precautions for federal organizations. However, merely having a plan on paper is no guarantee that sound or effective precautions have been taken. The GAO has repeatedly raised this concern in connection with government computer systems (GAO, 1990c).
Two agencies, the General Services Administration (GSA; which coordinates government procurement) and the Office of Management and Budget (OMB; which influences government procurement and has a general interest in the efficient use of information and systems), set the operating climate for computer and communications security
within civil government through circulars (e.g., A-130) and other directives. Despite this nominal breadth, defense agencies, which operate under a security-oriented culture and with a strong system of information classification, have been more active than most civilian agencies in seeking greater security. They have a relatively high degree of concern about unauthorized disclosure and access control, and they have been prodded by military standards (e.g., the Orange Book, which was made into a military standard) and by procurement requirements for specific types of systems in certain applications (e.g., Tempest units that have shielding to minimize electronic emanations).
Federal concerns regarding protection of unclassified systems and data include protection against improper disclosure of personal data, as required by the Privacy Act of 1974 (P.L. 93-579), protection against fraud, and protection of the availability and integrity of government systems (on which millions depend for a variety of payments and other services).
Although the scale of and public interest in government systems may be unique, the government shares many of the same problems found in commercial and other organizations, including inadequate awareness and inadequate precautions. Because of these commonalities, many of NIST's activities, while nominally aimed at meeting civilian government needs, are relevant to industry.
A third group of government entities involved with computer and communications security are the investigating and prosecuting agencies, including the Federal Bureau of Investigation (responsible for major federal law enforcement and also for counterintelligence), the Secret Service (responsible for investigating computer crimes involving finance and communications fraud), the Department of justice and the U.S. Attorneys (both responsible for prosecuting federal cases), agencies with specialized law enforcement responsibilities (e.g., U.S. Customs Service), and state and local law enforcement entities (Conly, 1989; Cook, 1989). These agencies are concerned with deterring and prosecuting computer crimes, which may result from inadequate computer and communications security. Among the challenges they have faced are encouraging the development of laws that fit emerging and anticipated patterns of crime, and applying laws developed under different technological regimes (e.g., laws against wire fraud) to computer crimes. (See Box 7.1 for a list of relevant laws.) These agencies report difficulties in achieving support from the public (computer-related crimes often go unreported), difficulties in obtaining the necessary technical expertise, and difficulties in obtaining management support for investigations of crimes that, compared to others, require a relatively large expenditure of resources for investigation relative to the nominal losses8 involved (Conly, 1989; Cook, 1989).
BOX 7.1 LEGISLATIVE TOOLS Congress has responded to the computer and telecommunication threat by providing federal investigators and prosecutors with impressive tools.
Many organizations rely on a security specialist or practitioner for guidance on computer and communications security problems and practices. Most such individuals are associated with information systems planning and operation units; others may be involved with the security of larger corporate functions (including physical facilities security as well as computer system concerns), with internal or external auditing responsibilities, or with an internal or external consulting service. As this range of roles suggests, security practitioners have a
variety of backgrounds and tend to be in staff positions. Informal communication with such individuals revealed a shared perception among security practitioners that their job is often made difficult by management's resistance to recommendations for greater security-related controls. Nevertheless, while much of the debate about technology development has been dominated by technical (research, development, and evaluation) experts, security practitioners are a more prominent influence on the ever-growing system-using community. These are the individuals responsible for selecting, recommending, and implementing security technology and procedures.
Several professional societies provide guidelines, continuing education, and other tools and techniques to computer and communications security practitioners. They include, for example, the Information Systems Security Association (ISSA), the Computer Security Institute (CSI), the Special Interest Group for Computer Security (SIG-CS) of the Data Processing Management Association (DPMA), the American Society for Industrial Security (ASIS), and the EDP Auditors Association. Another such group has been organized by SRI International, which offers a "continuing multiclient service" called the International Information Integrity Institute (I-4). The membership of I-4 is limited, by membership decision, to approximately 50 firms that are typically represented by security practitioners (SRI International, 1989). Other groups include large-scale users groups like Guide and Share for IBM system users and industry-specific associations like the Bank Administration Institute.
The need for professional certification has been a growing concern among security practitioners. By the mid-1980s professional societies recognized that certification programs attesting to the qualifications of information security officers would enhance the credibility of the computer security profession. After attempting without success to associate with existing accredited certification programs, the Information Systems Security Association (ISSA) decided to develop its own. Committees were formed to develop the common body of knowledge, criteria for grandfathering (to accommodate the transition to the new regime of certification), and test questions. The common body of knowledge refers to the knowledge deemed necessary to accomplish the tasks or activities performed by members in the field.
Elements of the common body of knowledge identified by a committee of a new consortium of professional societies described below include the following:
Access control—capabilities used by system management to achieve the desired levels of integrity and confidentiality by preventing unauthorized access to system resources.
Cryptography—use of encryption techniques to achieve data confidentiality.
Risk management—minimizing the effects of threats and exposures through the use of assessment or analysis, implementation of cost-effective countermeasures, risk acceptance and assignment, and so on.
Business continuity planning—preparation for actions to ensure that programs critical to preserving a business are run.
Data classification—implementation of rules for handling data in accordance with its sensitivity or importance.
Security awareness—consciousness of the reality and significance of threats and risks to information resources.
Computer and systems security—understanding computers, systems, and security architectures so as to be able to determine the appropriate type and amount of security appropriate for the operation.
Telecommunications security—protection of information in transit via telecommunications media and control of the use of telecommunications resources.
Organization architecture—structure for organization of employees to achieve information security goals.
Legal/regulatory expertise—knowledge of applicable laws and regulations relative to the security of information resources.
Investigation—collection of evidence related to information security incidents while maintaining the integrity of evidence for legal action.
Application program security—the controls contained in application programs to protect the integrity and confidentiality of application data and programs.
Systems program security—those mechanisms that maintain the security of a system's programs.
Physical security—methods of providing a safe facility to support data processing operations, including provision to limit (physical) access to authorized personnel.
Operations security—the controls over hardware, media, and the operators with access privileges to the hardware and media.
Information ethics—the elements of socially acceptable conduct with respect to information resources.
Security policy development—methods of advising employees of management's intentions with respect to the use and protection of information resources.
In November 1988 a consortium of organizations interested in the certification of information security practitioners began to forge a joint certification program. In mid-1989, the International Information Systems Security Certification Consortium or (ISC)2 was established
as a nonprofit corporation (under the provisions of the General Laws, Chapter 180, of the Commonwealth of Massachusetts) to develop a certification program for information systems security practitioners. Participating organizations include the Information Systems Security Association (ISSA), the Computer Security Institute (CSI), the Special Interest Group for Computer Security (SIG-CS) of the Data Processing Management Association (DPMA), the Canadian Information Processing Society (CIPS), the International Federation of Information Processing, agencies of the U.S. and Canadian governments, and Idaho State University (which has developed computer security education modules). Committees of volunteers from the various founding organizations are currently developing the products needed to implement the certification program, such as a code of ethics, the common body of knowledge, an RFP for obtaining a testing service, a marketing brochure for fund raising, and preliminary grandfathering criteria. Funds are being sought from major computer-using and computer-producing organizations.
According to (ISC)2 literature, certification will be open to all who "qualify ethically" and pass the examination—no particular affiliation with any professional organization is a prerequisite for taking the test. The examination will be a measure of professional competence and may be a useful element in the selection process when personnel are being considered for the information security function.9 Recertification requirements will be established to ensure that individual certifications remain current in this field that is changing rapidly as technological advancements make certain measures obsolete and provide more effective solutions to security problems.
The growth of security practitioner groups and activities is a positive force, one that can help to stimulate demand for trust technology. Because this profession is new, still evolving, and diverse in composition, it is not clear that it can have the impact on security that, say, certified public accountants have on accounting. That assumption is based in part on the absence to date of generally accepted computer and communications security principles and mature standards of practice in this arena, as well as the absence of the kind of legal accountability that other professions have achieved.