Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 286
Computers at Risk: Safe Computing in the Information Age Appendix F Glossary Access A subject's right to use an object. Examples include read and write access for data objects, execute access for programs, or create and delete access for directory objects. Access control The granting or denying to a subject (principal) of certain permissions to access an object, usually done according to a particular security model. Access control list A list of the subjects that are permitted to access an object, and the access rights of each subject. Access label See Label. Access level A level associated with a subject (e.g., a clearance level) or with an object (e.g., a classification level). Accountability The concept that individual subjects can be held responsible for actions that occur within a system. Accreditation 1. The administrative act of approving a computer system for use in a particular application. See Certification. 2. The act of approving an organization as, for example, an evaluation facility.
OCR for page 287
Computers at Risk: Safe Computing in the Information Age Administratively directed access control (ADAC) Access control in which administrators control who can access which objects. Contrast with user-directed access control (UDAC). See Mandatory access control. Assurance Confidence that a system design meets its requirements, or that its implementation meets its specification, or that some specific property is satisfied. Auditing The process of making and keeping the records necessary to support accountability. See Audit trail analysis. Audit trail The results of monitoring each operation of subjects on objects; for example, an audit trail might be a record of all actions taken on a particularly sensitive file. Audit trail analysis Examination of an audit trail, either manually or automatically, possibly in real time (Lunt, 1988). Authentication Providing assurance regarding the identity of a subject or object, for example, ensuring that a particular user is who he claims to be. Authentication sequence A sequence used to authenticate the identity of a subject or object. Authorization Determining whether a subject (a user or system) is trusted to act for a given purpose, for example, allowed to read a particular file. Availability The property that a given resource will be usable during a given time period. Bell and La Padula model An information-flow security model couched in terms of subjects and objects and based on the concept that information shall not flow to an object of lesser or noncomparable classification (Bell and La Padula, 1976).
OCR for page 288
Computers at Risk: Safe Computing in the Information Age Beta testing Use of a product by selected users before formal release. Biba model An integrity model in which no subject may depend on a less trusted object (including another subject) (Biba, 1975). Capability An authenticating entity acceptable as evidence of the right to perform some operation on some object. Certification The administrative act of approving a computer system for use in a particular application. See Accreditation. CESG The Communications-Electronics Security Group of the U.K. Government Communications Headquarters (GCHQ). Challenge-response An authentication procedure that requires calculating a correct response to an unpredictable challenge. Checksum Digits or bits summed according to arbitrary rules and used to verify the integrity of data. Ciphertext The result of transforming plaintext with an encryption algorithm. Also known as cryptotext. Claims language In the ITSEC, the language that describes the desired security features of a "target of evaluation" (a product or system), and against which the product or system can be evaluated. Clark-Wilson integrity model An approach to providing data integrity for common commercial activities, including software engineering concepts of abstract data types, separation of privilege, allocation of least privilege, and nondiscretionary access control (Clark and Wilson, 1987).
OCR for page 289
Computers at Risk: Safe Computing in the Information Age Classification level The security level of an object. See Sensitivity label. Cleanroom approach A software development process designed to reduce errors and increase productivity (Poore and Mills, 1989). Clear text Unencrypted text. Also known as plaintext. Contrast with ciphertext, cryptotext. Clearance level The security level of a subject. CLEF In the ITSEC, a Commercial Licensed Evaluation Facility. CoCom Coordinating Committee for Multilateral Export Controls, which began operations in 1950 to control export of strategic materials and technology to communist countries; participants include Australia, Belgium, Canada, Denmark, France, Germany, Greece, Italy, Japan, Luxembourg, the Netherlands, Norway, Portugal, Spain, Turkey, the United Kingdom, and the United States. COMPUSEC Computer security. COMSEC Communications security. Confidentiality Ensuring that data is disclosed only to authorized subjects. Correctness 1. The property of being consistent with a correctness criterion, such as a program being correct with respect to its system specification, or a specification being consistent with its requirements. 2. In ITSEC, a component of assurance (together with effectiveness). Countermeasure A mechanism that reduces the vulnerability of a threat.
OCR for page 290
Computers at Risk: Safe Computing in the Information Age Covert channel A communications channel that allows two cooperating processes to transfer information in a manner that violates a security policy, but without violating the access control. Criteria Definitions of properties and constraints to be met by system functionality and assurance. See TCSEC, ITSEC. Criticality The condition in which nonsatisfaction of a critical requirement can result in serious consequences, such as damage to national security or loss of life. A system is critical if any of its requirements are critical. Crypto-key An input to an encryption device that results in cryptotext. Cryptotext See Ciphertext. Data A sequence of symbols to which meaning may be assigned. Uninterpreted information. Data can be interpreted as representing numerical bits, literal characters, programs, and so on. (The term is used often throughout this report as a collective, singular noun.) See Information. Data Encryption Standard (DES) A popular secret-key encryption algorithm originally released in 1977 by the National Bureau of Standards. Delegate To authorize one subject to exercise some of the authority of another. Denial of service Reducing the availability of an object below the level needed to support critical processing or communication, as can happen, for example, in a system crash. Dependability The facet of reliability that relates to the degree of certainty that a system will operate correctly.
OCR for page 291
Computers at Risk: Safe Computing in the Information Age Dependence The existence of a relationship in which the subject may not work properly unless the object (possibly another subject) behaves properly. One system may depend on another system. Digital signature Data that can be generated only by an agent that knows some secret, and hence is evidence that such an agent must have generated it. Discretionary access control (DAC) An access-control mechanism that permits subjects to specify the access controls, subject to constraints such as changes permitted to the owner of an object. (DAC is usually equivalent to IBAC and UDAC, although hybrid DAC policies might be IBAC and ADAC.) DTI Department of Trade and Industry, U.K Dual-use system A system with both military and civilian applications. Effectiveness 1. The extent to which a system satisfies its criteria. 2. In ITSEC, a component of assurance (together with correctness). Emanation A signal emitted by a system that is not explicitly allowed by its specification. Evaluation 1. The process of examining a computer product or system with respect to certain criteria. 2. The results of that process. Feature 1. An advantage attributed to a system. 2. A euphemism for a fundamental flaw that cannot or will not be fixed. Firmware The programmable information used to control the low-level operations of hardware. Firmware is commonly stored in Read-Only Memorys (ROMs), which are initially installed in the factory and may be replaced in the field to fix mistakes or to improve system capabilities.
OCR for page 292
Computers at Risk: Safe Computing in the Information Age Formal Having a rigorous respect for form, that is, a mathematical or logical basis. FTLS Formal top-level specification. (See "Security Characteristics" in Chapter 5.) Functionality As distinct from assurance, the functional behavior of a system. Functionality requirements include, for example, confidentiality, integrity, availability, authentication, and safety. Gateway A system connected to different computer networks that mediates transfer of information between them. GCHQ Government Communications Headquarters, U.K. Group A set of subjects. Identity-based access control (IBAC) An access control mechanism based only on the identity of the subject and object. Contrast with rule-based access control. See Discretionary access control. Implementation The mechanism that (supposedly) realizes a specified design. Information Data to which meaning is assigned, according to context and assumed conventions. Information-flow control Access control based on restricting the flow of information into an object. See, for example, Bell and La Padula model. INFOSEC Information security. See also COMPUSEC and COMSEC.
OCR for page 293
Computers at Risk: Safe Computing in the Information Age Integrity The property that an object is changed only in a specified and authorized manner. Data integrity, program integrity, system integrity, and network integrity are all relevant to consideration of computer and system security. Integrity level A level of trustworthiness associated with a subject or object. Integrity policy See Policy. ITAR International Traffic in Arms Regulations (Office of the Federal Register, 1990). ITSEC The Information Technology Security Evaluation Criteria, the harmonized criteria of France, Germany, the Netherlands, and the United Kingdom (Federal Republic of Germany, 1990). Kernel A most trusted portion of a system that enforces a fundamental property, and on which the other portions of the system depend. Key An input that controls the transformation of data by an encryption algorithm. Label A level associated with a subject or object and defining its clearance or classification, respectively. In TCSEC usage, the security label consists of a hierarchical security level and a nonhierarchical security category. An integrity label may also exist, consisting of a hierarchical integrity level and a nonhierarchical integrity category (Biba, 1975). Letter bomb A logic bomb, contained in electronic mail, that is triggered when the mail is read.
OCR for page 294
Computers at Risk: Safe Computing in the Information Age Level 1. The combination of hierarchical and nonhierarchical components (TCSEC usage). See Security level, Integrity level. 2. The hierarchical component of a label, more precisely referred to as "hierarchical level" to avoid confusion. In the absence of nonhierarchical categories, the two definitions are identical. Logic bomb A Trojan horse set to trigger upon the occurrence of a particular logical event. Mandatory access control (MAC) 1. Access controls that cannot be made more permissive by users or subjects (general usage, roughly ADAC). 2. Access controls based on information sensitivity represented, for example, by security labels for clearance and classification (TCSEC usage, roughly RBAC and ADAC). Often based on information flow rules. Model An expression of a policy in a form that a system can enforce, or that analysis can use for reasoning about the policy and its enforcement. Monitoring Recording of relevant information about each operation by a subject on an object, maintained in an audit trail for subsequent analysis. Mutual authentication Providing mutual assurance regarding the identity of subjects and/or objects. For example, a system needs to authenticate a user, and the user needs to authenticate that the system is genuine. NCSC The National Computer Security Center, part of the National Security Agency, which is part of the Department of Defense. Node A computer system that is connected to a communications network and participates in the routing of messages within that network. Networks are usually described as a collection of nodes that are connected by communications links.
OCR for page 295
Computers at Risk: Safe Computing in the Information Age Nondiscretionary Equivalent to mandatory in TCSEC usage, otherwise equivalent to administratively directed access controls. Nonrepudiation An authentication that with high assurance can be asserted to be genuine, and that cannot subsequently be refuted. Object Something to which access is controlled. An object may be, for example, a system, subsystem, resource, or another subject. Operating system A collection of software programs intended to directly control the hardware of a computer (e.g., input/output requests, resource allocation, data management), and on which all the other programs running on the computer generally depend. UNIX, VAX/VMS, and DOS are all examples of operating systems. Orange Book Common name for the Department of Defense document that is the basic definition of the TCSEC, derived from the color of its cover (U.S. DOD, 1985d). The Orange Book provides criteria for the evaluation of different classes of trusted systems and is supplemented by many documents relating to its extension and interpretation. See Red Book, Yellow Book. OSI Open Systems Interconnection. A seven-layer networking model. Outsourcing The practice of procuring from external sources rather than producing within an organization. Password A sequence that a subject presents to a system for purposes of authentication. Patch A section of software code that is inserted into a program to correct mistakes or to alter the program.
OCR for page 296
Computers at Risk: Safe Computing in the Information Age Perimeter A boundary within which security controls are applied to protect assets. A security perimeter typically includes a security kernel, some trusted-code facilities, hardware, and possibly some communications channels. PIN Personal identification number. Typically used in connection with automated teller machines to authenticate a user. Plaintext See Clear text. Policy An informal, generally natural-language description of desired system behavior. Policies may be defined for particular requirements, such as security, integrity, and availability. Principal A person or system that can be authorized to access objects or can make statements affecting access control decisions. See the equivalent, Subject. Private Key See Secret key. Protected subsystem A program or subsystem that can act as a subject. Public key A key that is made available without concern for secrecy. Contrast with private key, secret key. Public-key encryption An encryption algorithm that uses a public key to encrypt data and a corresponding secret key to decrypt data. RAMP Rating Maintenance Phase. Part of the National Computer Security Center's product evaluation process. Receivers Subjects reading from a communication channel.
OCR for page 297
Computers at Risk: Safe Computing in the Information Age Red Book The Trusted Network Interpretation of the Trusted Computer System Evaluation Criteria, or TNI (U.S. DOD, 1987). Reference monitor A system component that enforces access controls on an object. Requirement A statement of the system behavior needed to enforce a given policy. Requirements are used to derive the technical specification of a system. Risk The likelihood that a vulnerability may be exploited, or that a threat may become harmful. RSA The Rivest-Shamir-Adelman public key encryption algorithm (Rivest et al., 1978). Rule-based access control (RBAC) Access control based on specific rules relating to the nature of the subject and object, beyond just their identities—such as security labels. Contrast with identity-based access control. See Mandatory access control. Safety The property that a system will satisfy certain criteria related to the preservation of personal and collective safety. Secrecy See Confidentiality. Secret Known at most to an authorized set of subjects. (A real secret is possible only when the size of the set is one or less.) Secret key A key that is kept secret. Also known as a private key. Secret-key encryption An encryption algorithm that uses only secret keys. Also known as private-key encryption.
OCR for page 298
Computers at Risk: Safe Computing in the Information Age Secure channel An information path in which the set of all possible senders can be known to the receivers, or the set of all possible receivers can be known to the senders, or both. Security 1. Freedom from danger; safety. 2. Computer security is protection of data in a system against disclosure, modification, or destruction. Protection of computer systems themselves. Safeguards can be both technical and administrative. 3. The property that a particular security policy is enforced, with some degree of assurance. 4. Often used in a restricted sense to signify confidentiality, particularly in the case of multilevel security. Security level A clearance level associated with a subject, or a classification level (or sensitivity label) associated with an object. Security policy See Policy. Sender A subject writing to a channel. Sensitivity label A security level (i.e., a classification level) associated with an object. Separation of duty A principle of design that separates functions with differing requirements for security or integrity into separate protection domains. Separation of duty is sometimes implemented as an authorization rule specifying that two or more subjects are required to authorize an operation. Shareware Software offered publicly and shared rather than sold. Signature See Digital signature. Simple security property An information-flow rule stating that a subject at a given security level can read only from an object with a security label that is the same or lower (Bell and La Padula, 1976).
OCR for page 299
Computers at Risk: Safe Computing in the Information Age Smart card A small computer in the shape of a credit card. Typically used to identify and authenticate its bearer, although it may have other computational functions. Source code The textual form in which a program is entered into a computer (e.g., FORTRAN). Specification A technical description of the desired behavior of a system, as derived from its requirements. A specification is used to develop and test an implementation of a system. Spoofing Assuming the characteristics of another computer system or user, for purposes of deception. State An abstraction of the total history of a system, usually in terms of state variables. The representation can be explicit or implicit. State machine In the classical model of a state machine, the outputs and the next state of the machine are functionally dependent on the inputs and the present state. This model is the basis for all computer systems. STU-III A secure telephone system using end-to-end private-key encryption. Stub An artifact, usually software, that can be used to simulate the behavior of parts of a system. It is usually used in testing software that relies on those parts of the system simulated by the stub. Stubs make it possible to test a system before all parts of it have been completed. Subject An active entity—e.g., a process or device acting on behalf of a user, or in some cases the actual user—that can make a request to perform an operation on an object. See the equivalent, Principal.
OCR for page 300
Computers at Risk: Safe Computing in the Information Age System 1. A state machine, that is, a device that, given the current state and inputs, yields a set of outputs and a new state (see State machine). 2. An interdependent collection of components that can be considered as a unified whole, for example, a networked collection of computer systems, a distributed system, a compiler or editor, a memory unit, and so on. TCB See Trusted computing base. TCSEC The Department of Defense Trusted Computer System Evaluation Criteria (U.S. DOD, 1985d). See Orange Book. Tempest U.S. government rules for limiting compromising signals (emanations) from electrical equipment. Threat The potential for exploitation of a vulnerability. Time bomb A Trojan horse set to trigger at a particular time. Token When used in the context of authentication, a physical device necessary for user identification. Token authenticator A pocket-sized computer that can participate in a challenge-response authentication scheme. The authentication sequences are called tokens. Trapdoor A hidden flaw in a system mechanism that can be triggered to circumvent the system's security. Trojan horse A computer program whose execution would result in undesired side effects, generally unanticipated by the user. A Trojan horse program may otherwise give the appearance of providing normal functionality.
OCR for page 301
Computers at Risk: Safe Computing in the Information Age Trust Belief that a system meets its specifications. Trusted computing base (TCB) A portion of a system that enforces a particular policy. The TCB must be resistant to tampering and circumvention. Under the TCSEC, it must also be small enough to be analyzed systematically. A TCB for security is part of the security perimeter. Trusted system A system believed to enforce a given set of attributes to a stated degree of assurance (confidence). Trustworthiness Assurance that a system deserves to be trusted. Tunneling attack An attack that attempts to exploit a weakness in a system at a low level of abstraction. User authentication Assuring the identity of a user. See Authorization. User-directed access control (UDAC) Access control in which users (or subjects generally) may alter the access rights. Such alterations may, for example, be restricted to certain individuals by the access controls, for example, limited to the owner of an object. Contrast with administratively directed access control. See Discretionary access control. Vaccine A program that attempts to detect and disable viruses. Virus A program, typically hidden, that attaches itself to other programs and has the ability to replicate. In personal computers, ''viruses" are generally Trojan horse programs that are replicated by inadvertent human action. In general computer usage, viruses are more likely to be self-replicating Trojan horses. Vulnerability A weakness in a system that can be exploited to violate the system's intended behavior. There may be security, integrity, availability, and other vulnerabilities. The act of exploiting a vulnerability represents a threat, which has an associated risk of being exploited.
OCR for page 302
Computers at Risk: Safe Computing in the Information Age Worm attack A worm is a program that distributes itself in multiple copies within a system or across a distributed system. A worm attack is a worm that may act beyond normally permitted behavior, perhaps exploiting security vulnerabilities or causing denial of service. Yellow Book The Department of Defense Technical Rationale Behind CSC-STD-003-85 (U.S. DOD, 1985b). Guidance for applying the TCSEC to specific environments. ZSI Zentralstelle für Sicherheit in der Informationstechnik. The German Information Security Agency (GISA).
Representative terms from entire chapter: