National Academies Press: OpenBook

Who Goes There?: Authentication Through the Lens of Privacy (2003)

Chapter: 5 Authentication Technologies

« Previous: 4 Security and Usability
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 104
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 105
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 106
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 107
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 108
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 109
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 110
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 111
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 112
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 113
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 114
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 115
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 116
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 117
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 118
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 119
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 120
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 121
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 122
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 123
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 124
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 125
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 126
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 127
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 128
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 129
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 130
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 131
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 132
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 133
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 134
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 135
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 136
Suggested Citation:"5 Authentication Technologies." National Research Council. 2003. Who Goes There?: Authentication Through the Lens of Privacy. Washington, DC: The National Academies Press. doi: 10.17226/10656.
×
Page 137

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

5 Authentication Technologies This chapter describes the basic technologies used as building blocks for authentication systems, especially those employed in computer and network environments. First, it describes technological choices that determine a dimension of authentication separate from the different kinds of authentication described in Chapter 2. Then, technological instantiations of the three main authentication mechanisms (something you know, have, are) are described. Multifactor authentication is consid- ered, and decentralized and centralized systems are compared. Finally, security and cost considerations for individual authentication technolo- gies are discussed. Throughout, this chapter also touches on the privacy implications of specific technologies in the context of authentication sys- tems, as appropriate. TECHNOLOGICAL FLAVORS OF AUTHENTICATION "Individual authentication" is defined in Chapter 1 as the process of establishing an understood level of confidence that an identifier refers to a specific individual. In an information systems context, it often is useful to distinguish among several types or modes of authentication, both for individuals (often referred to as "users") and for devices (such as comput- ers). This is a dimension distinct from the individual/attribute/identity authentication types discussed in Chapter 2. In the security literature, these modes are often referred to as one-way as opposed to two-way 104

AUTHENTICATION TECHNOLOGIES 105 authentication, initial versus continuous authentication, and data origin versus peer-entity authentication. Much individual authentication in an information system takes place in a client/server context, in which the individual user is the client (a "presenter" in the terminology introduced in Chapter 2) and some com- puter is a form of server (the "verifier". A user is required to authenti- cate his or her identity to a computer, usually as a prerequisite for gaining access to resources (access control or authorization). This is typically an explicit one-way authentication process; that is, the user authenticates himself or herself to the computer. If the user is authenticating to a computer directly (for example, when sitting at a desktop or laptop com- puter), there is an implicit two-way authentication; the user sees the com- puter with which he or she is interacting and presumably knows that it is the one he or she wishes to use.1 However, if the user is authenticating to a computer accessed via a communication network, there is often no way to verify that the com- puter at the other end of the communication path is the one that the user is trying to contact. The user typically relies on the communication infra- structure operating properly and thus connecting him or her to the in- tended computer. This assumption may be violated by any of a number of attacks against the communication path, starting with the computer that the user is employing locally. This lack of explicit, secure, two-way authentication can subvert many types of individual authentication mechanisms. If a presenter provides an identifier and authenticator to the wrong verifier, both security and privacy are adversely affected. Thus, two-way authentication is preferred so that a presenter can verify the identity of the verifier to which a secret may be disclosed. Initial authentication takes place when an individual first establishes a connection of some sort to a system. This may be a direct, very local connection, such as logging in to a desktop or laptop computer, or it may be a remote connection to a computer via a communication network. In either case, there is an assumption that future communication, for some period of time, is taking place between the two parties who were initially authenticated. For a direct connection, as defined here, this assumption usually relies on physical and procedural security measures; there is an assumption that the user will log out when leaving the computer unat- tended and in a place where others might access it. This is a form of implicit, continuous authentication. This assumption may not always be 1Looks can be deceiving, and even visual inspection of a proximate device is not always sufficient to authenticate it.

06 WHO GOES THERE? valid, and sometimes users are required to reauthenticate themselves ex- plicitly to the computer periodically, to verify that they are still present. This periodic reauthentication requirement is an explicit attempt at con- tinuous authentication, although it is not really continuous. Periodic reauthentication is also burdensome for the user and thus not commonly employed. When the connection between the user and a computer is through a network, there are many more opportunities for the connection to be "hijacked" that is, for an attacker to inject traffic into the connection or to seize the connection from the legitimate user. In remote-access con- texts, it is appropriate to employ explicit measures to ensure continuous authentication. Typically, this continuity is effected using cryptographic means, based on a secret (a cryptographic key) shared between a local computer employed by the user and a remote computer being accessed by the user, for the life of the connection. In this latter context, the techni- cal term for the security service being provided is "data origin authentica- tion." Continuous authentication is generally a result of a transition from initial, individual authentication to data origin authentication. It is the source (origin) of the data sent between two systems for example, be- tween a user's desktop and a server that is being authenticated rather than the user per se. A further technical distinction is sometimes applied. If the authentication mechanism ensures the timeliness of the communi- cation and thus provides protection against attacks that replay old mes- sages, the service is referred to as "peer-entity authentication." Individual authentication increasingly takes place in the context of information systems, and thus all of the flavors of authentication de- scribed above are relevant to this discussion of individual authentication technologies. BASIC TYPES OF AUTHENTICATION MECHANISMS By the mid-1970s, three basic classes of authentication technologies for use with information systems had been identified.2 They are colloqui- ally characterized as "something you know, something you have, and something you are" and were discussed abstractly in Chapter 2. This section focuses on specific technological examples of each of these basic classes. In the first class are authentication technologies based on what an individual can memorize (know). Passwords and personal identifica- 2D.E. Raphael and J.R. Young. Automated Personal Identification. Stanford Research Insti- tute International, 1974; National Bureau of Standards. "Evaluation Techniques for Human Identification." FIPSPUB48, April 1977.

AUTHENTICATION TECHNOLOGIES 107 lion numbers (PINs) are the canonical examples of such technology. In the "something you have" class are physical objects that are (assumed to be) hard to forge or to alter, such as magnetic-stripe cards, smart cards, SecurID cards, and so on. The object is issued to an identified individual and retained by the individual, so that possession of the object serves to identify the individual. In the last class are biometric authentication tech- nologies, which measure physical and behavioral characteristics of an individual. Each of these classes of authentication technologies has ad- vantages and limitations with regard to security, usability, and cost. Something You Know Simple, password-based authentication is the most common form of initial, one-way authentication used in information systems. A user re- members a short string of characters (typically six to eight) and presents the character string to a system for verification when requested. The string of characters is reused many times in authenticating to the same system; hence the passwords are usually referred to as "static." This sort of system is susceptible to many forms of attack. Because users have trouble choosing and remembering values with a significant number of "random" bits, passwords generally are vulnerable to guessing attacks. Unless passwords are protected (usually this means encrypted) for trans- mission over communication paths, they are subject to interception and subsequent use by a wiretapper. The lack of two-way authentication means that a user can be tricked into revealing a password if he or she connects to an attacker instead of to the desired system. Password-based authentication is cheap to implement; it may not require any explicit software purchases. It is easy for users and develop- ers to understand, so training costs are low. But, on a life-cycle basis, passwords are expensive for an organization to administer, largely be- cause of the costs of help-desk support for users who forget passwords. Users find passwords difficult to manage as they deal with a growing number of systems that require them. This leads to password reuse (that is, using the same password for multiple systems) and insecure storage of passwords (for example, in unencrypted files on computers). Both of these practices undermine the security of password-based systems. In the former case, if one system is compromised and the passwords used in it become known, other systems in which the user employs the same pass- words could be compromised. That is, compromise of the user's desktop or laptop or personal digital assistant (PDA) compromises all of the pass- words employed by that user to access many other systems. Many of the common recommendations for improving the security of passwords without changing the fundamental mechanisms involved trade

08 WHO GOES THERE? one form of insecurity for another. For example, if users are encouraged to choose passwords that are hard to guess, they will probably have to record the passwords somewhere (because these passwords are not easily remembered), making them vulnerable to attacks against the stored pass- words. Users are encouraged to change passwords periodically, which also increases the likelihood of recording the passwords in vulnerable locations. Passwords are easily shared. A user can tell others his or her pass- word; in many situations this is common and even encouraged as a sign of trust.3 Passwords can also be shared inadvertently, as they are often written down in semipublic places.4 This is not always a serious problem if the threat model focuses primarily on outsiders, but insiders represent threats in many contexts, and users often do not consider this type of threat when sharing passwords or recording them. In principle, passwords can offer a great deal of anonymity. In prac- tice, however, most people cannot remember many different passwords, and they tend to reuse the same passwords for different purposes. More- over, if allowed to select an identifier as well as a password, the user may choose to use the same values for multiple systems. This makes it poten- tially easy to link multiple accounts to the same user across system bound- aries, even though the base technology does not necessarily impose such linkages. Additionally, the recovery mechanisms for lost passwords gen- erally require one's mother's maiden name, an e-mail address, or some other form of personal information. Thus, the infrastructure for pass- word maintenance often requires sharing other forms of information that is personal and so presumed to be less likely to be forgotten (see Box 5.1~. This, too, potentially undermines privacy. Finding 5.1: Static passwords are the most commonly used form of user authentication, but they are also the source of many system security weaknesses, especially because they are often used inappropriately. Recommendation 5.1: Users should be educated with respect to the weaknesses of static passwords. System designers must consider trade-offs between usability and security when de- ploying authentication systems that rely on static passwords to ensure that the protections provided are commensurate with 3D. Weirich and M.A. Sasse. "Persuasive Password Security." Proceedings of CHI 2001 Conference on Human Factors in Computing Systems, Seattle, Wash., April 2001. 4J. Nielsen, "Security and Human Factors," Useit.com's Alertbox, November 26, 2000. Accessed on March 26, 2002, at <http://www.useit.com/alertbox/20001126.html>.

AUTHENTICATION TECHNOLOGIES 109 the risk and harm from a potential compromise of such an au- thentication solution. Great care should be taken in the design of systems that rely on static passwords. More secure authentication technologies can be based on password technology at some levels. For example, schemes such as encrypted key exchange (EKE)5 and Kerberos (a network authentication protocol)6 also 5s. Bellovin and M. Merritt. "Encrypted Key Exchange: Password-Based Protocols Secure Against Dictionary Attacks." Proceedings of the IEEE Symposium on Security and Privacy. Oak-

0 WHO GOES THERE? make use of static passwords. These schemes employ sophisticated cryp- tographic mechanisms and protocols to counter many of the attacks that are effective against static passwords. They typically provide one-way, initial authentication, which may transition to two-way, data-origin and peer-entity authentication for subsequent communication. These are not, per se, password-based authentication technologies. The section "Multi- factor Authentication" discusses in more detail authentication protocols of this sort. Something You Have The "something you have" class of authentication technologies is based on the possession of some form of physical token that is presumed to be hard to forge or alter. Many forms of physical tokens are used for authentication and for authorization outside the context of information systems, and they exhibit varying degrees of resistance to forgery and alteration. For example, many driver's licenses and credit cards make use of holograms as a deterrent to forgery, relying on visual verification by a human being when they are presented. Yet credit cards are now used extensively for purchases by mail or telephone or over the Web. In these contexts, there is no visual verification of the credential, so the antitamper security mechanisms are ineffective. In information systems, the security of hardware tokens, to first order, is usually based on the ability of these devices to store, and maybe make direct use of, one or more secret values. Each of these secrets can be much larger and more random than typical passwords, so physical tokens address some of the vulnerabilities, such as guessability, cited above for passwords. Nonetheless, the simplest forms of tokens share some of the same vulnerabilities as passwords that is, they both deal with static, secret values. A magnetic-stripe card is an example of a simple physical authentica- tion token. Tokens of this sort offer the primary benefit of storing larger secrets, but they offer almost no protection if the token is lost or stolen, because readers are readily available and can extract all data (secrets) from the magnetic stripe. After a secret is read from the card (even in the context of a legitimate authentication process), the secret is vulnerable in the same ways that a password is (for example, it can be intercepted if transmitted via an insecure communication channel or compromised land, Calif., May 1992, pp. 72-84. Available online at <http://citeseer.nj.nec.com/ bellovin92encrypted.html>. 6More information on Kerberos is available online at <http://web.mit.edu/kerberos/ www/>.

AUTHENTICATION TECHNOLOGIES 111 while held in storage in a computers. When a magnetic-stripe card is swiped, all the data can be read from the card and become accessible to malicious software in the system. This possible misuse argues against storing the secrets used to authenticate multiple, distinct identities on one card. (The storage space on cards of this sort also is very limited.) Con- versely, requiring a user to carry multiple physical cards to maintain multiple secrets is inconvenient for the user and adds to overall costs. Although magnetic-stripe cards and their readers are not very expensive, computer systems (other than in retail sales contexts) generally do not offer readers as standard equipment, so there are cost barriers to the use of such cards for individual authentication in home or corporate environ- ments. This is a good example of trade-offs among competing goals of security, user convenience, and cost. Sometimes applications for mag- netic cards focus on authorization rather than authentication, as in the case of transit fare cards (see Box 5.2~. An a weii-aesignea system, a secret read from a care would not be retained in storage on the system for very long, and the vulnerability here would be much less than when pass- words are stored in a file.

2 WHO GOES THERE? A not-so-obvious form of "something you have" authentication is a Web cookie. The need for cookies arises because the protocol used for Web access, hypertext transfer protocol (HTTP),is "stateless"; HTTP does not provide a reliable way for a server to know that a particular request is coming from the same user as a previous request.8 Pure stateless opera- tion would make it difficult to provide functions such as browsing through an online catalog and collecting a "shopping cart" full of items that the user has decided to purchase. Also, if the user is browsing through information that needs authentication (such as information about the user's bank account), it would be inconvenient if the user had to type a name and password each time a different page was viewed. The solution to this problem, designed by Netscape, is called a "cookie." A cookie is data given by a Web server to the client to maintain state. Each time a client makes a request to a server, any cookies provided to the client by that server are sent to the server along with the request. Thus, for example, the identifier and password provided by a user for initial authentication may be transformed into a cookie to facilitate con- tinuous authentication of the HTTP session. Sometimes a cookie is a bigger secret than an individual could remember. It may be like a secret stored in a token; in this case, the token is the user's computer, with all the attendant security problems that arise from storing secrets in a file on a computer and the problems that arise if the secret is transmitted across a communication network without encryption. Sometimes a cookie is used to track an individual's authorization in the context of an HTTP session. In such cases, the cookie itself may be a cryptographically protected value in order to prevent a user from tampering with it and thus fooling the Web server. The use of cookies is often criticized as a mechanism that violates privacy, but it depends on how they are used. If they are used solely to effect session continuity, overcoming the limitations of HTTP, and if the server does not maintain information about the user, they can be a pri- vacy-neutral or even privacy-enhancing technology. But cookies are sometimes used to track a user's movements through multiple sites. Many sites that do not require authentication will set a cookie on the first visit. This lets the site track return visits by presumably the same user, even though the site operators do not know who that person is in a larger context. Often, this technique is employed by third-party advertising 8The statelessness of the HTTP protocol implies that each request sent for a page is completely independent of any requests that came before. Thus preserving information from one click to the next requires additional technology (D. Kristol and L. Montulli, "HTTP State Management Mechanism," Request for Comments (RFC) 2965~.

AUTHENTICATION TECHNOLOGIES 113 sites; this use of cookies permits tracking users (and their interests) across multiple Web sites. This is a form of covert identification (see Chapter 1~; the user's identity as a Web site visitor and a dossier of his or her activity are compiled and retained, and an identifier in the form of the cookie is assigned. It is not necessary to use cookies to track user activity, however. Even if cookies were banned, it would still be possible to track a user's Web history through other mechanisms such as log files, browser caches, and browser history files.9 Smart cards are credit-card-size tokens that contain memory and, of- ten, a processor. Smart cards that act only as memory devices are essen- tially as vulnerable as magnetic-stripe cards in terms of extracting the secrets stored on the cards, because readers are widely available, and malicious software can extract stored values from the card. The costs for these cards is somewhat higher than those for magnetic-stripe cards, and smart card readers are more expensive as well, but smart storage cards offer more data storage than magnetic-stripe cards do, and they resist wear better.l° Universal Serial Bus (USB) storage tokens are another hardware storage token format. They have a potential advantage in that many PCs offer USB interfaces, thus eliminating reader cost and availabil- ity as barriers to deployment. Tokens that act only as storage devices may be used to provide initial, one-way authentication analogous to static passwords. However, be- cause these devices can hold larger, "more random" secret values (that is, an arbitrary collection of bits as opposed to something meaningful or mnemonic to a person), they can provide somewhat better security. In- creasingly, tokens of this sort are being used to bootstrap continuous data-origin authentication schemes that are implemented using the pro- cessing capabilities of a computer to which the token is (locally) con- nected. (Recall that the authentication taking place here is authenticating a local computer to a remote computer, not a person to a remote com- puter.) These schemes are often challenge/response protocols, as de- scribed below. Since these protocols are executed in the computer, not the 9So-called "Web bugs" are another mechanism used to surreptitiously observe an individual's actions online. They are objects, usually one-pixel-square graphic images, em- bedded within the HTML source on a Web site that cause part of the displayed Web page to be retrieved from another Web site, thereby transmitting information about the requester to a third party. Web bugs are used on a surprisingly large number of sites, primarily for statistical purposes and to gauge the effectiveness of advertising. The information trans- mitted to the "bugger" includes an IF address and the last site visited and may be linked to cookies to collect individual Web surfing profiles. Web bugs are also embedded in e-mail messages by shammers, who use them to validate live addresses. 1OThe magnetic stripe can abrade, and the data records on it may be degraded by expo- sure to magnetic fields.

4 WHO GOES THERE? token, they also can make use of secrets stored in the computer rather than on separate hardware storage tokens. The term "software token" has been coined to refer to the use of secrets stored on a computer and employed in conjunction with an authentication protocol. Software tokens are not as secure as hardware storage tokens, since the secrets used by the software are held in files in a computer on a long-term basis. At best, these secrets typically are protected by a password. Thus, any attack against the com- puter that compromises these files allows an attacker to retrieve the stored secrets through password-guessing attacks. In contrast, a well-designed authentication technology that uses a hardware storage token would read the secretes) stored on the token, use them, then erase them from the com- puter memory as quickly as possible. These actions present a smaller win- dow of opportunity for the compromise of the secretes), making the use of hardware storage tokens potentially more secure. The main attraction of software tokens is the low cost; the software may be free or inexpensive, and there is no need to buy token readers. Some of the earliest hardware authentication tokensll and some of the most popular ones employed today, such as SecurID (see Box 5.3), do not interface directly with an authentication system. Instead, the user is required to act as an interface, relaying information between an informa- tion system and the token. Tokens of this sort typically implement a type of authentication known as algorithmic challenge/response, or just chal- lenge/response. Challenge/response schemes operate much like human- enacted authentication scenarios. Most movie goers would recognize the words "Halt! Who goes there?" as the beginning of a challenge/response exchange between a guard and an individual approaching a guarded area. The password in such a scenario would usually change daily, con- sistent with human limitations for adapting to new passwords. In an online authentication technology, the challenge can change every time, making the corresponding response unique in order to thwart eavesdrop- ping attacks. Challenge/response schemes are a generic technique to prove knowl- edge of a secret, sometimes even without disclosing it to the party per- forming the authentication check.l2 Challenge/response schemes are analogous to Intruder: Friend or foe (IFF) systems originally developed 1lJ. Herman, S. Kent, and P. Sevcik. "Personal Authentication System for Access Control to the Defense Data Network." Proceedings of the 15th Annual IEEE Electronics and Aerospace Systems Conference (EASCON), September 1982. 12Research into a class of algorithms known as "zero knowledge algorithms" is moving work forward in this area. As a starting point for what this work involves, see S. Goldwasser, S. Micali, and C. Rackoff, "The Knowledge Complexity of Interactive Proof- Systems," in Proceedings of the Seventeenth Annual ACM Symposium on Theory of Computing,

AUTHENTICATION TECHNOLOGIES 115 by the military for automated authentication of aircraft by ground per- sonnel operating antiaircraft batteries. Although challenge/response sys- tems for information systems were originally implemented using hard- ware tokens, software tokens now are employed frequently for this purpose and there are manual analogs. Imagine a large sheet of paper with many different numbered passwords. The verifier sends the num- ber; the presenter sends back the corresponding password (which illus- trates why these systems are sometimes called one-time password schemes). In practice, the verifier sends some string of characters to the presenter (user); the presenter computes a response value based on that string and on a secret known to the user. This response value is checked by the verifier and serves as the "password" for only one transaction or session. As typically employed in a user-to-system authentication ex- change, this is an example of a one-way initial authentication scheme, but New York, ACM Press; O. Goldreich and H. Krawczyk, 1985, "On the Composition of Zero-Knowledge Proof Systems," Proceedings of 17th International Colloquium on Automata, Languages and Programming (ICALP), Coventry, U.K, July 16-20, 1990; U. Fiege, A. Fiat, and A. Shamir, "Zero Knowledge Proofs of Identity," Proceedings of the Nineteenth Annual ACM Conference on Theory of Computing, New York, ACM Press, 1987; and J. J. Quisquater and L. Guillou, "How to Explain Zero-knowledge Protocols to Your Children," Advances in Cryptology (Crypto '89), Springer-Verlag, pp. 628-631, 1990.

6 WHO GOES THERE? it is much more secure, relative to a variety of threats, than are static passwords. There are many variations on this scheme: Often a shared secret known to both the presenter and the verifier can be used to generate and verify the response. In schemes based on public key cryptosystems, a presenter may employ a private key to generate a response that the veri- fier checks using the corresponding public key associated with the pre- senter. These operations can be carried out using software (such as S/Keyl3 ), or the user may employ a hardware token to perform the calcu- lation. (See Box 5.4, on how such a technology might be used.) Hardware tokens that contain processors (for example, cryptographic processor smart cards, PC cards, some proximity cards,l4 or USB proces- sor tokens) are qualitatively different from all of the previous token types. They can be much more secure than hardware storage tokens or software tokens, because they can maintain secret values within the card and never export them (i.e., transmit secrets off the card). A smart card typically performs cryptographic operations in the card, using stored secret values to execute parts of an authentication protocol, such as a challenge/re- sponse protocol, on behalf of the cardholder. With a token capable of cryptographic operations, the secrets contained in the token are not ex- posed as a result of inserting the token into a reader, and no secrets are released to the computers to which the readers are attached or transmit- ted across a communication path. Typically, a user must enter a PIN to enable a smart card token, and the entry of a wrong PIN value multiple times in succession (logically) disables the token. This provides some protection in case the card is lost or stolen. Nonetheless, capable adver- 13See L. Lamport, "Password Authentication with Insecure Communication," Communi- cations of the ACM 24~11~(November 1981~:770-772, and Phil Karn's reference implementa- tion of S/Key described in Neil Hailer, "The S/Key One-Time Password System," RFC 1760, February 1995; available online at <http://www.faqs.org/rfcs/rfcl760.html>. 14A proximity card contains information stored electronically within the card. The infor- mation is transmitted via radio over a short distance (typically less than 10 centimeters) after the card is queried. Users like these cards because they require very few steps to perform authentication and therefore are quite fast. These cards are vulnerable to physical attacks that extract data from them. They also may be susceptible to interception of the transmitted data (over a short distance) and to spoofing attacks, in which the attacker transmits the same sort of query as a legitimate verifier and records the response. For disabled users, proximity cards may be attractive alternatives to magnetic-stripe cards since card readers for the former have instructions that are typically visual, they are not always located in positions accessible to those in wheelchairs, and they are hard to insert for those whose manual dexterity is poor. For more, see John Gill and J.N. Slater, "Nightmare on Smart Street," Tiresias: International Information on Visual Disability, 2002 (updated), avail- able online at <http://www.tiresias.org/reports/tidecon2.htm>.

AUTHENTICATION TECHNOLOGIES 117

8 WHO GOES THERE?

AUTHENTICATION TECHNOLOGIES 119

20 WHO GOES THERE? series with access to sophisticated technology (of the sort that might com- monly be found in a college physics lab) can extract secret values stored on smart cards.l5 Processor tokens are noticeably more expensive than magnetic-stripe cards or other storage tokens. The cost of readers varies, depending on which token technology is employed. Often a token of this sort is used only for initial one- or two-way authentication, based on the execution of the cryptographic operations of a challenge/response protocol within the token. This provides a secure foundation for two-way, continuous authentication schemes based on this initial exchange. Responsibility for continuous authentication often is borne by the computer to which the smart card is attached, because smart card interfaces are too slow to deal with all the data transmitted or received on a connection. However, the continuous authentication bootstrapped from a smart card offers the opportunity for better security overall. Hardware tokens have the desirable security property of not being able to be readily replicated by users, although the difficulty of unautho- rized replication varies widely, as noted above. In principle, if one user chooses to share his or her token with another, the first user relinquishes the ability to authenticate himself or herself as long as the hardware token is loaned. This guarantee is diminished with some forms of hardware storage tokens and with software tokens, since one can copy the files that personalize them. The extent to which tokens preclude sharing repre- sents a significant improvement over static passwords. A necessary cor- ollary to this observation is that the loss of any token results in a replace- ment cost, and the replacement process is more difficult than for a password. A help desk cannot remotely replace a token in the way that it can reset a password. Hardware cryptographic tokens and software to- kens entail costs for integration into applications because they execute authentication protocols rather than act just as repositories for secrets. Something You Are The final class of technologies "something you are" refers to the use of biometrics to authenticate individuals. Biometric authentication, which has received much attention in the media of late, is the automatic 15See, for example, R. Anderson and M. Kuhn, "Tamper Resistance A Cautionary Note," The Second USENIX Workshop on Electronic Commerce Proceedings, Oakland, Calif., November 18-21, 1996. 16Typically, two-way authentication relies on the use of public key cryptography and certificates.

AUTHENTICATION TECHNOLOGIES 121 identification or identity verification of human individuals on the basis of behavioral and physiological characteristics.~7 Biometric authentication is fundamentally different from the other two classes because it does not rely on secrets. Biometrics themselves are not secrets; people commonly leave fingerprints on everything they touch. Our voice, handwriting, and facial image can be captured without our knowledge. Rather, biometric authentication relies on registering and later matching what are believed to be distinguishing physical or behav- ioral characteristics of individuals. There are many different examples of biometric authentication: it can be based on fingerprints, iris scanning, voice analysis, handwriting dynamics, keystroke dynamics, and so on. Biometric authentication operates by matching measured nhvsical characteristics of a person against a template or a generating model of these characteristics that was created when the person was registered with an authentication system. The match between a captured biometric and the template is never exact, because of the "noise" associated with the measurement processes, the way the characteristic is presented to the sensor, and changes in the underlying biometric characteristic itself. Thus, these technologies require an administrator to set threshold values and a decision policy that controls how close the match must be and how many attempts to be authenticated the user will be allowed to make. The scoring aspect of biometrics is a major departure from other classes of individual authentication technologies, which provide a simple, binary determination of whether an authentication attempt was success- ful. The scoring aspect of biometric authentication technologies means that they exhibit Type I (false negative) and Type II (false positive) errors. Type I errors run the risk of inconveniencing or even alienating individu- als whose authentication attempts are erroneously rejected. Type II er- rors are security and privacy failures, as they represent authentication decisions that might allow unauthorized access. For any specific biomet- ric technology implementation, there is a trade-off between these two types of errors: Changing the scoring to reduce Type I errors increases Type II errors, and vice versa. Some implementations of biometric au- thentication technologies exhibit relatively poor tradeoffs between these two error types, forcing an administrator to choose between inconve- niencing legitimate users while rejecting (appropriately) almost all im- poster attempts, or minimizing inconvenience to legitimate users while accepting a higher rate of successful impostors. ~7J.~. Wayman, "Fundamentals of Biometric Authentication Technologies," International Journal of Imaging and Graphics 1~1~2001~; B. Miller, "Everything You Need to Know About Biometrics," PIN Industry Sourcebook, Warfel and Miller, 1989.

22 WHO GOES THERE? Biometric values that are captured for authentication and transmitted to a remote location for verification must be protected in transit. They are vulnerable to interception and replay, just like (static) passwords, unless suitably protected during transmission across communication networks. It also is important to ensure that the transmitted value represents a legiti- mate, digitized sample of a user biometric. Otherwise, an attacker might inject a string of bits that purports to be a biometric sample in an effort to subvert the system. Typically (though not always), the range of possible values for a biometric sample is so large that guessing is not a viable means of attack. But since biometric values are not secrets per se, it is conceivable that an attacker has gained access to a user's fingerprint, for example, and has digitized it in an effort to masquerade as the user. Moreover, if unencrypted (or weakly encrypted) biometric templates are stored in centralized authentication servers, an attack against one of these servers could result in the disclosure of the templates for all the users registered with the compromised server. With access to the templates and knowledge of the scoring algorithm, an attacker could engage in off- line analysis to synthesize bit strings that would pass as legitimate bio- metric samples for specific users. Today biometric authentication systems are not widely deployed, and there are many implementation variants for the same type of biometric (for example, a plethora of fingerprint systems). However, if biometric authentication were to become widely deployed and if there were signifi- cant consolidation and standardization in the industry (resulting in fewer variants), the compromise of an authentication server could have a very significant impact owing to the special characteristics of biometrics: namely, that they are not secret and cannot be easily modified. As with hardware tokens, the deployment of biometric authentica- tion sensors entails hardware-acquisition costs, although the cost here is typically for each access point of a system rather than for each user of the system. Sensors for biometric authentication have been expensive, and this has been a barrier to adoption. However, the cost of some biometric sensors, especially fingerprint-scanning sensors, has declined, making them affordable for use in individual computers and laptops. Although all biometric measures change over time, an individual cannot forget his or her biometric values, unlike passwords and PINs, nor can they be lost, like hardware tokens. Thus, life-cycle costs can, in principle, be lower for biometric authentication technologies, primarily because of reduced help- desk costs. However, in practice, most biometric authentication systems require the use of a password or PIN to improve security, and this elimi- nates the cost advantage that would have accrued from fewer help-desk calls. In fact, one can improve the performance of biometric authentica- tion systems in some contexts by offering an alternative authentication

AUTHENTICATION TECHNOLOGIES 123 mechanism for some individuals who are not compatible with a specific technology. (For example, some percentage of the general population has fingerprints that are not well recognized by fingerprint scanners.) Finding 5.2: Biometric user-authentication technologies hold the promise of improved user convenience. Vendors of these technologies also promise reduced system management costs, but this has yet to be demonstrated in practice. Moreover, these technologies can pose serious privacy and security concerns if employed in systems that make use of servers to compare bio- metric samples against stored templates (as is the case in many large-scale systems). Their use in very local contexts (for ex- ample, to control access to a laptop or smart card) generally poses fewer security and privacy concerns. Recommendation 5.2: Biometric technologies should not be used to authenticate users via remote authentication servers because of the potential for large-scale privacy and security compromises in the event of a successful attack (either internal or external) against such servers. The use of biometrics for local authentication for example, to control access to a private key on a smart card is a more appropriate type of use for biometrics. Biometric authentication offers only one-way initial authentication. As noted above, biometric authentication does not provide direct protec- tion for secrets, so it does not provide a basis for bootstrapping from initial to continuous authentication, nor does it support two-way authen- tication, unlike many of the "something you have" technologies described above. Thus, biometric authentication is not an appropriate replacement for other authentication technologies, specifically for cryptographic tech- nologies used to provide two-way initial authentication and continuous authentication. Box 5.5 provides some commonsense guidelines for the uses of biometric authentication systems. MULTIFACTOR AUTHENTICATION It is often asserted that individual authentication can be improved by employing multiple "factors." Generally this translates into using authen- tication technologies from two of the classes described above. Examples include a PIN plus a hardware token (something you know and some- thing you have) or a PIN and a biometric (something you know and something you are). There is a reasonable basis for this strategy, but it is

24 WHO GOES THERE? not foolproof. The assumption underlying the perceived security benefits of multifactor authentication is that the failure modes for different factors are largely independent. So, for example, a hardware token might be lost or stolen, but the PIN required for use with the token would not be lost or stolen at the same time. This assumption is not always true, however. For example, a PIN attached to a hardware token is compromised at the same time that the token is lost or stolen. If a fingerprint is used to activate a hardware token, is it not likely that copies of the fingerprint will appear on the token itself? As noted earlier, one cannot evaluate the rela- tive security of mechanisms without reference to a threat model, and some threat models undermine the perceived security of multifactor au- thentication. Nonetheless, multifactor authentication can improve the security of authentication under many circumstances.

AUTHENTICATION TECHNOLOGIES CENTRALIZED VERSUS DECENTRALIZED AUTHENTICATION SYSTEMS 125 A crucial issue for authentication technologies is whether they are inherently centralized or decentralized. This distinction affects both their deployability and their privacy implications. Some technologies require little or no infrastructure; any system can make use of such technologies without relying on additional systems for support. This is one of the major drivers of the use of static passwords: They are extremely easy to set up in a highly localized fashion. An appli- cation can create and maintain its own password database with about the same effort as that needed for maintaining a database of authorized users. (In fact, doing so properly is rather more complex, but there are numer- ous poorly implemented password systems.) Some public key authentica- tion technologies have similar decentralized properties. Some challenge/ response protocols are designed for local use and require minimal infra- structure. The one-time passwords and Secure Shell (SSH)~9 protocols are good examples. The latter makes use of public key cryptography but not a public key infrastructure (described later in this section). Both arguably provide a more secure authentication capability than passwords, but they are still intended for use in local contexts, such as a single com- puter or at most a single organization. Some types of authentication technologies require some degree of centralization for example, to help amortize the costs associated with deployment to gain security benefits. Kerberos and public key infrastruc- ture (PKI) are good examples of such systems. In Kerberos, the key distri- bution center (KDC) is a centralized infrastructure component that stores the passwords of all users, preventing them from having to be shared with each system to which a user connects. The KDC is aware of all sites with which the user interacts, because the KDC is invoked the first time that a user establishes a connection in any given log-in session. The content sent over the connections is not necessarily revealed; neverthe- less, the central site operates as the verifier, acting as an intermediary between the user (presenter) and the sites that rely on Kerberos for au- thentication. The scope of a Kerberos system deployment is typically limited, which in practice mitigates some of the privacy concerns. A1- though Kerberos systems can be interconnected, most usage of Kerberos is within an individual organization. When cross-realm authentication is employed, only those transactions that involve multiple realms are known Lessee RFC 1760 at <htip://www.faqs.org/rfcs/rfcl760.htmb. Resee <htip://www.ielf.org/internet-drafts/draft-ielf-secSh-userauth-16.ixt>.

26 WHO GOES THERE? Outside a user's home realm.20 This limits the adverse privacy aspects of using such a system. However, if a single Kerberos realm was used to authenticate individuals to systems across organizational boundaries, the privacy implications would be much worse. Thus, the same technology can be used in different contexts with vastly different privacy implica- tions. For more information about Kerberos, see Figure 5.1. The Passport and Liberty systems, though very different in detail, are centralized systems designed expressly to authenticate large user popula- tions to a wide range of disparate systems, with attendant privacy impli- cations. While their designs differ slightly, both offer users the same basic feature: the convenience of single sign-on to a variety of Web services. From a privacy perspective, the obvious drawback to centralized authen- tication systems is that all Web clients cannot be expected to trust the same authentication service with what could be personally identifying information. Passport and Liberty both address this fundamental ob- stacle by allowing what they call a federated topology. "Federated," in this context, means that peer authentication services can interoperate with different subsets of service providers. For example, a car rental company could rely on four different airlines' authentication services. Theoreti- cally, a single user could navigate seamlessly between multiply affiliated sites after authenticating only once. Even in their federated form, how- ever, there are two types of privacy risk inherent in these single sign-on systems: exposure of what we call identity data (the set of all information associated with an individual within this identity system) by the authen- tication service and the aggregation of an entity's (or his or her identifier's) downstream behavior. While the committee did not undertake a detailed analysis of these two systems (one of which is proprietary and one of which has a specification developed and licensed by a private consor- tium), as with any authentication system the privacy implications will ultimately depend on choices made at the design, implementation, and use stages.21 Detailed analysis of a particular product is beyond the scope of this report. Public key infrastructure has often been touted as a universal authen- tication technology, one that might have national or even global scope. 20A Kerberos "realm" is a local administrative domain. Typically an organization will have its own realm. Different realms can be configured to interoperate with each other, but this requires explicit action from each organization. 21Recently, the FTC stepped in (at the request of privacy advocates) to assure that Microsoft's security and privacy policy is correctly represented to consumers. Many ob- servers commented that this move should be considered a warning that governments inter- nationally will scrutinize how centralized authentication services collect, protect, and use consumer data.

AUTHENTICATION TECHNOLOGIES - .f :~:~:~:~:~:~:~:~:~:~: :~: A:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~:~ Client 127 ,£~ ~ / ,-~ FIGURE 5.1 Kerberos: 1. User provides a principal (user name) and password to the client system. 2. Client queries the Initial Ticket Service of the Kerberos key distribution center (KDC) for a ticket-granting ticket (TGT), which will allow the client to request tickets for specific services later on. The client's request in- cludes a derivative of the user's password, which the Initial Ticket Service . i. verities. 3. The KDC's Initial Ticket Service provides the client with a dual-encrypted initial TGT containing a log-in session key. The client system converts the user's password into an encryption key and attempts to decrypt the TGT. 4. The client uses the TGT and the log-in session key to request tickets to specific services from the KDC's Ticket-Granting Service. 5. The Ticket-Granting Service decrypts the TGT with its own key, and then decrypts the service request using the TOT's session key. If decryption is successful on both counts, the Ticket-Granting Service accepts the user's authentication and returns a service ticket and a service-session key (en- crypted with the log-in session key) for the targeted service. This result can be cached and reused by the client. 6. The client uses the log-in session key provided in step 3 to decrypt the service ticket, gaining access to the service-session key. This key is then used to request access to the target service. This request is accompanied by an encrypted time stamp as an authenticator. Access to the target service is granted. Steps 4 through 7 can be repeated when access to other services is needed; service messages can be encrypted with the service-session key. A time limit is built into the log-in session in steps 3 and 5; the user will need to enter the password again when the log-in session has timed out.

28 WHO GOES THERE? Certainly a very large-scale PKI would have very serious privacy implica- tions, as it might provide a single, uniform identifier that an individual would employ in transactions with many different organizations. (See Box 5.6 for a brief description of public key cryptography.) Since each public key certificate carries a clearly visible identifier for the person represented by the certificate, it is easy to link different uses of the same certificate to that person's identity. The General Services Administration's Access Certificates for Elec- tronic Services (ACES) program, described more fully in Chapter 6 in this report,22 has this flavor for citizen interactions with the U.S. gov- ernment. In lapan, plans call for the creation of a national-level PKI that would be used not only for individual interactions with the government but also for a wide range of private sector interactions. VeriSign and other so-called trusted third party (TIP) certificate authorities (CAB) in both the United States and Europe promote the notion of using a single public key certificate as the universal personal authenticator for a wide range of transactions. For example, if citizens were issued a single "interact with the gov- ernment" public key certificate, it might be relatively easy to determine if, say, the individual who had a reservation to visit Yosemite National Park was the same person who had sought treatment in a Department of Veter- ans Affairs (VA) hospital for a sexually transmitted disease. By contrast, if the VA and the National Park Service each issued their own certificates, or if they relied on some other decentralized authentication mechanism, such linkage would be harder to establish. Thus, it is not the use of PKI per se (except as it is an authentication system with all of the privacy implications intrinsic to authentication itself see Chapters 2, 3, and 7 in this report) but rather the scope of the PKI that influences the privacy of the authentication system. PKI technology does not intrinsically require large scale or use across multiple domains in order to be useful or cost-effective to deploy. This report has already argued that individuals typically have multiple identi- ties and that most identities are meaningful only in limited contexts, which suggests that many PKIs could arise, each issuing certificates to individu- als in a limited context, with an identifier that is meaningful only in that context.23 PKIs of this sort can be privacy-preserving, in contrast to very large-scale PKIs. Proposals have been made to use PKIs in a highly de- 22See <http: / /www.gsa.gov/aces/>. 23For another view of PKI, digital certificates, and privacy, see Stefan Brands, Rethinking Public Key Infrastructures and Digital Certificates: Building in Privacy, Cambridge, Mass., MIT Press, 2000.

AUTHENTICATION TECHNOLOGIES 129

30 WHO GOES THERE? centralized fashion24 25 that supports this notion of multiple identities for an individual and thus supports privacy. However, multiple PKIs might impose burdens on users, who would be required to manage the multi- tude of certificates that would result. In a sense, this is not too different from the common, current situation in which an individual may hold many physical credentials and has to manage their use. If individuals are going to accept and make use of a multitude of PKIs, software needs to provide a user interface that minimizes the burden on users. 24S. Kent. "How Many Certification Authorities Are Enough?" Proceedings of MILCOM (unclassified papers) 97~1~(November 1997~:61-68. 25S. Kent. "Security Issues in PKI and Certification Authority Design." Advanced Security Technologies in Networking. NATO Science Series. Burke, Va., IOS Press, pp. 33-52, 2001.

AUTHENTICATION TECHNOLOGIES Finding 5.3: Public certificate authorities and trusted third par- ties present significant potential privacy and security concerns. Finding 5.4: Public key infrastructures have a reputation for being difficult to use and hard to deploy. Current products do little to dispel this notion. Finding 5.5: Many of the problems that appear to be intrinsic to public key infrastructures (as opposed to specific public key infrastructure products) seem to derive from the scope of the public key infrastructures. Recommendation 5.3: Public key infrastructures should be lim- ited in scope in order to simplify their deployment and to limit adverse privacy effects. Software such as browsers should pro- vide better support for private (versus public) certificate au- thorities and for the use of private keys and certificates among multiple computers associated with the same user to facilitate the use of private certificate authorities. 131 This analysis suggests that authentication technologies that imply some degree of centralization can be operated over a range of scales with vastly differing privacy implications. Thus, neither Kerberos nor PKI intrinsically undermines privacy (beyond the fact that they are authenti- cation systems and as such can affect privacy), although each could be used in a way that would do so. In general, decentralized systems tend to be more preserving of privacy: No single party has access to more than its own transaction records. An individual may use the same password for two different Web sites; for a third party to verify this, the party would need at least the cooperation of both sites and (depending on the precise password storage technology being used) perhaps special-purpose moni- toring software on both sites. But if users employ the same identifiers at each site, the potential for privacy violations is significantly increased. This same observation applies to any form of decentralized authentica- tion system. An essential requirement for preserving privacy in authentication systems is allowing an individual to employ a different identifier when he or she asserts a different identity for example, in different organiza- tional contexts. The use of different identifiers makes it harder to corre- late the individual's activities across systems, which helps preserve pri- vacy. This goal can be achieved with technologies ranging from passwords to PKIs. Also, if each system collects less personal information

32 WHO GOES THERE? on its users only what is required to satisfy the requirements of that system this, too, is privacy-preserving. Finding 5.6: Core authentication technologies are generally more neutral with respect to privacy than is usually believed. How these technologies are designed, developed, and deployed in systems is what most critically determines their privacy im- plications. SECURITY CONSIDERATIONS FOR INDIVIDUAL AUTHENTICATION TECHNOLOGIES Authentication technologies are often characterized by the security that they offer, specifically in terms of resistance to various types of at- tack. Many authentication technologies rely on the use of secret values such as passwords, PINs, cryptographic keys, and so on. Secrets may be vulnerable to guessing attacks if they are selected from a set of values that is too small or predictable. Passwords, when selected by individuals and not subject to screening, often exhibit this vulnerability.26 Secrets also may be compromised by computational attacks, even when the secrets are chosen from large sets of values. For example, a large, randomly chosen cryptographic key would generally be immune to guessing at- tacks. But this key could be used in an authentication protocol in a man- ner that permits an attacker to perform computations that reveal the value of the key. If secrets are transmitted across a communication network, from pre- senter to verifier, as part of an authentication process, they are vulnerable to interception unless otherwise protected (e.g., by encryption). An en- crypted communication path is often necessary, but it is not sufficient to protect secrets against being transmitted. An attacker might masquerade as a system that a user wants to access and thus trick the user into reveal- ing an authentication secret, even though the secret was encrypted en route.27 A secret need not be transmitted across a network to be subject to attacks of this sort. Several years ago, thieves installed a fake ATM in a 26In this context, the user (presenter) also is acting as the issuer, and as an issuer is doing a poor job. 27This is an example of why two-way authentication is important. A form of this attack sometimes takes place when a user employs Secure Sockets Layer to encrypt communica- tion between a browser and a Web server. The user may reveal credit card account infor- mation (account number, expiration date, shipping address, and so on) to a sham merchant, who then can use this information to carry out unauthorized transactions.

AUTHENTICATION TECHNOLOGIES 133 shopping mall.28 Unsuspecting individuals inserted ATM cards and en- tered PINs, which were collected by the thieves and used to make unau- thorized withdrawals from the users' accounts. Thus, even physical prox- imity and an ability to see a verifier does not ensure that it is the device it appears to be and one to which authentication information should be presented! Secret values that are too big for individuals to remember must be stored. The way in which the secrets are stored may make them vulner- able. For example, passwords written on a note stuck on a monitor in the workplace may be observed by other employees, custodial staff, or even visitors. Secret values stored in a file on a computer can be compromised by a wide variety of attacks against the computer, ranging from physical theft to network intrusions. Even secret values stored in hardware dedi- cated to authentication can be extracted illicitly, with varying degrees of difficulty, depending on the technology used to store the secrets. Often there is a requirement to prevent individuals from sharing au- thentication data in support of individual accountability. If authentica- tion data are known to individuals or are easily extracted from storage, then individuals may voluntarily make copies and thus circumvent this system goal (see Chapter 4~. Even when secrets are stored in physical tokens, the tokens may be loaned to others, in violation of procedural aspects of a security policy. Sometimes authentication is based not on the possession of a secret value but on the possession of a physical item that is presumed to be resistant to tampering and forgery. An authentication system may be attacked successfully if the assumptions about its tamper- or forgery- resistance prove to be false. In many cases, the security of the credential is derived from the integrity of the data associated with the credential, rather than on the physical characteristics of the credential. For example, a physical credential might contain digitally signed data attesting to a name and employee ID number. Verification of the credential, and thus authentication of the individual possessing the credential, would be based on successful validation of the digital signature associated with the data. Careful use of public key cryptography can make the digital signature highly secure, protecting against modification of the signed data or cre- ation of new, fake signed data. However, it may be quite feasible to copy the data to additional physical credentials. These duplicate credentials represent a form of forgery. Unless the signed data are linked directly to 28In 1993 in Connecticut, a fraudulent ATM was installed in a shopping center. See the RISKS digest for more details; available online at <http: / /catless.ncl.ac.uk/Risks/ 14.60.html#subj3>.

34 WHO GOES THERE? the holder of the credential (for example, by means of biometrics), this sort of forgery by duplication is a security concern. Biometric authentication also relies on the possession of a physical item that is presumed to be resistant to tampering and forgery, namely some measurable part of an individual's body or behavior. Examples include fingerprints, voiceprints, hand geometry, iris patterns, and so on. Biometric values are not secrets; we leave fingerprints on many items that we touch, our voices and facial images may be recorded, and so on.29 Thus, the security of biometric authentication systems relies extensively on the integrity of the process used to capture the biometric values and on the initial, accurate binding of those values to an identifier. It is critical that later instances of the biometric capture process ensure that it is a real person whose biometric features are being captured this may mean re- quiring biometric sensors to be continuously monitored by humans. Bio- metric authentication systems may be fooled by fake body parts or photo- graphs created to mimic the body parts of real individuals.30 They also may be attacked by capturing the digitized representation of a biometric feature for an individual and injecting it into the system, claiming that the data are a real scan of some biometric feature. The preceding analysis of the security vulnerabilities of classes of authentication technologies, while accurate, does not determine whether any of these technologies is suitable for use in any specific context. Instead, a candidate technology must be evaluated relative to a per- ceived threat in order to determine whether the technology is adequately secure. Nonetheless, it is important to understand these vulnerabilities 29"The physical characteristics of a person's voice, its tone and manner, as opposed to the content of a specific conversation, are constantly exposed to the public. Like a man's facial characteristics, or handwriting, his voice is repeatedly produced for others to hear. No person can have a reasonable expectation that others will not know the sound of his voice, any more than he can reasonably expect that his face will be a mystery to the world." Justice Potter Stewart for the majority in U.S. v. Dionisio, 410 U.S. 1,1973. 30See T. Matsumoto, H. Matsumoto, K. Yamada, and S. Hoshino, "Impact of Artificial Gummy Fingers on Fingerprint Systems," Proceedings of the International Society for Optical Engineering (SPIE) 4677 Qanuary 2002), available online at <http://research.nii.ac.jp/kaken- johogaku/reports/H13_overview/A04-00-l.pdf>; L. Thalheim, J. Krissler, and P. Ziegler, "Biometric Access Protection Devices and Their Programs Put to the Test," c't Magazine 11 (May 21, 2002~:114, available online at <http://www.heise.de/ct/english/02/11/114>; T. van der Putte and J. Kenning, "Biometrical Fingerprint Recognition: Don't Get Your Fin- gers Burned," Proceedings of the IFIP TC8/WG8.8 Fourth Working Conference on Smart Card Research and Advanced Applications, Kluwer Academic Publishers, Dordrecht, The Nether- lands, 2000, pp. 289-303, available online at <http: / /www.keuning.com/biometry/ Biometrical_Fingerprint_Recognition.pdf>; and D. Blackburn, M. Bone, P. Grother, and J. Phillips, Facial Recognition Vendor Test 2000: Evaluation Report, U.S. Department of De- fense, January 2001, available online at <www.frvt.org>.

AUTHENTICATION TECHNOLOGIES 135 when evaluating the security characteristics of individual authentica- tion technologies. COST CONSIDERATIONS FOR INDIVIDUAL AUTHENTICATION TECHNOLOGIES Costs are an important factor in the selection of authentication tech- nologies. These costs take many forms. Capital costs are associated with the acquisition of any hardware or software needed for an authentication technology. The hardware and software costs may be a function of the number of individuals being authenticated, or of the number of points at which authentication takes place, or both. For example, an authentication system that makes use of hardware tokens has a per-user cost, since each user must have his or her own token, and each device that will authenti- cate the user (for example, each desktop or laptop computer) must be equipped with a reader for the token. A biometric authentication system might typically require readers at each point where individuals are au- thenticated, and there would be a per-device, not a per-person cost. A software-based authentication system may impose costs only for each computer, not each individual, although licensing terms directed by a vendor might translate into per-user costs as well. Many authentication systems also make use of some common infra- structure, which also has associated hardware and software acquisition costs. The infrastructure may be offline and infrequently used, or it may be online and require constant availability. In the online case, it may be necessary to acquire replicated components of the infrastructure, to geo- graphically disperse these components, and to arrange for uninterruptible power supplies, in order to ensure high availability. The key distribution center component of a Kerberos system (see Box 5.7) and the ACE/Server used by the SecurID system are examples of the latter sort of infrastruc- ture. A certificate authority in a PKI is an example of the online type of infrastructure component. Operation of an authentication system involves labor costs of vari- ous types. Help desks must be manned to respond to users' questions and problems. If the system relies on secret values that users are re- quired to remember, the help desk will have to interact with users to reset forgotten secret values. If the system makes use of hardware to- kens, provisions will have to be made to replace lost or stolen tokens. Users and system administrators must be trained to work with an au- thentication technology and with that technology's interaction with varying operating systems and applications. Application developers must learn how to make use of an authentication technology and to integrate it into their applications.

136 WHO GOES THERE? This brief discussion illustrates how complex it can be to evaluate the cost of an individual authentication system. Initial capital outlays are greater for some types of systems; ongoing costs of other types of systems may eventually outweigh these capital outlays. Different contexts merit different levels of security and will tolerate different costs for authentica- tion technology. Thus, there can be no single right answer to the question of how much authentication technology should cost. CONCLUDING REMARKS The preceding chapters describe three different conceptual types of authentication (identity, attribute, and individual), and this chapter fo-

AUTHENTICATION TECHNOLOGIES 137 cuses on the technologies that go into building an authentication system and some of the technology-related decisions that must be made. Some of these decisions will bear on the privacy implications of the overall sys- tem. In general, decentralized systems tend to be more preserving of privacy, but the core authentication technologies that make up authenti- cation systems tend to be privacy-neutral. What matters most in terms of privacy are design, implementation, and policy choices, as described else- where in this report.

Next: 6 Authentication, Privacy, and the Roles of Government »
Who Goes There?: Authentication Through the Lens of Privacy Get This Book
×
Buy Paperback | $48.00 Buy Ebook | $38.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Who Goes There?: Authentication Through the Lens of Privacy explores authentication technologies (passwords, PKI, biometrics, etc.) and their implications for the privacy of the individuals being authenticated. As authentication becomes ever more ubiquitous, understanding its interplay with privacy is vital. The report examines numerous concepts, including authentication, authorization, identification, privacy, and security. It provides a framework to guide thinking about these issues when deciding whether and how to use authentication in a particular context. The book explains how privacy is affected by system design decisions. It also describes government’s unique role in authentication and what this means for how government can use authentication with minimal invasions of privacy. In addition, Who Goes There? outlines usability and security considerations and provides a primer on privacy law and policy.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!