Click for next page ( 180


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 179
7 A Toolkit for Privacy in the Context of Authentication The preceding chapters provide an in-depth look at authentication, abstractly and with respect to particular technologies, as well as an overview of privacy and a look at government-specific issues re- lated to authentication and privacy. This concluding chapter provides a toolkit that can aid in designing an authentication system that is sensitive to privacy concerns. It focuses on the three types of authentication iden- tified in Chapter 1: Individual authentication is the process of establishing an under- stood level of confidence that an identifier refers to a specific individual. Identity authentication is the process of establishing an understood level of confidence that an identifier refers to an identity. The authenti- cated identity may or may not be linkable to an individual. Attribute authentication is the process of establishing an understood level of confidence that an attribute applies to a specific individual. Authentication systems using one or more of these techniques are generally deployed to meet one of two goals: Limiting access. Authentication may be used to limit who enters or accesses a given area/resource and/or to control what they do once granted entrance or access. Monitoring. Authentication may be used to enable monitoring of system use. This may occur regardless of whether decisions about access 179

OCR for page 179
80 WHO GOES THERE? to or use of resources are being made on the basis of authentication. Such authentication is conducted to support real-time and retrospective uses, including audits to assess liability or blame, to mete out rewards and praise, to provide for accountability, or to make behavior-based decisions such as those made by marketers. Privacy issues arise in all systems that exercise control over access or monitor behavior, regardless of the method of authentication used. As described in Chapter 3, the decision to authenticate, whatever the reason, may affect decisional privacy, bodily integrity privacy, information pri- vacy, and communications privacy. As noted earlier, affecting privacy is not always equivalent to violating privacy. Without delving into the normative decisions surrounding what is an appropriate level of sensitiv- ity to privacy, this chapter describes how choices made at the outset of system design and deployment can have baseline privacy implications that should be taken into account. The choice of attribute, identity, or individual authentication is a sub- stantial determinant of how large an effect on privacy the authentication system will have. However, for cases in which the resource to be pro- tected is itself private information or something else to which access must be controlled in order to protect privacy, a privacy-invasive authentica- tion system may be necessary and appropriate. Such an authentication system also may be warranted in other contexts for other reasons. Thus, examining the privacy consequences of authentication technology is best done in tandem with evaluating the nature of the resource that the au- thentication system is deployed to protect. As mentioned above, access control can be supported by proving that one is allowed to do something or by proving that one is not on a list of those prohibited from doing something. This proof can be provided us- ing attribute, identity, or individual authentication methods. For example, sensitive areas in a workplace are frequently limited to those individuals who can prove that they are on the list of those permitted access. This proof may come in the form of an attribute authentication system (the employee has a property that permits access), an identity authentication system (the identification number given the employee permits access), or an individual authentication system (the individuals on this list are per- mitted access). In contrast, it is common for bars and nightclubs to have rules about those individuals who may not enter. When individuals present themselves for entry, those who possess certain traits (such as being underage or being someone who is not welcome by the owner) may not enter. The under-21 criterion uses an attribute authentication system, and a driver's license or other age-verification documents are used to make age-based decisions about entry. An individual authentication sys-

OCR for page 179
A TOOLKIT FOR PRIVACY IN THE CONTEXT OF AUTHENTICATION 181 tem, albeit usually a low-tech one, is used to prohibit from entering those whom the owner has indicated are not welcome. An attribute authentication system deployed in either of the contexts described above (employment or bar) need not maintain a database. In each situation, the decision to permit or deny entry is based on an at- tribute that, as far as the authentication system is concerned, any indi- vidual may possess. In contrast, an identity or individual authentication system in these two contexts potentially is implemented quite differently.] In the employment context, the identity or individual authentication sys- tem must contain a record on everyone who is allowed to enter. In the bar context, the identity or individual authentication system will only contain information on those who cannot enter (such as the list of people the owner has indicated are unwelcome and the fact that those under 21 are not allowed in). An important consequence flows from this limitation. In the employment scenario, the system easily allows for additional controls over the individual once he or she enters the building, and it potentially supports monitoring of those within the system even where such moni- toring is unrelated to decisions about access to or use of a resource. In the bar scenario, on the other hand, the system put in place generally will not provide any means for controlling or monitoring those who enter based on the way they authenticated themselves. An authentication system designed to limit the access of a specific group of individuals has no further privacy consequences for those not on the initial list if the system is designed so as to limit its function to its goal. These examples illustrate that the privacy implications of authentication systems stem from implementation and system design choices and not necessarily from the reasons for which the authentication system is needed or the form of authentication technology employed. In the next section, a detailed toolkit is presented for thinking through how different choices in the design of an authentication system can have an impact on privacy. PRIVACY-IMPACT TOOLKIT The choice among an attribute authentication system, an identity au- thentication system, and an individual authentication system bears sub- 1In considering the distinction between identity and attribute authentication, note that identity authentication, which assumes the existence of an identity and a unique identifier, allows for the creation of longitudinal records. In attribute authentication, if the attribute chosen is sufficiently distinctive it is functionally equivalent to an identity authentication system, in which case the attribute may be more accurately labeled an identifier, thereby eroding the protections that might otherwise be provided by an attribute authentication system.

OCR for page 179
82 WHO GOES THERE? stantially on the privacy consequences of the system. Viewed indepen- dently of the resource they are designed to protect, attribute authentica- tion systems present the fewest privacy problems and individual authen- tication systems the most. Nevertheless, in some instances individual authentication systems may be appropriate for privacy, security, or other reasons. Separate from the type of authentication, the overall scope of the system will have obvious implications for user privacy. To limit effects on the privacy of users, systems should collect information on the fewest individuals possible. Not all access-control decisions contemplate or re- quire auditing. While many access-control systems, particularly those that control access to sensitive or valuable information or resources, ex- plicitly call for auditing, it is possible to design a system that supports access control but not auditing.2 Where auditing is not contemplated or necessary, the scope of the system should be narrowed. For example, if auditing is not needed, then once a decision to permit access or action is rendered, there may be no reason to store and maintain data about the decision and many privacy reasons to destroy it. In general, when developing an authentication system, several ques- tions must be answered that go beyond the scope of the system and what type of authentication will be used. Decisions will need to be made about which attributes to use, which identifiers will be needed, which identity will be associated with the identifier, and how the level of confidence needed for authentication will be reached. The answers to each of these questions will have implications for privacy. Below, the four types of privacy described in Chapter 3 (information, decisional, bodily integrity, and communications) are discussed in the context of each of the above questions. The analysis proposed here is technology-independent, for the most part, and can be applied to almost any proposed authentication system. Attribute Choice Attribute authentication and, frequently, identity authentication and individual authentication require the collection or creation of attributes that the system uses to determine whether to grant an individual access during the authentication phase. In an attribute authentication system, the attribute alone will be the thing being authenticated. In an identity authentication system, the identifier will correlate to some collection of 2For example, auditing may not be necessary when controlling access to theaters, amuse- ment parks, or other "one-time pay to enter" locales.

OCR for page 179
A TOOLKIT FOR PRIVACY IN THE CONTEXT OF AUTHENTICATION 183 information that the system considers to be an identity. The identity may be nothing more than an e-mail account that bears no obvious relation to the underlying individual and the password that accesses it (in other words not john.mulligan~example.com, but abracadabra~example.com). In an individual authentication system, however, the identity, which po- tentially includes attributes as well as personal information, is distinctive to a given individual. For example, when a driver's license is initially issued, an effort is made to bind the driver's license number (nothing necessarily individual about it at this point) to an identity that is distinct enough to be linked, in theory, to the individual who requested the li- cense. Part of the identity comprises attributes such as eye and hair color, height, weight, a photographic image of the individual, and so on. Information Privacy To analyze how the choice of attributers) may implicate information privacy, it is useful to consider the fair information principles detailed in Table 3.1 in Chapter 3. Several characteristics of an attribute may be related to collecting the minimum amount of information needed for authentication. For example, the more distinctive the attribute is in relation to the individual, the easier it will be to establish the necessary level of confidence that the attribute applies to a specific individual; conversely, this may increase the poten- tial for privacy problems. When a large group of individuals is allowed to access a resource, the selection of a unique attribute may inappropriately create opportunities for revelation of the individual to whom the attribute pertains. The selection of an overly distinctive attribute in such a situation would violate the minimization principle. However, the selection of a unique attribute may be appropriate where attribute authentication is being used to limit access to an individual or a small set of individuals. For example, the use of a highly distinctive attribute to control access to per- sonal information about an individual maintained by a third party may meet the minimization principle and be necessary to protect against inappropriate access to the personal information in question. Regardless of whether the choice of a highly distinctive attribute is appropriate, the more sensitive or revealing the attribute is, the greater the information privacy problems raised. Thus, greater attention must be paid to protecting against misuse and disclosure. Similarly, the more attributes collected for authentication (regardless of whether they are ap- propriate), the greater the information privacy problems raised. Clearly there are trade-offs between the privacy implications an attribute poses and that attribute's security value. Ideally, attributes should be selected that minimize the privacy effect and maximize the security potential.

OCR for page 179
84 WHO GOES THERE? In selecting an attribute, the quality of the data represented should also be examined. The attribute should be relevant to the system. For example, organizational role might be an appropriate attribute in an em- ployment context but not in a retail context; eye color might be appropri- ate for physical access systems but not online systems. If an attribute is subject to change, then in some circumstances it may not be a good at- tribute to select because its quality may be compromised. For example, hair color may be a poor choice of attributes if the goal is to limit access to individuals whose hair is naturally a given shade. In other circumstances the changeable nature of the attribute may improve its value as an au- thentication attribute. For example, the use of last-deposit or last-with- drawal information in a financial context as an attribute may meet the data-quality standard despite its variable nature. The fact that the value of these attributes changes frequently means that ongoing system com- promise is less likely if the value is guessed or stolen. The accuracy of an attribute should also be considered. Different systems may tolerate different levels of accuracy. In general, to enhance privacy protection, a system should select an attribute that is relevant, accurate, and fresh. If the levels of accuracy, relevance, and reliability of an attribute are high, the number of attributes can be minimized. In selecting an attribute, system designers should also consider how widely it is used in other systems. If an attribute is widely used, it can more easily facilitate secondary uses and record linkages from and to other systems. A less widely used attribute (ideally an attribute unique to the system) is less likely to serve as a link between the records of disparate systems. In addition, if an attribute is unique to a system, and the at- tribute space is sufficiently large and the attributes are randomly distrib- uted in that space, then the system is less vulnerable to outside attacks based on attribute guessing. To limit the likelihood that its value will be compromised, an attribute used for authentication purposes should not be used for other system purposes. For example, any attribute used as an identifier in a system (perhaps an account number) is likely to be exposed to a wide range of individuals and/or system elements and thus is a poor choice as an authentication attribute. In order to protect privacy, the security level that the system accords an authentication attribute should be consistent with the value of the attribute, as well as the value of the data that can be accessed on the basis of knowledge of the attribute. If an attribute is sensitive or unique, its value to the individual may go well beyond its value to the system as an authenticator. The data subject's valuation of the attribute and the conse- quent security that it should be afforded may not be immediately obvious to system developers or users. To better protect information privacy (and in accordance with fair

OCR for page 179
A TOOLKIT FOR PRIVACY IN THE CONTEXT OF AUTHENTICATION 185 information principles), once an attribute is selected, individuals should receive clear notice about whether information regarding that attribute will be retained in a separate authentication system of records, what the uses of that system are, who has access to it, and what rights the indi- vidual has with respect to accessing the system. The system should also specify how controls on the attribute authentication system will be en- forced and to whom the system is accountable. Decisional Privacy The choice of attributes may affect decisional privacy. In addition to raising information privacy problems, the choice of a sensitive or reveal- ing attributers) may also affect the individual's willingness to participate in the system for which authentication is sought and to engage in activi- ties that might result in the collection or generation of additional sensitive or revealing information. Examples of attributes that are themselves re- vealing or sensitive are political party, religion, and weight. Bodily Integrity Privacy Bodily integrity privacy may also be affected by the choice of at- tributes. For example, the collection of blood in order to ascertain blood type as an attribute, or of DNA in order to screen for a genetic attribute raises two types of privacy issues that have implications for bodily integrity. First, the collection of the attribute may be physically intrusive or invasive. Second, once collected, the attribute may reveal additional information about an individual's physiological or psycho- logical condition (such as a predisposition to certain diseases), as well as information about an individual's recent activities (such as preg- nancy or drug use). Communications Privacy If identifiers such as network or communication system addresses (or even phone numbers) are mislabeled and used as authentication at- tributes, communications privacy can be implicated. These addresses can facilitate collection and analysis of information about the individual that can be correlated with other records. Summary of Attribute Choice Discussion This analysis indicates that an attribute selected for an authentication system that minimizes privacy implications should:

OCR for page 179
86 WHO GOES THERE? Not be unique to an individual unless tightly controlled access is required, Not be widely used in other systems, Not be sensitive or revealing, Be relevant to the system, accurate, and fresh, Require no physical contact, Entail obvious (as opposed to covert) collection, and Not be related to communication activities. Identifier Selection Identity authentication and individual authentication systems both use identifiers to tie individuals to an identity within the system. Both systems require the selection or construction of an identifier, such as a name, a random number, or a tax ID number. The choice of or creation of an identifier raises privacy concerns. Information Privacy To analyze how the choice or creation of an identifier may implicate information privacy, consider once again the fair information principles in Table 3.1 in Chapter 3. The principle of limiting the collection of information is raised by the selection or construction of an identifier. First, the minimization aspect of the collection-limitation principle requires that efforts be made to limit the collection of information to what is necessary to support the transac- tion. The selection of an identifier that in itself has meaning, is important, or is revealing (if unnecessary to the underlying purpose of the transac- tion) would violate this principle. An effort should be made to use iden- tifiers that are not themselves personal information. Thus, randomness and system exclusivity are valuable traits in an identifier. As discussed above, these traits are valuable from the perspective of system security as well. An identifier that is created or constructed for the purpose of authentication in that one system will offer more protection for both pri- vacy and security than will an identifier selected from or based upon existing identifiers. Because the identifier is being selected for its capacity to link to the individual in the context of an individual authentication system, the in- formation privacy concerns are greater than they are in attribute and identity authentication. To best protect privacy, identifiable information should be collected only when critical to the relationship or transaction that is being authenticated. The individual should consent to the collec- tion, and the minimum amount of identifiable information should be

OCR for page 179
A TOOLKIT FOR PRIVACY IN THE CONTEXT OF AUTHENTICATION 187 collected and retained. The relevance, accuracy, and timeliness of the identifier should be maintained and, when necessary, updated. Restric- tions on secondary uses of the identifier are important in order to safe- guard the privacy of the individual and to preserve the security of the authentication system. The individual should have clear rights to access information about how data are protected and used by the authentication system and the individual should have the right to challenge, correct, and amend any information related to the identifier or its uses. The privacy question related to how involved an individual should be in the selection or creation of an identifier is an interesting one. It would appear at first that allowing the individual to select an identifier would maximize the individual's control and involvement and allow that person to establish a desired level of privacy. Yet studies on users indi- cate that individuals are likely to select an identifier that they can easily remember and, in most cases, an identifier that they use elsewhere or that is related to some personal information (see Chapter 4 for more on us- ability). A random identifier, created exclusively for use in that system, will provide more protection from an information privacy perspective but will be more cumbersome for the individual to use and remember. However, technical mechanisms can be employed to minimize these ~ nconvenlences.- Decisional Privacy An identifier that is randomly created and used exclusively for a particular authentication system will pose fewer adverse implications for decisional privacy than an identifier that reflects or contains personal information. The selection of an identifier that can be linked to the indi- vidual is likely to pose greater risks to decisional privacy than the selec- tion of an attribute or identifier that cannot be linked. Such a system would not provide for anonymous or pseudonymous participation. In- stead, it will be possible to associate a particular individual's involvement with the activity. Depending on the activity, it is possible that the selec- tion of an identifier linked to the individual will cause some individuals not to participate. 3For example, in Web access contexts, a different public key certificate can be created for use with each Web site, and browser software can automatically interact with the site to select the right certificate when the site is visited. This affords a high degree of privacy relative to linkage concerns, and it can provide a very convenient individual authentication interface.

OCR for page 179
88 WHO GOES THERE? Bodily Integrity Privacy Identifiers, unlike attributes, generally do not represent a characteris- tic of an individual and thus are not likely to affect bodily integrity. The selection of an identifier that could be associated with physical character- istics or physical activities of an individual may affect bodily integrity if the collection of the identifier was physically intrusive, invasive, or in- timidating. Communications Privacy Communications privacy is affected if the identifier is the individual's network or communication system address or number (telephone num- ber, e-mail address, IP address, and so on). If the identifier is related to the communication activities of an individual, its collection raises ques- tions of communication privacy, because it would enable linking between authentication and communication activities. For example, if the identi- fier could be linked both to the individual and to the communications activities of the individual (phone number or e-mail address), it could significantly compromise communications privacy. This information would be valuable to both the system collecting the information and also to those outside the system, especially for law enforcement and other investigative purposes. To minimize privacy implications and enhance security, it would be best if the identifier used in an authentication system is not related to communications. However, if the system for which ac- cess is being authenticated is a communications system, then use of a communications identifier would be appropriate, as it would be informa- tion that is germane to the system. Summary of Identifier Selection Discussion This analysis indicates that an identifier selected for an authentication system that also minimizes privacy implications should: Be unique to the system (possibly random), Not be widely used, Not be sensitive or revealing, Require little or no physical contact, Entail obvious (as opposed to covert) collection/assignment, and Not be related to communication activities.

OCR for page 179
A TOOLKIT FOR PRIVACY IN THE CONTEXT OF AUTHENTICATION 189 Identity Selection In both identity and individual authentication systems, identifiers are often associated with other data records. Even in an individual authenti- cation system that does not have as a primary goal the creation of an identity record, these data records constitute a de facto identity of the entity pointed to by the identifier. There are, accordingly, three types of individual/identity authentication systems, each with different privacy concerns: 1. Purely individual authentication systems. In these systems, the iden- tifier itself is the only information about the entity available to the system; no additional data records are associated with the entity's identifier. In this case, the privacy analysis above regarding the selection of an identi- fier applies directly. 2. Self-contained identity authentication systems. In these systems, an entity's identifier is linked to data records that are held within the system and contain identity information about the entity; this information may include the history of the entity's access to the system. In these systems, an entity's identifier is not linked to information about the entity outside the system. For example, a system-specific number might be assigned to each entity in the system. In this case, no new privacy issues are intro- duced. 3. Non-self-contained identity authentication systems. In these systems, the identifier used by the system to refer to an entity is also linked to data records that are held outside the system and contain identity information about the entity. For example, a system of this type might maintain an entity's credit card number, which is linked by credit agencies' external systems to the entity's transaction history and credit rating. In this case, new privacy issues arise; these issues are explored below. Information Privacy If the system identity is associated with a particular individual, all the fair information principles should be honored in order to best protect privacy. An authentication system that is organized in such a way that a particular individual's privacy may be compromised requires the follow- ing: the individual's knowledge and consent; collection of the minimum amount of information and publication, including specific examples, of the level of collection required; that the information be relevant, accurate, timely, and complete; that information collected be used only for the specific purpose for which it was collected, unless the individual consents

OCR for page 179
90 WHO GOES THERE? or a valid court order is issued; that the individual have the right to access, challenge, correct, and amend information; and that the system be maintained with the highest level of security. All of the issues related to identifier selection and information privacy remain present in the context of identity selection. Decisional Privacy Viewed independently of context, individual authentication systems pose the greatest risk to decisional privacy. The creation of transactional information about authentication events that is closely tied to a given individual has the greatest potential to have a chilling effect on individu- als who do not want their identity associated with an activity or organiza- tion. Similarly, individually identified transactional records about au- thentication events are more likely to be reused by third parties (law enforcement, private litigants, hackers, and so on). Individually identi- fied records are more easily repurposed and reused. Bodily Integrity and Communications Privacy The discussions in the "Identifier Selection" section above about issues related to bodily integrity and communications privacy also apply here. The Authentication Phase This phase determines whether the attribute, identifier, or identity refers to the individual being authenticated at the level of confidence required by the system. This determination is usually accomplished by observation of the individual or by challenging the individual to produce something supporting the claim. (For example, requiring a card and PIN at an ATM, requiring a badge to be swiped as the bearer enters a building, and requiring a password at an e-commerce Web site are all authentica- tion phases within their respective systems.) Information Privacy Whether records of the act of authentication are kept, including, for example, time and date logs, implicates information privacy most di- rectly. If such transactional records are kept for example, to provide an audit trail for security purposes then the system should minimize the amount of information collected. The individual should also be notified that the information is being kept as part of a system, how long it will be kept, who has access to the system, and what other uses may be made of

OCR for page 179
A TOOLMT FOR PRIVACY IN THE CONTEXT OF AUTHENTICATION 1 Al the information. The individual should be able to access the record and correct or amend it if necessary. The system should limit the retention of records containing information about authentication acts as well as sec- ondary uses of or access to such records. Decisional Privacy The intrusiveness and visibility of the authentication phase may also affect decisional privacy. The attribute or identifier itself may not be particularly revealing or sensitive, but the process for verification may be so revealing as to inhibit someone from participating. If the individual is challenged to produce something supporting the attribute or identity claimed, decisional privacy may be affected if the challenge seems to be intimidating or if the supporting evidence is revealing or sensitive. Deci- sional privacy is directly affected by the creation of transactional records of authentication events to support auditing. Bodily Integrity The authentication phase may also affect bodily integrity if the obser- vation of the attribute requires close or direct contact with the individual or observation that appears intrusive. If the authentication phase re- quires the individual to produce physical evidence of an attribute, the individual's bodily integrity may be compromised. For example, a casual observation that someone is 5 feet 6 inches tall is not likely to affect someone's sense of bodily integrity, while actually measuring someone is likely to affect that sense. Communications Privacy If the authentication phase requires the use of a communications sys- tem, communications privacy may be implicated. Any authentication that occurs on the Internet, for example, involves communications pri- vacy. The question of who has access to the content of the authentication and to the transactional information generated during the communica- tion should be addressed before the authentication system is imple- mented. Again, the creation and maintenance of transactional records of authentication events (or the authentication events that are unrelated to the need to control system access) may raise particularly troubling issues of communications privacy. If the monitoring reveals information about whom an individual associates or communicates with, directly or indi- rectly, the system will infringe on communications privacy. Finally, if the authentication phase entails use of a communications system that can be

OCR for page 179
92 WHO GOES THERE? associated with a particular individual, then communications privacy may be affected because the system will generate content and transactional information linked to the individual. Summary of Authentication Phase Discussion This analysis indicates that in order to minimize privacy conse- quences, the following goals should be kept in mind when designing the authentication phase of an authentication system: Choose the minimum level of confidence in the authentication that supports the system needs. These needs could be satisfied in range of ways: from self-reported, to verified by the second party to the transac- tion, to verified by a third party, to verified by multiple third parties, to polling government sources, and so on. Establish processes to achieve this level of confidence and make sure that the individual being authenticated is involved in the authentica- tion. Ensure that the system does only what it sets out to do (e.g., access control, monitoring, or some combination of these). Limit the maintenance and storage of transactional records to the . . minimum amount necessary. Set destruction policies for those records that need to be kept for limited periods. Segregate authentication information from transactional data sub- sequent to authentication events. Create technical and procedural strategies that limit the ability to connect authentication information with specific authentication events. Understand and consider the security risks of authentication activ- ity data storage, including risks of unauthorized access, unauthorized use by those with authorized access, and legally compelled access. CONCLUDING REMARKS The development, implementation, and broad deployment of authen- tication systems require thinking carefully about the role of identity and privacy in a free, open, and democratic society. Privacy, including con- trol over the disclosure of one's identity and the ability to remain anony- mous, is an essential ingredient of a functioning democracy. It is a pre- condition for the exercise of constitutionally protected freedoms, such as the freedom of association. It supports the robust exercise of freedom of expression by, for example, creating psychological space for political dis- sent. It maintains social norms that protect human dignity and autonomy

OCR for page 179
A TOOLKIT FOR PRIVACY IN THE CONTEXT OF AUTHENTICATION 193 by enabling expressions of respect and intimacy and the establishment of boundaries between oneself and one's community. Information collected in or generated by authentication systems can be valuable and revealing. It may document where individuals have been, what resources they have sought, what individuals or institutions they have chosen to associate with, and so on. It is likely to be sought by law enforcement, commercial entities, and private parties. If individuals fear unchecked scrutiny, they will be less likely to vigorously participate in the political process and in society in general. If individuals are denied physical and mental privacy from the government, corporations, and other individuals they are less able to explore ideas, formulate personal opinions, and express and act on these beliefs. In the context of systems that mediate access to political, cultural or artistic information and/or provide state and private parties with access to personal information, identity and individual authentication mechanisms chill the free flow of information and free association. At the same time, "privacy" is often used as a pretext to hide illegal activities, and society has, at times, a legitimate interest in requiring authentication or identification. This re- quirement may stem from the need to validate claims to rights and privi- leges or the need to hold individuals responsible for their activities. The decision about where to deploy identity authentication systems- be it only where today confirmation of identity is already required, or in a greater range of circumstances will shape society in both obvious and subtle ways. Because many privacy breaches are easy to conceal and/or are unreported, failing to protect privacy may cost less in the short run than the initial outlay required to establish sound procedural and techni- cal privacy protections. In addition, establishing practices and technical measures that protect privacy costs money at the outset. If individuals whose information was compromised and agencies responsible for en- forcing privacy laws were informed of privacy breaches, there would be greater incentive to proactively implement technologies and policies that protect privacy. Even if the choice is made to institute authentication systems only where people today attempt to discern identity, the creation of reliable, inexpensive systems will inevitably invite function creep and unplanned-for secondary uses unless action is taken to avoid these prob- lems. The role of attribute authentication in protecting privacy is underexplored and may be a way to mitigate some of these concerns. It is critical that there be analysis of the intended context and usage models and thoughtful decision making about what system requirements are. To best protect privacy, the privacy consequences of both the in- tended design and deployment and the potential secondary uses of au- thentication systems must be taken into consideration by vendors, users, policy makers, and the general public.

OCR for page 179