Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Privacy Challenges in Authentication Systems ' n principle, authentication technologies can both advance and under- mine privacy interests. In practice, however, a combination of forces, including the following- · The influence of the prevalent security paradigm of fully mediated access, · The desire of businesses to collect personal information cheaply and unobtrusively, · The pressure on governments and businesses to streamline their interactions and reduce costs, and · The resiliency of digital information- is more likely to lead to authentication systems that · Increase requests for identification, · Increase the collection of personal information, · Decrease the ability of individuals to understand and participate in data collection decisions, · Facilitate record linkage and profiling, and · Decrease the likelihood that individuals will receive notice of or have the right to object to third-party access to personal information. While authentication systems can undermine privacy in these ways, they can also be used in privacy-enhancing or privacy-preserving ways, 55
56 WHO GOES THERE? primarily by securing personal data and preventing unauthorized access to the data. The privacy-enhancing benefits of authentication systems are de- rived from the security features of the overall systems in which they are deployed and are not intrinsic to the authentication components themselves. As with any technology, careful consideration of the privacy risks, benefits, and trade-offs involved must be considered before authentica- tion systems are designed and deployed. To some extent, tension be- tween authentication and privacy is inherent, because the act of authenti- cation often requires some revelation and confirmation of personal information. PRIVACY IMPACT OF THE DECISION TO AUTHENTICATE First, let us look in broad terms at what an authentication system requires and examine how the collection, retention, reuse, and linkage of personal information might affect privacy interests: · Establishing an initial identifier or attribute for use within the sys- tem may require an individual to reveal personal facts or information (such as name, address, fingerprints). A requirement to reveal identify- ing personal information may inhibit participation in certain activities (such as medical tests). · The act of authentication itself may cause the creation of records of individuals' actions (such as where they shop, what they read, and when they come and go) that are linkable to one of three entities: a specific individual (individual authentication); a (possibly pseudonymous) iden- tity that may or may not be linked to an individual (identity authentica- tion); or, an attribute that applies to a specific individual (attribute au- thentication). · In addition, transactional information revealing details of an event (purchase, building entry) may be created as a result of or subsequent to authentication and can then be linked back to the identity or individual and be retained in the relevant record. · The requirements of the authentication or initial identity-establish- ment process may impose objectionable requirements (for example, they might conflict with religious beliefs2 or impose on bodily integrity). fin fact, some private sector and public sector policies impose requirements on those who collect data related to the protection of those data. 2In June 2002, CNN reported "Muslim Woman to Challenge Ban on Veil in Driver's License Photo," available online at <http://www.cnn.com/2002/LAW/06/27/license.veil.ap/ index.html>. For religious reasons, a woman wanted to wear a veil for her driver's license photo in spite of objections from the State of Florida that allowing it would jeopardize public safety.
PRIVACY CHALLENGES IN AUTHENTICATION SYSTEMS 57 · Personal information or data may be exposed at multiple points and to multiple entities during the operation of an authentication system: They may be revealed during the authentication process, created during the authentication process, and/or retained as a result of the authentica- tion process, all of which affect privacy. Personal information may also remain within a device possessed by the individual, reside in a system run by a single entity, or enable many entities to observe and/or collect personal information. · Authentication may require the use of an identifier that, even if not personally identifiable per se, can be used to compile a dossier of facts (records of use of the identifier) that otherwise would be difficult or im- possible to correlate. This collection of discrete facts may lead to a revela- tion of the individual's identity. · Depending on where the user's identity and other authentication- related data are stored, they may be accessible to a variety of individuals within one or more institutions, and they may be more or less susceptible to access by hostile third parties through technical exploits or legal processes. This general examination of authentication systems and the personal information practices that result from such systems harks back to the several general privacy risks created or increased by authentication sys- tems, as described in Chapter 1 of this report: covert identification, exces- sive use of authentication technology, excessive aggregation of personal information, and chilling effects. Given this categorization of privacy risks, an examination of relevant privacy interests will provide a better understanding of the foundations and contours of such interests, the values they protect, and the challenges that authentication technologies pose to privacy interests. ACCESS CONTROL AND INFORMATION SYSTEMS Access policies are a defining aspect of information systems. In a networked environment, the mediation of absolutely every user interac- tion with the system and its resources is a first step in enforcing access- control policies, identifying misuse, and investigating breaches. The Internet, perhaps the canonical example of a large, networked informa- tion system and a vast network of networks, while in many respects "open," is a highly mediated environment. Standards and protocols es- tablish the who, what, when, where, and how of information exchanges.3 3As Larry Lessig (in Code and Other Laws of Cyberspace, New York, Basic Books, 1999) and Joel Reidenberg (in "Lex Informatica: The Formulation of Privacy Rules Through Technol- ogy," Texas Law Review 76~1998~:553-593) argue, these standards establish the code by which online behavior is regulated.
58 WHO GOES THERE? Decisions about whether a given user may communicate with a resource, whether a given computer may communicate with another, whether a given network may communicate with another, and what extent of infor- mation exchange is allowed in each instance dominate the Internet. This is in part because the Internet exists at the collective will of individuals, private parties, and government entities to allow information to flow across their systems. Without these agreements to support the exchange of bits, there would be no Internet.4 These agreements also conceal the organizational boundaries and in- stitutional rules that users traverse when they access a site. Users are generally unaware of the intricacies established by their Internet service provider (ISP) or of the communication requirements for moving around on the Internet. The reality is that what users experience as a library or a public space is in fact a mixture of public and private networks. Not only are users generally ignorant of the jurisdictional boundaries they cross, but they are also usually oblivious of the presence of other users. One commentator said that being on the Internet is "like being in a movie theater without a view of the other seats. . [where] masses of silent, shuffling consumers . . . register their presence only by the fact of a turn- stile-like 'hit' upon each web page they visit. . ."5 These characteristics of the online world are in stark contrast with the physical world in three important respects: 1. In the physical world there are many clearly defined public spaces and many privately owned spaces in which access control is nonexistent . . Or momma -; 2. In physical space, relatively few actions are mediated; and 3. In the off-line world, if mediation occurs it is typically evidenced by a physical sign. In the off-line world, individuals and institutions make decisions about whether or not to mediate interactions between individuals and resources. For example, a university may decide not to control who walks across the campus but to control who enters certain buildings. Similarly, libraries and bookstores generally do not exert control over who enters the premises or what materials they access, but they do exert control over 4For a detailed look at the technological underpinnings of the Internet, see computer science and Telecommunications Board, National Research Council, The Internet's Coming of Age, Washington, D.C., National Academy Press, 2001. Jonathan zittrain. ''The Rise and Fall of Sysopdom.~, Harvard Journal of Law and Technology 10~1997):495. Available online at <http://jolt.law.harvard.edu/low/articles/10hjolt495.html>.
PRIVACY CHALLENGES IN AUTHENTICATION SYSTEMS 59 the terms on which individuals may remove things from the premises. In contrast, in a networked environment that is, an engineered system- the answer to the question Should we mediate access? is almost always yes; the inquiry begins with the questions How much do we mediate? With what mechanism? With increasing frequency, authentication systems are being deployed to control access and movement in physical spaces as well as to control access to networked systems themselves. The increase in the scope of authentication and identification supported by networked systems is ex- tending the scope of recorded interactions. The systems and the hard- ware that interacts with them are changing the information that can be collected during interactions and the extent to which it can be reused. As discussed below, these changes challenge the privacy of individuals in four significant respects. 1. Computer technology reduces the costs of record keeping. The reduction in costs has escalated the data collection and retention associated with authentication events. Increased data collection and retention exacerbate the privacy consequences of authentication events. Flashing one's driver's license in a corner store is qualitatively different from providing a digital copy of one's driver's license to an online merchant. In the latter case, the driver's license information is provided to the merchant in a format that encourages capture and allows for retention and reuse. One potential outcome of this change is that identity authentication (or the authentica- tion of a relatively unique attribute or set of attributes with the same effect) is more likely to result in a personally identifiable stored record than was the case in earlier environments. A recent example illustrates this point. Using a scanner that allows him to read and capture data from the magnetic stripes on the back of Massachusetts driver's licenses, a barkeep in Boston has built a database of personal information includ- ing driver's license number, height, weight, date of birth, eye and hair color, address, and, in some instances, Social Security number on his patrons.6 Without the state-issued driver's license, collecting such data on individuals would be expensive and cumbersome and would meet with privacy objections. The introduction of machine-readable cards and the market availability of readers have increased the chances that per- sonal information would be captured, reused, and potentially sold. The introduction of technology without any change in policy has led to practices that are more invasive of privacy. 6Jennifer Lee. "Finding Pay Dirt in Scannable Driver's L 21 2002. icenses." New York Times, March
60 WHO GOES THERE? 2. Once data are collected, computerized record-keeping facilitates record linkage.7 Distributed relational databases allow diverse records with a common attribute or attributes to be more readily combined. This ability to link and profile record subjects supports the secondary use of informa- tion. To build on the driver's license example above, stores across the country are making similar use of scannable driver's license data.8 As customer records across various sectors of the economy become tied to driver's license data, it becomes markedly easier to share and merge for different purposes the data collected by different establishments. And it is not only the private sector that makes use of the scannable licenses to control access. Some government buildings are also using these scan- nable licenses to record information about visitors. 3. Rules codified for use in computerized systems are generally less flexible (for both good and bad uses) than policies implemented by humans. Businesses and other entities often treat long-time customers and first-time custom- ers differently.9 A long-time customer may not need to provide the same level of authentication before engaging in an interaction or transaction. Information systems, while they can be programmed to treat different people differently, generally apply authentication rules designed for the worst-case scenario (in this instance, the new customer). In other words, unless otherwise directed, the system will demand the same information from a repeat visitor as from a newcomer and will retain that information. Therefore, the baseline data collected in information systems transactions tends to be richer than that collected in manual systems. 4. Information technology enables covert identification and possibly overt identity authentication on a large scale. The covert nature of some informa- tion systems used for identification and identity authentication (such as the driver's license scanners discussed above) denies individuals full in- formation about the transaction and impedes oversight and accountabil- 7see the 1993 report of the Committee on National Statistics, Private Lives and Public Policies: Confidentiality and Accessibility of Government Statistics, Washington, D.C., National Academy Press, 1993, as well as the same committee's 2000 workshop report Improving Access to and Confidentiality of Research Data, Washington, D.C., National Academy Press, 2000, for more on issues surrounding data collection, linkage, and confidentiality. Available online at <http://www7.nationalacademies.org/cnstat/>. 8The Driver's Privacy Protection Act of 1994 prohibits states from disclosing this infor- mation, except in limited circumstances, without individual consent. While the law does not prohibit the creation of such databases by the private sector, it is clear that scannable licenses undermine congressional policy to limit the use of driver's license data for non- driving-related purposes. 9The downside of this practice is discrimination. Without accurate data, rules about who is a risky customer are more likely to be influenced by the biases of the business or indi- vidual. Accurate data can check these tendencies.
PRIVACY CHALLENGES IN AUTHENTICATION SYSTEMS 61 ity through the political process. While individuals are aware that the license is being scanned, they are not necessarily informed that informa- tion from it may be retained, reused, exchanged, or used to link with other systems. Indeed, individuals are unlikely to know what informa- tion can actually be retrieved from scanning the back of the license. Even if people were to learn over time the data collection possibilities inherent in a driver's license, there will always be circumstances in which nondis- closure of those possibilities can cause problems. There are other systems that, while discussed prior to implementa- tion or debated by the public after the fact, nevertheless provide little signal to the individual at the time that identification occurs. For ex- ample, many cities have installed cameras to detect drivers running red lights. In locations where such cameras have been proposed or imple- mented, initial opposition has often generated community discussion about what information is collected, what decisions can be made on the basis of it, and what recourse is available to individuals.~° While this public debate increases the general awareness of the individuals who reside in an area (but not necessarily those who pass through), the collect- ing of information in this way is more covert than the scanning of driver's licenses described above. An individual gives over a driver's license. Here, an individual drives through an intersection hardly an activity that signals an identification or authentication event. While the cameras are more easily understood by individuals as identification (surveillance) tools than is the driver's license reader, it is less likely that the presence of a camera will be noticed. The increasing use of the Internet and other networked systems to support access to information, deliver services, and communicate raises questions about the access-control policies governing these interactions and their impact on individual privacy. Similarly, the use of information systems and networking to control access to and movement in physical spaces and to support attribute- and identity-based service and sales deci- sions off-line raises questions about the authentication systems that sup- port these interactions and their privacy implications. Ubiquitous com- puting, sensor-equipped buildings, and smart highways are the direction of the future. They raise important questions about what kind of authen- tication occurs, how the data used and generated during authentication events are handled, and how the answers to these questions support or William Matthews. "Battle Lines Form over Red-Light Cameras." Federal Computer Week (September 3, 2001~. Available online at <http://www.fcw.com/geb/articles/2001/sep/ geb-comm2-09-Ol.asp>.
62 WHO GOES THERE? undermine individual privacy, access to information, freedom of associa- tion, and other democratic values. A highly mediated environment of networked systems requires sys- tem owners to choose between attribute authentication and identity au- thentication. This choice and the decisions about retention, reuse, and disclosure that flow from it influence the degree of privacy that individu- als using the system enjoy. To the extent that individuals are aware of the chosen policies and their implications, the privacy provided by the sys- tem will in turn influence individuals' decisions about how and in what circumstances to interact with it. THE LEGAL FOUNDATIONS OF PRIVACY Privacy is a fundamental tenet of legal systems and political philoso- phies that value individual freedom, autonomy, and political participa- tion. Privacy has many and varied definitions and is evoked in many contexts to achieve differing results. It has important political, emotional, social, and legal dimensions. It protects against intrusions in physical places, interference with personal decisions, misuse of personal informa- tion, and various interests similar to property interests. The underlying values that privacy protects include individuality and autonomy; inti- macy; fairness; and limited, tolerant government. Early legal definitions of privacy center on the notion of being left alone. Phrases such as "a man's home is his castle''l1 and "the right to be let alone''l2 capture this notion of privacy, which encompasses the ability of individuals to retreat to the safety of home, pull the shades, and lock the doors, freeing themselves from prying neighbors and state surveil- lance. While a powerful and important element of privacy, this right to seclusion became increasingly incapable of protecting individuals as soci- ety became more interdependent and as interactions became more infor- mation-rich. Social and technological changes in the 1960s and 1970s gen- erated renewed interest on the part of philosophers and lawyers in defining and conceptualizing privacy.l3 From their analyses and writ- ings emerged an appreciation for a more complex and multifaceted con- cept of privacy and its legal foundations. 11~' . [T]he house of every one is to him as his castle and fortress." Semayne's Case, 5 C. Rep. 91a, 77 Eng. Rep. 194 (K.B. 1603~. 12''They conferred, as against the Government, the right to be let alone the most com- prehensive of rights, and the right most valued by civilized men." Justice Brandeis dissent- ing in Olmstead v. United States, 277 U.S. 438, 478 (1928~. 13See, for example, Edward J. Bloustein, "Privacy as an Aspect of Human Dignity," New York University Law Review 39 (December 1964~: 962-1007; Charles Fried, "Privacy," Yale Law
PRIVACY CHALLENGES IN AUTHENTICATION SYSTEMS 63 Privacy law in the United States derives from many sources, includ- ing common law, the U.S. Constitution and state constitutions, and state and federal statutes. As the values that it protects suggest, privacy law comprises several branches. This report examines the potential privacy impact of authentication technologies on four areas of privacy, each of which has a constitutional basis in the United States: 1. Bodily integrity, which protects the individual from intrusive searches and seizures; 2. Decisional privacy, which protects the individual from interference with decisions about self and family; 3. Information privacy, which protects the individual's interest in con- trolling the flow of information about the self to others; and 4. Communications privacy, a subset of information privacy that pro- tects the confidentiality of individuals' communications. As discussed above, authentication technology can intrude on each of these privacy interests. Authentication methods may require contact with or close proximity to the body, potentially raising concerns under the "bodily integrity" branch of privacy law. Authentication may introduce new opportunities to collect and reuse personal information, intruding on "information privacy." Authentication systems may be deployed in a manner that interferes with individuals' "decisional privacy" by creating opportunities for others to monitor and interfere with important expres- sive or other personal activities. Authentication methods may raise new opportunities to intercept or monitor a specific individual's communica- tions, revealing the person's thoughts and the identities of the individuals with whom he or she communicates. This section provides some histori- cal context for the privacy interests listed above. Constitutional Roots of Privacy The word "privacy" is notably absent from the U.S. Constitution. However, the values and interests that privacy protects are explicitly ex- pressed in various amendments and have been held by the U.S. Supreme Court to be implicit in other amendments. For example, the Fourth Amendment prohibition against unreasonable searches and seizures and the Fifth Amendment prohibition of compelled self-incrimination explic- Journal January 1968~: 475493; Judith Jarvis Thompson, "The Right to Privacy," Philosophy and Public Affairs 4 (summer 1975~: 303; James Rachels, "Why Privacy Is Important," Philoso- phy and Public Affairs 4 (summer 1975~: 323-333; William M. Beaney, "The Right to Privacy and American Law," Law and Contemporary Problems 31 (1966~: 357.
64 WHO GOES THERE? itly protect privacy interests in personal papers and effects and in per- sonal thoughts and beliefs, respectively,l4 while the First Amendment prohibition against the suppression of speech and assembly has been found to implicitly include the right to speak and to assemble anony- mously. The Supreme Court has interpreted the First, Third, Fourth, Fifth, Ninth, and Fourteenth Amendments as providing protection for different aspects of personal privacy. Although it is important to note that consti- tutional claims arise only in cases in which some state action interferes with privacy, the values represented by these constitutional claims reso- nate broadly throughout society. First Amendment Interest in Privacy and Anonymity The First Amendment guarantees the freedoms of speech, associa- tion, and access to information. Numerous Supreme Court cases docu- ment the right of individuals to speak, associate, and receive information without having their identities revealed. The ability to speak anony- mously is rooted not only in the Constitution but also in the actions forg- ing a consensus for its ratification. The Federalist Papers were penned under several noms de plume. The Supreme Court has affirmed the right of anonymity in political speech and the right to solicit door to door without registering or identifying oneself.l5 Similarly, the Court has rec- ognized the chilling effect that the disclosure of membership lists would have on the freedom to associate, and therefore it has shielded such lists from government scrutiny.l6 The ability to receive information anony- mously, the corollary of the right to speak anonymously, while less clearly 14''When the Fourth and Fifth Amendments were adopted, 'the form that evil had there- tofore taken' had been necessarily simple. Force and violence were then the only means known to man by which a Government could directly effect self-incrimination. It could compel the individual to testify a compulsion effected, if need be, by torture. It could secure possession of his papers and other articles incident to his private life a seizure effected, if need be, by breaking and entry. Protection against such invasion of 'the sancti- ties of a man's home and the privacies of life' was provided in the Fourth and Fifth Amend- ments by specific language." Justice Brandeis dissenting in Olmstead v. United States, 277 U.S. 473, quoting Boyd v. United States, 116 U.S. 616, 630. 15See McIntyre v. Ohio Elections Commission, 514 U.S. 334 (striking down a state statute requiring political leafleteers to identify themselves on their leaflets). Recently the Supreme Court upheld a similar challenge to a local ordinance requiring all individuals petitioning door to door to register and identify themselves (Watchtower Bible and Tract Society, Inc. v. Village of Stratton, 00-1737~. Also see Watchtower Bible and Tract Society of New York, Inc., et al. v. Village of Stratton, et al. (00-1737) 240 F.3d 553, reversed and remanded; available online at <http: / /supct.law.cornell.edu/supct/html/00-1737.ZS.html>. 16NAACP v. Alabama, 357 U.S. 449 (1958) (striking down a state statute that required organizations to disclose their membership to the state).
PRIVACY CHALLENGES IN AUTHENTICATION SYSTEMS 65 articulated by the Court, can be found in cases forbidding the government from requiring individuals to affirmatively register to receive certain kinds of informational and affirming the right of individuals to possess for in-home consumption "obscene" materials that could not legally be sold.l8 Recently the Colorado Supreme Court held that the First Amend- ment to the U.S. Constitution and the state constitution "protect the individual's right to purchase books anonymously, free from governmen- tal interference.''l9 Third Amendment Privacy Protection The Court has found protection of a right to privacy against unrea- sonable surveillance and compulsory disclosure in the Third Amend- ment's protection against quartering soldiers. This protection has gener- ally been viewed as secondary to the broader protection of the Fourth Amendment. Fourth Amendment Roots of Privacy Law The Fourth Amendment to the U.S. Constitution protects individuals against unreasonable searches of their persons and places and against unreasonable seizures of their property. Fourth Amendment jurispru- dence articulates limits on government searches of individuals, residences and other private places, and communications. The principle on which the Fourth Amendment is based derives from an even older tradition in British common law. As early as the early 17th century, British courts were placing limits on the power of the Crown to enter anyone's home. Though the power of the monarch was still substantial, Semayne's Cased in 1603 says that "the house of every one is to him as his castle and fortress." Over time, this basic limitation on entry into the sanctity of one's home has been stated with more precision. The state may not enter 17See Lamont v. Postmaster General, 381 U.S. 301 (1965) (striking down a postal regulation requiring individuals to register a desire to receive communist propaganda). 18See Stanley v. Georgia, 394 U.S. 557 (1969) (striking down a state statute criminalizing in-home possession of obscene material); Denver Area Educational Telecommunications Con- sortium, Inc. v. FCC, 518 U.S. 727 (striking down cable statute requiring individuals to re- quest in writing segregated, patently offensive cable programming as overly restrictive in light of alternatives that protected the anonymity of viewers). 19Tattered Cover, Inc. v. City of Thorton, Colo. Sup Ct 2002 Colo. LEXIS 269, April 8, 2002; see also Julie E. Cohen, "A Right to Read Anonymously: A Closer Look at 'Copyright Management' in Cyberspace," 28 Conn. L. Rev. 981 (1996) (arguing that the right to read anonymously is protected by the First Amendment). 20See Semayne's Case, 5 C. Rep. 91a, 77 Eng. Rep. 194 (K.B. 1603~.
66 WHO GOES THERE? without a reason and a warrant issued by a court; in addition, the state must "knock and announce" the search. Announcing the search and presenting the target of the search with a copy of the warrant for inspec- tion is critical to assure that the state does not enter without a warrant and that the reasons for which the warrant were issued can be challenged, at least after the fact. These procedural safeguards have been found neces- sary to guard against abuse of the invasive searching power granted to the state. Searches conducted without simultaneous notice are considered secret searches and generally prohibited under U.S. constitutional law. For obvi- ous reasons, wiretapping and other types of electronic surveillance are, by definition, secret. A telephone wiretap that first announces to the parties being tapped that their voices are being recorded is not likely to yield any useful evidence. Yet, courts have allowed that wiretapping, though gener- ally violating the rule against secret searches, may be allowed in limited circumstances. Historically, electronic surveillance was only allowed for a limited class of serious crimes, and only after other investigative means had failed. In recent years the list of crimes has grown. In addition, the statutory protections for electronic communications such as e-mail do not directly parallel those established for voice communications in the wake of Supreme Court rulings, not to mention that the effects of the USA PA- TRIOT Act of 2001 (Public Law 107-56, Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terror- ism Act of 2001) on opportunities for surveillance and accountability are still to be determined. (See the sections below entitled "Statutory Privacy Protection" and "Privacy of Communications.") Fifth Amendment Protection of Privacy The protection against self-incrimination also serves as a basis for a type of privacy protection, including primarily decisional privacy and, somewhat more weakly, bodily integrity. Although the principle of the Fifth Amendment that no person shall be compelled to be a witness against himself or herself may be relevant in many contexts, its applica- tion is limited to criminal cases or other government proceedings. The Court has adopted a rather narrow view of the coverage of the Fifth Amendment by making a distinction between testimonial evidence, in- volving communication by the individual and thus falling under the Fifth Amendment, and physical evidence, entailing the taking of something Recent developments may be changing this baseline, however. For a general discus- sion of the law, see Computer Science and Telecommunications Board, National Research Council, Cryptography's Role in Securing the Information Society, Washington, D.C., National Academy Press, 1996.
PRIVACY CHALLENGES IN AUTHENTICATION SYSTEMS 67 from an individual and thus falling outside the protection of the Fifth Amendment. This distinction was made most clearly in Schmerber v. California,22 in which the Court ruled that there was no Fifth Amendment protection against blood tests, viewed as physical evidence, to determine blood alcohol content following a car accident. The Court distinguished between situations in which a defendant was forced verbally to incrimi- nate himself or herself and situations in which marks or material were taken from him or her for identification purposes (fingerprints, photo- graphs) or for purposes of preventing the dissipation of evidence (blood test). Although the latter situations would not be covered by the Fifth Amendment, the Court indicated that the Sixth Amendment protection of counsel, the Fourth Amendment protection against unreasonable searches and seizures, and the due process clause23 would provide protection against the state's overreaching in such situations. Ninth Amendment Penumbras, Fourteenth Amendment Due Process Clause, and Decisional and Informational Privacy As mentioned above, privacy has been invoked to protect the individual's right to make decisions about important aspects of life with- out government interference. A line of Supreme Court cases starting with Griswold v. Connecticut24 in 1965 began to establish such a right, although various justices viewed the source of the right differently. Justice Dou- glas believed the privacy right emanated from the First, Third, Fourth, Fifth, and Ninth amendments, which created "penumbras" of privacy protection. Other justices preferred to lodge the right in the Ninth Amend- ment. In Roe v. Wade,25 the Court held that the right to privacy was founded in the Fourteenth Amendment's liberty clause and restrictions on state action. The right to privacy protected in this line of cases has been primarily limited to reproductive and family interests, including the individual's right to make choices with respect to childbearing, child rear- ing, and the use of contraceptives.26 In Whalen v. Roe,27 the Court articu- 22Schmerber v. California, 384 U.S. 757 (1966~. 23In Rochin v. California, 342 U.S. 165 (1952), Justice Frankfurter, writing for the majority, said that the forced regurgitation of stomach contents was conduct that "shocks the con- science" and violates the due process clause of the Fourteenth Amendment. 24Griswold v. Connecticut, 381 U.S. 479 (1965~. 25Roe v. Wade, 410 U.S. 113 (1973~. 26In Paul v. Davis (424 U.S. 693 (1976~), the Supreme Court refused to expand the areas of personal privacy considered "fundamental" to include erroneous information in a flyer listing active shoplifters. The court limited these fundamental privacy areas to "matters relating to marriage, procreation, contraception, family relationships, and child rearing and education" (713~. 27Whalen v. Roe, 429 U.S. 589 (1977~.
68 WHO GOES THERE? fated a constitutional basis for a right of information privacy, arguing that the constitutionally protected "zone of privacy" protects both an interest in avoiding disclosure of personal matters and an interest in independent decision making. Although recognizing an expanded privacy interest, the Court unanimously found that the New York law in question, which required the maintenance of computerized records of prescriptions for certain drugs, did not pose a significant constitutional threat to either privacy interest, in part because of the security of the computer system and the restrictions on disclosure. In subsequent cases, the Court has not expanded constitutional protections for information privacy. The Common Law Roots of Privacy Law As mentioned above, constitutional privacy protections limit state action; they do not protect against intrusion by private individuals or entities. Historically, tort law has provided protection for some aspects of personal privacy. English and early American case law provides examples of the use of tort law to protect against trespass into private spaces, un- wanted knowledge of private events, and unwanted publicity of private matters. In 1890, concerned with tabloid journalists' and photographers' intrusion on private matters, Samuel D. Warren and Louis D. Brandeis, in "The Right to Privacy,"28 set forth the "right to an inviolate personality." American courts and legislatures adopted various expressions of the new privacy tort throughout the early 20th century. In 1960, William L. Prosser structured and defined these various tort law privacy protections into four separate privacy torts: 1. Intrusion upon seclusion: objectionable intrusion into the private af- fairs or seclusion of an individual, 2. Public disclosure of privatefacts: publication of private information that a reasonable person would object to having made public, 3. False light: publication of objectionable, false information about an individual, and 4. Misappropriation of name or likeness: unauthorized use of an individual's picture or name for commercial advantage.29 The 1964 Restatement of Torts (a clarification and compilation of the law by the American Law Institute) adopted the Prosser framework.30 28Samuel D. Warren and Louis D. Brandeis. "The Right to Privacy." Harvard Law Review (December 1890~:195. 29William L. Prosser, "Privacy," 48 Cal. L. Rev. 383 (1960~. 30Restatement of Torts (2d) 1964.
PRIVACY CHALLENGES IN AUTHENTICATION SYSTEMS 69 Together, these torts provide a basis for privacy suits against those who publish embarrassing false information or intimate information about an individual, peep or spy on an individual, or commercially exploit an individual's picture, name, or reputation. Today privacy torts provide limited protection for individuals. As torts, they are unlikely to directly shape the design and use of authentication systems. However, the prin- ciples behind the intrusion-upon-seclusion, public-disclosure-of-private- facts, false-light, and misappropriation-of-name-or-likeness torts are use- ful reminders of some of the things that privacy is designed to protect against intrusion into personal affairs, disclosure of sensitive personal information, and improper assignment of actions to individuals. Each of these is relevant to the discussion of authentication systems. Statutory Privacy Protections In recent years, the Federal Trade Commission (FTC) Act of 19143~ has become a tool for enforcing privacy statements whatever they may be made by commercial actors to the public. Section 5 of the FTC Act gives the FTC jurisdiction over "unfair and deceptive trade practices." Importantly, while the statute clearly provides an enforcement opportu- nity where statements about data collection practices are made, it alone provides no independent basis for compelling such statements, or for driving their contents.32 A series of workshops, industry-developed self- regulatory guidelines, and enforcement actions by the FTC and offices of the states attorneys general have provided some check on objectionable or questionable private sector practices. Over the years, Congress has enacted a number of privacy statutes. Most have come in response to changes in technology, to market failures, or to narrow interpretations of the Fourth Amendment. Market failures have led, as one would suspect, to statutes that primarily regulate private sector behavior. Narrow rulings on the protections afforded by the Fourth Amendment have led to statutes regulating government access to infor- mation. Finally, statutes that address both market failures and narrow constitutional interpretations have most often resulted from advances in technology that cause civil libertarians and industry to push for new privacy protections against the expansion of governmental and private sector authority to collect and use private information. 3~15 U.S.C. §§ 41-51. 32Jeff Sovern has articulated the position that the FTC actually has the authority to go after various unsavory data practices under its current legislation and mandate. See Jeff Sovern, "Protecting Privacy with Deceptive Trade Practices Legislation," Fordham Law Re- view 69~4~:1305.
70 WHO GOES THERE? The existing federal and state statutory privacy protections are often described as piecemeal or patchwork.33 Personal information contained in "systems of records" held by the federal government are covered by We Privacy Act of 1974,34 We Freedom of Information Act of 1967,35 and over federal statutes dealing with particular records or record keepers.36 Statutes of many states on access to information contain privacy excep- tions, and some states have "mini" privacy acts. In general, rules govern- ing access to and use of state and local records containing personal infor- mation are less stringent. Personal information held by the private sector is afforded the weakest statutory protections. While 11 federal statutes currently provide some form of privacy protection for records held by specific private sector entities37 and a set of statutory-like regulatory pro- tections applies to heal information,38 much detailed personal informa- tion in the hands of businesses is available for reuse and resale to private third parties and available to the government with little in the way of legal standards or procedural protections. (Chapter 6 in this report goes into more detail about some of these statutes and the roles that govern- ment plays in We privacy and authentication sense.) Business records are subject to few privacy regulations. While recent statutes have increased We privacy regulations in We private sector, We 33See Colin J. Bennett, Regulating Privacy: Data Protection and Public Policy in Europe and the United States, Ithaca, N.Y., Cornell University Press, 1992; David Flaherty, Protecting Privacy in Surveillance Societies, Chapel Hill, University of North Carolina Press, 1989; Priscilla M. Regan, Legislating Privacy: Technology, Social Values, and Public Policy, Chapel Hill, University of North Carolina Press, 1995; and Paul Schwartz and Joel Reidenberg, Data Privacy Law, Charlottesville, Va., Michie, 1996. 345 U.S.C. § 552a. 355 U.S.C. § 552. 36Driver's Privacy Protection Act of 1994,18 U.S.C. § 2721 (1994~; Family Educational Rights and Privacy Act of 1974, 20 U.S.C. § 1232g. 37Right to Financial Privacy Act of 1978, 12 U.S.C. § 3401; Electronic Communications Privacy Act of 1986,18 U.S.C. § 2510 (1995~; Communications Assistance and Law Enforce- ment Act of 1994, PL 103414,108 Stat. 4279 (1994) (providing heightened protections for transactional data); Cable Communications Act of 1984, PL 98-549, 98 Stat. 2779 (1984) (codified as amended in scattered sections of 47 U.S.C.~; Video Privacy Protection Act of 1988, 18 U.S.C. § 2710 (1994~; Consumer Credit Reporting Reform Act of 1996, 15 U.S.C. 1681 § 2 (1997~; Telemarketing and Consumer Fraud and Abuse Prevention Act of 1994, 15 U.S.C. §§ 6101, 6108; Privacy of Customer Information (Customer Proprietary Network Information Rules of the Telecommunications Reform Act of 1996), 47 U.S.C. § 222 (c), (d) (1996~; Fair Credit Reporting Act of 1970, 15 U.S.C. §1681 et seq.; Children's Online Privacy Protection Act (1998),16 U.S.C. §§ 6501 et seq; Financial Services Modernization Act (1999), 15 U.S.C. § 6801 et seq. 38On April 14, 2001, privacy regulations were issued by the Department of Health and Human Services by authority granted under the Health Insurance Portability and Account- ability Act of 1996 (see Chapter 6 for more information on HIPAA).
PRIVACY CHALLENGES IN AUTHENTICATION SYSTEMS 71 U.S. legal and regulatory approach continues to be driven by concerns about a given sector or a narrow class of information (see Chapter 6~. In addition to piecemeal rules governing private sector use of personal in- formation, the general rule established in two 1970s cases leaves personal information "voluntarily" provided to businesses without Fourth Amend- ment protection.39 The rationale espoused in these two cases dramatically shaped privacy case law and led to statutory protections for privacy. The principle that in general individuals have no constitutionally based pri- vacy interest in information about them contained in the routine records of a business has specific consequences for individual privacy in authen- tication systems that routinely collect information about an individual during the course of an authentication event that precedes a transaction. INFORMATION PRIVACY AND FAIR INFORMATION PRACTICES Statutory protections for personal information all rest on the same core set of "fair information practices," which were developed in response to the move from paper to computerized records. The first "code of fair information practices," developed in 1973 by an advisory committee in the then-Department of Health, Education, and Welfare (HEW), provided a core statement of principles that may be enforced either by statute or voluntarily.40 These principles set out basic rules designed to minimize the collection of information, ensure due-process-like protections where personal information is relied upon, protect against secret data collection, provide security, and ensure accountability. In general, the principles emphasized individual knowledge, consent, and correction, as well as the responsibility of organizations to publicize the existence of a record sys- tem, to assure the reliability of data, and to prevent misuse of data. A1- though the practices cited in the HEW code have been broadly accepted, slightly different iterations of fair information practices have been offered by different bodies.4l42 Because of the broad recognition accorded the 39In 1976, in United States v. Miller, the supreme court held that individuals had no constitutionally protected privacy interest in checks held by a bank. Shortly thereafter, in 1979, in Smith v. Maryland, the court ruled that because the numbers dialed by a telephone subscriber were routinely collected business records of phone companies, subscribers had no Fourth Amendment privacy interest in them and therefore no right to receive notice of or to object to their disclosure to the government. 40secretary s Advisory committee on Automated Personal Data systems. u.s. Depart- ment of Health, Education, and Welfare. Records, Computers and the Rights of Citizens, Wash- ington, D.C., 1973. Available online at <http://aspe.os.dhhs.gov/datacncl/1973privacy/ tocprefacemembers.htm>. 4lWhen discussions of online privacy began in the early 1ssos, the concept and prin-
72 WHO GOES THERE? fair information practice principles, they are explained in detail in Table 3.1 and used later in this report for analyzing the privacy impact of different authentication systems. ~ general, though, the individual principles have not been implemented with uniform rigor. Limitations on the collection of information have not been widely adopted, consent has been largely re- nounced in favor of choice, and access has been harder to achieve. The concept of notice is in some respects a simple idea: people are to be informed about how personally identifiable information is collected, used internally, and disclosed or exchanged. An organization's informa- tion practices should be, in theory, transparent. In practice, there are questions about how complete notices need to be without either compro- mising the proprietary interests of the organization or confusing people. Additionally, what really constitutes effective notice?43 ciples of "fair information practices" provided the foundation for policy discussions. Two executive branch study commissions the Information Infrastructure Task Force (IITF) and the National Information Infrastructure Advisory Council (NIIAC) developed privacy principles for the National Information Infrastructure (NII). In both cases, these study commissions echoed many of the traditional principles developed earlier, often modifying, and in some cases weakening, some of the core principles, such as consent and redress. But both commissions also struggled with questions about fair information practice that are new in the online environment. The IITF and the NIIAC recognized emergent principles, including the need to provide some opportunity for individuals to use technical controls, such as encryption, to protect the confidentiality and integrity of personally identifiable information. Both acknowledged that individuals should be able to remain anonymous as they conduct some online activities. The importance of educating the public about the privacy implications of online activities was highlighted in the codes developed by the IITF and the NIIAC. Although these early online privacy study commissions advocated a fairly detailed list of fair information practices, by 2000 the various iterations of fair information practices for online privacy discussed by the Federal Trade Commission and others largely focus on four: notice, choice, access, and security. Efforts to articulate more clearly the essence of information privacy were not limited to the United States. Indeed, the most comprehensive of these codes of fair information practices is the one crafted by the Organi- zation for Economic Cooperation and Development (OECD) in 1980. The OECD code em- phasized eight principles: collection limitation, data quality, purpose specification, use limi- tation, security safeguards, openness, individual participation, and accountability. 42Different countries have adopted these principles to varying extents. Canada, for ex- ample, has developed a national privacy code, the Model Code for the Protection of Per- sonal Information. This code was developed through a consensus process that included representation from Canada's Direct Marketing Association. More information is available online at <http: / /www.csa.ca/standards/privacy/>. 430ther problems with the effectiveness of notices are illustrated by experience with the Financial Services Modernization Act of 1999 (commonly referred to as the Gramm-Leach- Bliley Act), discussed in more detail in Chapter 6, which requires financial institutions to give notice to customers regarding the sharing of personal information with a third party. Financial institutions have complained about the expense incurred in sending notices. Con- sumers have complained that notices are incomprehensible and unhelpful. See Mark Hochhauser, Lost in the Fine Print: Readability of Financial Privacy Notices, July 2001. Avail- able online at <http://www.privacyrights.org/ar/GLB-Reading.htm>.
PRIVACY CHALLENGES IN AUTHENTICATION SYSTEMS TABLE 3.1 Fair Information Principles and Practices 73 Principle Practice/Meaning Collection limitation Data quality Purpose specification Use limitation (restriction on secondary uses) Security Openness /notice Individual participation Accountability Collect the minimum amount of information that is needed for the relationship or transaction at issue- By lawful and fair means. With the knowledge and consent of the individual. Information should be relevant, accurate, timely, and complete. Use of data should be specified at the time that data are collected. Data should only be used for the specific purpose for which they are collected and for which the individual understands they will be used, except under two conditions: With the prior consent of the individual, and With the appropriate legal authority. The integrity of the information and the system should be maintained to ensure against loss, destruction, unauthorized access, modification, unauthorized use, or disclosure. There should be no secret data systems. People should be able to ascertain the existence of data systems and their purposes and uses. An individual has rights to Know if he or she is a subject of a system, Access information about him- or herself, Challenge the quality of that information, and Correct and amend that information. The organization collecting and using information can be held responsible for abiding by these principles through: Enforcement and/or Redress.
74 WHO GOES THERE? Federal agencies comply with notice provisions of the Privacy Act of 1974 by publishing requests for comments in the Federal Register when they plan to create "systems of records" those information systems that contain personally identifiable information such as a name or Social Secu- rity number. Few individuals read the Federal Register to see whether a federal agency that maintains data on them in a system of record has announced in a routine use notice changes in the way that the agency intends to use those data. The concept of "consent," or the less stringent "choice," is a more complex idea. In theory, individuals are to be given some power or control over how personally identifiable information about them is used. In practice, the primary question is whether such control comes from giving the individual the opportunity to opt in by giving prior permission or the opportunity to opt out by allowing them to say no. Privacy advo- cates argue that "opt in" is more consistent with the idea of consent, while "opt out" erroneously assumes that individuals tacitly give consent to secondary uses. Organizations argue that "opt out" gives individuals adequate opportunity to choose and does not overburden consumers or industry. Recognizing the complexity and importance of access and security in the online environment, the FTC convened an advisory committee to ex- amine and advise on these subjects.44 With regard to access, the commit- tee addressed four questions: (1) What is the meaning of access (merely view or view and modify)? (2) Access to what? (3) Who provides access? and (4) How easy should access be? The Advisory Committee on Online Access and Security was unable to agree on a clear recommendation and instead presented a range of access options. In part, the committee recog- nized that the dilemmas presented by the need to authenticate for access purposes complicated access options and necessitated an evaluation of the particular circumstances. The Advisory Committee on Online Access and Security recognized that security likewise is contextual, that costs and inconveniences affect the level of security that administrators are willing to set and users are willing to bear, and that the establishment of a security system should begin with a risk assessment. The committee outlined five options for achieving security and recommended a solution including these three principles: (1) every Web site should have a security program, (2) the 44Federal Trade Commission (FTC). Final Report of the FTC Advisory Committee on Online Access and Security. Washington, D.C., May 15, 2000. Available online at <http:// www.ftc.gov/acoas/papers/finalreport.htm>.
76 WHO GOES THERE? (e-mail, the World Wide Web, and so on), wireless phones, and other devices complement and in some cases replace telephone communica- tions, the United States as a nation has generally recognized the need to create privacy protections similar to those established for voice communi- cations by the Supreme Court.45 From telegraph to telephone, wireline phone to cell phone, e-mail to the World Wide Web, users of the major new communication technologies have acquired privacy protections for their communications. Thus far in the history of electronic communica- tions, policy makers, commercial providers, and even those in the field of law enforcement have come to agree that new technologies demand pri- vacy protections,46 both out of faithfulness to basic constitutional values and to assure the commercial viability and acceptance of the latest com- munications technologies. However, the scope of such protections has consistently fallen short of the standards, based on the Fourth Amend- ment, that govern real-time voice communications. At the same time, the range of information and communications flowing through these new communications technologies has dramatically increased. Thus today, many kinds of information are potentially accessible under the secret searches of wiretap law. In addition, in light of recent events, there is an expanding sense of what government may legitimately need to access to meet national security and law enforcement requirements. Most recently, Congress has struggled with the question of the pro- tection of online transactional records such as logs tracking the Web pages viewed by individual users and records of electronic mail messages sent and received. Though law enforcement argued that these logs revealed little information and should be easily available for any investigative pur- pose at all, the legislature found that this information is sufficiently sensi- tive to warrant extra protection. Electronic communications have required the expansion of privacy protection commensurate with new technology capabilities (see Box 3.1~. The Electronic Communications Privacy Act (ECPA) of 1986 was sup- ported by a coalition of businesses and privacy advocates who under- stood that protections similar to those for first-class mail were a necessary precursor to business and individual adoption of e-mail as a communica- 45See Katz v. United States, 389 U.S. 347 (1967~; Available online at <http://laws.findlaw.com/ us/389/347.html>. 46In Kyllo v. United States, Justice Scalia, writing for the majority, noted "We think that obtaining by sense-enhancing technology any information regarding the interior of the home that could not otherwise have been obtained without physical 'intrusion into a consti- tutionally protected area' (Silverman, 365 U.S., at 512) constitutes a search at least where (as here) the technology in question is not in general public use"; see <http: / / supct.law.cornell.edu/supct/html/99-8508.ZO.html>.
PRIVACY CHALLENGES IN AUTHENTICATION SYSTEMS 77 lions tool for sensitive information.47 Similarly, the privacy amendments to ECPA in 1994 creating a higher level of protection for transactional information generated in Web-based interactions recognized that this in- formation was more sensitive than the numbers dialed on a phone, and consequently that public use of the Web would be aided by creating more stringent protections against access. 47see Electronic communications Privacy Act of 1986 (PL 99-508~. Available online at <http: / /www.cpsr.org/cpsr/privacy/wiretap/ecpa86html>.
78 WHO GOES THERE? CONCLUDING REMARKS Authentication technologies, like other technical advances, renew the debate about how much privacy protection should be provided to per- sonal information generated in the authentication process. As with other advances, in order to speed adoption, policy makers, industry, law en- forcement, and privacy advocates should identify the privacy-sensitive features of these technologies and develop appropriate protections. Finding 3.1: Authentication can affect decisional privacy, infor- mation privacy, communications privacy, and bodily integrity privacy interests. The broader the scope of an authentication system, the greater its potential impact on privacy. Recommendation 3.1: Authentication systems should not in- fringe upon individual autonomy and the legal exercise of ex- pressive activities. Systems that facilitate the maintenance and assertion of separate identities in separate contexts aid in this endeavor, consistent with existing practices in which individu- als assert distinct identities for the many different roles they assume. Designers and implementers of such systems should respect informational, communications, and other privacy in- terests as they seek to support requirements for authentication actions. In terms of developing an actual system, and considering fair infor- mation principles and practices as described in this chapter, as well as how authentication works in the abstract (as discussed in Chapter 2), the following guidelines are offered for the development of authentication systems that would protect privacy interests as much as possible. Recommendation 3.2: When designing an authentication sys- tem or selecting an authentication system for use, one should: · Authenticate only for necessary, well-defined purposes; · Minimize the scope of the data collected; · Minimize the retention interval for data collected; · Articulate what entities will have access to the collected data; · Articulate what kinds of access to and use of the data will be allowed; · Minimize the intrusiveness of the process; · Overtly involve the individual to be authenticated in the process;
PRIVACY CHALLENGES IN AUTHENTICATION SYSTEMS · Minimize the intimacy of the data collected; · Ensure that the use of the system is audited and that the audit record is protected against modification and destruc- tion; and · Provide means for individuals to check on and correct the information held about them that is used for authentication. 79