4
Cultural, Social, and Legal Considerations

Biometric systems assume and require an intimate relationship between people and technologies that collect and record the biological and behavioral characteristics of their bodies. It is therefore incumbent upon those who conceive, design, and deploy biometric systems to consider the cultural, social and legal contexts of these systems. Not attending to these considerations and failing to consider social impacts diminishes their efficacy and can bring serious unintended consequences.

The key social issue surrounding biometrics is the seemingly irrevocable link between biometric traits and a persistent information record about a person. Unlike most other forms of recognition, biometric techniques are firmly tied to our physical bodies. The tight link between personal records and biometrics can have both positive and negative consequences for individuals and for society at large. Convenience, improved security, and fraud reduction are some of the benefits often associated with the use of biometrics. Those benefits may flow to particular individuals, corporations, and societies but are sometimes realized only at the expense of others. Who benefits at whose expense and the relative balance between benefits and costs can influence the success of biometric deployments.

The efficacy of a biometric system can be affected by the cultural, social, and legal considerations that shape the way in which people engage and interact with these systems. People’s deliberate choices about whether and how to engage and their inadvertent actions both affect system performance. For example, some people may choose not to place their fingers on a fingerprint scanner for fear of contracting a disease or may be unable



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 85
4 Cultural, Social, and Legal Considerations Biometric systems assume and require an intimate relationship between people and technologies that collect and record the biological and behavioral characteristics of their bodies. It is therefore incumbent upon those who conceive, design, and deploy biometric systems to consider the cultural, social and legal contexts of these systems. Not attending to these considerations and failing to consider social impacts diminishes their efficacy and can bring serious unintended consequences. The key social issue surrounding biometrics is the seemingly irrevoca- ble link between biometric traits and a persistent information record about a person. Unlike most other forms of recognition, biometric techniques are firmly tied to our physical bodies. The tight link between personal records and biometrics can have both positive and negative consequences for individuals and for society at large. Convenience, improved security, and fraud reduction are some of the benefits often associated with the use of biometrics. Those benefits may flow to particular individuals, corpo - rations, and societies but are sometimes realized only at the expense of others. Who benefits at whose expense and the relative balance between benefits and costs can influence the success of biometric deployments. The efficacy of a biometric system can be affected by the cultural, social, and legal considerations that shape the way in which people engage and interact with these systems. People’s deliberate choices about whether and how to engage and their inadvertent actions both affect system per- formance. For example, some people may choose not to place their fingers on a fingerprint scanner for fear of contracting a disease or may be unable 

OCR for page 85
 BIOMETRIC RECOGNITION to do so because long fingernails are highly valued by their social group. Similarly, some people may avoid having their photographs taken for a face recognition system because of concerns over how the images will be used; others will avoid this owing to concerns about the absence of cus- tomary adornments to the face (for example, scarves). In both cases system performance may be compromised. The proportionality of a biometric system—that is, its suitability, necessity, and appropriateness—in a given context will have a significant effect on the acceptability of that system.1 The societal impact of such systems will vary significantly depending on their type and purpose. For example, the use of iris scanning to control access to a local gym and of finger imaging to recognize suspected terrorists at international borders are likely to differ, both for the individuals being scanned and the broader community. The potential impacts on particular social groups and thus their receptions by these groups may also vary dramatically due to dif - ferences in how the group interprets the cultural beliefs, values, and specific behaviors. Imposing facial recognition requirements to enter a store or workplace may limit the shopping and work options available to individuals who consider photographs of faces inappropriate, creating barriers to social activities. This chapter explores such considerations in four areas: biometric systems and individual participation, potential impacts on society of bio - metric systems, legal considerations with respect to biometrics, and data collection and use policies. INTERACTION BETWEEN BIOMETRIC SYSTEMS AND INDIVIDUALS System performance may be degraded if social factors are not ade- quately taken into consideration. These factors are of two types, those that motivate and those that facilitate participant engagement with the system. Motivating Participation by Individuals As a rule, peoples’ willingness to participate in a system and their commitment to it depend on their understanding of its benefits. For exam- ple, a biometric system that allows convenient access to a worksite might be perceived as beneficial to individuals by relieving them of the necessity 1 European Commission, Article 29. The Data Protection Working Party observes that pro - portionality has been a significant criterion in decisions taken by European Data Protection Authorities on the processing of biometric data. Available at http://ec.europa.eu/justice _home/fsj/privacy/docs/wpdocs/2003/wp80_en.pdf.

OCR for page 85
 CULTURAL, SOCIAL, AND LEGAL CONSIDERATIONS to carry an ID card. On the other hand, a biometric system that tracks day- time movement of employees might be perceived as primarily beneficial to the employer and as undermining the employee’s personal freedom. In some instances, ancillary inducement, such as monetary reward, may be required to obtain a desired level of participation. Participation may also be motivated by the possibility of negative consequences for nonparticipation—for instance restrictions on access to locations or services (perhaps entry to the United States), requirements to use a much more lengthy process for a routine activity (for example, to open a bank account), and even the threat of legal action (for example, the requirement to enroll in a biometric system in order to maintain legal alien status). Nonparticipation may also subject individuals to social pres- sure and/or prevent them from joining some collective activities. Willingness to participate also may be influenced by concern that system uses will change over time (often referred to as “mission creep”), perhaps becoming less benign. For example, a system initially deployed to allow employees easy access to a worksite might also be used later to track attendance, hours worked, or even movement at the worksite. Such concerns argue for clear documentation of both how the system will be used and protections to ensure that it will not be used for other, unac - knowledged purposes. More broadly, the social and cultural factors that influence willing- ness to participate in biometric systems run the gamut from trust in government and employers, to views about privacy and physical contact, to social involvement vs. isolation. Because the use of biometric systems depends on physical connections with individuals and because they are used for national security, law enforcement, social services, and so on, a host of societal issues should be addressed before they are deployed. Facilitating Individual Participation The adoption of biometric systems depends on the ease with which people can use them. In systems design it is critical to consider training in use of the system, ease of use (for example, are multiple steps, awkward actions, or complicated procedures required?), and management of errors (for example, how does the system recover from a mistake?). Designing usable systems also requires that the designers have some knowledge of the human users and operators, the context in which they will use the system, and their motivations and expectations.2 Users of biometric sys- 2 For more on usability and biometric systems, see “Usability and Biometrics: Ensur- ing Successful Biometric Systems.” Available at http://zing.ncsl.nist.gov/biousa/docs/ Usability_and_Biometrics_final2.pdf.

OCR for page 85
 BIOMETRIC RECOGNITION tems might include travelers, people whose first language is not English, employees of a particular company, shoppers, and so on. Contexts of use might include empty office buildings vs. busy airports, indoor vs. open air, lone individuals vs. groups, daily vs. semiannual, and so on. Motiva - tions of operators might range from speeding up the check-in process to protecting personal information to preventing terrorism. Discussions of usability tend to focus on narrow technical consid- erations such as the adequacy of the instructions for where and how to place the hand or finger to successfully engage with a biometric system (some of this was discussed briefly in the section “Operational Context” in Chapter 2). But broader considerations also affect usability. For example, providing a table where users can place their bags, purses, and other paraphernalia before interacting with the system may improve usability. Interfaces should take into account physical differences among people (height, girth, agility). If an interface is difficult for a particular person (tall or short, say) to use, then users are implicitly sorted into categories and may be treated differently for reasons unrelated to system goals. Dealing with user diversity also leads to the challenge of providing user assistance. The presence of knowledgeable people providing help has been shown time and again to ease the pain of learning to use new systems and of managing errors. For a biometric system, it is important to have ways to mediate a variety of potential problems, ranging from indi- viduals uncertain how to use the system to individuals who cannot pres- ent the trait needed (for example, if their fingerprints are hard to image). Earlier chapters considered failures to enroll and failures to acquire from technical and statistical perspectives, but how such failures are handled from the perspective of the user (who has “failed” in some sense) also has an impact on system effectiveness. If they are not handled carefully, some users may be less ready to participate, or even disenfranchised altogether; see below for the broader societal implications of disenfranchisement. Expected frequency of use by a given individual is an important con- sideration in system design, since familiarity comes with repeated use. Typically, systems designed for infrequent use should be easy to learn, with readily interpreted instructions and help at hand, as usage proce- dures may not be remembered. On the other hand, people can learn to use even the most difficult systems if they have a chance to practice and learning is reinforced by frequent use and feedback. Thus, a system to be used by vacationers once or twice a year will have different requirements than one used daily by employees. While systems used frequently should avoid time-consuming operations to accomplish routine tasks such as entering a work area or logging on to the computer, they need not mini- mize initial training.

OCR for page 85
 CULTURAL, SOCIAL, AND LEGAL CONSIDERATIONS SOCIETAL IMPACT The increasing use of biometric systems has broad social ramifications and one overarching consideration is proportionality. While the technical and engineering aspects of a system that contribute to its effectiveness are important, it is also useful to examine whether a proposed solution is proportional and appropriate to the problem it is aimed at solving. 3 Biometric systems’ close connection to an individual, as described in the preceding section, means that even extremely effective technical solutions may turn out to be inappropriate due to perceived or actual side effects and means that proportionality—both how the system will be perceived in its user communities as well as possible side effects, even if the system is accurate and robust—must be considered when first examining the solution space. The rest of this section explores some of those potential side effects, including potential disenfranchisement of nonparticipants, privacy issues, and impact of varying cultural perspectives on individual- ity and identity. Universality and Potential Disenfranchisement Where biometric systems are used extensively, some members of the community may be deprived of their rights. Some individuals may not be able to enroll in a system or be recognized by it as a consequence of physical constraints, and still others may have characteristics that are not distinctive enough for the system to recognize. There will also be those who decline to participate on the basis of religious values, cultural norms, or even an aversion to the process. Religious beliefs about the body and sectarian jurisdiction over personal characteristics (for example, beards, headscarves) or interpersonal contact (for example, taking photographs, touching, exposing parts of the body) may make a biometric system an unacceptable intrusion. Mandatory or strongly encouraged use of such a system may undermine religious authority and create de facto discrimi- nation against certain groups whose members are not allowed to travel freely, take certain jobs, or obtain certain services without violating their religious beliefs. Another category of people who may choose not to par- ticipate are those concerned about misuse or compromise of the system or its data—and its implications for privacy and personal liberty. Although a decision to participate or not may be an individual one, biometric sys- 3 An example that received some media attention was a proposal to use fingerprint scan - ners to speed up a school lunch line. For reasons described elsewhere in this chapter, many in that community felt that such a use of the technology was disproportionate to the prob- lem, regardless of its effectiveness.

OCR for page 85
0 BIOMETRIC RECOGNITION tems can inadvertently affect groups whose shared characteristics make them less inclined to use the systems, assuming that participation is vol - untary. Where use is mandatory—for example, in some military applica - tions such as the U.S. Army’s Biometric Automated Toolkit (BAT) system described in Appendix D—even more consideration of these issues may be needed. Thus, while disenfranchisement in such cases may seem to affect only individuals, broad use of systems that are known to have these con- sequences can adversely affect the broader community if no appropriate relief is put in place. The community, including those not affected directly, may come to distrust the systems or the motivations of those deploying them. A system deployed in a community in which certain members are consistently unable to participate in the de facto methods of recognition without significant inconvenience may acquire an unwelcoming reputa - tion no matter how benign the purposes for which it is deployed. Similar kinds of potential ostracism have been seen when formerly pedestrian- friendly or bus-friendly communities are transformed to focus on auto - mobiles; those without driver’s licenses or their own automobiles can become effectively disenfranchised in particular geographic locations. 4 Other technologies, such as the telephone and the Internet in the com- munications arena, have had similar effects. Privacy as a Cultural Consideration Biometric systems have the potential to collect and aggregate large amounts of information about individuals. Almost no popular discussion of biometric technologies and systems takes place without reference to privacy concerns, surveillance potential, and concerns about large data - bases of personal information being put to unknown uses. Privacy issues arise in a cultural context and have implications for individuals and soci - ety even apart from those that arise in legal and regulatory contexts. The problems arising from aggregating information records about individuals in various information systems and the potential for linking those records through a common identifier go well beyond biometrics, and the chal- lenges raised have been addressed extensively elsewhere. For example, a 2007 NRC report5 that examined privacy in the digital age had a host of citations to other important work in this area. A thorough 4 Langdon Winner, Do artifacts have politics? Daedalus 109(1) (1980), and How technology reweaves the fabric of society, The Chronicle of Higher Education 39(48) (1993). Donald A. MacKenzie and Judy Wajcman, eds., The Social Shaping of Technology, London: Open Univer- sity Press (1985; 2nd ed., 1999). 5 See NRC, Engaging Priacy and Technology in a Digital Age, Washington, D.C.: The National Academies Press (2007).

OCR for page 85
 CULTURAL, SOCIAL, AND LEGAL CONSIDERATIONS treatment of authentication technologies and privacy, with references to a host of sources, appears in the NRC report Who Goes There? Authentication Through the Lens of Priacy (2003), which treats the constitutional, statutory, and common law protections of privacy and their intersection with mod - ern authentication technologies, including biometrics. A 2002 NRC report from the same project explored large-scale identity systems and potential technical and social challenges.6 Almost all of the issues raised in these three NRC reports on technology and identity systems, with or without biometric components, also apply to biometric systems.7 In addition, a recent symposium on privacy and the technologies of identity includes a series of scholarly papers on the subject. These papers refer to a wide range of sources.8 This chapter does not seek to recapitulate this extensive literature, and instead briefly examines some ways biometric systems can contribute to the privacy challenges inherent in systems storing informa - tion about individuals. Record Linkage and Compromise of Anonymity Information of various kinds about individuals is routinely stored in a variety of databases. Linking such information—however imper- fectly—in order to form profiles of individuals is also routinely done for purposes ranging from commercial marketing to law enforcement. The biometric data stored in information systems have the potential of becom- ing yet another avenue through which records within a system or across systems might be linked. This potential raises several questions: Under what circumstances is such linkage possible? If undesirable linkages are technically feasible, what technological and/or policy mechanisms would impede or prevent them? How could compliance with those mechanisms be monitored by those whose data are stored? What criteria should be used for deciding whether these mechanisms are needed? Depending on the anticipated uses of the personal data, policy and technical mechanisms may have to be put in place to prevent their unauthorized linking. A challenge related to record linkage is the potential for erosion or compromise of anonymity. As discussed previously, in contrast to the wide choice of passwords available to an individual, there are a fairly lim- ited number of biometric identifiers that a person can present, even when all possible combinations (for example, multiple fingers, face recognition 6 See NRC, IDs—Not That Easy: Questions About Nationwide Identity Systems , Washington, D.C.: The National Academies Press (2002). 7 See NRC, Who Goes There? Authentication Through the Lens of Priacy, Washington D.C.: The National Academies Press (2003). 8 See Katherine Strandburg and Daniela Stan Raicu, eds., Priacy and Technologies of Identity: A Cross-Disciplinary Conersation, pp. 115-188 (2006).

OCR for page 85
 BIOMETRIC RECOGNITION coupled with hand geometry) are considered. Thus, even a biometric system that does not internally link an individual’s biometric data with other identifying information may fail to preserve anonymity if it were to be linked using biometric data to another system that does connect biometric data to identity data. This means that even a well-designed biometric system with significant privacy and security protections may still compromise privacy when considered in a larger context. A related challenge is secondary use of data—that is, using data in ways other than originally specified or anticipated. The 2003 NRC report Who Goes There? examined secondary use in an authentication context. The challenge to privacy posed by secondary use of data in information systems gener- ally, and particularly in data-intensive systems even without biometrics, is widely known. Although it may seem that these concerns are specific to individuals, privacy considerations can have broad social effects beyond the indi- vidual,9 as the discussion above on universality makes clear. Privacy breaches, however well-contained, can erode trust not only in the tech- nological systems but also in the institutions that require their use. The potential for abuse of personal information can be sufficient to make cer- tain segments of society reluctant to engage with particular technologies, systems, and institutions. Biometric systems carry their own particular challenges with respect to privacy in addition to many of those that have been identified for other information systems. Coert Sureillance Some recognition systems may function at a distance, making it possi- ble to associate actions or data with a person without that person’s explicit participation. Such tracking and collection of data has privacy implica - tions not only for the person involved but for society as a whole. If these capabilities were to be broadly deployed, with their existence becoming broadly known and concern about their use becoming common, there would be potential distrust of the institutions that had deployed the tech - nology. Even if knowledge of a capability is not widespread, the power that flows to those who control it may have unanticipated effects. To date, widespread use of covert identification appears to be con- fined to movie plots. Contrary to popular belief, for example, the surveil - lance cameras used to investigate the 2005 London bombings did not perform biometric recognition as described in this report, because the cameras produced video searched by humans, not by machine. Nonethe - less, several programs now pursuing recognition at a distance presage 9 See, for instance, Priscilla Regan, Legislating Priacy: Technology, Social Values, and Public Policy, University of North Carolina Press Enduring Editions (2009).

OCR for page 85
 CULTURAL, SOCIAL, AND LEGAL CONSIDERATIONS such applications. While concerns about choice to participate are often dismissed by the biometrics industry, they must be addressed to take into account the target community’s cultural values in order to gain accep - tance and become broadly effective, especially as such systems become more pervasive or if covert biometric surveillance systems mature and become widely deployed. Individuality and Identity Because a biometric system recognizes the body, its applications may assume and embed particular Western notions of individuality and per- sonal identity—namely, that the individual acts in self-interest and has autonomy over his or her actions. However in some non-Western contexts there are different views on the primacy of the individual, and more col- lectivist views of identity prevail. In these contexts agency and authority are not presumed to reside with individuals and autonomous action is not assumed. An individual’s identity, in such cases, cannot easily be sepa - rated from that of the larger group of which the individual is a member. This undermines the assumptions of systems (biometric and otherwise) whose design expects that the actor is the individual (for example, access- ing a bank account, doing a job, or making political decisions). 10 Biometric systems are used to recognize individuals, but depending on the application and the cultural context, the broader system integrat- ing biometric technologies may also need to recognize the position of individuals within the family, workplace, or community.11 That is, an understanding of the relational dimensions of individual action whereby a person or group acts on behalf of another may be required. For example, within a workplace context an assistant may be required to take action on behalf another to check in for a flight or post personnel evaluations or in a community context a neighbor may be enlisted to pick up prescription medication for someone unable to do so.12 10 Thereare microexamples of this sort of thing even within Western cultures. For example, administrative assistants are often given authority to make decisions and even sign docu - ments for their bosses when their bosses are not present. Parents may act on behalf of their children, and spouses often are able to speak for each other. 11 Note that a well-designed biometric system will integrate an appropriate notion of individual. The Walt Disney World entrance gate application of biometrics described in Appendix D creates an affinity group for the people entering the park using a set of tickets purchased at the same time. This association of the tickets to the group avoids unnecessary complication at the admission gate and reflects the common social context for ticket use without unduly weakening the value of biometric recognition. 12 These examples are more properly accommodated by features in the broader system in which the biometrics components are integrated. The authorization system may allow delegation rather than recognize multiple individuals as the same entity.

OCR for page 85
 BIOMETRIC RECOGNITION Another way the identity of an individual can be conflated with that of a larger group is when biometric data are used for research. While this report urges extensive empirical research (of necessity on large groups of individuals) to achieve a stronger empirical foundation for biometric sys - tems, such research is not without complications. When a group (such as an ethnic or racial group) is studied, whether for a medical, say, or a bio- metric purpose, the associated findings about that group can raise issues. Perhaps the individuals who were studied had not consented to the use of their individual data for research and the drawing of generalizations about their group, or perhaps consent was obtained but the results of the research are not welcomed by the individuals who participated. Another complication may arise when the results of a group study are made public and have an effect on individuals who are part of that group, whether or not they participated in the study. Even though individual enrollees in the database have given their consent, does the group qua group have the power to withhold consent for conduct of the research itself or publica- tion of the findings? These issues apply beyond biometrics research, but biometric recognition’s close association to individual bodies and notions of identity will inevitably heighten participants’ sensitivity to the issues and necessitate that they be addressed with special care. In addition to the identity issues raised by cultural considerations and role-based agency and the challenges of research on socially identifiable groups, biometric technologies explore the boundary between public and private information about an individual’s body. The ability of these sys - tems to categorize, monitor, and scrutinize persons through behavioral or biological characteristics raises the issue of the integrity of the person. The gathering of biometric data of all kinds (for example, fingerprint images, iris scans, brain scans, DNA, face imaging) that is associated with and defines the individual raises issues of the “informatized” body—a body that is represented not by human-observable anatomical and physical fea- tures but by the digital information about the body housed in databases. This has implications for how we ultimately perceive and conceive of the individual.13 13 For more on this notion, see Irma van der Ploeg, 2007, “Genetics, biometrics and the in - formatization of the body,” Ann. Ist. Super. Sanità 43(1): 44-50, and Emilio Mordini and Sonia Massar “Body biometrics and identity,” Bioethics 22(9): 488-498 (2008). The former notes: The digital rendering of bodies allows forms of processing, of scrolling through, of datamining peoples’ informational body in a way that resembles a bodily search. Beyond mere data privacy issues, integrity of the person, of the body itself is at stake here. Legal and ethical measures and protections should therefore perhaps be modelled analogous to bodily searches, and physical integrity issues. This issue is of particular relevance with regard to a curious aspect of this new body, namely that it has become (re-)searchable at a distance. The digitized body can be trans-

OCR for page 85
 CULTURAL, SOCIAL, AND LEGAL CONSIDERATIONS Biometric data contribute to new ways of knowing and defining per- sons as digitized information. This information is gathered during both routine and exceptional activities such as medical examinations, perfor- mance testing in sports, and users’ interactions with biometric systems deployed in the various applications described elsewhere in this report. For some purposes, the observable physical body becomes less definitive and exclusive with respect to connoting who we are and is increasingly augmented (or even supplanted) by digital information about us. This information may ultimately figure into such decisions as who we date, mate, and hire and—conversely—who dates, mates, and hires us. The ability to recognize people by how they look or walk or talk is a human skill critical to social order and human survival. Some individuals are able to quickly recognize people they have not seen for years. Biometric systems that are able to perform these historically human acts of recogni - tion at high rates of speed and on a massive scale may alter underlying assumptions about the uniqueness of these human capabilities and may blur previously clear boundaries between the human skills and social processes that control access to social spaces and bestow rights and duties, and the technological capabilities of biometric systems that recognize faces, gaits, and voices. LEGAL ISSUES Comprehensive discussion of legal issues associated with biometrics is well beyond the scope of this report. However, as with any scientific or technical issue, the assumptions made by engineers are very different from those made in the legal system. Understanding the broader context, including the legal context, within which biometric systems will operate, is important to achieving effectiveness. The use of biometrics brings with it important legal issues, especially the following: remediation, reliability, and, of course, privacy. Legal precedent on the use of biometrics technol - ogy is growing, with key cases stretching back decades,14 and some recent ported to places far removed, both in time and space, from the person belonging to the body concerned. Databases can be remotely accessed through network connections; they are built to save information and allow retrieval over extended periods of time. A bodily search or examination used to require the presence of the person involved—a premise so self-evident that to question it would be quite ridiculous. Moreover, this requirement rendered the idea of consenting to any bodily search at least a practicable possibility. Today, however, these matters are not so obvious any more. 14 Cases include U.S. . Dionisio (U.S. Supreme Court, 1973) and Perkey . Department of Motor Vehicles (California Supreme Court, 1986).

OCR for page 85
0 CULTURAL, SOCIAL, AND LEGAL CONSIDERATIONS Larry Hiibel’s insistence that he did not identify himself because his name was “none of the officer’s business.”38 It bears repeating that there is nothing intrinsic to biometrics that automatically aggravates Larry Hiibel’s problem. Police officers relying on their own judgment unaided by technology can easily make politi - cally or racially motivated decisions that improperly invade privacy. In some settings, biometric systems could alleviate rather than worsen these problems, by relieving the individual law enforcement agent of the rec- ognition task and assigning it to a common automated system that pre - sumably performs equally and repeatedly for all agents. Such a system would, of course, be subject to all of the usual technological and environ - mental factors discussed elsewhere in this report that might degrade its effectiveness. In the end, many aspects of the intersection between the Hiibel deci- sion and biometrics come down to control over the uses of databases. Consider again a facial recognition system employed by an officer on patrol. A legislature might understandably want to reap the benefits of such a system while eliminating its misuse. Suppose the state curtailed the use of the system by saying that images of those under suspicion should be compared only to a database of convicted felons. Is there any constitutional requirement that this limit be abided by? The Supreme Court has not answered this question, although it has suggested that a state might violate due process if it does not take reasonable steps to stop unwarranted disclosures of data. The suggestion came in the Court’s decision in Whalen . Roe, 429 U.S. 589 (1977). A New York statute required that prescriptions for legiti- mate but addictive drugs be recorded on a computer database to prevent abuses such as users obtaining prescriptions from more than one doctor. Although the computer system was set up to prevent leaks and pub- lic disclosure of the identity of patients was made a crime, the system was challenged by those who feared that the information could get out, stigmatizing patients as addicts in violation of their privacy rights. The statute was upheld, with the Court noting that no evidence of informa- tion falling into the wrong hands had been presented.39 In his opinion for the Court, Justice Stevens, in an oft-quoted passage (idem at 605-606), left open the possibility that some future database might not be constitution - ally acceptable if it were not adequately protected against improper use: We are not unaware of the threat to privacy implicit in the accumulation of vast amounts of personal information in computerized data banks or other massive government files. . . . The right to collect and use such data 38 Hiibel . Sixth Judicial District Court, 542 U.S. 177, 190 (2004). 39 429 U.S., at 601.

OCR for page 85
0 BIOMETRIC RECOGNITION for public purposes is typically accompanied by a concomitant statutory or regulatory duty to avoid unwarranted disclosures. Recognizing that in some circumstances that duty arguably has its roots in the Constitu- tion, nevertheless New York’s statutory scheme . . . evidences a proper concern with, and protection of, the individual’s interest in privacy. We therefore need not, and do not, decide any question which might be presented by the unwarranted disclosure of accumulated private data whether intentional or unintentional or by a system that did not contain comparable security provisions. We simply hold that this record does not establish an invasion of any right or liberty protected by the Fourteenth Amendment. This recognition that the due process clause of the Fourteenth Amend- ment might protect informational privacy is clearly important. It suggests that a government biometric database with inadequate safeguards could be successfully challenged by an individual in that database on the ground that the government had violated his or her liberty. Moreover, regardless of whether a challenge in court would succeed, it is clear that the public desires protection from unwarranted disclosures from databases of all kinds. Biometric systems will be judged by that standard. Kyllo . United States The other recent Supreme Court decision that casts light on privacy and biometrics is Kyllo . United States.40 Federal Agent William Elliott suspected that marijuana was being grown in the home of Danny Kyllo, but he lacked the probable cause necessary to obtain a search warrant. Accordingly, Agent Elliott sat in a car in the street next to Kyllo’s home and used a thermal imager to scan the residence. The imager detected infrared radiation coming from Kyllo’s house. The pat - tern revealed that portions of the house were hotter than the rest of the house and the neighboring homes. Agent Elliott concluded that Kyllo was using halide lights to grow marijuana. Using this and other informa - tion, Elliott obtained a warrant for a search. Once inside Kyllo’s home, federal agents found more than a hundred marijuana plants. Kyllo, who was eventually convicted of manufacturing marijuana, appealed on the ground that using a thermal imager without probable cause constituted an unreasonable search under the Fourth Amendment. The Supreme Court, in another 5-4 decision, agreed with Kyllo. See Box 4.2 for an overview of the arguments used. Two main lessons for biometrics emerge from Kyllo. The first is that as a technology advances and becomes widespread, our zone of constitutionally guaranteed privacy shrinks. The Court recognizes, more- over, that our society’s expectations of privacy set the baseline for laws 40 533 U.S. 27 (2001).

OCR for page 85
0 CULTURAL, SOCIAL, AND LEGAL CONSIDERATIONS governing searches. In California . Ciraolo, noted in Box 4.2, the Court, in upholding aerial surveillance of a fenced backyard, explicitly said that “in an age where private and commercial flight in the public airways is routine, it is unreasonable to expect” that one’s backyard is private.41 In Kyllo the Court implied that if thermal imagers had been in common use its decision would have been different. Thus if the day comes when a bio - metric device that analyzes voices can function from a hundred feet or so and extend its range into a private home, and if use of that device becomes widespread, the Fourth Amendment will have little application. If the first lesson offers a boost for biometrics, the second lesson does the opposite. All nine justices in Kyllo expressed concern about future technologies that impinge on privacy. The majority wanted to step in now; the dissent wanted to let the legislatures have the first crack at controlling future developments. On this question, the nine justices on the Supreme Court are representative of a wide span of public opinion. The ability of the government to use biometrics to, say, track people’s movements around their neighborhood or even inside their own homes will raise red flags for those concerned about privacy and government intrusion. Public assent, so crucial, is likely to be lacking unless the application of biometrics is highly justifiable and carefully circumscribed. Finally, going from protections provided by the Constitution to the less lofty ones provided by statute, we find a series of federal and state laws that tend to control certain government databases on a sector-by- sector basis with varying success. Many of these statutes incorporate the Fair Information Practices code developed by the then-Department of Health, Education and Welfare in 1973 and incorporated into the federal Privacy Act of 1974. The strengths and weaknesses of these statutes have been exhaustively analyzed.42 As Hiibel, Kyllo, and the other cases men- tioned above demonstrate, in many sensitive areas an individual has no statutory protections and must rely on constitutional arguments. The Priate Sector and Priacy Rights In Whalen . Roe, Justice Stevens spoke of the threat to privacy posed by “massive government files.”43 But what about information held in pri- vate hands? Many Americans are just as concerned or even more so about the loss of privacy when personal information is given to their employer or demanded in an online private transaction. Suppose a bank gives its 41 476 U.S., at 215. 42 See, for example, Daniel J. Solove, Marc Rotenberg, and Paul M. Schwartz, Information Priacy Law, 2nd edition, pp. 523-622, New York: Aspen Publishers (2006). 43 429 U.S., at 605.

OCR for page 85
0 BIOMETRIC RECOGNITION BOX 4.2 The Legal Arguments in Kyllo Although Justice Scalia’s opinion for the Court emphasized the Constitution’s traditional protection for the privacy of the home, he recognized that visual surveil- lance of the home without probable cause has long been allowed. In addition, his opinion for the Court conceded that “it would be foolish to contend that the degree of privacy secured to citizens by the Fourth Amendment has been entirely unaf- fected by the advance of technology” (533 U.S., at 33-34). For example, the Court had permitted aerial surveillance of the backyard of a private house, even though a fence shielded the yard from the street (California v. Ciraolo, 476 U.S. 207 (1986)). Why did the Court disallow the use of the evidence adduced by the thermal imager? The Court relied on a test that stemmed from the case of Katz v. the United States (389 U.S. 347 (1967)). Katz upheld a challenge to warrantless eavesdropping by an electronic listening device placed on the outside of a telephone booth because Katz had justifiably relied on the privacy of the booth. The test held that “a Fourth Amendment search occurs when the government violates a subjective expectation of privacy that society recognizes as reasonable” (idem, at 33). The Court recognized that the Katz test had been criticized as circular and un- predictable (idem, at 34), but it declined to revisit Katz in the setting of the thermal imaging of a private home. The Court also recognized that changing societal expec- tations of privacy affect Fourth Amendment rights under the Katz approach. In its opinion, the Court twice noted that the search of Kyllo’s home was being set aside in part because the thermal imaging device was “not in general public use” (idem, at 34 and 40). A final feature of the majority’s opinion was the evident concern that more ad- vanced technologies could reach into the privacy of the home. The Court said that “while the technology used in the present case was relatively crude, the rule we customers the option of using fingerprints rather than a password to access an ATM. The bank may believe it is enhancing privacy because a password is easily stolen. But if the bank’s fingerprint database is not adequately secured, a purposeful or accidental disclosure of that data could lead to identity theft. Suppose an employer conducts a biometric scan of its workers to facilitate access to the secure workplace when a badge is lost. If the biometric modality chosen happened to also reveal information about a worker’s health, that information could be misused by the employer, by insurance companies, or others. Justice Stevens’s reference to “government” files was not inadvertent. The individual freedoms guaranteed by the U.S. Constitution are virtually all protections against government overreach. The Bill of Rights protects us from government suppression of free speech and religion, government establishment of religion, improper searches by government officials, deprivations of due process by the government, and so on. Similarly,

OCR for page 85
0 CULTURAL, SOCIAL, AND LEGAL CONSIDERATIONS adopt must take account of more sophisticated systems that are already in use or in development” (p. 36). The Court then detailed what it had in mind (idem, at 36, note 3): The ability to “see” through walls and other opaque barriers is a clear, and scientifi- cally feasible, goal of law enforcement research and development. The National Law Enforcement and Corrections Technology Center, a program within the United States Department of Justice, features on its Internet Website projects that include a “Radar- Based Through-the-Wall Surveillance System,” “Handheld Ultrasound Through the Wall Surveillance,” and a “Radar Flashlight” that “will enable law enforcement officers to detect individuals through interior building walls.” The four dissenting justices believed that Danny Kyllo had no reasonable expecta- tion of privacy in heat emissions that were being sensed after they had left his house. In the dissenters’ view, Agent Elliott’s use of a “fairly primitive thermal imager” was no different than if he had noticed that Kyllo’s house was warmer than a nearby building because “snow melts at different rates across its surfaces” (pp. 41 and 43, Stevens, dissenting). But the dissent was not prepared to say that advanced technology should always be absolved from Fourth Amendment scrutiny because it is nothing more than an enhancement of our senses. On that subject, Justice Stevens’s dissent on p. 51 took a wait-and-see attitude: Although the Court is properly and commendably concerned about the threats to privacy that may flow from advances in the technology available to the law enforcement profes- sion, it has unfortunately failed to heed the tried and true counsel of judicial restraint. Instead of concentrating on the rather mundane issue that is actually presented by the case before it, the Court has endeavored to craft an all-encompassing rule for the future. It would be far wiser to give legislators an unimpeded opportunity to grapple with these emerging issues rather than to shackle them with prematurely devised constitutional constraints. Articles I, II, and III of the Constitution protect us from oppression by dividing government power between the federal and state governments and by dividing the federal government’s power among the legislative, executive, and judicial branches. The infringement of privacy by a private entity, including privacy of biometric information, can be protected against by legislation. But some limitations to this approach should be noted at the outset. While federal laws can be drafted that preempt state action, most such laws leave room for complementary state regulation, leading to debates over coverage. Moreover, when the federal government does not preempt or when it is silent, state laws typically differ from state to state, raising problems for businesses seeking to comply with the law and for enforcement efforts. Finally, only constitutional protections extend to minorities who lose out in legislative battles. For example, if the private sector depriving you of a

OCR for page 85
0 BIOMETRIC RECOGNITION job is in compliance with all relevant legislation, you have no recourse in court even if you believe you have been treated unfairly.44 Biometric information held by private companies is subject to con - straints beyond those imposed by legislation. One such constraint might be self-regulation, which a company might impose to gain a market edge. Another such constraint might be common-law protections. An employee or a consumer might enter into a contract with a company that promises to protect biometric data and would risk breach of contract if it did not do so. Similarly, a company that failed to meet accepted standards in pro - tecting information might be liable for negligence in a tort suit. However, continuing public concern about privacy suggests that market failures and the limits of relief achievable with retrospective common law make further legislative action likely. There are many possible ways to regulate biometric technologies and systems that might provide needed protections for the public and build its confidence in the private sector. Some would be relevant to govern- ment databases as well. In response to the continuing concern over iden - tity theft and fraud, some jurisdictions are considering enacting laws to prohibit the selling and sharing of an individual’s biometric data, absent consent or compelling circumstances.45 Privacy concerns should be attended to when a biometric deployment is being implemented. For example, in answering the question of whether to store biometric data as processed references or in source samples or images, which approach best protects privacy should be considered. The same is true of the choice between local or centralized storage of biomet - ric data—a choice that has significant security and privacy implications. Encryption of biometric data is often vital. Perhaps most important, the use of biometric systems should be defined and limited at the outset of a program, by legislation when appropriate. The temptation to use informa- tion for new purposes never justified to the public should be resisted. In the end, working as hard on privacy as on technical success will help assure that biometric programs maintain or even enhance individual autonomy as they achieve social goals. Failure to do so will result in biometric programs that undermine American values while potentially bringing about their own failure due to public resistance. 44 The NRC report Who Goes There? Authentication Through the Lens of Priacy (2003) found that personal information held by the private sector is afforded weaker statutory protec - tions than information held by the federal or state governments and that much detailed personal information in the hands of businesses is available for reuse and resale to private third parties or to the government, with little in the way of legal standards or procedural protections. 45 Illinois passed such a law in 2008 (740 ILCS 14/), the Biometric Information Privacy Act. Available at http://www.ilga.gov/.

OCR for page 85
 CULTURAL, SOCIAL, AND LEGAL CONSIDERATIONS DATA POLICIES Biometric data are personally identifying information.46 Thus bio- metric systems have the potential to collect not only pattern recognition information captured by sensors, but also other information that can be associated with the biometric data themselves or with data records already contained within the system. Depending on the biometric system, this information could include time and location of use, identification data (for example, name or Social Security number, and so on) and, in some cases, medical measurements (for example, glucose levels).47 Additional data may be created when a decision is generated by the system (positive or negative recognition) that may be stored or shared with another system. Given the increasing volumes and kinds of data associated with a biomet- ric system, data policies are important to answer a variety of questions that arise regarding sharing, storage, integrity, and confidentiality of the biometric system data. Biometric systems are often associated with an identity system. Bio- metric data may be correlated across identity systems to recognize indi- viduals. The data associated with an individual collected by different organizations using the same biometric modality may be similar but almost certainly not identical, because the sample acquisition for enroll - ment will vary (see Chapter 1 for elaboration on sources of uncertainty and variation in biometric systems). An earlier NRC report addressed a set of questions and issues that arise particularly in the context of an identity system. For the most part, they apply to biometric recognition systems. The questions are reprinted for reference.48 • What is the purpose of the system? Possible purposes of an identity system include expediting and/or tracking travel; prospectively moni - toring individuals’ activities in order to detect suspicious acts; retrospec - tively identifying perpetrators of crimes. • What is the scope of the population to whom an “ID” would be is - sued and, presumably, recorded in the system? How would the identities of these individuals be authenticated? 46 The Data Protection Working Party is the independent European Union advisory body on data protection and privacy, established under Article 29 of Directive 95/46/EC. It determined that in most cases biometric data are personal data and can in all cases be considered as “information relating to a natural person.” Available at http://ec.europa. eu/justice_home/fsj/privacy/docs/wpdocs/2003/wp80_en.pdf. 47 While biometric data captured by systems using the most common modalities do not contain medical information, some emerging technologies capture traits such as heartbeat patterns, which directly convey medical data. 48 NRC, IDs—Not That Easy: Questions About Nationwide Identity Systems. Washington, D.C.: The National Academies Press, pp. 9-11 (2002).

OCR for page 85
 BIOMETRIC RECOGNITION • What is the scope of the data that would be gathered about in- dividuals participating in the system and correlated with their system identity? “Identification systems,” despite the name, often do much more than just identify individuals; many identity systems use IDs as keys to a much larger collection of data. Are these data identity data only (and what is meant by identity data)? Or are other data collected, stored, and/or analyzed as well? With what confidence would the accuracy and quality of this data be established and subsequently determined? • Who would be the user(s) of the system (as opposed to those who would participate in the system by having an ID)? If the public sector or government will be the primary user, what parts of the government will be users, in what contexts, and with what constraints? In what setting(s) in the public sphere would such a system be used? Would state and lo - cal governments have access to the system? Would the private sector be allowed to use the system? What entities in the private sector would be allowed to use the system? Who could contribute, view, and/or edit data in the system? • What types of use would be allowed? Who would be able to ask for an ID, and under what circumstances? Assuming that there are datasets associated with an individual’s identity, what types of queries would be permitted (e.g., “Is this person allowed to travel?” “Does this person have a criminal record?”). Beyond simple queries, would analysis and data mining of the information collected be permitted? If so, who would be allowed to do such analysis and for what purpose(s)? • Would participation in and/or identification by the system be vol- untary or mandatory? In addition, would participants have to be aware of or consent to having their IDs checked (as opposed to, for example, being subjected to surreptitious facial recognition)? • What legal structures protect the system’s integrity as well as the data subject’s privacy and due process rights, and which structures de- termine the liability of the government and relying parties for system misuse or failure? Information-Sharing Issues With increased use of biometrics, there is legitimate concern about how information stored in biometric databases might be shared. Sharing can extend the administrative reach of biometric findings. It could also afford valuable facts for management and research studies. However, such sharing presents significant privacy and confidentiality challenges. Systematic approaches in law, regulation, or technology to resolve the tension between the evident demand to share more biometric information and the cautions of privacy and technology are lacking. In particular, no

OCR for page 85
 CULTURAL, SOCIAL, AND LEGAL CONSIDERATIONS comprehensive federal policy exists to guide sharing information from biometric databases.49 The sharing of information from biometric databases occurs when the records in one database are integrated with those in other databases and when data are disseminated directly to users. Most biometric systems today involve databases to which biometric samples captured from a pop- ulation of individuals have been submitted and are later searched to find matching enrolled individuals with the same biometric characteristics. In some systems, a sample is compared to the reference biometric data asso- ciated with the claimed identity (which means maintaining a database of reference data, even if it is not searched at each use). A national system of ID cards or passports based on biometric data would rely on a database. The costs of electronically capturing biometric samples and storing the data continue to drop, as do the costs of data integration and dissemi- nation, and the technical ability to do so is expanding as well, serving to increase interest in storing biometric information in databases. The ben - efits of storage capacity and data integration and sharing could include the following: • Administratie efficiencies. One of the many applications might be giving a homeless shelter the ability to check whether an applicant has a criminal record; another might be allowing law enforcement to check whether a particular individual used a certain facility at a certain time. • Business purposes. Sports teams could collaborate to examine the usage patterns and demographics of their season ticket holders, perhaps to avoid issuing more tickets than they have seats or to look for joint advertising opportunities. • Research uses. Biometric data are needed for testing biometric sys- tem performance and for developing new systems and features. 50 Although processing and sharing biometric information can bring many benefits, there are also concerns that stem from the ease with which bio - metrics technology integrates with database technology, increasing the likelihood of privacy violations. For this reason, it has been suggested 49 In addition to the lack of policy guidance, there can be logistical and practical challenges to information sharing. For example, even within the DOD structure, the GAO has found gaps in biometrics sharing across missions. See http://www.gao.gov/products/GAO-09-49. For a more thorough treatment of information sharing and privacy, not just with respect to biometrics but also generally, see Peter Swire, Privacy and information sharing in the war on terrorism, Villanoa Law Reiew 51:260 (2006). 50A. Ross, S. Crihalmeanu, L. Hornak, and S. Schuckers, A centralized web-enabled multi- modal biometric database, Proceedings of the 00 Biometric Consortium Conference (BCC), September 2004.

OCR for page 85
 BIOMETRIC RECOGNITION that privacy must be designed into the systems rather than added on at a later time.51 Currently there does not seem to be much of an ingrained culture of privacy protection for biometric databases, beyond that which exists for information systems generally or state and local efforts such as the Illi - nois statute mentioned previously. With the exception of some agencies, mainly statistical agencies, there is little historical tradition of maintaining the confidentiality of biometric databases. This is in spite of the fact that if biometric data associated with an individual falls into the wrong hands that individual could be at risk of identity theft.52 Moreover, sharing of information from biometric databases raises questions of (1) whether the information would be used for purposes not intended or inconsistent with the purposes of the original biometric application and (2) what informa - tion about intended uses of the system should be disclosed to users and how that information should be presented. Protection of Biometric Data The protection of personal information is not the only reason for pro - tecting biometric data. Another is the desire to prevent third parties from linking records between systems, determining the enrolled users in a sys- tem, or discovering a doppelganger (an individual who is a close match for an enrolled user). The encryption of biometric data stored in central- ized databases or on a personal device such as a smart card, coupled with appropriate security measures to limit probing of the database, can be effective in countering these threats. Encryption and database protection, however, are insufficient to protect against identity theft by an attacker impersonating an individual by mimicking his or her biometric traits. It is natural to draw a parallel between password-based authentica - tion and biometric verification of identity. In a password-based system, a secret password is presented to confirm a claimed identity; in a biometric verification system, the trait is presented to confirm the claimed identity. It would appear that exposure of an individual’s biometric data is compa- rable to disclosure of a secret password, with the added complication that while it is easy to replace a password, the same is not true for a biometric 51 “Biometric technology is inherently individuating and interfaces easily to database tech - nology, making privacy violations easier and more damaging. If we are to deploy such sys - tems, privacy must be designed into them from the beginning, as it is hard to retrofit complex systems for privacy.” Available at http://www.eff.org/Privacy/Surveillance/biometrics/. 52As noted in Chapter 1, access to sensitive systems should rely not just on the presentation of the correct biometric sample but rather on the security of the full process. Concerns about identity theft arise because not all biometric systems offer adequate security, and some could be vulnerable to attack by impersonators.

OCR for page 85
 CULTURAL, SOCIAL, AND LEGAL CONSIDERATIONS trait. Further, biometric data are exposed not only when data leak from unencrypted or poorly protected databases—they can, at least in prin - ciple, be derived from publicly observable human traits. The submission of a password and the presentation of a biometric trait are not, however, analogous. As discussed in Chapter 1, the security value of a biometric verification system stems from measures surrounding the presentation and capture of the biometric trait. These measures cope with public dis- closure of an individual’s biometric data by verifying that a presented trait is genuine and not an artifact employed by an attacker. However, when the sample capture is remote and unattended, as would be the case for most systems associated with computer access, there are few technical safeguards and minimal protection against the use of artifacts. In these circumstances, one would not expect a biometric recognition system to provide reliable protection against a premeditated attack. SUMMARY Although biometric systems can be beneficial, the potentially lifelong association of biometric traits with an individual, their potential use for remote detection, and their connection with identity records may raise social, cultural, and legal concerns. Such issues can affect a system’s acceptance by users, its performance, or the decision on whether to use it in the first place. Biometric recognition also raises important legal issues of remediation, authority, and reliability, and, of course, privacy. Ulti- mately, social, cultural, and legal factors are critical and should be taken into account in the design, development, and deployment of biometric recognition systems.