Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 55
Engaging Privacy and Information Technology in a Digital Age Part II The Backdrop for Privacy Chapter 2 (“Intellectual Approaches and Conceptual Underpinnings”) provides a primer on privacy as an intellectual concept from the perspective of three disciplines—philosophy, economics, and sociology. Philosophical approaches to the study of privacy have centered on the elucidation of the basic concept and the normative questions around whether privacy is a right, a good in itself, or an instrumental good. Economic approaches to the question have centered around the value, in economic terms, of privacy, both in its role in the information needed for efficient markets and in the value of information as a piece of property. Sociological approaches to the study of privacy have emphasized the ways in which the collection and use of personal information have reflected and reinforced the relations of power and influence between individuals, groups, and institutions within society. That there is such a multiplicity of legitimate intellectual approaches to the study of privacy suggests that no one discipline captures, or can capture, the richness and texture of its various nuances, and what appear at first to be very slight or subtle differences turn out to have deep implications in practice. Chapter 3 (“Technological Drivers”) examines the vast changes enabled by information technology by exploring the ramifications of increased connectivity and ubiquity, data gathering, ever-growing computational power and storage capacity, and more-sophisticated sensors; what architecture choices mean for social values such as privacy; and what kind of control (or lack of control) technology enables for individuals. Such change has dramatically altered the on-the-ground realities within which
OCR for page 56
OCR for page 57
Engaging Privacy and Information Technology in a Digital Age 2 Intellectual Approaches and Conceptual Underpinnings The concept of privacy has a long intellectual history. Many have written attempting to characterize privacy philosophically, sociologically, psychologically, and legally. This chapter provides a brief sampling of some major intellectual perspectives on privacy, along with some analysis of how these different perspectives relate to one another. These perspectives illustrate some common themes while demonstrating the difficulty inherent in characterizing privacy, no matter what intellectual frameworks or tools are used. Note also that this chapter—as well as this report—focuses on privacy as it relates to information. The informational dimensions of privacy are clearly central, but at the same time some have argued that the concept of privacy must be broader than that; for example, the U.S. Supreme Court has held that a right to choose an abortion or to receive information about contraceptives is founded on privacy protections implied in the Constitution. The discussion below is not intended to address these non-informational dimensions of privacy and mentions them only in passing as they may help to illuminate some of the issues surrounding the notion of privacy and the ethical and moral dimensions of the general privacy debate.
OCR for page 58
Engaging Privacy and Information Technology in a Digital Age 2.1 PHILOSOPHICAL THEORIES OF PRIVACY 2.1.1 A Philosophical Perspective Philosophical works on privacy generally focus on two central tasks.1 The first is to attempt to answer the question, What is privacy?, by giving a precise definition or analysis of the concepts involved. The second is to explain why we value privacy and why privacy should be respected by others, whether those others are individuals, organizations, or governments. It is useful to distinguish these two tasks by calling the first a descriptive account of privacy and the second a prescriptive or normative account of privacy. These tasks are conceptually distinct, and maintaining the distinction between them allows a rich set of questions to be asked. For example, a descriptive analysis does not need to justify the ethical questions surrounding privacy as part of the analysis of the concept. Once a descriptive analysis of privacy has been accomplished, the normative aspects of the concept can then be discussed based on that description, and the discussion may well—and properly—include ethical questions. Further, normative accounts of privacy may depend on subtle differences in the descriptive analysis that are either stated or presumed, and that can be masked if the two tasks are intertwined. So, for example, a descriptive account of privacy may show that there are cases where privacy conflicts with other values. Such a conflict may lead to the decision that not all violations of privacy are to be avoided. But if descriptive and prescriptive accounts of privacy were merged, such an analysis might be precluded from the outset since our prescriptive account might hold that all reductions of privacy count as moral violations. Any descriptive account of privacy will have to correspond to the perceptions and intuitions of most people about clear cases of the concept. So, for example, an analysis of the concept that held that privacy is a binary property that an individual either has or has totally lost would not be acceptable, as it violates the commonly held intuition about degrees of privacy and the loss of some privacy. A descriptive account that adequately deals with clear cases can then be used to elucidate less clear cases, and can then be used as a base for prescriptive discussions about privacy. So, for example, starting from a descriptive analysis of privacy that acknowledges that there are levels or degrees to privacy, it is then possible to address the prescriptive question of where a particular loss or 1 Note that the discussion in Section 2.1.1 draws not only on the writings of professional philosophers, but also on other work that undertakes explicit conceptual analysis devoted to exploring what privacy is, what a right to privacy consists in, and why we ought to protect it.
OCR for page 59
Engaging Privacy and Information Technology in a Digital Age gain of privacy is good or bad, problematic or not, and for whom and for what time period and under what conditions. Much of the philosophical work on privacy has been stimulated by contentious activities in realms such as law, policy, institutional practices, and specific or novel applications of technology. In such a context, the prescriptive aspects of privacy are the most commonly discussed. What descriptive work has been done is often in the context of clarifying the basic concepts as part of a discussion of a particular normative view. 2.1.2 Privacy as Control Versus Privacy as Restricted Access The most common descriptive accounts of privacy reflect two basic views: (1) privacy as restrictions on the access of other people to an individual’s personal information and (2) privacy as an individual’s control over personal information2 such as information on health status. While related, these two views can also be seen as distinct. Political science professor emeritus Alan Westin’s take on privacy was (and remains) an influential example from the group that defines privacy in terms of control. Indeed, Westin can be credited with developing one of the very first formulations of so-called informational privacy in the late 1960s, and his definition of privacy has proven quite useful to scholars as society has moved more fully into the information age: “Privacy is the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others.”3 In Privacy and Freedom, Westin took an interdisciplinary approach to analyzing the nature and functions of privacy, its roles in society, and new technologies for surveillance, as well as the push for new privacy standards and protections. As part of the overall theory put forth in the book, Westin defines four distinct functions of (or reasons for wanting) privacy—personal autonomy, emotional release, self-evaluation, and limited/protected communication—as well as four distinct states of privacy—solitude, freedom from observation; intimacy, closeness among a small group of people; anonymity, freedom from being identified in public settings; and reserve, the freedom to withdraw from communication. As he describes it, these states are subject to constant change, depending on one’s personal needs and choices about what to reveal and what not to reveal at a given time. Indeed, for Westin, the importance of this control over information disclosure, “both to [an] individual’s self-development 2 A second use of “control” refers to an external agency with the power to compel a person to disclose personal information. “Control” in this section is not used in this sense. 3 See Alan Westin, Privacy and Freedom, Atheneum, New York, 1967, p. 7.
OCR for page 60
Engaging Privacy and Information Technology in a Digital Age and to the exercise of responsible citizenship, makes the claim to privacy a fundamental part of civil liberty in democratic society.”4 Westin’s model of informational privacy and his ideas regarding privacy more generally have informed most subsequent scholarly discussions of privacy and have often acted as a crucial jumping-off point for the work of other researchers.5 In more recent years, much of Westin’s work has been somewhat less philosophical in nature, involving surveys to assess the attitudes of U.S. consumers and Internet users toward privacy, often in association with Harris Interactive (previously Louis Harris and Associates)—work that has earned him both praise and criticism. In contrast to traditional university-based research, this work has often been done in cooperation with commercial interests. In his survey research, Westin has suggested that Americans hold differing views regarding the value of certain aspects of privacy. For example, based on his analysis of surveys over a number of years, he groups consumers into one of three categories: Privacy fundamentalists—those who reject trading information for special offers, who prefer only opt-in approaches, and who would prefer to see more legislative approaches to privacy protection; Privacy unconcerned—consumers who are comfortable with trading their information for almost any consumer value; and Privacy pragmatists—people who take time to weigh the benefits and risks associated with providing their personal information. Westin goes on to suggest that the “privacy pragmatists” are the largest group, at over 60 percent of U.S. consumers, and are thus a group deserving of focused attention from businesses and policy makers. His survey work has also shown that of the four states of privacy he identified in his earlier research, intimacy is the most important state to Americans, followed by solitude, reserve, and anonymity (in that order). Westin’s empirical research also led him to identify three phases in the state of U.S. consumer attitudes toward privacy: 1961 to 1979, 1980 to 1989, and 1990 to 2002.6 He notes that privacy has gone from “a modest matter for a minority of consumers in the 1980s to an issue of high 4 Alan F. Westin, “Social and Political Dimensions of Privacy,” Journal of Social Issues 59(2):431-453, 2003. 5 For an additional perspective on the impact of Westin’s privacy work, see Stephen Margulis, “On the Status and Contributions of Westin’s and Altman’s Theories of Privacy,” Journal of Social Issues 59(2):411-429, 2003. 6 These three phases, as well as the baseline period leading up to them (1945 to 1960), are described in detail in Westin, “Social and Political Dimensions of Privacy,” 2003.
OCR for page 61
Engaging Privacy and Information Technology in a Digital Age intensity expressed by more than [75 percent] of American consumers in 2001,”7 citing driving factors like increasing distrust of institutions and fears surrounding the potential abuse of technology. Analyses of privacy in terms of individuals’ control over their personal information are far more common than those that are based on access, and hence are rarely backed by systematic argument. Arguments based on the “privacy as control” perspective tend to concentrate on the extent of what must be controlled for privacy to be maintained. At one extreme is the position that says that all that needs to be controlled is information about the individual per se (e.g., health status). More general analysis includes other aspects of an individual’s life, such as control over the receipt of information (such as information concerning birth control, argued in Griswold v. Connecticut, 381 U.S. 479 (1965)) or the control over one’s body (the crux of Roe v. Wade, 410 U.S. 11 (1973)). Theorists supporting the access-based definition of privacy have offered explicit explanations for their analysis, based on the ability of that analysis to explain situations that cannot be explained by a control-based theory. One such class of situations is exemplified by a person choosing to reveal intimate details of his life on national television. On the basis of the “privacy as control” theory, he could not reasonably claim that he had less privacy as the result of doing so because he chose to reveal those details. However, on the basis of the “privacy as restricted access” theory, he would have less privacy, because the information given on the show had become accessible to the entire audience. American University philosopher Jeffrey Reiman has presented a case in which the control theory would say that privacy had been violated but our intuitions say that no such violation has occurred. He points out that societies often regulate certain public activities, by requiring, for example, that bathrooms are used or that clothing is worn in public. Such requirements diminish the control of individuals over the information that they can allow to others, but the laws are also seen as privacy enhancing. Thus control over information cannot be the exclusive defining characteristic of privacy. However, laws and related expectations regarding disclosure and non-disclosure of personal information do limit the access to information by others, which is just the sort of thing that the access-based models of privacy would predict. Although the issue of whether privacy is best seen as a question of access or a question of control is the primary disagreement in much of the philosophical literature, it is hardly the only point of dispute. Another 7 Alan Westin, 2001, Testimony before the House Committee on Energy and Commerce’s Subcommittee on Commerce, Trade, and Consumer Protection, May 28, available at http://energycommerce.house.gov/107/hearings/05082001Hearing209/Westin309.htm.
OCR for page 62
Engaging Privacy and Information Technology in a Digital Age basic question has to do with what aspects of a person’s life are relevant to the question of privacy. Ruth Gavison, professor of human rights at the Hebrew University in Jerusalem, defines privacy along three axes: the first has to do with access to information about the person (secrecy), the second has to do with knowledge of the person’s identity (anonymity), and the third has to do with access to the physical proximity of the person (solitude).8 University of Pennsylvania professor of law Anita Allen’s early work distinguished among informational privacy (limited access to information, confidentiality, secrecy, anonymity, and data protection), physical privacy (limited access to persons, possessions, and personal property), and decisional privacy (limited intrusion into decision making about sex, families, religion, and health care).9 2.1.3 Coherence in the Concept of Privacy The wide variation in the accounts of privacy has led some commentators to question the whole endeavor of giving a descriptive account of privacy. For example, Yale professor of law Robert Post writes, “Privacy is a value so complex, so entangled in competing and contradictory dimensions, so engorged with various and distinct meanings, that I sometimes despair whether it can be usefully addressed at all.”10 Some of the commentators who question the descriptive accounts of privacy have argued for a general skepticism toward the coherence of the concept of privacy, while others have claimed that the concept is best understood not as a single concept but rather as a combination of other, more basic rights. A radical example of this second approach is the analysis of MIT philosopher Judith Jarvis Thompson, who argues that privacy is neither distinctive nor useful.11 In fact, says Thompson, privacy is not a coherent concept in itself, but rather a catchall that reduces, in various cases, to more primitive concepts that are more easily understood, such as property, contracts, and bodily rights. Treating privacy as a concept distinct from these others simply confuses the issues surrounding the more basic concepts. A less radical approach admits that privacy is a distinct concept but argues that it is impossible to clearly analyze because of the excess baggage that the concept has accumulated over time. Indeed, this baggage is seen to make the overall concept of privacy incoherent. This approach 8 Ruth Gavison, “Privacy and the Limits of Law,” Yale Law Journal 89:421-471, 1980. 9 Anita Allen, “Constitutional Law and Privacy,” in Dennis Patterson, ed., A Companion to Philosophy of Law and Legal Theory, Oxford University Press, Blackwell, England, 1996. 10 Robert C. Post, “Three Concepts of Privacy,” Georgetown Law Journal 89(6):2087, 2001. 11 Judith Jarvis Thomson, “The Right to Privacy,” Philosophy & Public Affairs 4:295-314, 1975.
OCR for page 63
Engaging Privacy and Information Technology in a Digital Age suggests reducing and simplifying the distinct claims, interests, and values that common usage covers in the term “privacy” to a few basic concepts (or a single, much reduced, concept). Some aspects of the general concept of privacy are reducible to more fundamental concepts, and some aspects do not belong within the rubric of privacy and should be dropped altogether. In this approach, what remains should be a coherent and useful concept of privacy, even if it does not reflect current use of the term. An even weaker form of reductionism is willing to accept a multidimensional concept of privacy, made up of several non-reducible parts that relate to each other in some fundamental way. For example, Judith DeCew argues that privacy covers information, physical and mental access to oneself, and decision-making autonomy.12 These concepts are distinct, and therefore the concept of privacy is in some way made up of others that are more basic. However, DeCew argues that there is a coherence to these concepts that makes the notion of privacy important in its own way; the whole is greater than the sum of the parts. New York University professor of culture and communication Helen Nissenbaum’s approach to privacy is based on the idea that social norms governing information flow depend on context.13 A judgment that a given action or practice violates privacy is a function of the context in which the activity takes place, what type of information is in question, and the social roles of the people involved. Social contexts, such as health care, education, commerce, religion, and so on, are governed by complex social norms, including informational norms that specify the principles of transmission governing the flow of information. These norms prescribe how certain types of information about specific individuals acting in specific roles ought to flow from party to party. In a health care context, for example, one such norm might specify that patients are obliged to share health-related information with physicians who are treating them; another norm for that context specifies that physicians may not release that information to anyone else. In a context of friendship, friends share information not out of any obligation but through choice, and the flow of information is generally reciprocal. The term “contextual integrity” is applied to those circumstances in which informational norms are respected. According to Nissenbaum’s theory of contextual integrity, these informational norms establish an expectation against which certain actions and practices are evaluated. In particular, they provide a guide to evaluating 12 Judith DeCew, In Pursuit of Privacy: Law, Ethics, and the Rise of Technology, Cornell University Press, Ithaca, N.Y., 1997. 13 See, for example, Helen Nissenbaum, “Privacy as Contextual Integrity,” Washington Law Review 79(1):119-158, 2004.
OCR for page 64
Engaging Privacy and Information Technology in a Digital Age new socio-technical practices, which are judged to respect or violate contextual integrity on the bases of several key factors: The governing context; Whether the new practice changes the types of information at issue; Whether the new practice causes a shift in who is involved as senders, receivers, and subjects of this information; and Whether new patterns of information flow fit with the relevant principles of transmission. When new technologies or socio-technical practices are disturbing from a privacy standpoint, the reason is that they are seen as violating standing informational norms. Under certain circumstances, norms may be revisited and revised: critical events, such as the September 11 attacks, may demand a revision of informational norms governing public spaces; the outbreak of an epidemic may demand that norms of information flow in the medical health care context be revisited; emergence of online dating might also result in a shift of the norms governing information flow. Nevertheless, a systematic and comprehensive strategy for evaluating whether change should be resisted or embraced starts with the important first step of revealing how, if at all, standing norms have been breached or are threatened. The theory of contextual integrity augments dominant approaches to privacy by introducing a middle layer of social analysis missing from theories analyzing privacy as a fundamental human right or value and also from theories placing privacy interests among a whole set of moral and non-moral interests to be weighed and traded in the course of political decision making. By bringing social norms to the foreground through contexts, roles, and transmission principles, this social approach adds a dimension of thought that can help address some of the critical challenges posed by new practices, and can help illuminate many of the intractable puzzles and stand-offs regularly faced in traditional approaches to privacy, for example, cultural and historical differences. Gary T. Marx, MIT professor emeritus, offers a related approach emphasizing the importance of defining terms and identifying contextual variation.14 Examining context can guide empirical inquiries and help identify assumptions often buried within normative arguments. Among 14 Gary T. Marx, Windows into the Soul: Surveillance and Society in an Age of High Technology, University of Chicago Press, forthcoming, and various articles by Gary Marx on privacy, equality, soft surveillance, borders, the public and the private, ethics, varieties of personal information and anonymity, available at http://garymarx.net.
OCR for page 65
Engaging Privacy and Information Technology in a Digital Age the most relevant factors in his situational or contextual approach are the following: Keeping distinct (yet noting relationships among) the family of concepts encompassing personal information—e.g., privacy and publicity, public and private, personal and impersonal information, surveillance, secrecy, confidentiality, anonymity, pseudo-anonymity, and identifiability and being clear about which concepts (whether as factors to be explained or to be approached as social issues) are being discussed; The nature of the means or techniques used (contrast the unaided and aided senses—e.g., directly overhearing a conversation with intercepting electronic communication on the Internet); The goals (contrast collecting information to prevent a health epidemic with the spying of the voyeur); The location (contrast personal information obtained from the home with that gathered on a public street); The type of information-protective border that is crossed (contrast crossing the borders of the physical person as in a body cavity search, with crossing via aggregation the de facto borders resulting from the disaggregation of information in scattered databases); The direction of a personal border crossing (compare taking information from a person as with drug testing or a photo with imposing information on a person as with spam or subliminal sounds); The type of personal information involved (contrast a general characteristic such as gender or age with information that may be stigmatizing, intimate, and/or offer a unique and locatable identity); The form of the data itself (contrast a third party’s written account of an event with the same event revealed by a hidden video camera); The roles and relationships among those involved (contrast parents gathering personal information on children or an ill family member with an employer or merchant gathering information on employees or customers); and The conditions of information collection, involving, e.g., informed consent, adequate security, reciprocity, sharing in the benefits, and so on. This approach yields a number of hypotheses about the patterning of normative expectations with respect to privacy behavior and helps give structure to the intuitive understandings of privacy mentioned below in this chapter. The multidimensional nature of personal information and the related contextual and situational variation prevent reaching any simple conclusions about how crossing personal informational borders will, or should, be judged. Thus, the intellectual approach is one of contingency rather than absolutism.
OCR for page 77
Engaging Privacy and Information Technology in a Digital Age and are disproportionate (i.e., perception of risks will vary for near-term and longer-term consequences). In relating this to an individual’s online behavior, he suggests that individuals want to protect their privacy in principle but put off to some time in the future the effort required, rather than taking immediate action. Combining these two sets of factors reveals broader consequences for individual privacy protection. Acquisti suggests that individuals tend to dismiss possible future consequences of revealing personal information for an immediate reward, but also lack complete information to grasp the magnitude of the risk—because each instance of revealing personal information can be linked together, resulting in “a whole that is more than the sum of its parts.” Acquisti concludes that more attention will have to be paid to behavioral responses to privacy protections, rather than focusing on protecting privacy solely through informational awareness and industry self-regulation. Acquisti’s conclusions have deep privacy implications. For example, one principle of fair information practice (see Box 1.3) is that of choice and consent. But the principle itself is silent on whether the appropriate choice should be opt-in or opt-out. Under the canons of traditional economic analysis and the rational actor model, these regimes are essentially equivalent (under the assumption that there are no transaction costs associated with either choice). But there are impassioned arguments about whether opt-in or opt-out consent better reflects the willingness of data subjects to provide information without limitations on its secondary use—and these arguments are rooted in a realization that in the real world, the default choice makes a huge difference in the regime that will end up governing most people. Those who advocate opt-out regimes know that most people will not take the trouble to opt out, and thus they can be presumed to “want” to allow information to be collected. Those who advocate opt-in regimes know that most people will not take the trouble to opt in, and that their privacy (in this case, their immunity to having information collected) will thus be protected. Behavioral economics calls into question how to determine the value that consumers place on their personal information. Hui and Png suggest that one important factor is that the information owners are unlikely to fully take into account the benefit of their information to the parties wanting their information.35 This has both a societal consequence (in that overall welfare is reduced as these parties are unable to exploit that information) and personal consequences (in that they may thus exclude 35 Kai-Lung Hui and I.P.L. Png, “The Economics of Privacy,” in Terry Hendershott, ed., Handbook of Information Systems and Economics, Elsevier, forthcoming.
OCR for page 78
Engaging Privacy and Information Technology in a Digital Age themselves from being offered certain goods or services that they might desire). In addition, information owners are likely to attach too high a price to their personal information, which might excessively raise the barrier to potential buyers of the information, and there is often a significant discrepancy between what consumers report being willing to pay to protect their personal information and what they are willing to accept to allow the use of their personal information. Hui and Png go on to suggest that consumers often attach a high price to their personal information when discussing privacy and personal information, but often readily part with their personal information “in exchange for even small rewards or incentives.”36 Finally, the findings of behavioral economics have implications for the various critiques of how fair information principles have been implemented in the United States.37 At the heart of these critiques is the oft-expressed concern that the Federal Trade Commission (FTC), the government agency with responsibility for the protection of certain privacy rights, at least for consumers in the United States, has compressed these fair information practices into a limited construct referred to as “notice and choice.” Most often, the provision of notice is satisfied by largely unintelligible industrial sector “boilerplate” language that is not subject to review by the FTC, and the default choice is framed in terms of consumers’ opting out of some subcomponent as standard business practice, unless specific legislation establishes informed affirmative consent as the default.38 Behavioral economics thus implies that a default opt-out choice will not result in a regime that would be affirmatively chosen under a default opt-in choice. 36 See Section 6 in Hui and Png, “The Economics of Privacy,” forthcoming. 37 See for example, Marc Rotenberg, “Fair Information Practices and the Architecture of Privacy (What Larry Doesn’t Get),” Stanford Technology Law Review, Volume 1, 2001 (online journal available at http://stlr.stanford.edu/STLR/Articles/01_STLR_1/index.htm); Robert Gellman, “Does Privacy Law Work?,” in P. Agre and M. Rotenberg, eds., Technology and Privacy: The New Landscape, MIT Press, Cambridge, Mass., 1997; and R. Clarke, “Beyond the OECD Guidelines,” Xamax Consultancy Pty Ltd., 2000. 38 The special and highly contested case of telecommunications policy, in which customer proprietary network information (CPNI) could be released only with explicit consumer permission, is one such example. The Federal Communications Commission (FCC) issued an order interpreting the “approval” requirements in February 1998 (available at http://www.fcc.gov/Bureaus/Common_Carrier/Orders/1998/fcc98027.txt). Under the FCC’s rule, telephone companies must give customers explicit notice of their right to control the use of their CPNI and obtain express written, oral, or electronic approval for its use. The rule was unsuccessfully challenged by a potential competitor, U.S. West, 182 F.3d 1223 (10th Cir. 1999).
OCR for page 79
Engaging Privacy and Information Technology in a Digital Age 2.3 SOCIOLOGICAL APPROACHES Sociological approaches to the study of privacy have emphasized the ways in which the collection and use of personal information may reflect and reinforce the relationships of power and influence between individuals, groups, and institutions within society. This emphasis on power relationships is an important factor characterizing the behavior of institutional actors in modern societies.39 There are also important distinctions to be drawn with the work of those scholars who focus on structural or institutional relationships and those who focus on the cognitive, affective, behavioral, and social process responses of individuals. Surveillance from this perspective is generally understood to refer to the systematic observation of individuals undertaken in an effort to facilitate their management and control. Some scholars concerned with surveillance emphasize the ways in which surveillance technology has changed over time, while others focus on the ways in which old and new surveillance techniques are used in the exercise of control over individuals in their roles as employees, consumers, and citizens. Still others emphasize the ways in which the use of surveillance techniques becomes a commonplace activity within more and more varied organizational and institutional contexts. Still others focus on interpersonal uses among families, friends, and strangers and the role of the mass media.40 An important but distinct body of work within this scholarly tradition is one that focuses on a variety of surveillance techniques used by 39 Within sociology there are different perspectives on surveillance. More technologically oriented observers might prefer to talk about the “capture” of transaction-generated information (see, for example, Philip E. Agre, “Surveillance and Capture: Two Models of Privacy,” The Information Society 10(2):101-127, 1995). On the other hand, David Lyon (in Surveillance Society: Monitoring Everyday Life, Open University Press, 2001) argues that “rather late in the day sociology started to recognize surveillance as a central dimension of modernity, an institution in its own right, not reducible to capitalism, the nation-state, or even bureaucracy.” Gary T. Marx (“Seeing Hazily, But Not Darkly, Through the Lens: Some Recent Empirical Studies of Surveillance Technologies,” Law and Social Inquiry 30(2):339-399, 2005) notes limitations on an exclusive focus on power and control as the defining elements. There are also goals involving protection, documentation, planning, strategy, and pleasure (e.g., as entertainment and to satisfy curiosity). 40 One surprisingly relatively unstudied and unregulated type here that arouses strong social concern is the voyeur secretly gathering data. See, for example, C. Calvert, Voyeur Nation: Media, Privacy and Peering in Modern Culture, Westview, Boulder, Colo., 2000; and a “true fiction” case in which the protagonist Tom Voire engages in, or is the victim of, more than 100 kinds of questionable personal information practices, as described in Gary T. Marx, “Forget Big Brother and Big Corporation: What About the Personal Uses of Surveillance Technology as Seen in Cases Such as Tom I. Voire?” Rutgers Journal of Law & Urban Policy 3(4):219-286, 2006.
OCR for page 80
Engaging Privacy and Information Technology in a Digital Age the police and security forces.41 However, debates continue among scholars of surveillance who contest evidence and claims about the extent to which there is a meaningful possibility of limiting, resisting, or reversing the trend toward more complete social control and domination through surveillance. Another important distinction within sociology is a focus on the macro level of grand theory examining the structural or institutional relationships versus a focus on the cognitive, affective, and behavioral responses of individuals who are subject to surveillance. This latter approach makes it easier to test theories empirically. Among those taking a macro-level approach, David Lyon also brings a long historical view to his assessment of the role of surveillance in society. He integrates a number of insights from an evolving cultural studies tradition to study that which is referred to as the postmodern condition. He provides examples to illustrate the ways in which “dataveillance,” or the analysis of transaction-generated data, reduces the need for access to a physical or material body in order to gather information and intelligence about the past, present, or future behavior of data subjects.42 Priscilla M. Regan has proposed framing privacy as a collective concern rather than as an individualistic one.43 She gives three reasons for arguing that privacy has value not only to individuals but also to society in general. First, she believes that all individuals value some degree of privacy and have some common perceptions about privacy. Second, she notes that privacy is a value that supports democratic political systems. Third, she asserts that technology and market forces make it hard for any one person to have privacy without all persons having a similar minimum level of privacy. She also conceptualizes personal information as a resource that is available to multiple parties, is difficult to prevent others from using, and is subject to degradation as a result of overuse. Thus, she argues, personal information as a resource is subject to some of the same kinds of excessive-use pressures as are resources such as clean air and edible ocean fish. Privacy can be framed as preventing the overuse of personal information, and thus she argues that public policies to support privacy would have much in common with policies that address the issue of common-pool resources such as air and fish. Scholars working from within a Marxist or neo-Marxist tradition 41 Richard Ericson and Kevin Haggerty, Policing the Risk Society, University of Toronto Press, 1997; and Gary T. Marx, Undercover: Police Surveillance in America, University of California Press, 1988. 42 Lyon, Surveillance Society, 2001. 43 P.M. Regan, “Privacy as a Common Good in the Digital World,” Information, Communication and Society 5(3, September 1):382-405, 2002.
OCR for page 81
Engaging Privacy and Information Technology in a Digital Age engage the development of surveillance techniques as a response of capitalists to continuing crises of over- and underproduction. As the logic of capital is extended to more and more activities, surveillance facilitates coordination and control.44 Anthony Giddens, who extends the analyses of surveillance, combines the grand theories begun by Michel Foucault and Max Weber with empirical data in an effort to cross these distinctions.45 Scholars influenced by Giddens have focused on surveillance as an aspect of rationalization within government46 and corporate bureaucracies.47 In terms of understanding how individuals perceive surveillance processes, Irwin Altman’s work reflects a psychological emphasis and contributes not only concepts and measures of desired and realized levels of privacy, but also behavioral insights that are useful in cataloging the resources available to individuals that allow them to exercise more or less control over the boundaries between themselves and others.48 Finally, and in addition to his identification and assessment of important trends, critical events, and important distinctions between segments of the population (Section 2.1.2), Westin’s analyses have provided insights into the ways in which privacy policies emerge in response to public concerns. But critics have suggested that for a number of reasons, including the influence of corporate sponsors and a concern with assessing the public’s response to the issue of the day, Westin’s empiricism has stretched its theoretical foundations beyond its useful limits.49 The work within sociology on surveillance (and, by extension, its relationship to privacy) considers the effect of surveillance on individuals and society. These effects can occur even in the absence of actual surveillance if the individuals believe that they are being observed—these are intangible effects in the sense that they affect individuals’ states of mind, but are no less real. This feeds into the worries about the impact of tech- 44 Frank Webster and Kevin Robins, Information Technology: A Luddite Analysis, Ablex, 1986. 45 See, for example, Anthony Giddens, The Nation State and Violence: Volume Two of a Contemporary Critique of Historical Materialism, Polity Press, Cambridge, Mass., 1985. This work integrates an understanding of bureaucracy from Max Weber (see Reinhard Bendix, Max Weber, an Intellectual Portrait, Doubleday, 1960) and panoptic surveillance from Michel Foucault (Discipline and Punish: The Birth of the Prison, Vintage Books, 1979). 46 Christopher Dandeker, Surveillance Power and Modernity, Polity Press, Cambridge, Mass., 1990. 47 Oscar H. Gandy, Jr., The Panoptic Sort: A Political Economy of Personal Information, Westview, 1993. 48 Stephen T. Margulis, “On the Status and Contribution of Westin’s and Altman’s Theories of Privacy,” Journal of Social Issues 59(2):411-429, 2003. 49 Oscar H. Gandy, Jr., “The Role of Theory in the Policy Process: A Response to Professor Westin,” pp. 99-106 in Charles Firestone and Jorge Reina Schement, eds., Towards an Information Bill of Rights and Responsibilities, The Aspen Institute, 1995.
OCR for page 82
OCR for page 83
Engaging Privacy and Information Technology in a Digital Age Descriptive identification plays an important role in the identification of groups, and such identification can enable and justify discriminatory actions that reduce the space for autonomous action that people might otherwise enjoy. Such issues are often discussed under the heading of group privacy. Group privacy is not a new concept,52 although most of the attention that has been paid to the concept in the past has been focused on the freedom of association. Privacy scholars have begun to argue that group privacy should also be understood as a right of self-determination that is increasingly limited by the use of transaction-generated information to define group membership on the basis of behavioral rather than biological attributes. Developments in genetic research may or may not establish linkages between biology and behavior, but the emerging concern with behavioral identification is different. The reason is that behavioral identification or characterization of groups can be done covertly. Persons who are members of groups defined on the basis of biological markers (age, race, gender) have some opportunity to mobilize and petition for rights on the basis of a common identity. Persons who are members of groups that have been identified and defined on the basis of statistical analyses are less likely to be aware of the identities of others in their group, even if they manage to discover the nature of their own classification. These ascriptive groups often have names that are used only by the organizations that produce them.53 Because the existence of such groups is rarely made public, and little effort is made to inform group members of their status, they are less likely to organize politically to press for their rights. To the extent that members of social groups that have traditionally been victims of discrimination are also members of statistically derived groups of “disfavored others,” it seems likely that their social position will reflect the impact of cumulative disadvantage.54 An important cluster of concerns is inherent in an information-based ability to discriminate against persons as individuals or as members of groups in the name of increasing efficiency and economic competitiveness 52 Edward J. Bloustein, Individual and Group Privacy, Transaction Publications, 1978. Also relevant is S. Alpert, “Protecting Medical Privacy: Challenges in the Age of Genetic Information,” Journal of Social Issues 59(2):301-322, 2003. 53 The names for communities produced by users of geo-demographic targeting techniques may be interpreted by the general public, but those names are rarely made public because people so identified are often angered when they learn of their identification. Examples might be a group characterized as “shotgun and pickups” or “Volvo-driving Gucci lovers.” 54 The concept of cumulative disadvantage is examined in considerable detail in R.M. Blank, M. Dabady, and C.F. Citro, eds., Measuring Racial Discrimination, The National Academies Press, Washington, D.C., 2004.
OCR for page 84
Engaging Privacy and Information Technology in a Digital Age or meeting other socially desirable goals.55 The ability to discriminate has the potential to reinforce differences in power and privilege within social systems. Discrimination is an exercise in social sorting, and such sorting relies heavily on personal information. Sorting occurs for all kinds of reasons—some benign and some insidious—but social sorting can mean that individuals in some categories will face more limitations on their opportunities to choose than will individuals in other categories. On the other hand, the protection of privacy means that some personal information will not be available for such sorting. As a matter of national policy, it is illegal to make certain decisions (e.g., on employment, housing) on the basis of categorical distinctions regarding race, gender, religion, and certain aspects of lifestyle. At the same time, many believe that attention to these very factors is appropriate as a tool to overcome historic and persistent disadvantage. But the conflicts are clear, and the sociological and cultural challenge is thus to determine what kinds of information about persons are and are not appropriate to use and under what conditions. New contested means of classification and identification are likely to continually appear and to be challenged as values conflict. There is no simple answer to such questions, but discussion of and openness with respect to the factors used in social sorting for public policy purposes are of the utmost importance. 2.4 AN INTEGRATING PERSPECTIVE The discussion above of philosophical, economic, and sociological perspectives on privacy indicates that understanding privacy in the information age requires consideration of a variety of approaches, methods, and ideas. Taken as a whole, the privacy literature is a cacophony, suggesting that trying to define privacy in the abstract is not likely to be a fruitful exercise. Indeed, a number of varied and sometimes incommensurate perspectives on privacy were reflected in the committee. But the committee also found common ground on several points among its members, witnesses, and in the literature. The first point is that privacy touches a very broad set of social concerns related to the control of, access to, and uses of information—this report emphasizes privacy as access to and control over information about individuals. An interesting question is whether privacy is a concept relevant to information about groups of people, although of course for many 55 See, for example, Frederick Schauer, Profiles, Probabilities, and Stereotypes, Harvard University Press, Cambridge, Mass., 2003, and Bernard E. Harcourt, “Rethinking Racial Profiling: A Critique of the Economics, Civil Liberties, and Constitutional Literature, and of Criminal Profiling More Generally,” University of Chicago Law Review 71(Fall):1275-1381, 2004.
OCR for page 85
Engaging Privacy and Information Technology in a Digital Age purposes a group can be treated as an individual (e.g., corporations are given the legal right to make contracts and to sue and be sued as legal persons). Second, to the extent that privacy is a “good thing to have,” it sometimes conflicts with other good things to have. Thus, needs for privacy must sometimes be balanced with other considerations—complaints of some privacy advocates or privacy foes about excessive or inappropriate balancing notwithstanding. How this balance is to be achieved is often the center of the controversy around privacy. Complicating the effort to find the appropriate balance is the tendency to confuse the needs of privacy with other values that might be tied to privacy but are, in fact, distinct from it and the differential impact of policies on various social groups and over time. For example, the privacy of personal health information is often related to concerns about discriminatory access to heath care; this point is discussed further in Box 7.1 in Chapter 7. Third, privacy in context is much more understandable than privacy in the abstract. As the literature illustrates, agreement on a broad analytical definition of privacy in the abstract is difficult if not impossible. But discussions of the privacy implications of specific events and practices are easier to understand and discuss. One approach to grounding a discussion of privacy in context is the use of anchoring vignettes (Box 2.2). The anchoring vignette technique can be useful for understanding the impact of any given technology on privacy by posing a set of grounded, scenario-specific questions that can be answered with and without the presence of that technology. It can also be helpful for understanding public perceptions of privacy in various contexts. In this report, the technique is used in multiple ways to illustrate and to help unpack intuitions about privacy in different contexts. Knowing a context for privacy discussions does not result in an overarching theoretical definition of privacy. Nor does it represent an agreement about the level of privacy that is appropriate in any given situation. However, knowing the relevant dimensions of privacy and what “more” or “less” privacy might mean in the specific context of each dimension does clarify the discussion, and the anchoring vignette technique is one useful approach to obtain such knowledge. The context-sensitive nature of privacy makes it clear that questions about privacy necessarily imply specifying privacy “from whom,” “about what,” “for what reasons,” and “under what conditions.” For example, a set of possible privacy violators might include one’s employer; family; friends, acquaintances, and neighbors; researchers; businesses; and government. Associated with each of these potential violators is an “about what”—a (different) set of information types that might arise with any given possible privacy violator. For example, in the context of an employer
OCR for page 86
Engaging Privacy and Information Technology in a Digital Age BOX 2.2 The Anchoring Vignette Approach to Grounding Discussions of Privacy Developed by committee member Gary King and others, an anchoring vignette is a brief description of a specific situation involving personal information.1 Organized into related sets in which a range of privacy considerations are manifest, the vignettes help to collect, articulate, and organize intuitions about privacy in a more precise and empirical fashion; clarify assumptions about privacy; empirically document views on privacy; and serve as a good tool for illustrating, expressing, and communicating existing concepts of privacy. Vignettes have been extensively used for conducting actual surveys and in helping develop actual survey instruments, but in the context of this report they help to define the concepts so that all participants in a privacy discussion have the same frame of reference. Although they are not intended to suggest a particular policy to adopt, anchoring vignettes help to provide a lingua franca for privacy and so they may be of use to citizens in attaining a better understanding of public policy regarding privacy. The vignettes form a continuum along which various policy scenarios can be placed and in that sense can help to frame questions that might be asked about any given policy. To illustrate, consider the issue of privacy with respect to criminal charges. A set of useful vignettes might be as follows: [Jonathan] was arrested on charges of assault and battery last year. He lives in a county that stores records of criminal charges at the police headquarters, where there is no public access. 1Gary King, Christopher J.L. Murray, Joshua A. Salomon, and Ajay Tandon, “Enhancing the Validity and Cross-cultural Comparability of Survey Research,” American Political Science Review 98(1):191-207, 2004, available at http://gking.harvard.edu/files/abs/vign-abs.shtml. See also Gary King and Jonathan Wand, “Comparing Incomparable Survey Responses: New Tools for Anchoring Vignettes,” Political Analysis, forthcoming, 2007, available at http://gking.harvard.edu/files/abs/c-abs.shtml. Extensive examples and other information can be found at the Anchoring Vignettes Web site, at http://gking.harvard.edu/vign/. The committee thanks Dan Ho and Matthew Knowles, who assisted in the development of material on anchoring vignettes presented to the committee during its open data-gathering sessions. as a possible privacy violator, one might be concerned about surveillance of work or about drug testing. By contrast, in the context of friends, acquaintances, and neighbors as possible privacy violators, one might be concerned about personal secrets, nudity, sex, medical information, and invasiveness.56 The kinds of social roles and relationships involved are as central as 56 In thinking through who might be a possible privacy violator, it also helps to consider parties with whom one might be willing to share information. Although in some sense, one is the complement of the other, in practice the complement is more likely to be fuzzy, with zones of more gray and less gray rather than sharp boundaries between black and white.
OCR for page 87
Engaging Privacy and Information Technology in a Digital Age [Monali] was arrested on charges of assault and battery last year. She lives in a county that maintains all records of criminal charges for public inspection at the county courthouse. [David] was arrested on charges of assault and battery last year. He lives in a county that maintains all records of criminal charges at the county courthouse for public inspection and in an electronic database, to which any police officer or county official has access. [Andrea] was arrested on charges of assault and battery last year. She lives in a county that posts all criminal charges on the Internet. The Web page includes pictures and detailed profiles of all arrested. Our intuitions about privacy in each of these situations reflect our answers to questions such as, How much privacy does the individual have in this situation?, Does David have more privacy than Andrea?, and so on. We can also ask how much privacy the individual should be granted in the situation. One way to think about these vignettes is to imagine being asked a survey question about each vignette or even about yourself: How much privacy [does “Name”/do you] have? (a) unlimited, (b) a lot, (c) moderate, (d) some, (e) none? The imagined survey context helps to make the examples concrete and clarifies how they are to be read. Although such vignettes are often used for survey research, defining privacy from the bottom up does not involve administering a survey or necessarily asking these questions of others. For each set of anchoring vignettes (denoting privacy in one specific context), different people will have different views about what thresholds delineate levels of privacy below which violations should be considered undesirable, unethical, illegal, or immoral. Agreement on normative issues like these will always be difficult to achieve. The anchoring vignette-based approach to privacy thus does not resolve all normative issues, but it helps to clearly define the playing field. Note also that vignettes can be modified to illustrate different scenarios. For example, the above scenario can be modified by substituting “convicted” for “arrested on charges” and “convictions” for “charges.” It is likely that such changes might cause at least some people to reevaluate their answers. the goals, location, type of technology, and data involved, and the conditions under which personal information is collected and used. Indeed, what constitutes privacy, what information should be private, and what individuals or institutions are posing potential threats to that privacy are all questions subject to considerable debate. A related set of questions involves the circumstances under which privacy can be seen to go too far. Under some conditions the failure to discover or reveal personal information can be harmful socially (e.g., in the case of potential for exposure to deadly contagious diseases or a person with a history of violent and abusive behavior).
Representative terms from entire chapter: