National Academies Press: OpenBook

Engaging Privacy and Information Technology in a Digital Age (2007)

Chapter: Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings

« Previous: Part I Thinking About Privacy, 1 Thinking About Privacy
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

Part II
The Backdrop for Privacy

Chapter 2 (“Intellectual Approaches and Conceptual Underpinnings”) provides a primer on privacy as an intellectual concept from the perspective of three disciplines—philosophy, economics, and sociology. Philosophical approaches to the study of privacy have centered on the elucidation of the basic concept and the normative questions around whether privacy is a right, a good in itself, or an instrumental good. Economic approaches to the question have centered around the value, in economic terms, of privacy, both in its role in the information needed for efficient markets and in the value of information as a piece of property. Sociological approaches to the study of privacy have emphasized the ways in which the collection and use of personal information have reflected and reinforced the relations of power and influence between individuals, groups, and institutions within society. That there is such a multiplicity of legitimate intellectual approaches to the study of privacy suggests that no one discipline captures, or can capture, the richness and texture of its various nuances, and what appear at first to be very slight or subtle differences turn out to have deep implications in practice.

Chapter 3 (“Technological Drivers”) examines the vast changes enabled by information technology by exploring the ramifications of increased connectivity and ubiquity, data gathering, ever-growing computational power and storage capacity, and more-sophisticated sensors; what architecture choices mean for social values such as privacy; and what kind of control (or lack of control) technology enables for individuals. Such change has dramatically altered the on-the-ground realities within which

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

the concept of privacy is necessarily embedded. The net result is that new kinds of data are being collected and stored in vast quantities and over long periods of time, and obscurity or difficulty of access are increasingly less practical as ways of protecting privacy. Finally, because information technologies are continually dropping in cost, technologies for collecting and analyzing personal information from multiple, disparate sources are increasingly available to individuals, corporations, and governments.

Chapter 4 (“The Legal Landscape in the United States”) provides a detailed overview of the legal landscape in the United States. The foundation of privacy law in the United States is, of course, the U.S. Constitution, and the First, Fourth, and Ninth Amendments of the Constitution have important implications for privacy, and specifically constrain the applicability of federal and state law regarding certain privacy-related issues. In addition, there are a large number of federal laws (and regulations and executive orders) that address privacy in one form or another, so many in fact that what emerges is an ad hoc patchwork of law that lacks strong coherence or unifying themes. State laws regarding privacy and common law and private causes of action (privacy torts) add to this patchwork but do not rationalize it. Finally, in a global economy, the need to conduct commerce across international borders suggests that the United States cannot ignore foreign law regarding privacy—and foreign law regarding privacy is often much more comprehensive than domestic law.

Chapter 5 (“The Politics of Privacy Policy in the United States”) addresses the question of how privacy policy is made. Privacy advocates use public opinion as a lever to generate concern and action. In addition, a number of reports over the past several decades have served to catalyze public action. Judicial decisions—important in interpreting existing law—are also the most important, and perhaps the only, forum in which competing goals and values are explicitly weighed and balanced against each other. Finally, corporate policy regarding privacy—even if it is established by default or inattention—has potentially enormous impact on the privacy actually enjoyed by individuals, because such policies often delve into areas of privacy that are minimally addressed by existing law.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

2
Intellectual Approaches and Conceptual Underpinnings

The concept of privacy has a long intellectual history. Many have written attempting to characterize privacy philosophically, sociologically, psychologically, and legally. This chapter provides a brief sampling of some major intellectual perspectives on privacy, along with some analysis of how these different perspectives relate to one another. These perspectives illustrate some common themes while demonstrating the difficulty inherent in characterizing privacy, no matter what intellectual frameworks or tools are used.

Note also that this chapter—as well as this report—focuses on privacy as it relates to information. The informational dimensions of privacy are clearly central, but at the same time some have argued that the concept of privacy must be broader than that; for example, the U.S. Supreme Court has held that a right to choose an abortion or to receive information about contraceptives is founded on privacy protections implied in the Constitution. The discussion below is not intended to address these non-informational dimensions of privacy and mentions them only in passing as they may help to illuminate some of the issues surrounding the notion of privacy and the ethical and moral dimensions of the general privacy debate.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

2.1
PHILOSOPHICAL THEORIES OF PRIVACY

2.1.1
A Philosophical Perspective

Philosophical works on privacy generally focus on two central tasks.1 The first is to attempt to answer the question, What is privacy?, by giving a precise definition or analysis of the concepts involved. The second is to explain why we value privacy and why privacy should be respected by others, whether those others are individuals, organizations, or governments.

It is useful to distinguish these two tasks by calling the first a descriptive account of privacy and the second a prescriptive or normative account of privacy. These tasks are conceptually distinct, and maintaining the distinction between them allows a rich set of questions to be asked.

For example, a descriptive analysis does not need to justify the ethical questions surrounding privacy as part of the analysis of the concept. Once a descriptive analysis of privacy has been accomplished, the normative aspects of the concept can then be discussed based on that description, and the discussion may well—and properly—include ethical questions.

Further, normative accounts of privacy may depend on subtle differences in the descriptive analysis that are either stated or presumed, and that can be masked if the two tasks are intertwined. So, for example, a descriptive account of privacy may show that there are cases where privacy conflicts with other values. Such a conflict may lead to the decision that not all violations of privacy are to be avoided. But if descriptive and prescriptive accounts of privacy were merged, such an analysis might be precluded from the outset since our prescriptive account might hold that all reductions of privacy count as moral violations.

Any descriptive account of privacy will have to correspond to the perceptions and intuitions of most people about clear cases of the concept. So, for example, an analysis of the concept that held that privacy is a binary property that an individual either has or has totally lost would not be acceptable, as it violates the commonly held intuition about degrees of privacy and the loss of some privacy. A descriptive account that adequately deals with clear cases can then be used to elucidate less clear cases, and can then be used as a base for prescriptive discussions about privacy. So, for example, starting from a descriptive analysis of privacy that acknowledges that there are levels or degrees to privacy, it is then possible to address the prescriptive question of where a particular loss or

1

Note that the discussion in Section 2.1.1 draws not only on the writings of professional philosophers, but also on other work that undertakes explicit conceptual analysis devoted to exploring what privacy is, what a right to privacy consists in, and why we ought to protect it.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

gain of privacy is good or bad, problematic or not, and for whom and for what time period and under what conditions.

Much of the philosophical work on privacy has been stimulated by contentious activities in realms such as law, policy, institutional practices, and specific or novel applications of technology. In such a context, the prescriptive aspects of privacy are the most commonly discussed. What descriptive work has been done is often in the context of clarifying the basic concepts as part of a discussion of a particular normative view.

2.1.2
Privacy as Control Versus Privacy as Restricted Access

The most common descriptive accounts of privacy reflect two basic views: (1) privacy as restrictions on the access of other people to an individual’s personal information and (2) privacy as an individual’s control over personal information2 such as information on health status. While related, these two views can also be seen as distinct.

Political science professor emeritus Alan Westin’s take on privacy was (and remains) an influential example from the group that defines privacy in terms of control. Indeed, Westin can be credited with developing one of the very first formulations of so-called informational privacy in the late 1960s, and his definition of privacy has proven quite useful to scholars as society has moved more fully into the information age: “Privacy is the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others.”3

In Privacy and Freedom, Westin took an interdisciplinary approach to analyzing the nature and functions of privacy, its roles in society, and new technologies for surveillance, as well as the push for new privacy standards and protections. As part of the overall theory put forth in the book, Westin defines four distinct functions of (or reasons for wanting) privacy—personal autonomy, emotional release, self-evaluation, and limited/protected communication—as well as four distinct states of privacy—solitude, freedom from observation; intimacy, closeness among a small group of people; anonymity, freedom from being identified in public settings; and reserve, the freedom to withdraw from communication. As he describes it, these states are subject to constant change, depending on one’s personal needs and choices about what to reveal and what not to reveal at a given time. Indeed, for Westin, the importance of this control over information disclosure, “both to [an] individual’s self-development

2

A second use of “control” refers to an external agency with the power to compel a person to disclose personal information. “Control” in this section is not used in this sense.

3

See Alan Westin, Privacy and Freedom, Atheneum, New York, 1967, p. 7.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

and to the exercise of responsible citizenship, makes the claim to privacy a fundamental part of civil liberty in democratic society.”4 Westin’s model of informational privacy and his ideas regarding privacy more generally have informed most subsequent scholarly discussions of privacy and have often acted as a crucial jumping-off point for the work of other researchers.5

In more recent years, much of Westin’s work has been somewhat less philosophical in nature, involving surveys to assess the attitudes of U.S. consumers and Internet users toward privacy, often in association with Harris Interactive (previously Louis Harris and Associates)—work that has earned him both praise and criticism. In contrast to traditional university-based research, this work has often been done in cooperation with commercial interests.

In his survey research, Westin has suggested that Americans hold differing views regarding the value of certain aspects of privacy. For example, based on his analysis of surveys over a number of years, he groups consumers into one of three categories:

  • Privacy fundamentalists—those who reject trading information for special offers, who prefer only opt-in approaches, and who would prefer to see more legislative approaches to privacy protection;

  • Privacy unconcerned—consumers who are comfortable with trading their information for almost any consumer value; and

  • Privacy pragmatists—people who take time to weigh the benefits and risks associated with providing their personal information.

Westin goes on to suggest that the “privacy pragmatists” are the largest group, at over 60 percent of U.S. consumers, and are thus a group deserving of focused attention from businesses and policy makers. His survey work has also shown that of the four states of privacy he identified in his earlier research, intimacy is the most important state to Americans, followed by solitude, reserve, and anonymity (in that order).

Westin’s empirical research also led him to identify three phases in the state of U.S. consumer attitudes toward privacy: 1961 to 1979, 1980 to 1989, and 1990 to 2002.6 He notes that privacy has gone from “a modest matter for a minority of consumers in the 1980s to an issue of high

4

Alan F. Westin, “Social and Political Dimensions of Privacy,” Journal of Social Issues 59(2):431-453, 2003.

5

For an additional perspective on the impact of Westin’s privacy work, see Stephen Margulis, “On the Status and Contributions of Westin’s and Altman’s Theories of Privacy,” Journal of Social Issues 59(2):411-429, 2003.

6

These three phases, as well as the baseline period leading up to them (1945 to 1960), are described in detail in Westin, “Social and Political Dimensions of Privacy,” 2003.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

intensity expressed by more than [75 percent] of American consumers in 2001,”7 citing driving factors like increasing distrust of institutions and fears surrounding the potential abuse of technology.

Analyses of privacy in terms of individuals’ control over their personal information are far more common than those that are based on access, and hence are rarely backed by systematic argument. Arguments based on the “privacy as control” perspective tend to concentrate on the extent of what must be controlled for privacy to be maintained. At one extreme is the position that says that all that needs to be controlled is information about the individual per se (e.g., health status). More general analysis includes other aspects of an individual’s life, such as control over the receipt of information (such as information concerning birth control, argued in Griswold v. Connecticut, 381 U.S. 479 (1965)) or the control over one’s body (the crux of Roe v. Wade, 410 U.S. 11 (1973)).

Theorists supporting the access-based definition of privacy have offered explicit explanations for their analysis, based on the ability of that analysis to explain situations that cannot be explained by a control-based theory. One such class of situations is exemplified by a person choosing to reveal intimate details of his life on national television. On the basis of the “privacy as control” theory, he could not reasonably claim that he had less privacy as the result of doing so because he chose to reveal those details. However, on the basis of the “privacy as restricted access” theory, he would have less privacy, because the information given on the show had become accessible to the entire audience.

American University philosopher Jeffrey Reiman has presented a case in which the control theory would say that privacy had been violated but our intuitions say that no such violation has occurred. He points out that societies often regulate certain public activities, by requiring, for example, that bathrooms are used or that clothing is worn in public. Such requirements diminish the control of individuals over the information that they can allow to others, but the laws are also seen as privacy enhancing. Thus control over information cannot be the exclusive defining characteristic of privacy. However, laws and related expectations regarding disclosure and non-disclosure of personal information do limit the access to information by others, which is just the sort of thing that the access-based models of privacy would predict.

Although the issue of whether privacy is best seen as a question of access or a question of control is the primary disagreement in much of the philosophical literature, it is hardly the only point of dispute. Another

7

Alan Westin, 2001, Testimony before the House Committee on Energy and Commerce’s Subcommittee on Commerce, Trade, and Consumer Protection, May 28, available at http://energycommerce.house.gov/107/hearings/05082001Hearing209/Westin309.htm.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

basic question has to do with what aspects of a person’s life are relevant to the question of privacy. Ruth Gavison, professor of human rights at the Hebrew University in Jerusalem, defines privacy along three axes: the first has to do with access to information about the person (secrecy), the second has to do with knowledge of the person’s identity (anonymity), and the third has to do with access to the physical proximity of the person (solitude).8 University of Pennsylvania professor of law Anita Allen’s early work distinguished among informational privacy (limited access to information, confidentiality, secrecy, anonymity, and data protection), physical privacy (limited access to persons, possessions, and personal property), and decisional privacy (limited intrusion into decision making about sex, families, religion, and health care).9

2.1.3
Coherence in the Concept of Privacy

The wide variation in the accounts of privacy has led some commentators to question the whole endeavor of giving a descriptive account of privacy. For example, Yale professor of law Robert Post writes, “Privacy is a value so complex, so entangled in competing and contradictory dimensions, so engorged with various and distinct meanings, that I sometimes despair whether it can be usefully addressed at all.”10 Some of the commentators who question the descriptive accounts of privacy have argued for a general skepticism toward the coherence of the concept of privacy, while others have claimed that the concept is best understood not as a single concept but rather as a combination of other, more basic rights.

A radical example of this second approach is the analysis of MIT philosopher Judith Jarvis Thompson, who argues that privacy is neither distinctive nor useful.11 In fact, says Thompson, privacy is not a coherent concept in itself, but rather a catchall that reduces, in various cases, to more primitive concepts that are more easily understood, such as property, contracts, and bodily rights. Treating privacy as a concept distinct from these others simply confuses the issues surrounding the more basic concepts.

A less radical approach admits that privacy is a distinct concept but argues that it is impossible to clearly analyze because of the excess baggage that the concept has accumulated over time. Indeed, this baggage is seen to make the overall concept of privacy incoherent. This approach

8

Ruth Gavison, “Privacy and the Limits of Law,” Yale Law Journal 89:421-471, 1980.

9

Anita Allen, “Constitutional Law and Privacy,” in Dennis Patterson, ed., A Companion to Philosophy of Law and Legal Theory, Oxford University Press, Blackwell, England, 1996.

10

Robert C. Post, “Three Concepts of Privacy,” Georgetown Law Journal 89(6):2087, 2001.

11

Judith Jarvis Thomson, “The Right to Privacy,” Philosophy & Public Affairs 4:295-314, 1975.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

suggests reducing and simplifying the distinct claims, interests, and values that common usage covers in the term “privacy” to a few basic concepts (or a single, much reduced, concept). Some aspects of the general concept of privacy are reducible to more fundamental concepts, and some aspects do not belong within the rubric of privacy and should be dropped altogether. In this approach, what remains should be a coherent and useful concept of privacy, even if it does not reflect current use of the term.

An even weaker form of reductionism is willing to accept a multidimensional concept of privacy, made up of several non-reducible parts that relate to each other in some fundamental way. For example, Judith DeCew argues that privacy covers information, physical and mental access to oneself, and decision-making autonomy.12 These concepts are distinct, and therefore the concept of privacy is in some way made up of others that are more basic. However, DeCew argues that there is a coherence to these concepts that makes the notion of privacy important in its own way; the whole is greater than the sum of the parts.

New York University professor of culture and communication Helen Nissenbaum’s approach to privacy is based on the idea that social norms governing information flow depend on context.13 A judgment that a given action or practice violates privacy is a function of the context in which the activity takes place, what type of information is in question, and the social roles of the people involved. Social contexts, such as health care, education, commerce, religion, and so on, are governed by complex social norms, including informational norms that specify the principles of transmission governing the flow of information.

These norms prescribe how certain types of information about specific individuals acting in specific roles ought to flow from party to party. In a health care context, for example, one such norm might specify that patients are obliged to share health-related information with physicians who are treating them; another norm for that context specifies that physicians may not release that information to anyone else. In a context of friendship, friends share information not out of any obligation but through choice, and the flow of information is generally reciprocal. The term “contextual integrity” is applied to those circumstances in which informational norms are respected.

According to Nissenbaum’s theory of contextual integrity, these informational norms establish an expectation against which certain actions and practices are evaluated. In particular, they provide a guide to evaluating

12

Judith DeCew, In Pursuit of Privacy: Law, Ethics, and the Rise of Technology, Cornell University Press, Ithaca, N.Y., 1997.

13

See, for example, Helen Nissenbaum, “Privacy as Contextual Integrity,” Washington Law Review 79(1):119-158, 2004.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

new socio-technical practices, which are judged to respect or violate contextual integrity on the bases of several key factors:

  • The governing context;

  • Whether the new practice changes the types of information at issue;

  • Whether the new practice causes a shift in who is involved as senders, receivers, and subjects of this information; and

  • Whether new patterns of information flow fit with the relevant principles of transmission.

When new technologies or socio-technical practices are disturbing from a privacy standpoint, the reason is that they are seen as violating standing informational norms. Under certain circumstances, norms may be revisited and revised: critical events, such as the September 11 attacks, may demand a revision of informational norms governing public spaces; the outbreak of an epidemic may demand that norms of information flow in the medical health care context be revisited; emergence of online dating might also result in a shift of the norms governing information flow. Nevertheless, a systematic and comprehensive strategy for evaluating whether change should be resisted or embraced starts with the important first step of revealing how, if at all, standing norms have been breached or are threatened.

The theory of contextual integrity augments dominant approaches to privacy by introducing a middle layer of social analysis missing from theories analyzing privacy as a fundamental human right or value and also from theories placing privacy interests among a whole set of moral and non-moral interests to be weighed and traded in the course of political decision making. By bringing social norms to the foreground through contexts, roles, and transmission principles, this social approach adds a dimension of thought that can help address some of the critical challenges posed by new practices, and can help illuminate many of the intractable puzzles and stand-offs regularly faced in traditional approaches to privacy, for example, cultural and historical differences.

Gary T. Marx, MIT professor emeritus, offers a related approach emphasizing the importance of defining terms and identifying contextual variation.14 Examining context can guide empirical inquiries and help identify assumptions often buried within normative arguments. Among

14

Gary T. Marx, Windows into the Soul: Surveillance and Society in an Age of High Technology, University of Chicago Press, forthcoming, and various articles by Gary Marx on privacy, equality, soft surveillance, borders, the public and the private, ethics, varieties of personal information and anonymity, available at http://garymarx.net.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

the most relevant factors in his situational or contextual approach are the following:

  • Keeping distinct (yet noting relationships among) the family of concepts encompassing personal information—e.g., privacy and publicity, public and private, personal and impersonal information, surveillance, secrecy, confidentiality, anonymity, pseudo-anonymity, and identifiability and being clear about which concepts (whether as factors to be explained or to be approached as social issues) are being discussed;

  • The nature of the means or techniques used (contrast the unaided and aided senses—e.g., directly overhearing a conversation with intercepting electronic communication on the Internet);

  • The goals (contrast collecting information to prevent a health epidemic with the spying of the voyeur);

  • The location (contrast personal information obtained from the home with that gathered on a public street);

  • The type of information-protective border that is crossed (contrast crossing the borders of the physical person as in a body cavity search, with crossing via aggregation the de facto borders resulting from the disaggregation of information in scattered databases);

  • The direction of a personal border crossing (compare taking information from a person as with drug testing or a photo with imposing information on a person as with spam or subliminal sounds);

  • The type of personal information involved (contrast a general characteristic such as gender or age with information that may be stigmatizing, intimate, and/or offer a unique and locatable identity);

  • The form of the data itself (contrast a third party’s written account of an event with the same event revealed by a hidden video camera);

  • The roles and relationships among those involved (contrast parents gathering personal information on children or an ill family member with an employer or merchant gathering information on employees or customers); and

  • The conditions of information collection, involving, e.g., informed consent, adequate security, reciprocity, sharing in the benefits, and so on.

This approach yields a number of hypotheses about the patterning of normative expectations with respect to privacy behavior and helps give structure to the intuitive understandings of privacy mentioned below in this chapter.

The multidimensional nature of personal information and the related contextual and situational variation prevent reaching any simple conclusions about how crossing personal informational borders will, or should, be judged. Thus, the intellectual approach is one of contingency rather than absolutism.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

Although the conceptual questions surrounding the notion of privacy are important, it is not necessary to resolve these matters here. It is sufficient to observe that in all of these various perspectives, personal information—information about us—plays a central role, and questions of access to and use of such information are important. The challenges posed for privacy in the information age—by technological advancement, societal shifts, and critical or signal events—fall squarely within the scope of most dominant accounts of privacy and do not require resolution of some of the more difficult conceptual questions concerning the full scope or borders of the concept.

2.1.4
Normative Theories of Privacy

The philosophical works that attempt to characterize the concept of privacy see that activity as necessary for addressing the important normative questions surrounding privacy. These normative questions concern the value of privacy and include such issues as why privacy is important, both to the individual and to society; why we should individually and as a society protect privacy; and how and to what extent and in what ways with what costs and benefits privacy should be protected.

This last issue arises because of the need to consider privacy in relation to other values that may be in conflict with it. For example, maximizing privacy will constrain the information available to others, and in so doing may decrease efficiency, or security, or other things that society or various subgroups value. Deciding how much privacy to allow or require sometimes entails a tradeoff with respect to other values, and understanding the nature of those tradeoffs is necessary before one can think in a systematic fashion about decisions involving tradeoffs.

One position on the value of privacy is that it is a fundamental human right, like the right to liberty or life. In this view, privacy is an intrinsic good in itself, and a life shorn of privacy is less meaningful than one that has some measure of privacy. The fundamentalist position holds that privacy is tied to a cluster of rights, such as autonomy and dignity. These are tied together in such a way that the combination allows a human life to be more essentially human than if they are missing. Carried to its logical extreme, if privacy is an intrinsic (and absolute) good, then there are no cases in which any lack of privacy can be justified.

A more common view holds privacy to be of instrumental rather than intrinsic value; that is, the value of privacy derives from the value of other, more fundamental values that privacy allows. In the instrumentalist view, the value of privacy comes because it sustains, promotes, and protects other things that we value. In this view, privacy can be traded off or limited because doing so will promote other values that we hold dear.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

One example of the instrumentalist view holds that the value of privacy derives from the need for privacy to allow autonomy in the individual. Unlike the fundamentalist, who claims that privacy is a basic right on the same level as autonomy, the instrumentalist will claim that autonomy (the ability to control our actions and make free choices) is a fundamental value that is aided by privacy. Without privacy, in this view, the individual could be manipulated or coerced in such a way as to lose autonomy. People with no sense of privacy are less able to define and pursue the goals and ends that are meaningful to them. The actions of such an individual are more likely to be dictated by what others think than by his or her own decisions. But privacy, in this view, is not a fundamental right; if the autonomy of the individual could be guaranteed without a guarantee of privacy, there would be no need (in this view) to ensure privacy.

Another instrumentalist view holds that the value of privacy is derived from the fact that privacy contributes to fairness. It is because of privacy that we can ensure a level playing field in the information that is known by each of the two parties in an interaction. Without privacy, it would be easy for the more powerful (rich, devious) of the parties to hold extra information about the other party, disadvantaging that party in the interactions. Without privacy, the party with the greatest resources can get an unfair information advantage over other parties, ensuring the maintenance of the power or resource disparity.

Privacy has also been identified as an instrument needed for other, less tangible, goods. Arguments have been presented that tie privacy with such things as the ability to define relationships with others and the ability to sustain intimacy. Respect for privacy, in such views, is needed to demonstrate to individuals that they have control of their minds and their bodies. Without privacy there could be no such demonstration of respect, and without such respect the intimacy needed for personal relationships would be impossible.

All of this chapter’s previous discussions of the value of privacy concentrate on that value to the individual. There have been other approaches that try to see privacy through the lens of the society or group of which the individual is a part. Some discussions, for example those by communitarians, call attention to possible negative consequences of privacy.

For example, Amitai Etzioni contrasts certain privacy interests of individuals against what he identifies as the common good, or the well-being of the society as a whole.15 In most cases in which the interests of an individual are assessed against the interests of the collective, Etzioni

15

Amitai Etzioni, The Limits of Privacy, Basic Books, New York, 1999.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

insists that the collective interests must prevail. Protecting the privacy of individuals makes it harder for the society to enforce laws and ensure good public health, and it makes the overall economy less efficient. In this view, privacy has a negative value to the overall community in many settings, even if it has some value to individuals within that society. Because people tend to see only their own point of view, privacy has historically been seen to be valuable. Etzioni’s view holds that if the price society sometimes pays for individual privacy were clearer, privacy would be given less importance by society.

Similar arguments are set forth by Anita Allen-Castellito, who suggests that individuals are “accountable” to a number of “others” including employers, family members, and in some instances, members of a racial or ethnic group.16 Accountability means that we may reasonably be expected to “answer for” our behavior to others with whom we have a meaningful relationship. In her view, we are not entitled to say “it is none of your business” when some people inquire into our reasons for acting in some way that might place others, or the relationship or the person we care about, at risk.

There are also communitarians who hold that privacy is actually of value to society as a whole. While it is true that the lack of information about the individual required by privacy may have drawbacks in the areas of public health, law enforcement, and the economy, it is argued that privacy is needed to ensure the best results to society in all of these areas. Without privacy, for example, public health authorities would obtain less accurate reporting of disease and be less sure that those who have communicable diseases would seek treatment. While privacy may impede law enforcement, it is also required to insulate citizens from governmental tyranny and to ensure the general health of liberal democracy. Citizens with faith in government and law enforcement are more likely to be cooperative when they perceive that power is limited by decent rules. While aspects of the economy might be more efficient if there were no privacy, such a state of affairs would favor those able to obtain the most information, tending to ensure that unfair distributions of wealth and privilege would be perpetuated.

As is often the case with ethical and philosophical discussions, the value of these debates over privacy is not so much that we can find an answer to our questions, but rather that the issues becomes clearer and more precisely identified.

For example, the descriptive debate concerning the nature of privacy shows the difficulty of saying just what privacy is in a single simplistic

16

Anita L. Allen-Castellito, Why Privacy Isn’t Everything: Feminist Reflections on Personal Accountability, Rowman and Littlefield, Oxford, 2003.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

definition. While we can be reasonably sure that privacy is a matter of individuals’ control over information about themselves, it is less clear whether the emphasis should be on control over the gathering of that information, the access to that information after it has been gathered, the use of the information that has been gathered, or on all equally.

In addition, the debate about the normative status of privacy shows that it is sometimes unclear why we should value privacy, and what sort of value privacy has. It does seem clear that privacy must be balanced with other values that we have (at least some of the time), but the mechanisms for establishing such a balance are far from clear. Indeed, as the debates about the value of privacy for the individual or the group show, different circumstances can lead to very different decisions about the value of privacy. The debate has brought forward examples where claims to privacy are used to protect behavior that we find of great value, and examples where claims to privacy are used to protect behavior we abhor. The advantages and disadvantages will also have differential impact on the group or individual in question.

2.2
ECONOMIC PERSPECTIVES ON PRIVACY17

2.2.1
The Rationale for an Economic Perspective on Privacy

Normative discussions of privacy emphasize the notion of privacy as something of value, which has led some to attempt to look at privacy through the lens of economic theory. Rather than philosophizing about what societal values are being balanced, an economic perspective on privacy regards privacy as something that people value in varying degrees under varying circumstances (Box 2.1). To the extent that the value of privacy can be imagined in meaningful quantitative terms, an economic approach provides a framework for specifying some of the various costs and tradeoffs privacy is presumed to involve.

Thus, one starting point is the idea brought forth in Section 1.2 that privacy inherently involves tradeoffs, and understanding the nature and scope of tradeoffs is squarely in the domain of economics. A second starting point is a growing awareness that personal information about individuals, their interests, their buying tendencies, and so on has commercial value. Indeed, as Culnan and Bies suggest, consumers’ personal

17

This section is based largely on Kai-Lung Hui and I.P.L. Png, 2006, “The Economics of Privacy,” in Terry Hendershott, ed., Handbook of Information Systems and Economics, Elsevier, forthcoming; and Alessandro Acquisti, The Economics of Privacy, available at http://www.heinz.cmu.edu/~acquisti/papers/acquisti_privacy_economics.ppt.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

BOX 2.1

Personal Information as an Economic Good

Consider the passing of personal information in some kind of transaction between the subject of the information and the recipient.

  • The amount of personal information can be increased by recombinations and analysis of existing data.

  • The individual generally does not know how, how often, or for how long the information provided will be used.

  • Other parties who may gain access to the information are not known to the individual.

  • The value of personal information to the individual is highly uncertain, and is often determined by events or circumstances extant after the transaction.

  • Individuals often place different values on the protection and on the sale of the same piece of information.

  • The information, as information, is generally non-rivalrous (use of the information by one party generally does not prevent its use by another party). Personal information is also often non-excludable (other parties often cannot be prevented from using the information).

SOURCE: Adapted from Alessandro Acquisti, The Economics of Privacy, available at http://www.heinz.cmu.edu/~acquisti/papers/acquisti_privacy_economics.ppt.

information is at the center of the tension between many corporate and consumer interests.18

Finally, from a policy standpoint, economics is relevant because much of the public policy debate about privacy involves a consideration of the positive or negative market effects (whether real or potential) of government privacy regulation. For example, one long-running debate concerns using “opt-in” or “opt-out” approaches for permitting the sharing of consumer information among organizations. (Opt-in regimes do not collect information unless the individual explicitly takes steps to allow it; opt-out regimes collect information unless the individual explicitly takes steps to disallow it.) Opt-in is largely seen as an undue burden by the business world, whereas many privacy advocates see consumer opt-in (which, in a sense, places consumers in the position of owner of their own information) as the best approach for protecting consumers’ privacy.

18

Mary J. Culnan and Robert J. Bies, “Consumer Privacy: Balancing Economic and Justice Considerations,” Journal of Social Issues 59(2):323-342, 2003.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

In considering the economics of privacy, Alessandro Acquisti notes that privacy issues actually originate from two different markets: a market for personal information and a market for privacy. The market for personal information focuses on economically valuable uses for personal information and how such information can be bought, sold, and used in ways that generate value. Consider, for example, companies that may need to make decisions about extending credit to individuals. These companies may engage the services of credit bureaus that provide personal financial information regarding these individuals that is relevant to determining their creditworthiness. Or a company may use personal information about the tastes and buying preferences of its customers so that it can tailor its products more precisely to customer needs or market its products with greater efficiency. In general, personal information can be regarded as an economic good.

The market for privacy can be conceptualized as the market for goods and services that help consumers enhance or protect the privacy of their personal information. For example, they may buy products based on privacy-enhancing technologies, or adopt privacy-enhancing strategies, or patronize companies that promise to keep their customers’ personal information protected and private.

The sections below describe four economic perspectives on privacy: a “privacy as fraud” perspective, a perspective based on assigning to individuals the property rights to their personal information, a perspective based on regulation, and a perspective based on behavioral economics.

2.2.2
Privacy as Fraud

First appearing in the late 1970s, one school of thought about the economics of privacy asserts that government facilitates the free flow of personal information for commercial uses in the interests of promoting and maximizing market efficiency.19 In this view, both consumers and sellers benefit: consumers benefit when sellers have access to useful information about them, and sellers benefit from being able to get the best return on their advertising or marketing approaches and ultimately make more sales. For example, having information about a given consumer’s interest in golf might help a travel agency tailor vacation offerings or packages that would interest or benefit that consumer, as well as allow the agency

19

Among the pioneering economic and legal studies with a focus on privacy are Richard Posner, “An Economic Theory of Privacy,” Regulation (May/June):19-26, 1978; Richard A. Posner, “The Economics of Privacy,” American Economic Review 71(2):405-409, 1981; and George J. Stigler, “An Introduction to Privacy in Economics and Politics,” Journal of Legal Studies 9:623-644, 1980.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

to save money by not wasting resources in reaching out to people with no interest in such things.

However, as Hui and Png have noted, this approach “may not work efficiently” as it is predicated on, among other things, sellers having perfect information about consumers.20 Such is rarely the case, as relevant consumer information is often inaccurate, too costly, or simply too difficult to obtain. Moreover, access to information about buyers (especially transaction information) can also allow sellers to engage in so-called price discrimination, whereby consumers willing to pay a higher price for a given good or service will be charged a higher price. For example, Odlyzko describes how one computer manufacturer offered the same system for sale to small businesses, health care companies, and state and local governments at respectively lower prices.21

In addition, this approach can also lead to certain external effects that consumers often view as undesirable. Indeed, many consumers perceive unsolicited marketing appeals of a type enabled by the sharing of information (e.g., an unsolicited phone call about a tailored golfing vacation package when one has no interest whatsoever in a golfing vacation) as intrusions into their privacy and hence as becoming costs that they must bear.22 Generally, research suggests that this approach tends to favor sellers over consumers and often results in undesirable externalities for consumers.

In the context of this approach, privacy involves the “withholding or concealment of information”23—the ability of buyers to keep personal information away from sellers. Advocates of this approach assert that because efficient markets depend on the free flow of information, privacy thus renders markets less efficient.24 For instance, a buyer would choose to hide discrediting or negative information out of self-interest rather than allowing that information to be used in the decision-making process of the seller, particularly if it would raise the cost of the good or service for

20

Kai-Lung Hui and I.P.L. Png, “The Economics of Privacy,” in Terry Hendershott, ed., Handbook of Information Systems and Economics, Elsevier, forthcoming.

21

Andrew Odlyzko, “Privacy, Economics, and Price Discrimination on the Internet,” 2003, available at http://www.dtc.umn.edu/~odlyzko/doc/privacy.economics.pdf.

22

As a general matter, it is not evident whether privacy leads to more or fewer intrusions such as telemarketing calls. Increased information allows firms to better target their marketing efforts. On the one hand, marketing efforts become more effective, so that firms engage in more of them. On the other hand, because firms target, a consumer is less likely to get a worthless call.

23

Posner, “An Economic Theory of Privacy,” 1978, p. 19.

24

Stigler, “An Introduction to Privacy in Economics and Politics,” 1980.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

the buyer.25 In this view, privacy would create inefficiencies and impose restraints on businesses, and ultimately, would lower the general welfare. Furthermore, in this view, market forces would create the necessary balance between the opposing interests of buyers and sellers for the most efficient allocation of personal information for maximum benefit.

2.2.3
Privacy and the Assignment of Property Rights to Individuals

After the initial economic analyses in the late 1970s and 1980s, privacy reappeared in the economic literature in the mid-1990s,26 as the “dot-com” IT sector expanded and markets for personal information grew. In these analyses, assignment of property rights to information was proposed as one way to control or improve the use of personal information:27 consumers would “own” their personal information and would be free to share it in whatever manner they chose. For example, some might choose to restrict commercial access to their personal information entirely, whereas others might choose to sell some or all of their personal information or make it available in exchange for some other benefit.

From the “privacy as fraud” perspective, the assignment of property rights to information would distort the free market for information, shifting society away from an economically efficient equilibrium and reducing overall societal welfare. For example, granting workers property rights to their personal information might allow them to conceal information from employers. Individual workers might benefit in the short term, but employers—being denied valuable information—would make less efficient employment decisions and in the long run might be able to offer fewer jobs, resulting in fewer opportunities for these workers overall.

Assigning property rights to personal information would result largely in an opt-in market for information sharing, whereby sellers would have access to the information only of those consumers who chose to make it available. This approach would arguably free consumers from some of the negative effects permitted by a more free market approach (e.g.,

25

It is this point that makes the “privacy as fraud” school of thought significantly different from a free market school. In a completely free market, individuals would be free to spend money to conceal information. Rather, the “privacy as fraud” school stipulates that the government acts to prevent such concealment.

26

Economic literature in the mid-1990s emphasizing privacy aspects includes Roger Clarke, “Computer Matching by Government Agencies: The Failure of Cost/Benefit Analysis as a Control Mechanism,” Information Infrastructure & Policy 4(1):29-65, 1995; Eli Noam, “Privacy and Self-Regulation: Markets for Electronic Privacy,” and Hal Varian, “Economic Aspects of Personal Privacy,” both in Privacy and Self-Regulation in the Information Age, U.S. Department of Commerce, 1997; and Kenneth Laudon, “Markets and Privacy,” Communications of the ACM 39:9, 1996.

27

Varian, “Economic Aspects of Personal Privacy,” 1997.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

costs/intrusions from unsolicited marketing appeals), but it might also mean that consumers would not benefit as much from tailored goods and services from sellers or that sellers would be forced to pass along higher costs associated with less efficient marketing.

Hermalin and Katz have found that privacy can be efficient in certain circumstances but that privacy property rights—personal control over one’s personal information—are often worthless.28 Suppose, for example, that a job candidate has the right to withhold health information from a potential employer. Such silence would likely be inferred by the employer to mean that the candidate’s health is poor. The same holds true for a job candidate who declines to answer a question about whether he was in prison: The employer would likely assume that the candidate has served a prison term. If an employer assumes the worst about a potential employee who chooses to exercise his or her privacy rights, then the right to remain silent can be completely valueless, and anyone other than those with the most to hide will “voluntarily” reveal their personal information. To protect privacy in such settings, it may be necessary that public policy make it mandatory for everyone to remain silent about their personal information.

2.2.4
The Economic Impact of Privacy Regulation

Whereas the assignment of property rights can have value in resolving privacy issues in contexts where the collectors and users of personal information and the information owners can enter into contractual arrangements, there are many situations in which such contractual arrangements are difficult to manage. Such situations are generally handled through regulatory and tort law, and economic analyses of such laws make it possible to understand some of their economic impact.

Hui and Png argue that privacy regulation is most appropriate when many information providers (consumers) are highly sensitive to the gathering of their personal information. (In this context, regulation refers to restrictions, and possibly prohibitions, on the sale or use of personal information for commercial purposes.) When this is so, regulation works efficiently and overall welfare is maximized because consumers can avoid the cost of understanding the privacy policy of each individual information gatherer. By contrast, regulation is highly inefficient when most consumers do not care very much about protecting their personal information; under these circumstances, information collectors will avoid the cost of protecting the personal information they gather.

28

Ben Hermalin and Michael Katz, “Privacy, Property Rights & Efficiency: The Economics of Privacy as Secrecy,” Quantitative Marketing and Economics 4(3, September):209-239, 2006.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

One argument in favor of regulation is that it may be a more effective form of commitment than contractual arrangements.29 Hui and Png argue that under some circumstances, sellers benefit from privacy guarantees provided to consumers. A privacy guarantee assures the consumer that his information will not be further shared with a third party, thus making a privacy-sensitive consumer more likely to make a purchase from a seller. In this setting, the efficiency of regulation emerges from eliminating uncertainty on the consumer’s part about whether his personal information will be shared with other parties. An example is the privacy of patient health information. Because candor is required for effective provision of health care services, privacy guarantees for patient health information promote healthier individuals—and healthier individuals enhance overall community health.

Yet there are contexts in which privacy guarantees detract from overall welfare. Hui and Png suggest that although the “do not call” list has helped to reduce the volume of telephone solicitors, it may not be possible to control spam e-mail in the same way. One reason is that the majority of spam e-mail likely comes from illicit spammers in any event, and these individuals are unlikely to obey a law that requires senders of spam to consult with a “do not spam” list. Even worse, spammers might find ways of obtaining the e-mail addresses on the “do not spam” list, thus rendering the law counterproductive.

Hui and Png conclude that the key issue is how to balance the interests of sellers and consumers, and note that a sweeping “okay to use” or “do not use” solution will not work across all contexts. When it is feasible to determine the benefits and costs of information use, one approach is industry-specific regulation.

2.2.5
Privacy and Behavioral Economics

In 2004, some of the first work on a behavioral economic perspective on privacy appeared. Behavioral economics seeks to integrate insights from psychology with neoclassical economic theory and to understand the economic implications of behavior that does not conform to the calculating, unemotional, utility-maximizing characteristics of Homo economicus.

In a privacy context, it has been observed that despite consumer concern about privacy, survey results point to broad discrepancies between

29

Although private contracts can represent a stronger commitment than does public policy (which can be unilaterally changed), a regulatory approach to privacy protection has the dual advantages of greater enforceability and broad applicability and relevance across the entire population.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

attitudes of individuals and their actual behavior toward privacy protection.30 Given the lack of consumer demand, markets for privacy-protecting goods and services (e.g., anonymizers) will continue to remain small, and other self-regulating markets relying on consumer behavior for input may not sufficiently protect consumer privacy.31

Moreover, research in social psychology and behavioral economics indicates that the “default condition” of many choices is very “sticky”—that is, most people find it easier to accept a default choice made on their behalf regarding a putative decision than to change that choice, even if the default choice is less advantageous to them than changing that choice.32

Recent work by Acquisti is illustrative, offering an explanation for the well-known discrepancy between individuals’ attitudes and their behavior when it comes to online privacy.33 Acquisti argues that contrary to traditional economic analyses that assume that individuals are fully rational, forward-looking, Bayesian updaters who take into account how current behavior will influence their future well-being and preferences, individuals instead demonstrate various forms of psychological inconsistencies (self-control problems, hyperbolic discounting, present-biases). Furthermore, the ability to make fully informed decisions regarding one’s privacy is extremely difficult after personal information has been transmitted to a third party and can continue to change hands, without the individual’s knowledge, for any length of time.

To provide further insight on individual decision making, Acquisti relies on the concept of immediate gratification,34 an individual’s preference for well-being earlier rather than later and the tendency to engage in desirable activities over undesirable activities, even if the choice may result in negative future consequences. Furthermore, an individual’s preferences are also inconsistent over time (i.e., preferences for the future activities will change as the date to undertake the activity approaches)

30

See, for example, studies cited in Section 8 in Hui and Png, “The Economics of Privacy,” forthcoming.

31

Alessandro Acquisti, “Privacy, Economics, and Immediate Gratification: Why Protecting Privacy Is Easy, But Selling It Is Not,” in Proceedings of the 2004 BLACKHAT Conference, Las Vegas, Nev., July 2004.

32

See, for example, William Samuelson and Richard Zeckhauser, “Status Quo Bias in Decision Making,” Journal of Risk & Uncertainty 1:7-59, 1988; B.C. Madrian and D.F. Shea, “The Power of Suggestion: Inertia in 401(k) Participation and Savings Behavior,” Quarterly Journal of Economics 116(4):1149-1187, 2001.

33

Alessandro Acquisti, “Privacy in Electronic Commerce and Economics of Immediate Gratification,” pp. 21-29 in Proceedings of the ACM Electronic Commerce Conference (EC 04), ACM Press, New York, 2004.

34

Immediate gratification is related to other types of psychological distortion described in economic and psychological literature that include time inconsistency, hyperbolic discounting, and self-control bias.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

and are disproportionate (i.e., perception of risks will vary for near-term and longer-term consequences). In relating this to an individual’s online behavior, he suggests that individuals want to protect their privacy in principle but put off to some time in the future the effort required, rather than taking immediate action.

Combining these two sets of factors reveals broader consequences for individual privacy protection. Acquisti suggests that individuals tend to dismiss possible future consequences of revealing personal information for an immediate reward, but also lack complete information to grasp the magnitude of the risk—because each instance of revealing personal information can be linked together, resulting in “a whole that is more than the sum of its parts.” Acquisti concludes that more attention will have to be paid to behavioral responses to privacy protections, rather than focusing on protecting privacy solely through informational awareness and industry self-regulation.

Acquisti’s conclusions have deep privacy implications. For example, one principle of fair information practice (see Box 1.3) is that of choice and consent. But the principle itself is silent on whether the appropriate choice should be opt-in or opt-out. Under the canons of traditional economic analysis and the rational actor model, these regimes are essentially equivalent (under the assumption that there are no transaction costs associated with either choice). But there are impassioned arguments about whether opt-in or opt-out consent better reflects the willingness of data subjects to provide information without limitations on its secondary use—and these arguments are rooted in a realization that in the real world, the default choice makes a huge difference in the regime that will end up governing most people. Those who advocate opt-out regimes know that most people will not take the trouble to opt out, and thus they can be presumed to “want” to allow information to be collected. Those who advocate opt-in regimes know that most people will not take the trouble to opt in, and that their privacy (in this case, their immunity to having information collected) will thus be protected.

Behavioral economics calls into question how to determine the value that consumers place on their personal information. Hui and Png suggest that one important factor is that the information owners are unlikely to fully take into account the benefit of their information to the parties wanting their information.35 This has both a societal consequence (in that overall welfare is reduced as these parties are unable to exploit that information) and personal consequences (in that they may thus exclude

35

Kai-Lung Hui and I.P.L. Png, “The Economics of Privacy,” in Terry Hendershott, ed., Handbook of Information Systems and Economics, Elsevier, forthcoming.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

themselves from being offered certain goods or services that they might desire). In addition, information owners are likely to attach too high a price to their personal information, which might excessively raise the barrier to potential buyers of the information, and there is often a significant discrepancy between what consumers report being willing to pay to protect their personal information and what they are willing to accept to allow the use of their personal information. Hui and Png go on to suggest that consumers often attach a high price to their personal information when discussing privacy and personal information, but often readily part with their personal information “in exchange for even small rewards or incentives.”36

Finally, the findings of behavioral economics have implications for the various critiques of how fair information principles have been implemented in the United States.37 At the heart of these critiques is the oft-expressed concern that the Federal Trade Commission (FTC), the government agency with responsibility for the protection of certain privacy rights, at least for consumers in the United States, has compressed these fair information practices into a limited construct referred to as “notice and choice.” Most often, the provision of notice is satisfied by largely unintelligible industrial sector “boilerplate” language that is not subject to review by the FTC, and the default choice is framed in terms of consumers’ opting out of some subcomponent as standard business practice, unless specific legislation establishes informed affirmative consent as the default.38 Behavioral economics thus implies that a default opt-out choice will not result in a regime that would be affirmatively chosen under a default opt-in choice.

36

See Section 6 in Hui and Png, “The Economics of Privacy,” forthcoming.

37

See for example, Marc Rotenberg, “Fair Information Practices and the Architecture of Privacy (What Larry Doesn’t Get),” Stanford Technology Law Review, Volume 1, 2001 (online journal available at http://stlr.stanford.edu/STLR/Articles/01_STLR_1/index.htm); Robert Gellman, “Does Privacy Law Work?,” in P. Agre and M. Rotenberg, eds., Technology and Privacy: The New Landscape, MIT Press, Cambridge, Mass., 1997; and R. Clarke, “Beyond the OECD Guidelines,” Xamax Consultancy Pty Ltd., 2000.

38

The special and highly contested case of telecommunications policy, in which customer proprietary network information (CPNI) could be released only with explicit consumer permission, is one such example. The Federal Communications Commission (FCC) issued an order interpreting the “approval” requirements in February 1998 (available at http://www.fcc.gov/Bureaus/Common_Carrier/Orders/1998/fcc98027.txt). Under the FCC’s rule, telephone companies must give customers explicit notice of their right to control the use of their CPNI and obtain express written, oral, or electronic approval for its use. The rule was unsuccessfully challenged by a potential competitor, U.S. West, 182 F.3d 1223 (10th Cir. 1999).

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

2.3
SOCIOLOGICAL APPROACHES

Sociological approaches to the study of privacy have emphasized the ways in which the collection and use of personal information may reflect and reinforce the relationships of power and influence between individuals, groups, and institutions within society. This emphasis on power relationships is an important factor characterizing the behavior of institutional actors in modern societies.39 There are also important distinctions to be drawn with the work of those scholars who focus on structural or institutional relationships and those who focus on the cognitive, affective, behavioral, and social process responses of individuals.

Surveillance from this perspective is generally understood to refer to the systematic observation of individuals undertaken in an effort to facilitate their management and control. Some scholars concerned with surveillance emphasize the ways in which surveillance technology has changed over time, while others focus on the ways in which old and new surveillance techniques are used in the exercise of control over individuals in their roles as employees, consumers, and citizens. Still others emphasize the ways in which the use of surveillance techniques becomes a commonplace activity within more and more varied organizational and institutional contexts. Still others focus on interpersonal uses among families, friends, and strangers and the role of the mass media.40

An important but distinct body of work within this scholarly tradition is one that focuses on a variety of surveillance techniques used by

39

Within sociology there are different perspectives on surveillance. More technologically oriented observers might prefer to talk about the “capture” of transaction-generated information (see, for example, Philip E. Agre, “Surveillance and Capture: Two Models of Privacy,” The Information Society 10(2):101-127, 1995). On the other hand, David Lyon (in Surveillance Society: Monitoring Everyday Life, Open University Press, 2001) argues that “rather late in the day sociology started to recognize surveillance as a central dimension of modernity, an institution in its own right, not reducible to capitalism, the nation-state, or even bureaucracy.” Gary T. Marx (“Seeing Hazily, But Not Darkly, Through the Lens: Some Recent Empirical Studies of Surveillance Technologies,” Law and Social Inquiry 30(2):339-399, 2005) notes limitations on an exclusive focus on power and control as the defining elements. There are also goals involving protection, documentation, planning, strategy, and pleasure (e.g., as entertainment and to satisfy curiosity).

40

One surprisingly relatively unstudied and unregulated type here that arouses strong social concern is the voyeur secretly gathering data. See, for example, C. Calvert, Voyeur Nation: Media, Privacy and Peering in Modern Culture, Westview, Boulder, Colo., 2000; and a “true fiction” case in which the protagonist Tom Voire engages in, or is the victim of, more than 100 kinds of questionable personal information practices, as described in Gary T. Marx, “Forget Big Brother and Big Corporation: What About the Personal Uses of Surveillance Technology as Seen in Cases Such as Tom I. Voire?” Rutgers Journal of Law & Urban Policy 3(4):219-286, 2006.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

the police and security forces.41 However, debates continue among scholars of surveillance who contest evidence and claims about the extent to which there is a meaningful possibility of limiting, resisting, or reversing the trend toward more complete social control and domination through surveillance.

Another important distinction within sociology is a focus on the macro level of grand theory examining the structural or institutional relationships versus a focus on the cognitive, affective, and behavioral responses of individuals who are subject to surveillance. This latter approach makes it easier to test theories empirically. Among those taking a macro-level approach, David Lyon also brings a long historical view to his assessment of the role of surveillance in society. He integrates a number of insights from an evolving cultural studies tradition to study that which is referred to as the postmodern condition. He provides examples to illustrate the ways in which “dataveillance,” or the analysis of transaction-generated data, reduces the need for access to a physical or material body in order to gather information and intelligence about the past, present, or future behavior of data subjects.42

Priscilla M. Regan has proposed framing privacy as a collective concern rather than as an individualistic one.43 She gives three reasons for arguing that privacy has value not only to individuals but also to society in general. First, she believes that all individuals value some degree of privacy and have some common perceptions about privacy. Second, she notes that privacy is a value that supports democratic political systems. Third, she asserts that technology and market forces make it hard for any one person to have privacy without all persons having a similar minimum level of privacy. She also conceptualizes personal information as a resource that is available to multiple parties, is difficult to prevent others from using, and is subject to degradation as a result of overuse. Thus, she argues, personal information as a resource is subject to some of the same kinds of excessive-use pressures as are resources such as clean air and edible ocean fish. Privacy can be framed as preventing the overuse of personal information, and thus she argues that public policies to support privacy would have much in common with policies that address the issue of common-pool resources such as air and fish.

Scholars working from within a Marxist or neo-Marxist tradition

41

Richard Ericson and Kevin Haggerty, Policing the Risk Society, University of Toronto Press, 1997; and Gary T. Marx, Undercover: Police Surveillance in America, University of California Press, 1988.

42

Lyon, Surveillance Society, 2001.

43

P.M. Regan, “Privacy as a Common Good in the Digital World,” Information, Communication and Society 5(3, September 1):382-405, 2002.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

engage the development of surveillance techniques as a response of capitalists to continuing crises of over- and underproduction. As the logic of capital is extended to more and more activities, surveillance facilitates coordination and control.44 Anthony Giddens, who extends the analyses of surveillance, combines the grand theories begun by Michel Foucault and Max Weber with empirical data in an effort to cross these distinctions.45 Scholars influenced by Giddens have focused on surveillance as an aspect of rationalization within government46 and corporate bureaucracies.47

In terms of understanding how individuals perceive surveillance processes, Irwin Altman’s work reflects a psychological emphasis and contributes not only concepts and measures of desired and realized levels of privacy, but also behavioral insights that are useful in cataloging the resources available to individuals that allow them to exercise more or less control over the boundaries between themselves and others.48

Finally, and in addition to his identification and assessment of important trends, critical events, and important distinctions between segments of the population (Section 2.1.2), Westin’s analyses have provided insights into the ways in which privacy policies emerge in response to public concerns. But critics have suggested that for a number of reasons, including the influence of corporate sponsors and a concern with assessing the public’s response to the issue of the day, Westin’s empiricism has stretched its theoretical foundations beyond its useful limits.49

The work within sociology on surveillance (and, by extension, its relationship to privacy) considers the effect of surveillance on individuals and society. These effects can occur even in the absence of actual surveillance if the individuals believe that they are being observed—these are intangible effects in the sense that they affect individuals’ states of mind, but are no less real. This feeds into the worries about the impact of tech-

44

Frank Webster and Kevin Robins, Information Technology: A Luddite Analysis, Ablex, 1986.

45

See, for example, Anthony Giddens, The Nation State and Violence: Volume Two of a Contemporary Critique of Historical Materialism, Polity Press, Cambridge, Mass., 1985. This work integrates an understanding of bureaucracy from Max Weber (see Reinhard Bendix, Max Weber, an Intellectual Portrait, Doubleday, 1960) and panoptic surveillance from Michel Foucault (Discipline and Punish: The Birth of the Prison, Vintage Books, 1979).

46

Christopher Dandeker, Surveillance Power and Modernity, Polity Press, Cambridge, Mass., 1990.

47

Oscar H. Gandy, Jr., The Panoptic Sort: A Political Economy of Personal Information, Westview, 1993.

48

Stephen T. Margulis, “On the Status and Contribution of Westin’s and Altman’s Theories of Privacy,” Journal of Social Issues 59(2):411-429, 2003.

49

Oscar H. Gandy, Jr., “The Role of Theory in the Policy Process: A Response to Professor Westin,” pp. 99-106 in Charles Firestone and Jorge Reina Schement, eds., Towards an Information Bill of Rights and Responsibilities, The Aspen Institute, 1995.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

nology advances, since many of those advances enable the observation of an individual without the knowledge of the person being observed. Even if these beliefs are groundless, they can change the behavior of the individual. Sociological studies attempt to understand these effects.

Sociological perspectives on privacy also examine the consequences that flow from the lack of privacy. For example, although one could define privacy as “that which can be lost under excessive surveillance,” sociological analyses would highlight the point that under excessive surveillance far more is lost than just privacy. Human dignity, equal opportunity, social justice, and equality for groups with historical disadvantages—among other values—can suffer as a result of excessive and inappropriate surveillance. On the other hand and somewhat ironically, surveillance may also be a factor in documenting wrongs. (For example, a tape from a video surveillance may inadvertently capture an incident of police misconduct.) To approach the topic using only the language of privacy is to miss some of the pressing social, ethical, and political issues raised by contemporary surveillance. Alternatively, one might formulate issues of human dignity, equal opportunity, social justice, and racial parity as being some of the areas in which societal or group harms may result from a loss of (individual) privacy.

It is helpful to consider some of the distinctions between personal identity and externally constructed identification.50 For example, David Phillips accepts the critical distinction between identity and identification,51 noting the differences in agency that usually locate within bureaucratic organizations the power to impose identification on someone. Furthermore, Phillips clarifies the distinctions between nominal, indexical, and descriptive forms of identification:

  • Nominal identification refers to the names people have been given. Such names often do not provide unique identification in that more than one person can have the same name. Additional data can reduce the number of persons to whom these data apply. Biometric data are assumed to reduce the range of associations rather dramatically.

  • Indexical identification associates information about place and time in order to enable a particular person to be “identified” in some way.

  • Descriptive identification refers to the ways in which identification can be based on an association of attributes, behaviors, and locations with other markers.

50

Oscar H. Gandy, Jr., “Exploring Identity and Identification in Cyberspace,” Notre Dame Journal of Law, Ethics and Public Policy 14(2):1085-1111, 2000.

51

David J. Phillips, “Privacy Policy and PETs: The Influence of Policy Regimes on the Development and Social Implications of Privacy Enhancing Technologies,” New Media & Society 6(6):691-706, 2004.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

Descriptive identification plays an important role in the identification of groups, and such identification can enable and justify discriminatory actions that reduce the space for autonomous action that people might otherwise enjoy. Such issues are often discussed under the heading of group privacy.

Group privacy is not a new concept,52 although most of the attention that has been paid to the concept in the past has been focused on the freedom of association. Privacy scholars have begun to argue that group privacy should also be understood as a right of self-determination that is increasingly limited by the use of transaction-generated information to define group membership on the basis of behavioral rather than biological attributes. Developments in genetic research may or may not establish linkages between biology and behavior, but the emerging concern with behavioral identification is different.

The reason is that behavioral identification or characterization of groups can be done covertly. Persons who are members of groups defined on the basis of biological markers (age, race, gender) have some opportunity to mobilize and petition for rights on the basis of a common identity. Persons who are members of groups that have been identified and defined on the basis of statistical analyses are less likely to be aware of the identities of others in their group, even if they manage to discover the nature of their own classification. These ascriptive groups often have names that are used only by the organizations that produce them.53

Because the existence of such groups is rarely made public, and little effort is made to inform group members of their status, they are less likely to organize politically to press for their rights. To the extent that members of social groups that have traditionally been victims of discrimination are also members of statistically derived groups of “disfavored others,” it seems likely that their social position will reflect the impact of cumulative disadvantage.54

An important cluster of concerns is inherent in an information-based ability to discriminate against persons as individuals or as members of groups in the name of increasing efficiency and economic competitiveness

52

Edward J. Bloustein, Individual and Group Privacy, Transaction Publications, 1978. Also relevant is S. Alpert, “Protecting Medical Privacy: Challenges in the Age of Genetic Information,” Journal of Social Issues 59(2):301-322, 2003.

53

The names for communities produced by users of geo-demographic targeting techniques may be interpreted by the general public, but those names are rarely made public because people so identified are often angered when they learn of their identification. Examples might be a group characterized as “shotgun and pickups” or “Volvo-driving Gucci lovers.”

54

The concept of cumulative disadvantage is examined in considerable detail in R.M. Blank, M. Dabady, and C.F. Citro, eds., Measuring Racial Discrimination, The National Academies Press, Washington, D.C., 2004.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

or meeting other socially desirable goals.55 The ability to discriminate has the potential to reinforce differences in power and privilege within social systems. Discrimination is an exercise in social sorting, and such sorting relies heavily on personal information. Sorting occurs for all kinds of reasons—some benign and some insidious—but social sorting can mean that individuals in some categories will face more limitations on their opportunities to choose than will individuals in other categories. On the other hand, the protection of privacy means that some personal information will not be available for such sorting.

As a matter of national policy, it is illegal to make certain decisions (e.g., on employment, housing) on the basis of categorical distinctions regarding race, gender, religion, and certain aspects of lifestyle. At the same time, many believe that attention to these very factors is appropriate as a tool to overcome historic and persistent disadvantage. But the conflicts are clear, and the sociological and cultural challenge is thus to determine what kinds of information about persons are and are not appropriate to use and under what conditions. New contested means of classification and identification are likely to continually appear and to be challenged as values conflict. There is no simple answer to such questions, but discussion of and openness with respect to the factors used in social sorting for public policy purposes are of the utmost importance.

2.4
AN INTEGRATING PERSPECTIVE

The discussion above of philosophical, economic, and sociological perspectives on privacy indicates that understanding privacy in the information age requires consideration of a variety of approaches, methods, and ideas. Taken as a whole, the privacy literature is a cacophony, suggesting that trying to define privacy in the abstract is not likely to be a fruitful exercise. Indeed, a number of varied and sometimes incommensurate perspectives on privacy were reflected in the committee. But the committee also found common ground on several points among its members, witnesses, and in the literature.

The first point is that privacy touches a very broad set of social concerns related to the control of, access to, and uses of information—this report emphasizes privacy as access to and control over information about individuals. An interesting question is whether privacy is a concept relevant to information about groups of people, although of course for many

55

See, for example, Frederick Schauer, Profiles, Probabilities, and Stereotypes, Harvard University Press, Cambridge, Mass., 2003, and Bernard E. Harcourt, “Rethinking Racial Profiling: A Critique of the Economics, Civil Liberties, and Constitutional Literature, and of Criminal Profiling More Generally,” University of Chicago Law Review 71(Fall):1275-1381, 2004.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

purposes a group can be treated as an individual (e.g., corporations are given the legal right to make contracts and to sue and be sued as legal persons).

Second, to the extent that privacy is a “good thing to have,” it sometimes conflicts with other good things to have. Thus, needs for privacy must sometimes be balanced with other considerations—complaints of some privacy advocates or privacy foes about excessive or inappropriate balancing notwithstanding. How this balance is to be achieved is often the center of the controversy around privacy. Complicating the effort to find the appropriate balance is the tendency to confuse the needs of privacy with other values that might be tied to privacy but are, in fact, distinct from it and the differential impact of policies on various social groups and over time. For example, the privacy of personal health information is often related to concerns about discriminatory access to heath care; this point is discussed further in Box 7.1 in Chapter 7.

Third, privacy in context is much more understandable than privacy in the abstract. As the literature illustrates, agreement on a broad analytical definition of privacy in the abstract is difficult if not impossible. But discussions of the privacy implications of specific events and practices are easier to understand and discuss. One approach to grounding a discussion of privacy in context is the use of anchoring vignettes (Box 2.2). The anchoring vignette technique can be useful for understanding the impact of any given technology on privacy by posing a set of grounded, scenario-specific questions that can be answered with and without the presence of that technology. It can also be helpful for understanding public perceptions of privacy in various contexts. In this report, the technique is used in multiple ways to illustrate and to help unpack intuitions about privacy in different contexts.

Knowing a context for privacy discussions does not result in an overarching theoretical definition of privacy. Nor does it represent an agreement about the level of privacy that is appropriate in any given situation. However, knowing the relevant dimensions of privacy and what “more” or “less” privacy might mean in the specific context of each dimension does clarify the discussion, and the anchoring vignette technique is one useful approach to obtain such knowledge.

The context-sensitive nature of privacy makes it clear that questions about privacy necessarily imply specifying privacy “from whom,” “about what,” “for what reasons,” and “under what conditions.” For example, a set of possible privacy violators might include one’s employer; family; friends, acquaintances, and neighbors; researchers; businesses; and government. Associated with each of these potential violators is an “about what”—a (different) set of information types that might arise with any given possible privacy violator. For example, in the context of an employer

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

BOX 2.2

The Anchoring Vignette Approach to Grounding Discussions of Privacy

Developed by committee member Gary King and others, an anchoring vignette is a brief description of a specific situation involving personal information.1 Organized into related sets in which a range of privacy considerations are manifest, the vignettes help to collect, articulate, and organize intuitions about privacy in a more precise and empirical fashion; clarify assumptions about privacy; empirically document views on privacy; and serve as a good tool for illustrating, expressing, and communicating existing concepts of privacy.

Vignettes have been extensively used for conducting actual surveys and in helping develop actual survey instruments, but in the context of this report they help to define the concepts so that all participants in a privacy discussion have the same frame of reference. Although they are not intended to suggest a particular policy to adopt, anchoring vignettes help to provide a lingua franca for privacy and so they may be of use to citizens in attaining a better understanding of public policy regarding privacy. The vignettes form a continuum along which various policy scenarios can be placed and in that sense can help to frame questions that might be asked about any given policy.

To illustrate, consider the issue of privacy with respect to criminal charges. A set of useful vignettes might be as follows:

  1. [Jonathan] was arrested on charges of assault and battery last year. He lives in a county that stores records of criminal charges at the police headquarters, where there is no public access.

  

1Gary King, Christopher J.L. Murray, Joshua A. Salomon, and Ajay Tandon, “Enhancing the Validity and Cross-cultural Comparability of Survey Research,” American Political Science Review 98(1):191-207, 2004, available at http://gking.harvard.edu/files/abs/vign-abs.shtml. See also Gary King and Jonathan Wand, “Comparing Incomparable Survey Responses: New Tools for Anchoring Vignettes,” Political Analysis, forthcoming, 2007, available at http://gking.harvard.edu/files/abs/c-abs.shtml. Extensive examples and other information can be found at the Anchoring Vignettes Web site, at http://gking.harvard.edu/vign/. The committee thanks Dan Ho and Matthew Knowles, who assisted in the development of material on anchoring vignettes presented to the committee during its open data-gathering sessions.

as a possible privacy violator, one might be concerned about surveillance of work or about drug testing. By contrast, in the context of friends, acquaintances, and neighbors as possible privacy violators, one might be concerned about personal secrets, nudity, sex, medical information, and invasiveness.56

The kinds of social roles and relationships involved are as central as

56

In thinking through who might be a possible privacy violator, it also helps to consider parties with whom one might be willing to share information. Although in some sense, one is the complement of the other, in practice the complement is more likely to be fuzzy, with zones of more gray and less gray rather than sharp boundaries between black and white.

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
  1. [Monali] was arrested on charges of assault and battery last year. She lives in a county that maintains all records of criminal charges for public inspection at the county courthouse.

  2. [David] was arrested on charges of assault and battery last year. He lives in a county that maintains all records of criminal charges at the county courthouse for public inspection and in an electronic database, to which any police officer or county official has access.

  3. [Andrea] was arrested on charges of assault and battery last year. She lives in a county that posts all criminal charges on the Internet. The Web page includes pictures and detailed profiles of all arrested.

Our intuitions about privacy in each of these situations reflect our answers to questions such as, How much privacy does the individual have in this situation?, Does David have more privacy than Andrea?, and so on. We can also ask how much privacy the individual should be granted in the situation.

One way to think about these vignettes is to imagine being asked a survey question about each vignette or even about yourself: How much privacy [does “Name”/do you] have? (a) unlimited, (b) a lot, (c) moderate, (d) some, (e) none? The imagined survey context helps to make the examples concrete and clarifies how they are to be read. Although such vignettes are often used for survey research, defining privacy from the bottom up does not involve administering a survey or necessarily asking these questions of others.

For each set of anchoring vignettes (denoting privacy in one specific context), different people will have different views about what thresholds delineate levels of privacy below which violations should be considered undesirable, unethical, illegal, or immoral. Agreement on normative issues like these will always be difficult to achieve. The anchoring vignette-based approach to privacy thus does not resolve all normative issues, but it helps to clearly define the playing field.

Note also that vignettes can be modified to illustrate different scenarios. For example, the above scenario can be modified by substituting “convicted” for “arrested on charges” and “convictions” for “charges.” It is likely that such changes might cause at least some people to reevaluate their answers.

the goals, location, type of technology, and data involved, and the conditions under which personal information is collected and used. Indeed, what constitutes privacy, what information should be private, and what individuals or institutions are posing potential threats to that privacy are all questions subject to considerable debate. A related set of questions involves the circumstances under which privacy can be seen to go too far. Under some conditions the failure to discover or reveal personal information can be harmful socially (e.g., in the case of potential for exposure to deadly contagious diseases or a person with a history of violent and abusive behavior).

Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 55
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 56
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 57
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 58
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 59
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 60
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 61
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 62
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 63
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 64
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 65
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 66
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 67
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 68
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 69
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 70
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 71
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 72
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 73
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 74
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 75
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 76
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 77
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 78
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 79
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 80
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 81
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 82
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 83
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 84
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 85
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 86
Suggested Citation:"Part II The Backdrop for Privacy, 2 Intellectual Approaches and Conceptual Underpinnings." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 87
Next: 3 Technological Drivers »
Engaging Privacy and Information Technology in a Digital Age Get This Book
×
Buy Hardback | $59.95 Buy Ebook | $47.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Privacy is a growing concern in the United States and around the world. The spread of the Internet and the seemingly boundaryless options for collecting, saving, sharing, and comparing information trigger consumer worries. Online practices of business and government agencies may present new ways to compromise privacy, and e-commerce and technologies that make a wide range of personal information available to anyone with a Web browser only begin to hint at the possibilities for inappropriate or unwarranted intrusion into our personal lives. Engaging Privacy and Information Technology in a Digital Age presents a comprehensive and multidisciplinary examination of privacy in the information age. It explores such important concepts as how the threats to privacy evolving, how can privacy be protected and how society can balance the interests of individuals, businesses and government in ways that promote privacy reasonably and effectively? This book seeks to raise awareness of the web of connectedness among the actions one takes and the privacy policies that are enacted, and provides a variety of tools and concepts with which debates over privacy can be more fruitfully engaged. Engaging Privacy and Information Technology in a Digital Age focuses on three major components affecting notions, perceptions, and expectations of privacy: technological change, societal shifts, and circumstantial discontinuities. This book will be of special interest to anyone interested in understanding why privacy issues are often so intractable.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!