National Academies Press: OpenBook

Engaging Privacy and Information Technology in a Digital Age (2007)

Chapter: Part IV Findings and Recommendations, 10 Findings and Recommendations

« Previous: 9 Privacy, Law Enforcement, and National Security
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

Part IV
Findings and Recommendations

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

This page intentionally left blank.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

10
Findings and Recommendations

10.1
COMING TO TERMS

Finding 1. The meaning of privacy is highly contextual, and it can vary depending on the specific circumstances at hand, such as the situation and relationships at issue, the intentions of the parties involved, and the historical context, technology, and political environment.


Chapters 1 and 2 of this report take note of the fact that in both everyday discourse and in the scholarly literature, a commonly agreed-upon abstract definition of privacy is elusive (Section 1.2). For example, “privacy” under discussion may involve protecting the confidentiality of information; enabling a sense of autonomy, independence, and freedom to foster creativity; wanting to be left alone; or establishing enough trust that individuals within a given community are willing to disclose data under the assumption that it will not be misused.

Nevertheless, it is often possible to find agreement on the meaning of privacy in specific contexts (Section 2.4). In other words, the meaning of privacy depends on many specifics about the situation at hand, e.g., the situation and relationships at issue, the intentions of the parties involved, and the historical context, technology, and the political environment. For example, informational privacy involving political and religious beliefs raises different issues than does health information with respect to a contagious disease. A conversation with one’s attorney is different from a speech in a public park or a posting on an Internet bulletin board. Agreement on the meaning of “privacy” outside the specified context is

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

not necessary, but for making progress in a specific context, a common understanding is essential. In many cases, simply clarifying the terms constitutes progress in itself, and indeed may on occasion be sufficient to reduce the need for further argument.

Because the committee found that common to almost all notions of privacy is a privileged status for personal information (privileged in the sense of information that is not immediately known or accessible to others), this report has focused on the meaning and implications of privacy as it relates to the gathering, aggregation, analysis, distribution, and use of personal information. A successful discussion about privacy policies requires the clear identification of both the nature of the personal information in question and the relevant contextual factors.

Regarding the nature of the personal information, it is important to probe in several areas discussed in Section 2.1.3:

  • Data capture, which includes the type(s) of personal information in question (e.g., Social Security number, medical information, publicly available information) and the circumstance and means of its capture;

  • Data storage, which includes the time period for which data will be retained and available for use and the circumstances and means of storage (e.g., media used), and the protections for personal information while it is available for a specific use;

  • Data analysis and integration, which includes the nature of the process through which the information is analyzed and the links that might be made to other data; and

  • Data dissemination, which includes the parties who will have access to the information, the form(s) in which the information is presented, the type of harm that might result from unwelcome disclosure or dissemination, and the extent to which this information has privacy implications for other individuals.

Regarding the relevant contextual factors, it might be useful to probe about the following:

  • What is the relevant and applicable social and institutional context? For example, are rewards or benefits offered for sharing personal information? Is coercion used in the form of withholding benefits when personal information is not shared? Does the individual retain control over the initial and potential future uses of her information? Does she have the opportunity to review and correct personal information?

  • Who are the actors and institutions involved? These might include the subject of the information, the provider of the information (which may not be the subject), the original recipients of the information, subsequent

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

recipients, and other individuals who might be affected without their active involvement—and the relationships among them.

  • What are the stated and unstated motivations, goals, or purposes of the actors? Why do the recipients of the information want it? How might the information be repurposed—used for a purpose other than that for which it was originally collected—in the future?

  • How are decisions made when there are competing interests regarding personal information, for example, public health needs versus individual privacy or national security versus civil rights interests?

  • What are the informational norms in question? As noted in Chapter 2, informational norms specify how different kinds of information about various actors in the given context can flow. These norms can be illuminated in many instances through the technique of applying anchoring vignettes as described in Chapter 2. Relevant issues concerning these norms might include:

    • The extent to which information is provided voluntarily (e.g., is the providing of information required by law, is the information acquired covertly or deceptively);

    • The extent to which information can be passed along to third parties and the circumstances of such passing (e.g., is it part of a financial transaction);

    • The extent to which reciprocity exists (is the subject entitled to receive information or other benefits from the recipient);

    • The extent to which the gathering of information is apparent and obvious to those to whom the information pertains;

    • Limitations on the use of the information that are implied or explicitly noted;

    • Whether or not the act of subsequently providing information is known to the subject; and

    • The extent to which collected information can/might be used for or against others (e.g., relatives, other members of a class).

One important corollary of Finding 1 is that policy debates are likely to be sterile and disconnected if they are couched simply in abstract terms. It should thus be expected that policy debates involving privacy will be couched in the language of the specific context involved—and such context-dependent formulations are desirable. The reason is that even if the issues themselves seem to carry over from one context to another, the weighting of each issue and hence the relationships of issues to each other are likely to depend on the specific context.

A second corollary is that because privacy has meaning only in context, the incidence of privacy problems (e.g., violations of privacy) is poorly defined outside specific contexts, and overall quantitative measures of

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

privacy are not particularly meaningful. What may be more meaningful is careful delimitation of claims that are made based on domain-specific data. An example from the identity theft domain might involve hypothesizing the number of individuals per year whose names and Social Security numbers were potentially compromised by a security breach, rather than asserting these numbers as indicating identity theft.

A third corollary is that privacy is not primarily a technological issue—technology cannot violate or guarantee privacy. Technology can enhance or detract from the secrecy of information or the anonymity of an actor, but these are not the same as privacy. The nature and extent of privacy in any given context are tied to many factors, including the way in which information is accessed, the intentions of those accessing the information, and the trust relationships between the user of the information and the subject of the information.

10.2
THE VALUE OF PRIVACY

Finding 2. Privacy is an important value to be maintained and protected, although it is not an absolute good in itself.


As noted in Chapter 2, privacy is an important value to be maintained and protected. Certain types of privacy (e.g., those involving religious beliefs and political ideas or certain aspects of the body) approach the status of fundamental human rights. They are related to our most cherished ideals of the dignity of the person, the family, liberty, and democracy.

At the same time, the committee does not view privacy as an intrinsic and absolute good independently of particular situations. There are times when crossing the informational borders of the person is appropriate and to fail to do so would be irresponsible. That is, the committee recognizes situations and contexts in which society negotiates appropriate tradeoffs between privacy and other values (as discussed below) such as public health and safety. To note this is not to deny the centrality of privacy to human dignity, candor, and intimacy as well as to a democratic society. Privacy is thus also a means as well as an end, and the committee recognizes considerable instrumental value in privacy—privacy in the service of other important goals. Beyond instrumentality, privacy has important symbolic value in demonstrating societal respect for the individual.


Finding 3. Loss of privacy often results in significant tangible and intangible harm to individuals and to groups.


In one obvious example, protecting the privacy of one’s personal information helps to make one safer from crimes such as fraud, identity theft, and stalking. (When undertaken on a large scale, identity theft can

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

also have important and negative effects on society, as suggested by the use of identity theft as an element in the financing of terrorist groups and their operations (see Box 4.1).) But such tangible harms, striking though they are, affect far fewer people compared with those who suffer less tangible harms (as suggested in Section 1.3). These intangible harms could be regarded as the consequential damages to individuals and to society that result from the loss or compromise of privacy, and they are no less real or significant for being intangible rather than tangible. Consider:

  • A person whose personal information (name, address, Social Security number, and so on) may have fallen into the hands of identity thieves may not in fact suffer from an actual fraudulent purchase made in her name. But if the breach is identified and the subject learns of it, she will likely worry about being victimized and thus must look over her shoulder for a very long period of time. She may have to scrutinize her credit card statements more carefully; she may have to subscribe to a credit-monitoring service; she may have to put a freeze on her credit report and thereby deny herself the convenience of obtaining instant credit at a store. She may live in fear of assault, public embarrassment, or defamation, not knowing who has the information or how it might be used. Thus, absent the protection of her information, she stands to lose real benefits and the intangible peace of mind that she would otherwise enjoy, even if no actual direct harm occurs, not to mention the many dozens of hours needed to repair her records and relationships. Furthermore, it takes only a few such well-publicized incidents (i.e., a small number compared with the number of possible instances where it could happen) to cause a very large number of people to lose trust in electronic commerce and related matters—and thus to refrain from engaging in such commerce. Such broader impacts have larger consequences for the economy as a whole than simply the impact on the individuals directly affected by identity theft.

  • Under public surveillance, many people change their behavior so that they are not seen as acting anomalously in any way, even if their behavior absent surveillance would be perfectly legal and ethical. For example, an interracial couple may walk down the road holding hands and even sneak a kiss. With surveillance cameras visibly trained on the road, they may not kiss, they may not hold hands, and they may even change their route so that they are not under video surveillance. Public surveillance may reduce the likelihood that someone would attend a public demonstration in which he might otherwise participate. In short, surveillance often has the effect of influencing the behavior of people in the direction of greater conformity and homogeneity. Greater conformity is sometimes defensible, as might be the case when safe driving can be linked to automatic traffic camera surveillance. But surveillance in some

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

instances has negative consequences, and in a culture and society that celebrate diversity and embrace tolerance, such chilling effects are not at all positive. In short, privacy supports many democratic societal values, such as the right to freely associate, the embrace of social diversity, and even the use of secret ballots in support of free elections, and consequently the loss of privacy can affect the entire society.

  • Through the analysis of a variety of personal information, the U.S. government has placed many individuals on “watch lists” as suspected terrorists who should be denied airplane boarding privileges or entry into the United States. Individuals on these watch lists cannot know of their status until they are denied boarding or entry—and if they are in fact not terrorists, they suffer the consequences of mistaken identity. Further, they have no recourse—no way to be made whole—for the consequences they suffer.

  • Workplace surveillance changes the workplace environment, almost by definition. But unlike most unfocused public surveillance, the very purpose of workplace surveillance is to change the behavior of everyone within its purview. From the standpoint of employees under poorly explained surveillance (which is often simply offered as a fait accompli), surveillance can result in a deadened work environment perceived as hostile and restrictive in which workers are not trusted and “are treated like children.” Ironically, work monitoring seen to be unreasonable is likely to be responded to in ways that undermine the goals of the organization, and such surveillance may raise the level of stress among workers in ways that limit their productivity.

  • A voter without privacy is subject to coercion in casting his or her vote. Indeed, it was for just this reason that the secret ballot was gradually introduced in the United States in the late 19th century. With a secret ballot, there is no way to prove how an individual voted, and thus a voter can cast his or her vote freely without fear of later retribution. Secret ballots also impede vote buying, since a voter can vote one way and tell his or her paymaster that he voted the way he or she was paid to vote.

  • The availability of personal information about an individual enables various organizations to provide him or her with information or product and service offerings customized to the interests and patterns reflected in such information. While such information and offerings do have benefit for many people who receive them, they can have negative effects as well. For example, personal medical information made available to drug manufacturers may result in drug advertisements being targeted to individuals with certain diseases. Receipt of such advertisements at one’s family home can compromise the privacy of the individual’s medical information if the diseases associated with such drugs are socially stigmatizing.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
  • People who lose control of their personal information can be subject to discrimination of various kinds (Section 2.3). As a society, we have made a choice that discrimination based on race and religion (among other things) should be illegal as a matter of public policy. But there are many other distinctions that can be made when detailed personal information is available that facilitates the classification and assignment of people to groups—groups defined not by race or religion but by some nameless statistical sorting on multiple dimensions. Members of groups so defined can be denied services, information, opportunities, or employment to which they would otherwise be entitled if that personal information had been kept private. For example, political campaigns can use collections of personal information to tailor different messages to members of different groups that are designed to appeal to their particular views and attitudes. Such practices work against full disclosure and a community-wide consideration of the issues.

These examples underscore the committee’s categorical rejection of the notion that if you have done nothing wrong, you have nothing to fear from a loss of privacy.

It should also be noted that the ability to put individuals under surveillance is often as significant in changing behavior as the reality of such surveillance. From dummy surveillance cameras intended to deter crime to fellow diners in a cafeteria who might be listening to a private conversation, there are many ways in which potential surveillance can affect behavior.


Finding 4. Privacy is particularly important to people when they believe that the entity receiving their personal information is not trustworthy and that they may be harmed by sharing that information.


Trust is an important issue in framing concerns regarding privacy. In the context of an individual providing personal information to another, the sensitivities involved will depend on the degree to which the individual trusts that party to refrain from acting in a manner that is contrary to his or her interests (e.g., to pass it along to someone else, to use it as the basis for a decision with inappropriately adverse consequences). As an extreme case, consider the act of providing a complete dossier of personal information on a stack of paper—to a person who will destroy it. If the destruction is verifiable to the person providing the dossier (and if there is no way for the destroyer to read the dossier), it would be hard to assert the existence of any privacy concern at all.

But for most situations in which one provides personal information, the basis for trust is less clear. Children routinely assert privacy rights to their personal information against their parents when they do not trust

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

that parents will not criticize them or punish them or think ill of them as a result of accessing that information. (They also assert privacy rights in many other situations.) Adults who purchase health insurance often assert privacy rights in their medical information because they are concerned that insurers might not insure them or might charge high prices on the basis of some information in their medical record. Many citizens assert privacy rights against government, although few would object to the gathering of personal information within the borders of the United States and about U.S. citizens if they could be assured that such information was being used only for genuine national security purposes and that any information that had been gathered about them was accurate and appropriately interpreted and treated (as discussed in Section 9.2.5). Perversely, many people hold contradictory views about their own privacy and other people’s privacy—that is, they support curtailing the privacy of some demographic groups at the same time that they believe that their own should not be similarly curtailed. This dichotomy almost certainly reflects their views about the trustworthiness of certain groups versus their own.

In short, the act of providing personal information is almost always accompanied to varying degrees by a perceived risk of negative consequences flowing from an abuse of trust. The perception may or may not be justified by the objective facts of the situation, but trust has an important subjective element. If the entity receiving the information is not seen as trustworthy, it is likely that the individuals involved will be much more hesitant to provide that information (or to provide it accurately) than they would be under other circumstances involving a greater degree of trust.

10.3
PRESSURES ON PRIVACY

The discussion in earlier chapters suggests that there are many pressures that are increasingly limiting privacy. Among them are advancing information technologies; increasing mechanisms for obtaining information; the value of personal information to business and government; and changing social norms and needs.


Finding 5. Although some developments in information technology (IT) and other technologies do have considerable potential to enhance privacy, the overall impact of advancing technology including IT has been to compromise privacy.


One obvious pressure on privacy is the evolution of information technology writ large, an evolution that has resulted in greater capability to invade and compromise privacy more deeply and more easily than ever before. One might ask whether this result was inevitable—whether under

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

a different set of societal structures and different notions of power and privilege the evolution of IT might have done more to enhance privacy. But even though some developments in IT do indeed have the potential to enhance privacy, there is little doubt that the overall impact of advancing IT has been to compromise privacy in important ways.

For example, the rapidly decreasing cost of storing information has meant that personal information on an individual, once collected, may generally be available for potential use forever unless special measures are taken to destroy it (Chapter 3). Even when there is no particular need to keep information for a long time, it is often kept by default, because it is more expensive to decide on what to destroy or delete than to maintain it in storage. Such information is easily if not routinely added to existing databases on the individual, which means that the volume of information about an individual only grows over time.1

A second example is the proliferation of smaller, less expensive, and more easily deployed sensors that can readily obtain information in their ambient environment, information that is sometimes personal information about individuals.

Technology has also facilitated greater access to information (Section 3.4). Nominally public records stored on paper are vastly more inaccessible than if their contents are posted on a Web site or are available online, and in that sense are more private apart from any rules regulating access to them. For example, property tax records have been available to the public in most municipalities for decades. The inconvenience of access has prevented widespread knowledge of neighbors’ property values, but when such information is available via the Internet, it is disseminated much more broadly.

More generally, information technology is a rapidly changing field. New information technologies—and new sensor, biometric, and life science technologies, too—often offer capabilities poorly understood and considered in public debates or in individuals’ expectations of privacy. Traditional expectations about information are in a sense under continuous bombardment from such changes, and prior beliefs, understandings, and practices are not necessarily an adequate guide or control with respect to the torrent of new developments. The net result is that the appearance

1

An example is a person’s medical history, much of which is irrelevant to an individual’s current medical status. (Information regarding major medical events (surgeries, major diseases) and associated significant data such as reports on operations, X rays, and pathology reports continue to be useful, but much of the medical record over time becomes filled with data that may be maintained for medical legal purposes but has little value to the treating physician long after the fact. Such data might, for example, include lab work taken during a critical event or during routine care many years in the past.)

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

of new technologies rekindles debates and arguments that might otherwise have been regarded as settled.


Finding 6. Businesses, researchers, and government agencies increasingly find value in the exploitation of personal information, which leads to many pressures for repurposing of collected data.


A second pressure is the fact that the IT-enabled exploitation of personal information has many benefits for modern business in enhancing the economic bottom line (Chapters 6 through 8). Activities such as increasing consumer choice, reducing economic risks, directing customized product offerings to consumers, and hiring and placing employees in the most cost-effective fashion become feasible as business strategies only when there is personal information available to support them. Researchers rely on collections of personal information to derive statistical trends of importance to public policy makers. Government authorities increasingly seek better and more ways of using personal information to provide services (Section 6.8), administer benefits, and enhance security (Chapter 9).

Furthermore, the ways in which personal information can be exploited to benefit the bottom line of businesses or the mission capabilities of government agencies seem to be limited only by the creativity of the human mind (Section 6.1). This is significant because data, once collected for a given purpose, can easily be used for a different purpose in the future—this is especially true because the data can be retained indefinitely at little cost. Today’s databases for the most part have not been designed to restrict the use of the data they contain to any specific purpose. Many of the privacy concerns discussed in this report center on cases in which information gathered for one purpose is reused for a different purpose with neither notice nor consent. Whether it is the use of medical information for the marketing of pharmaceuticals, the conversion of timestamps on toll-road tickets for the calculation of average speed (and the subsequent issuing of speeding tickets), or the reuse of information collected from patients in a long-term epidemiological research study, repurposing of previously collected data creates myriad privacy concerns.

A particularly interesting kind of personal information is information that today is not easily personally identifiable but may be more identifiable in the future. For example, surveillance cameras in public places take many pictures of people. Today, the automated recognition of facial images taken by such cameras is difficult and unreliable—but the technology of facial recognition will almost certainly improve in the future. As the technology improves, and databases of facial images are populated, it is entirely possible that businesses and government will develop new ways of exploiting personal information that is not identifiable today.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

Finding 7. Privacy considerations are relevant throughout the life cycle of personal information that is collected, and not just at the beginning of the collection process.


The collection and use of personal information is typically not a single event, but a process with considerable duration and organizational support. Finding new ways to exploit already-collected information extends the life cycle even further. Thus, privacy considerations need to be taken into account on a continuing basis.


Finding 8. Businesses and government agencies have developed many mechanisms—both voluntary and intrusive—for obtaining personal information.


Because institutions both public and private find high value in the IT-enabled large-scale availability of personal information, they continually find ways to maintain and expand the availability of such information from individuals. Though there are many variants, such ways can be grouped in a few categories:

  • Mandated disclosure is, by definition, coercive. For example, taxpayers must provide detailed financial information on income tax returns. Convicted felons in a number of states must provide DNA information for entry into a database to which law enforcement officials have access. Convicted sex offenders must register with law enforcement authorities, and communities must be notified about the presence of such individuals living therein. Most importantly, failure to provide such information is punishable by law.

  • Incentivized disclosure is arguably voluntary. Individuals are persuaded to provide personal information by the offer of some incentive—an offer that may be difficult to refuse. For example, merchants often offer customers “loyalty” cards, the presentation of which at the cash register entitles the customer to a discount on the merchandise being bought. In exchange, the merchant obtains a record of all purchases (and patterns over time) made using this loyalty card. The use of this data may enable the merchant to better tailor product offerings to its customers’ revealed preferences. Customers who prefer that no record be made of their purchases need not present a loyalty card, and in some cases they may request that a generic loyalty card be used to provide the discount.

  • Conditioned disclosure lies between incentivized disclosure and mandatory disclosure. Obtaining a certain good or service is conditioned on the recipient providing personal information. Furthermore, the good or service in question is arguably very important—perhaps nearly essential—to the activities of one’s daily life or the obligations of citizenship.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

Driving a car, traveling on an airplane, voting, and being employed are voluntary activities in some sense, but one must provide personal information in order to engage in them. Walking in a public plaza watched by surveillance cameras is voluntary, but even if notices of surveillance are posted (which may not be the case), avoiding the plaza may not be particularly convenient—and thus surveillance photos of the walker may be taken. Taking a drug test may be a requirement of keeping one’s job—and the results of a urine test may be stored in an employee’s personnel file. Thus, the disclosure of personal information is not quite mandated, because one may indeed make a choice to not obtain the good or service. But it is not quite voluntary either, because doing without the good or service in question would constitute a hardship or a substantial inconvenience for the individual.

  • Entirely voluntary disclosure. People engaged in social interactions with others often exchange information about themselves, but they themselves decide what they will share. A person may have a sense of the other person involved, or of the cultural norms that suggest the nature of the exchange, but for the most part they still decide if and how much information to provide. In other situations, people sometimes voluntarily provide information about themselves as a gesture of affiliation or as evidence of competence, understanding, or empathy (“Yes, that happened to me too; I understand just how you feel”). To the extent that these interactions do not reflect differential power relationships, these can be regarded as entirely voluntary disclosures, and they need not be governed by the expectation of tangible or direct personal benefit or exchange.

  • Unannounced acquisition of information. In such situations, information is not even “disclosed,” since disclosure implies that the individual realizes that he or she is revealing information. But there are many situations in which people disclose personal information without realizing it. Most individuals who make toll-free calls do not realize that the numbers from which they are calling are provided to the called party, and caller-ID services operate without notice to callers. Web bugs and cookies covertly provide information about the Web surfing behavior of individuals. Building entry is often recorded as individuals swipe their electronic ID cards into an access control system. Surveillance photos are often taken at a distance. In some of these cases, individuals subject to these acquisitions of information are in some sense given notice of that fact, but these notices are often provided in such a way that they are easy to ignore or forget.


Finding 9. Changing social trends and sentinel events put strong pressures on privacy.


Some forms of privacy invasion that are technically possible may in practice not take place in certain social contexts. Beyond formal law, for

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

example, various professional codes of ethics require practitioners to preserve certain kinds of privacy, and in many cases these codes are sufficient to reassure individuals that personal information revealed to these practitioners will remain private and confidential. Social and cultural norms regarding propriety and civility have also tended to keep certain kinds of personal information that were nominally public from being widely circulated (e.g., information about a public figure’s divorce or extramarital affairs). Manners and common sense can also be important in limiting disclosure and notice.

Nevertheless, a number of social trends have significantly eroded much of the privacy protection that may have resulted from such norms (Section 1.4.3). Once information becomes public, it is virtually impossible to fully expunge it, no matter how privacy invasive, offensive, or incorrect it may be. Personal information that is available is likely to be exploited by those who see economic, political, or other strategic value to its use, independent of societal approval or disapproval. DNA evidence has led to the freeing of imprisoned individuals and convicted others, putting pressure on obtaining it. Sexual offender notices have led to the harassment and murder of convicted offenders who have served their sentences.2

Sentinel events (i.e., dramatic changes in circumstance such as terrorist events and public health crises) often change the privacy environment (Section 1.4.4). Furthermore, the resulting media coverage and political rhetoric often lead to a political environment in which privacy can be reduced or curtailed in ways not previously accepted by the public. This was dramatically illustrated by the speed with which the USA PATRIOT Act was passed in the wake of the September 11, 2001, terrorist attacks.


Finding 10. The power of the state and large private organizations to gather personal information and to act on that information is much greater than the power of the individual to withhold personal information from the state or those organizations or to prevent improper or unjustified actions from being taken.


As noted in Section 9.1.6, there is almost always a substantial imbalance between the power of the state and that of the individual regarding the gathering of information about individuals. Some regard this imbalance as dangerous and improper, and infer that external limits are thus necessary to constrain the ability of government officials to act improperly, even if such constraints complicate or impede the task of law enforcement agencies. Others trust that government officials will use such power only

2

Emily Bazar, “Suspected Shooter Found Sex Offenders’ Homes on Website,” USA Today, April 18, 2006, available at http://www.usatoday.com/news/nation/2006-04-16-maine-shootings_x.htm.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

in the interest of the citizenry and thus do not believe that such constraints are necessary, especially if these constraints complicate or impede the task of law enforcement agencies. Whatever one’s views on this matter, it is a reality that must be factored strongly into the debate over privacy.

Similar comments apply to the balance between large private organizations and individuals, although the texture of that balance is different. It is difficult to withhold information from both government and many large private organizations with which individuals choose to do business. Government and private organizations are subject to some degree of oversight, by the independent branches of government in the former case and by boards and ombudsmen in the latter case. But individuals have no choice in the government laws and regulations under which they live short of moving away,3 and they often have only some choices in the private organizations with which they interact. Moreover, private organizations do not directly hold coercive powers such as imprisonment, which are reserved for the state. Concerns regarding private organizations parallel those regarding government. That is, some people are highly concerned about the imbalance between individuals and private organizations, and thus infer that regulation is thus necessary to constrain their ability to act in ways that harm individuals, even if such constraints complicate or impede their business and operational missions. Others believe that the power of the marketplace is sufficient to constrain their behavior, and reject external constraints because they would complicate or impede their business and operational missions.

10.4
MAKING TRADEOFFS

Finding 11. Privacy is a value that must often be traded off against some other desirable societal value or good.


At the same time that the committee strongly believes privacy is central to notions of the dignity of the person and is a requisite for a decent and democratic society, it also recognizes the complexity of society and the existence of competing values (Section 1.2). For example, the protection of privacy is sometimes detrimental to economic efficiency—in the absence of certain kinds of information, two otherwise equal parties may not make the most economically efficient decision. Privacy claims can be

3

Of course, citizens in a democracy can vote to support candidates who support changes in laws and regulations that they regard as objectionable. However, this does not change the fact that citizens are obligated to obey all applicable laws and regulations on the books at any given moment, and their only choices at that moment are to accept such responsibility or to move to a location where they are not subject to the reach of those laws or regulations.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

used to shield criminal acts, and they can also be used to limit scrutiny of acts and behavior that—though not technically illegal—are arguably antisocial or otherwise disapproved.

Depending on one’s weighting of the values at stake, such tradeoffs may mean more privacy and less of X (X being the other value or values at stake) or vice versa. Deciding the right mix of privacy and X in any given situation sometimes entails a tradeoff with respect to other values, and understanding the nature of those tradeoffs is necessary before one can think systematically about decisions involving tradeoffs.

A central feature of many policy tradeoffs involving privacy is the fact that privacy—in the terms usually most relevant to the policy debate—relates to the privacy of individuals, whereas the other X at stake relates to a collective value or good. That is, some individual members of society are asked to accept reductions in privacy in order to benefit the entire society. If these individual members are politically marginalized (e.g., because they are few in number or have no vocal advocates), the political process will similarly marginalize their privacy concerns. (In the past, such groups have included Japanese-Americans in World War II and the U.S. citizens and organizations subjected to National Security Agency communications surveillance in the decade beginning in the early 1960s, many of whom among the latter were active in the antiwar and civil rights movements.4) Whether the actions taken are viewed as desirable or undesirable, or as necessary or unnecessary, will vary depending on the conditions, but they should be recognized for what they are.

A similar tradeoff occurs when researchers seek to obtain statistical information from large aggregations of personal information from many individuals. For example, epidemiologists often use personal health information of many individuals to understand patterns of disease propagation. Although personal health information is generally collected for the benefit of individual patients, epidemiological research generally does not require the identities of these patients. Society as a whole benefits from epidemiological research, but the potential costs of using putatively anonymized personal health information are borne by the individuals whose identities might be compromised inadvertently. It is for this reason that the nature and the scope of privacy assurance regarding personal health information are so important from a policy perspective.

The fact that tradeoffs are made is not new in public policy. But one implication of the information age—in which information is collected and

4

Warrantless FBI Electronic Surveillance, Book III of the Final Report of the Select Committee to Study Governmental Operations with Respect to Intelligence Activities, United States Senate, April 23, 1976.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

disseminated with increasing ease and individuals are more and more interconnected—is that routine administrative and bureaucratic processes (one might even call these processes autonomic) for making many such decisions are no longer sufficient, and that engaged, reflective decision making, perhaps at higher levels than before, will become increasingly necessary.

The reason is that in the absence of countervailing forces, privacy will tend to be eroded when considered in relation to some more tangible and immediate objective that is also desirable. Indeed, privacy is something that is noticed mostly when it is gone or missing or compromised. People are much more likely to notice that their privacy is being violated than they are to notice that their privacy is being respected. Thus, it is easy for autonomic decision making to sacrifice a good—privacy—that is noticeable mostly in its absence in favor of the more tangible and visible benefit promised by those seeking another legitimate end. We must become aware of this tendency and be sure that decision makers give adequate weight to values having different characteristics.


Finding 12. In the public debate about balancing privacy and other societal interests, there is often a lack of clarity about the privacy interests involved and too often a tendency to downplay and to be dismissive of the privacy issues at stake.


Because policy makers recognize the political risks involved in appearing to compromise citizen privacy, they often offer assurances that legitimate citizen privacy will in fact be protected. That is, they assert that they have in fact guarded against sacrificing privacy needlessly or inappropriately—whether or not they have in fact done so.

The committee believes that for public policy purposes, a vital element of making tradeoffs is the enhancement of transparency in the process. This strategy consists of two components. The first is the provision of a clear statement of what meaning of “privacy” is being used and why. The goal is to ensure that everyone in the discussion means the same thing by “privacy” and/or by “X” (whatever privacy is being traded against). (By clarifying the meaning of privacy rather than obfuscating it, the public debate could also serve an educational role for the public that often does not appreciate the issues attendant to privacy.) From an analytical perspective, Sections 10.1 and 2.4 (see Box 2.2) describe one process—the use of anchoring vignettes—for coming to terms about meaning, and there are of course other ways of doing so as well.

Note that debates seeming to be about privacy can in fact involve very different matters. For example, the debate about access to DNA information by insurance companies is actually a debate about access to health

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

care, employment, or mobility and whether or not insurance companies should be allowed to deny coverage on the basis of one’s genetic profile. Those making policy should take care to make sure that debates that seem to center on privacy do not, in fact, use privacy as a screen to cover other fundamental disagreements. This does not mean that privacy is irrelevant to such a discussion—for example, individuals asserting a right to privacy may be using the only weapon that they have to protect their self-interest in such a situation. But privacy may not be the only or even the primary issue at stake.

Indeed, the point about identifying the source of disagreement is worth expanding. Although privacy considerations are generally important in policy discussions (both public and private), there are a number of logically prior antecedents. The most important of these antecedents is the desirability of any particular policy goal in the first place. Some particular contemplated action may have many and deep privacy implications, but if the goal to be served by that action is an inappropriate one—however that may be decided—it may not be necessary to address the privacy implications at all. If health insurance were a right to be enjoyed by all citizens, the debate over access to DNA information by insurance companies would have a much different character. A program that collects the personal information of Elbonian-Americans has definite privacy implications. But if the goal of the program is to enable the identification of Elbonian-Americans for possible deportation, it may make sense for the nation to assess whether or not the deportation of Elbonian-Americans is a good or a bad policy goal.

The second component of a transparency strategy is discussed below.


Finding 13. When privacy is at issue, bland assurances that privacy will not be harmed offered by policy makers can do more to raise skepticism than would honest assessments of tradeoffs.


Transparency also requires that tension be recognized when it is present. Recognizing that there is often tension in a “privacy versus X” relationship (e.g., personal privacy versus video surveillance for transportation safety), it is important to make clear how the various stakeholders view the situation and the factors that decision makers consider. Public debate and discourse are undermined when policy makers simply deny the existence of tradeoffs between privacy and other values and assert without evidence that it is possible to “have it all.” Policy makers of good conscience and good will can legitimately come to different conclusions about the right balance in any given situation, but it is unreasonable to assert without evidence that there will be no diminution or narrowing

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

of privacy if and when these other values are given a new and higher priority.5

It is true that in making a tradeoff, it is sometimes possible to develop privacy-respecting solutions that reduce the conflict between privacy and other values. For example, policy makers may decide to make greater use of a system that collects certain kinds of personal information on individuals in order to enhance energy efficiency in a building. But the potential infringement on privacy may well come from the long-term retention of such information—and so restructuring the system to erase that information after it is no longer necessary (e.g., after an hour, when it is no longer needed to manage the building heating and air conditioning system) might mitigate the privacy concerns substantially without damaging the goal of conserving energy. Drivers on toll roads need to be charged, but the time at which they enter or leave a toll road is irrelevant to whether or not they have paid. Thus, a toll system that does not record entry and exit times cannot be used to calculate the driver’s speed between those two points and thus cannot be used as a basis for issuing speeding tickets.6 In general, explicit attention to privacy considerations (e.g., collecting only information that is directly relevant, or showing only the degree of intrusiveness and invasiveness necessary for the stated goal) can reduce the privacy downside for some proposed action.

If a solution is available or developed that does mitigate privacy concerns, it should be discussed explicitly, so that there is clear evidence for what would otherwise be an empty assertion made simply for public relations purposes. And even in the cases in which no mitigating solution is available, an explicit discussion of costs to privacy and benefits regarding the other values would be much more credible than stock assertions.

If the public is to give up some measure of privacy, there should be a reasonable expectation of public information about the benefits in other dimensions that will result from that loss. Because a loss is certain (by definition), it is not sufficient to offer speculative benefits as justification. The benefits themselves may be uncertain because they are probabilistic,

5

Perhaps a prior issue is whether or not some proposed action should be taken at all, irrespective of privacy considerations. For example, if a proposed action is demonstrably not cost-effective in achieving some goal, privacy considerations may not be relevant at all, since decision makers in both the private and the public sectors should not be taking cost-ineffective actions in the first place.

6

Of course, one might argue that the use of a toll system to catch speeders (but only certain types of speeders) is an appropriate and efficient use of technology designed for other purposes, and that such “dual use” should be encouraged rather than discouraged. From a public policy perspective, this may well be true—but the committee believes that in such cases, both purposes ought to be openly discussed, and if the outcome of the public policy process is that both uses are determined to be desirable, then so be it.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

but at the very least the analytical basis that led to the decision must be publicly articulated.

Put differently, it is necessary to make explicit the evidentiary or other basis for concluding that the action in question will serve a stated policy goal. While there is a reasonable debate to undertake about an action that compromises privacy to some extent and that demonstrably advances a stated policy goal, it makes little sense to take an action that does the former but not the latter. If some action does not demonstrably advance a stated policy goal, it may not be necessary to consider the privacy implications of that action at all, as the action may not make sense for reasons entirely unrelated to privacy. Yet in the rush to “do something” in response to a shocking event, privacy-compromising actions are often taken that have little real relationship to advancing a stated goal.


Finding 14. Privacy-invasive solutions to public policy problems may be warranted, but when they are implemented as measures of first rather than last resort, they generate resistance that might otherwise be avoided if other alternatives were tried first.


Privacy-respecting solutions that reduce the cost of making a tradeoff are often difficult to find. But there is one type of solution that is worth special notice—the approach in which privacy-reducing actions are employed as a last rather than a first resort. Before demanding solutions that require that citizens provide more personal information, policy makers crafting solutions to problems would do well to fix problems by making more effective use of the personal information to which they already have access. For example, if the bottleneck in processing intercepted phone calls is the lack of linguists that understand the language in which those calls are made, it may not make much sense to intercept even more calls until more linguists are available.

10.5
APPROACHES TO PRIVACY IN THE INFORMATION AGE

As noted above, the pressures on privacy are many and the inherent protections for privacy few. It is thus worth considering explicit approaches that can be used to support privacy.

10.5.1
Principles

The committee identified a number of principles that it believes should guide efforts to achieve an appropriate balance between privacy and other issues. These include the following:

  • Avoid demonization. Most threats to privacy do not come from fun-

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

damentally bad people with bad intentions. Rather, they are consequences of trying to address real needs (such as national security, law enforcement, open government, business efficiency, fraud prevention, and so on) either without giving adequate thought to the privacy consequences, or because they assign to the other needs a higher priority than they assign to privacy. Although demonization of an opponent is a staple of today’s political rhetoric, it tends to make compromise and thoughtful deliberation difficult.

  • Account for context and nuance. As noted in Section 10.1 and elsewhere in this report, privacy is a complicated subject with many nuances. Whose privacy is at issue? Against what parties? What information is in question? What are the circumstances? What precedents may be created? In the policy-making process (whether for public policy or organizational policy), taking these nuances into account will often be necessary if common ground is to be found. Without context and nuance, the debate quickly polarizes into “pro-privacy” and “anti-privacy”/“pro-X” camps.7 (For X, substitute any issue of public importance—security, law enforcement, the economy, for example.)

  • Respect complexity. Privacy is a moving target, as the numerous social and technical factors with which it is intertwined change over time. Many choices made today to settle some privacy issue will almost certainly lead to surprising results and require further adjustment, either by modifying the original choices, or by adding further mechanisms to compensate for newly discovered problems. Thus, solutions to privacy problems are more likely to be successful if they can begin with modest and simple steps, guided by well-formulated principles that produce operational real-world experience and understanding that can then be used to shape further actions, always with an eye to the dynamic nature of the topic.

  • Be aware of long-term costs and risks. As noted in Section 3.10, privacy protections are—in practice—based on a mix of culture, technology, and policy. But all systems are deployed in a cost-sensitive environment, and it is important to consider how economic cost might have an impact on this mix. On the one hand, retrofitting privacy protections to information systems or business practices is often more expensive than design-

7

An analogy from the world of computer security may be helpful. Operating systems often have facilities for protecting the privacy of files. If the privacy question is formulated simply as, Should other people be able to have access to a given user’s files?, most people would say no. But if the question is decomposed into finer questions such as, Who can know about the existence of this particular file?, Who has permission to read its contents?, Who can change its contents?, and, Who can change these permissions?, it becomes possible to have a more useful discussion about privacy requirements and the necessary system capabilities to support those requirements.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

ing these systems or practices from the start to be privacy-protective or respecting. Indeed, there is substantial empirical experience that indicates that this is the case. On the other hand, it is clearly simpler and less expensive to design systems or business practices without attention to privacy at all. Because the policy process often seems easier to manipulate than technological development (particularly after the technology is in place), the temptation is great to rely heavily on policy to protect privacy. The committee believes that in the long run, policy-based privacy protections introduced without providing for adequate technology enforcement are likely to result in unintended violations of privacy. (It is also axiomatic that absent an adequate policy framework for protecting privacy, the best technology is unlikely to succeed.) This scenario is likely to result in costly retrofits and may also result in unfavorable publicity and perhaps significant economic liability. Thus, organizations that handle personal information are well advised to invest up front in adequate technological privacy protection from the very beginning.

10.5.2
Individual Actions

Finding 15. Individuals can take steps to enhance the privacy of their personal information. They can also become better informed about the extent to which their privacy has been compromised, although the effectiveness of these measures is bound to be limited.


Individuals have some ability to take steps to enhance the privacy of their personal information, and to be better informed about the extent to which their privacy may be compromised (Section 3.8.1).8 Most of these steps involve tradeoffs involving convenience, access, and cost. Individuals can tailor their privacy protection practices to their specific situation. As in the physical world, people whose privacy has been compromised with harmful, costly, or inconvenient results will almost certainly increase the degree of inconvenience and cost they are willing to accept for greater protection in the future.

To reduce the amount of personal information that may be compromised, individuals can:

  • Improve the security of their local computing environments using tools such as firewalls and encryption;

  • Make use of re-mailers, proxies, and other anonymization techniques if anonymity is desired;

8

A fuller discussion of measures that individuals may take to thwart surveillance can be found in Gary Marx, “A Tack in the Shoe: Neutralizing and Resisting the New Surveillance,” Journal of Social Issues 59(2):369, 2003.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
  • Use secure and encrypted e-mail;

  • Take anti-phishing measures to reduce the likelihood of identity theft;

  • Install software from reliable sources to block third-party cookies, Web bugs, and other devices that enable the tracking of activity across various Web sites; and

  • Install software that reliably deletes all relevant information from one’s computer when one deletes a file.

To reduce the amount of unwanted information that they receive, individuals can:

  • Block spam e-mail;

  • Employ pop-up blockers and ad blockers;

  • Use special e-mail addresses for important correspondents and faked or infrequently checked e-mail addresses for others;

  • Take advantage of all opt-out opportunities, such as do-not-call lists (for both home and mobile numbers) and options for not receiving postal or electronic mail;

  • Put credit freezes or fraud alerts on their credit reports; and

  • Avoid using toll-free numbers and block caller-ID when making calls.

To monitor one’s online privacy, individuals can:

  • Search the Internet periodically for sensitive personal information, such as one’s Social Security number or an unlisted phone number from an anonymized account or a computer that cannot be traced to the individual. (So-called vanity searches, in which one searches the Internet for references to one’s name, can also be revealing to many people.) One may (or may not) be able to do anything about the online existence of such information, but at least one would know that it was available in such a fashion;

  • Periodically monitor their credit ratings; and

  • Use personal e-mail addresses that are specifically created to monitor the implementation of policies of a Web site operator. For example, a site such as merchant.com might post a privacy policy that said, “Your e-mail address will never be given to anyone else.” Given the volume of spam e-mail that one receives, it would ordinarily be difficult to trace to a specific merchant the unauthorized release of one’s e-mail address. However, if one used an e-mail address that was tailored for the site in question, receipt of a marketing e-mail from anyone else to that address would be convincing proof that the site did not adhere to its posted policy.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

Additional steps for individuals can be found on the Web sites of the Electronic Frontier Foundation,9 the Center for Democracy and Technology,10 and the Electronic Privacy Information Center.11

In general, the actions described above are technically oriented—that is, they protect the individual only against technologically based intrusions of privacy (though they would do a better job of doing so if they were less cumbersome in actual use), and whether they are worth the trouble depends on the individual’s cost-benefit calculus about the value of privacy. But they cannot defend against intrusions of privacy that occur as a matter of policy or routine bureaucratic practice (e.g., routine sharing of information that is allowable under the law or policy). And they do not in general enable the user to know if these actions are effective in protecting one’s privacy.

In such instances, the only privacy-enhancing measure that one can take as an individual is to provide false or incomplete information when personal information is requested. Of course, providing false information has other consequences. In some cases, it is illegal to provide false information (as on a tax return). In other cases, providing the false information may result in being denied certain benefits that providing true information would enable. In addition, providing false information may not be an entirely reliable technique for protecting one’s identity, because some data-correcting techniques—intended to catch errors made when the data are recorded—may also be able to correct false information under some circumstances. More to the point, an individual is unlikely to know if his or her attempt to provide false information is in fact succeeding in protecting his or her identity.

It is important to note that in identifying actions that individuals can take to enhance their privacy, the committee is not “blaming the victim” or arguing that individuals who fail to take such actions are solely or even primarily responsible for invasions of their privacy. The fact that individuals can take steps to protect their personal information does not imply that other societal actors, such as government and private organizations, have no responsibility for privacy. Indeed, private and personal actions are not equally available to all members of society, especially in contexts of inequality where the resources for self-protection are not equally distributed, and so personal actions may need to be supported by the kinds of organizational and public policy actions considered below.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

Recommendation 1. If policy choices require that individuals shoulder the burden of protecting their own privacy, law and regulation should support the individual in doing so.


If a policy choice is made that places the onus on individuals to fend for themselves, the individual’s ability to do so should be facilitated by law and regulation. That is, all reasonable efforts must be made to inform the individual about the options available and the consequences of selecting any of those options. (Reasonableness necessarily takes into account an assessment of the relative costs and benefits of providing such information.) Such a precedent exists in the legislative mandate for credit-monitoring agencies to provide free credit reports to consumers periodically and for consumers to demand corrections for erroneous entries in their credit reports. In the future (and simply for illustrative purposes), law and regulation could mandate requirements that published privacy policies be easily readable (e.g., at a 7th-grade reading level), that statements related to repurposing illustrate possible secondary purposes, and that information is well publicized about technical options for individuals to protect their privacy.

10.5.3
Organization-based Actions

Finding 16. Self-regulation is limited as a method for ensuring privacy, although it nevertheless offers protections that would not otherwise be available to the public.


Organizations that use technology to manage large volumes of personal information (both in the private sector and in government) can establish privacy policies (e.g., on Web sites) that specify self-imposed restrictions on their use of information that they collect—or could collect—about those with whom they interact. The desire to maintain public trust and good will is a powerful motivator for many organizations to protect privacy on a voluntary basis.

To strengthen the force of privacy assurances, as well as to make it easier for organizations to establish appropriate privacy protections, organizations that are committed to particular standards and to mutual policing have banded together in associations such as TRUSTe,12 BBBOnline,13 and the Direct Marketing Association14 in an attempt to improve their members’ public images by forming larger “regions of trust.” In general,

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

members of these organizations agree to adhere to established privacy principles and agree to comply with a variety of oversight and consumer resolution procedures.

Some argue that self-regulating associations have at least the appearance and perhaps embody the fact of “the fox guarding the henhouse,” and even self-regulation advocates recognize that such associations cannot provide a complete solution. For example, the existence of a privacy policy per se does not indicate whether that policy actually protects privacy. Membership is voluntary, and so organizations whose regular practices are very different from the voluntary guidelines may not even apply for membership or may have been kicked out because of violations. Moreover, resources available for policing depend in part on dues paid by the members, and to encourage a large membership, there is a temptation to keep dues low and policing light. Nevertheless, a declared policy is often an organization’s first step toward a meaningful privacy protection regime.


Recommendation 2. Organizations with self-regulatory privacy policies should take both technical and administrative measures to ensure their enforcement.


An important next step is the enforcement of a declared privacy protection policy. This is a non-trivial task, and even the most stringent privacy policies cannot provide protection if they are subverted by those with access to the personal information, either legitimate access (the insider threat) or illegitimate access (the hacker threat). Thus, data custodians almost always use some form of technical protection to limit access to, and use of, personal information (e.g., passwords and other access control devices to prevent unauthorized people from accessing protected data).

Sometimes this is as simple as using file or database system access controls, or encrypting the data and limiting access to the decryption keys, or even using physical measures such as guards and locks for a facility that houses personal information. But more sophisticated measures may be needed. Section 3.8 describes several technologies relevant to maintaining the privacy of personal information in an organizational setting: auditing queries to databases containing personal information, designing systems whose data requirements and data retention features are narrowly tailored to actual needs, restricting access to information from which individual identities can be inferred, and implementing machine-readable privacy policies as a way of better informing users about the nature of those policies.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

Administrative measures are also necessary to support enforcement. For example, administrative actions are needed to promulgate codes of behavior and procedures that govern access to stored personal information. Penalties for violating such codes or procedures are also needed, as technological enforcement measures sometimes fail or do not cover certain eventualities.


Recommendation 3. Organizations should routinely test whether their stated privacy policies are being fully implemented.


Because automated privacy audits are rarely comprehensive (except at great expense), red-teaming of an organization’s privacy policy and its implementation is often in order. In the security domain, red-teaming refers to the practice of testing an organization’s operational security posture through the use of an independent adversary team whose job it is to penetrate the defenses of the organization. Red-teaming in a privacy context refers to efforts undertaken to compare an organization’s stated privacy policy to its practices. In general, red-teaming for privacy will require considerable “insider” access—the ability to trace data flows containing personal information. As in the case of security red-teaming, results of a privacy red-teaming exercise need to be reported to senior management, with a high-level executive in place with responsibility for ensuring and acting as an advocate for privacy as an individual and a collective good.


Recommendation 4. Organizations should produce privacy impact assessments when they are appropriate.


It is often the case that information practices—adopted entirely for non-privacy-related reasons—have unforeseen or surprising impacts on privacy that may not even have been considered in the adoption of those practices. Inadvertent effects on privacy could be reduced if privacy were systematically considered before adopting new information practices or changing existing practices. Privacy impact assessments—analogous to environmental impact assessments—can be established as a regular part of project planning for electronic information systems. Explicit attention to privacy issues can be valuable even if these assessments remain internal to the organization. However, public review can encourage consideration from other perspectives and perhaps reduce unintended consequences that could generate additional rounds of feedback, costly retrofitting, and/or unnecessary erosion of privacy.

Federal agencies are already required to produce privacy impact assessments (PIAs) under the E-Government Act of 2002. Illustrative PIAs

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

produced by two agencies can be found at the Department of Homeland Security and National Science Foundation Web sites.15 But the advantages of producing PIAs are not limited to government agencies, and the committee believes that they may have considerable utility in the context of private organizations as well.


Recommendation 5. Organizations should strengthen their privacy policies by establishing a mechanism for recourse if an individual or a group believes that they have been treated in a manner inconsistent with an organization’s stated policy.


Finally, the limits on self-regulation must be acknowledged. As noted in Section 9.2.4, organizations are sometimes willing to violate their stated policies without advance notice under some circumstances, especially when those circumstances are both particularly exigent and also unanticipated. For these reasons, it is important to consider mechanisms other than self-regulation to protect privacy. Public policy is one source of such mechanisms. But an organization that establishes a mechanism for recourse should its policy be violated does much to enhance the credibility of its stated policy.


Recommendation 6. Organizations that deal with personal information should establish an institutional advocate for privacy.


Organizations that deal with personal information would benefit from some kind of institutional advocacy for privacy, as many healthcare-providing organizations have done in response to the Health Insurance Portability and Accountability Act of 1996 (Section 7.3.4). By analogy to an organizational ombudsman who provides high-level oversight of everyday activities conducted in the name of the organization that might not be entirely consistent with the organization’s stated policies or goals, an organizational privacy advocate could have several roles. For example, it might serve as an internal check for the organization, ensuring that the organization has and makes public some stated privacy policy. It might also help to ensure that the privacy policy is actually followed by the organization. Internally, it might serve a red-team role, pushing on the

15

The NSF Web site includes a PIA for its Personnel Security System and Photo Identification Card System (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=pia0503); the DHS Web site includes a PIA for the US-VISIT program (for the automatic identification of non-immigrants exiting the United States at certain land points of entry; see http://www.dhs.gov/interweb/assetlibrary/privacy_pia_usvisitupd1.pdf).

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

privacy mechanisms instituted by the organization and testing them for their adequacy. Finally, it could be responsible for the generation and periodic review of privacy impact statements, which would be reviews of the privacy implications of new programs or policies being instituted by an organization. It could also help anticipate emerging privacy issues.

Some precedents for institutional advocates do exist, although their function and purpose vary. Under the Bankruptcy Abuse Prevention and Consumer Protection Act of 2005, a consumer privacy ombudsman must be appointed by the bankruptcy court before certain kinds of consumer information are sold or leased. Verified Identity Pass, Inc., a private firm that offers a voluntary, biometric “fast pass” system to support the Transportation Security Administration’s registered traveler program for expedited security screening at airports, has an independent, outside privacy ombudsman whose responsibility is “to investigate all privacy complaints, gather the facts, and respond to members, as well as to post responses publicly and prominently on [the firm’s] website.”16 Bell Canada has designated a privacy ombudsman to oversee compliance with the Bell Code of Privacy.17

A number of companies have created the position of chief privacy officer. In those companies where this title does not primarily designate a public relations position that puts the best face on company privacy practices or a legal position that merely ensures compliance with existing privacy laws, the chief privacy officer can serve as an effective organizational advocate who ensures high-level management attention to privacy issues, serves as a liaison to other privacy expert stakeholders, and anticipates future needs. This role is symbolic as well as instrumental.

10.5.4
Public Policy Actions

Finding 17. Governmental bodies have important roles to play in protecting the privacy of individuals and groups and in supporting and ensuring informed decision making about privacy issues.


Historically, privacy concerns in the United States have most often been tied to government infringement of privacy at various levels. Many have also noted that government is at least willing, under many circumstances, to trade off privacy and other rights in pursuit of some other goal or objective. This has meant that in some cases government agencies have undertaken actions and activities that have violated citizen privacy and then subsequently noted the impropriety of such actions (e.g., the forced

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

relocation of Japanese-Americans during World War II and the use of census data to identify such individuals (Section 9.3); the domestic surveillance in 1960s of participants in the civil rights movement). Against this historical perspective, a certain skepticism about the role of government as a guarantor of privacy is not surprising and may be helpful.

Nevertheless, the committee believes that various governmental bodies have important roles to play in protecting the privacy of individuals and groups, and in ensuring that decisions concerning privacy are made in a more transparent and well-informed fashion. As citizens become more concerned with privacy issues, it will become increasingly important for governmental agencies at all levels to address these concerns. Perhaps more importantly, actions and decisions of governmental entities on a variety of issues are likely to have significant privacy impacts. Whether it is something as obvious as decisions or policies concerning national security or as seemingly minor as making public-record information available on the World Wide Web, a great many actions taken by governments have privacy implications. Consequently, it is appropriate that privacy be at least a consideration if not a priority at all levels of government decision making.

10.5.4.1
Managing the Privacy Patchwork

Finding 18. The U.S. legal and regulatory framework surrounding privacy is a patchwork that lacks consistent principles or unifying themes. A less decentralized and more integrated approach to privacy policy in the United States could bring a greater degree of coherence to the subject of privacy.


The U.S. legal and regulatory framework surrounding privacy is a patchwork that lacks commitment to or guidance from a set of consistent principles or unifying themes. Because of the ad hoc way in which privacy has been approached by most policy makers, the current sets of privacy-related laws, rules, and regulations—at all levels of government—are confusing at best and inconsistent at worst.

Given the decentralized manner in which the United States has dealt with privacy issues (Section 4.4), this state of affairs is hardly surprising—and yet it has major costs. This patchwork is more than just a source of frustration and confusion—it is also inefficient and expensive. The current regulatory patchwork, in which laws governing privacy differ across the jurisdictions in which firms engage in business transactions, increases the economic costs of attending to privacy that these firms must bear.

The committee believes that a less decentralized approach to privacy policy in the United States could bring substantial benefits for the understanding and protection of privacy. Only with such an approach

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

can different priorities and tensions be reconciled. At the same time, the committee cautions that less decentralization can also lead to a lowest-common-denominator approach to privacy, which might well weaken privacy protections enjoyed by some states. Further, the committee notes that in an increasingly global marketplace, some degree of harmonization of U.S. privacy law with the privacy laws and regulations of other nations is likely to be necessary when business-related data flows between the United States and these other nations involve personal information.

10.5.4.2
Reviewing Existing Privacy Law and Regulations

Recommendation 7. The U.S. government should undertake a broad, systematic review of national privacy laws and regulations.


As a first step in the direction of a less decentralized approach to privacy policy, the U.S. government should undertake a systematic review of the laws and regulations that affect the level and quality of privacy that Americans enjoy. This review should address:

  • Areas of overlap and conflict in current national privacy law and regulation—special attention should be paid to the relationship of national law to state and local law extensively enough to generate a representative picture of those relationships;

  • Assessment of the nature and extent of gaps between stated policies and implementation, and the causes of such gaps;

  • Areas of privacy concern that the current legal and regulatory framework leaves unaddressed, such as the gathering, aggregation, and use of personal information by companies and other organizations that are currently not covered to any significant degree by any form of privacy regulation;

  • A clear articulation of the value tradeoffs that are embedded in the current framework of laws and regulation, especially where those tradeoffs were not made explicit at the time of adoption;

  • The economic and social impact, both positive and negative, of current privacy law and regulation;

  • The extent to which the personal information of Americans held by various parties is covered by the principles of fair information practices;

  • The interplay between state and federal privacy laws, taking into consideration matters such as the scope and nature of state laws as compared to federal laws; and

  • The interplay between domestic and foreign privacy laws, taking into consideration matters such as the scope and nature of flows of personal information to and from the United States and instances in

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

which differences between foreign laws and domestic law might call for harmonization.

If undertaken in an authoritative manner, such a review would simplify the task of knowing how to comply with privacy regulations and might also make such compliance less expensive. Further, it would help individuals to understand their privacy rights, and it could facilitate such enforcement of those rights as is necessary for their enjoyment. By making the protection of privacy more efficient, more transparent, and more consistent, all members of the community should benefit. By anticipating future developments, future problems might also be avoided or minimized.

As to which part of the U.S. government should undertake this review, the privacy commissioner’s office (the subject of Recommendation 14) is an obvious locus of such activity. But the recommendation for undertaking a review is independent of Recommendation 14, and what part of the U.S. government undertakes this review is less important than that it be undertaken.

10.5.4.3
Respecting the Spirit of the Law

Recommendation 8. Government policy makers should respect the spirit of privacy-related law.


The United States is a nation governed by law. It is thus axiomatic that the rule of law must be the supreme authority of the land. Common discourse recognizes the distinction between the spirit and the letter of the law, and the committee believes that both the spirit and the letter of the law play important roles in the protection of privacy. Conformance to the latter is what is needed “to not break the law.” The spirit of the law is necessarily more imprecise than the letter of the law, but if fully respected, the spirit of the law has operational implications as well. For example, a number of laws provide for the confidentiality of data collected by certain federal agencies (e.g., the Census Bureau, the Internal Revenue Service, and so on). To the extent that government policy makers wish to merge protected data with commercial and other non-protected data in order to identify individuals, the committee believes that such actions are not consistent with the spirit of data confidentiality guarantees. Respecting the spirit of the law would result in decision-making processes that give legal limitations and constraints a wide berth rather than “pushing the envelope” and “looking for loopholes.” This approach supports policy makers who engage in open and public debate and discussion when

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

circumstances change rather than use such circumstantial changes to advance long-standing agendas that were previously blocked by public opposition.

Note, too, that these comments apply irrespective of any particular policy outcome or preference. They are a call for deliberation and moderation rather than hasty overreaction—whether the issue is revelation of a government abuse (that might lead to excessive curtailment of law enforcement or national security authorities) or a terrorist incident (that might lead to excessive intrusions on privacy). And they also imply a need to build into policy some mechanisms, such as “sunset requirements,” that facilitate the periodic revisiting of these issues.

10.5.4.4
The Relevance of Fair Information Practices Today

Finding 19. The principles of fair information practice enunciated in 1973 for the protection of personal information are as relevant and important today as they were when they were first formulated.


Principles of fair information practice were first enunciated in 1973 (Section 1.5.4). At the time, they were intended to apply to all automated personal data systems by establishing minimum standards of fair information practice, violation of which would constitute “unfair information practice” subject to criminal penalties and civil remedies. Pending legislative enactment of such a code, the report also recommended that the principles be implemented through federal administrative action.

In 1974, the Privacy Act (Section 4.3.1) was passed, applying these principles to personal information in the custody of federal agencies. In addition, the Fair Credit Reporting Act (first passed in 1970 and amended thereafter several times) applies these principles to the accuracy, fairness, and the privacy of personal information assembled by private sector credit-reporting agencies. Many other private sector organizations have also adopted privacy policies that trace their lineage to some or all of the principles of fair information practice.

Since 1973, the environment surrounding the gathering and use of personal information has changed radically. Information technology is increasingly networked. Private sector gathering and use of personal information have expanded greatly since the early 1970s, and many private sector organizations that manage personal information, such as data aggregators (Section 6.5), are not covered by fair information practices, either under the law or under a voluntary privacy policy based on these principles. National security considerations loom large as well, and the risks of compromising certain kinds of personal information are arguably greater in an environment in which terrorism and identity theft go hand in hand (see Box 4.1 in Chapter 4).

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

For these reasons, the committee believes that the principles of fair information practice are as relevant today—perhaps more so—for the protection of personal information as they were when they were first formulated.


Recommendation 9. Principles of fair information practice should be extended as far as reasonably feasible to apply to private sector organizations that collect and use personal information.


Although some of the restrictions on government regarding the collection and use of personal information are not necessarily applicable to the private sector, the values expressed by the principles of fair information practice should also inform private sector policies regarding privacy.

Reasonableness involves a variety of factors, including an assessment of the relative costs and benefits of applying these principles. This recommendation is thus consistent with the original intent behind the 1973 Department of Health, Education, and Welfare report covering all organizations handling personal information (not just government agencies),18 although the committee is explicitly silent on whether the legislative enactment of a code of fair information practices is the most appropriate way to accomplish this goal.

For the sake of illustration, another approach to encourage the broad adoption of fair information practices might be based on the “safe harbor” approach described in Section 4.7. That is, a private sector organization that collected or used personal information would self-certify that it is in compliance with safe harbor requirements, which would be based on the principles of fair information practice. Periodic assessment of the extent to which mechanisms for ensuring enforcement of these requirements have been developed and applied would be provided to the public. Adherence to these requirements would similarly take the form of government enforcement of the federal and state statutes relevant to unfair and deceptive business practices. In return, complying organizations could be granted immunity from civil or criminal action stemming from alleged mishandling of personal information.

Within the domain of fair information practices, the committee calls attention to two particularly important topics: the repurposing of data and the notion of choice and consent.

18

U.S. Department of Health, Education, and Welfare, Records, Computers and the Rights of Citizens, Report of the Secretary’s Advisory Committee on Automated Personal Data Systems, MIT Press, Cambridge, Mass., 1973.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

Recommendation 10. To support greater transparency into the decision-making process regarding repurposing, guidelines should be established for informing individuals that repurposing of their personal information might occur, and also what the nature of such repurposing would be, and what factors would be taken into account in making any such decision.


While repurposing is not necessarily privacy invasive (e.g., medical information gathered for clinical decision making can be used to conduct epidemiological research in ways that are privacy preserving), there is an unavoidable tension between a principle that one should know how personal information collected from him or her will be used before it is collected and the possibility that information collectors might want to use that information in the future for a purpose that cannot be anticipated today. While this tension cannot necessarily be resolved in any given instance, it should be possible to provide greater transparency into the resolution process. Accordingly, guidelines should be established for informing individuals that repurposing might occur, and also about the nature of such repurposing and what factors would be taken into account in making any such decision. Educating the public about the nature of this tension is also important, and might be undertaken as part of the effort described in Recommendation 14.


Recommendation 11. The principle of choice and consent should be implemented so that individual choices and consent are genuinely informed and so that its implementation accounts fairly for demonstrated human tendencies to accept without change choices made by default.


Even with mandated disclosure, individuals have choices about whether or not they provide information. But only informed choice—choice made when the deciding individual has an adequate amount of the important information that could reasonably affect the outcome of the choice—is morally and ethically meaningful. Individuals are entitled to be informed about answers to the questions articulated in Section 10.1—and parties acquiring personal information are morally and ethically obligated to provide such information to subjects. Vague notices that obfuscate and presume high educational levels of their readers do not satisfy these obligations, even if they do technically comply with legal requirements. Moreover, as the issues of data collection become more complex, the task of providing usable and comprehensible information increases in difficulty.

The importance of default choices has been empirically demonstrated.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

As noted in Section 2.2.5, the endless debate between the desirability of opt-in and opt-out regimes is a debate over which of these should be the information subject’s default choice. In fact, it is easy to circumvent this Hobson’s choice by requiring the individual to make an explicit choice to opt-in or to opt-out. Recall that opting in means that the individual must affirmatively allow the primary data recipient to share his or her information, while opting out means that the individual must affirmatively disallow such sharing of information. But consent requirements could be formulated so that the individual had to choose one of these options explicitly—either “I choose to share information” or “I choose to not share information”—and so that the selection of one of these options would be as essential to processing the form as the individual’s Social Security number would be for a financial institution. Absent a choice, the form would be regarded as null and void, and returned to the individual for resubmission.


Recommendation 12. The U.S. Congress should pay special attention to and provide special oversight regarding the government use of private sector organizations to obtain personal information about individuals.


As noted in Chapter 6, government use of private sector organizations to obtain personal information about individuals is increasing. Fair information practices applied to data aggregation companies would go a long way toward providing meaningful oversight of such use. However, even if data aggregation companies are not covered by fair information practices in the future (either directly or indirectly—that is, through the extended application of fair information practices to government agencies that use such companies), the committee recommends that such use receive special attention and oversight from the U.S. Congress and other appropriate bodies so that privacy issues do not fall in between the cracks established by contracts and service agreements.

To illustrate what might be included under attention and oversight, the committee notes that two oversight mechanisms include periodic hearings (in this case, into government use of these organizations) and reporting requirements for U.S. government agencies that would publicly disclose the extent and nature of such use.

10.5.4.5
Public Advocates for Privacy

Finding 20. Because the benefits of privacy often are less tangible and immediate than the perceived benefits of other interests such as public security and economic efficiency, privacy is at an inherent disadvantage when decision makers weigh privacy against these other interests.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

As noted in Section 10.4, privacy offers benefits that are often less tangible, visible, or immediate than those benefits offered by public safety, economic efficiency, and so on. The consequence is that privacy is at an inherent disadvantage in the decision-making competition for priority and resources.

For other issues in which short-term pressures tend to crowd out longer-term perspectives, the mechanism of institutionalized advocacy has found some success. For example, the Environmental Protection Agency was established to provide a bureaucratic counterweight to the forces of unrestricted economic development inside and outside government.

Today, a number of privately funded organizations, such as the Electronic Privacy Information Center and the Electronic Frontier Foundation, act as generalized advocates for privacy in the public policy sphere. Such groups, while an important ingredient in the debate concerning privacy, are generally focused at the national level, and resource limitations mean that they focus primarily on the most egregious threats to privacy if and when they come to notice. Perhaps most importantly, they do not have institutionally established roles in the public policy process, and they achieve success primarily based on the extent to which they can mobilize public attention to some privacy issue. In contrast, an organizational privacy advocate would have better access to relevant information from government agencies and possibly private organizations under some circumstances, legal standing, and greater internal legitimacy, thus enabling it to play a complementary but no less important role.


Recommendation 13. Governments at various levels should establish formal mechanisms for the institutional advocacy of privacy within government.


Institutionalized advocacy can take place at a variety of different levels—at the level of individual organizations, local government, federal agencies, and so on. An example of institutionalized advocacy is the Privacy Office of the U.S. Department of Homeland Security (DHS), whose mission is to minimize the impact of departmental activities on the individual’s privacy, particularly the individual’s personal information and dignity, while achieving the mission of the DHS.19 The DHS Privacy Office is the focal point of departmental activities that protect the collection, use, and disclosure of personal and departmental information. In addition, the Privacy Office supports the DHS Data Privacy and Integrity Advisory Committee, which provides advice on programmatic, policy,

19

This description is based on the DHS description of its Privacy Office, available at http://www.dhs.gov/dhspublic/interapp/editorial/editorial_0338.xml.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

operational, administrative, and technological issues within DHS that affect individual privacy, as well as data integrity and data interoperability and other privacy-related issues. The Privacy Office also holds public workshops to explore the policy, legal, and technology issues surrounding government’s, private sector’s, and individuals’ information and the intersection of privacy and homeland security.

A common complaint about standards issued at a national level—regardless of subject—is that they do not take into account local contexts and perspectives, and a “one-size-fits-all” mentality can easily lead to absurdities that undercut both public support and the spirit of the original standard. But local communities can have their own institutional advocates, and it may make sense to consider the idea of local enforcement of national standards as a way to obtain some of the efficiencies afforded by national standards and the benefits of local awareness of how those standards might sensibly be implemented in practice.


Recommendation 14. A national privacy commissioner or standing privacy commission should be established to provide ongoing and periodic assessments of privacy developments.


As discussed in earlier chapters (especially Chapters 1 and 3), rapid changes in technology or in circumstances can and often do lead to changes in societal definitions of privacy and in societal expectations for privacy. Solutions developed and compromises reached today may be solidly grounded a year from now, but 3 years is enough for a new “killer app” technology to emerge into widespread use (thus changing what is easily possible in the sharing of information), and a decade is enough for today’s minority political party to become the majority in both the legislature and the executive branch. Any of these eventualities coming true is bound to require a new and comprehensive examination of privacy issues. Thus, it is unrealistic to expect that privacy bargains will become settled “once and for all” or that expectations will be static. Dynamic environments require continuous attention to privacy issues and readiness to examine taken-for-granted beliefs that may no longer be appropriate under rapidly changing conditions.

Of significance is the likelihood that the effects of changes in the environment will go unnoticed by the public in the absence of some well-publicized incident that generates alarm. Even for those generally knowledgeable about privacy, the total impact of these developments is difficult to assess because rapid changes occur in so many different sectors of the community and there are few vantage points from which to assess their cumulative effects.

For these reasons, it makes sense to establish mechanisms to ensure

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

continuing high-level attention to matters related to privacy as society and technology change and to educate the public about privacy issues. Although a number of standing boards and committees advise individual agencies on privacy-related matters (e.g., the Information Security and Privacy Advisory Board of the Department of Commerce, the Data Privacy and Integrity Advisory Committee of the Department of Homeland Security), their inputs are—by design—limited to the concerns of the agencies with which they are associated. The committee believes that at the federal level, a “privacy commissioner” type of office or a standing privacy commission would serve this role very well. A permanent governmental body with the charter of keeping discussions about privacy in the foreground of public debate and discussion could do much to reduce the number and intensity of unwanted privacy-related surprises that occur in the future.

Areas of focus and inquiry for such an office could include the following:

  • A comprehensive review of the legal and regulatory landscape, as described in Section 10.5.4.2. Such a review might be undertaken periodically so that changes in this landscape could be documented and discussed.

  • Trends in privacy-related incidents and an examination of new types of privacy-related incidents. Prior to the widespread use of the Internet, certain privacy issues, such as those associated with online “phishing,” never occurred. Because the deployment of new technologies is often accompanied by new privacy issues, warning of such issues could help the public to better prepare for them. Documented trends in privacy-related incidents would also provide some empirical basis for understanding public concerns about privacy. Note also that “incidents” should be defined broadly, and in particular should not be restricted to illegal acts. For example, “incidents” might include testing of specific privacy policies for readability, and with an appropriate sampling methodology information could be provided to the public about whether the average readability level of privacy policies was going up or down.

  • Celebration and acknowledgment of privacy successes. Much as the Department of Commerce celebrates the quality of private companies through its Baldrige awards program, a privacy commissioner could acknowledge companies whose privacy protection programs were worthy of public note and emulation.

  • Normative issues in data collection and analysis. Grounded in the information technology environment of the early 1970s, the principles of fair information practice generally presume that the primary source of personal information about an individual is that person’s active and

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

consensual engagement in providing such information to another party. This source is still quite important, but new sources of personal information have emerged in the past 30 years—video and infrared cameras, Internet usage monitors, biometric identification technology, electronic location devices, radio-frequency identification chips, and a variety of environmental sensors. In addition, new techniques enable the discovery of previously hidden patterns in large data sets—patterns that might well be regarded as new information in and of themselves.

These types of data acquisition devices and techniques have rarely been the subject of focused normative discussion. Currently, there are no principles or standards of judgment that would help public policy makers and corporate decision makers determine the appropriateness of using any given device or technique. (For example, the use of a given device or technique for gathering data may not be illegal, perhaps because it is so new that regulation has yet to appear, but the lack of legal sanctions against it does not mean that using it is the right thing to do.) Systematic attention to such principles by a privacy commissioner’s office might provide valuable assistance to these decision and policy makers.

  • Collective and group privacy. Historically, privacy regulation in the United States has focused on personal information—information about and collected from individuals. Issues related to groups have generally been addressed from the important but nevertheless narrow perspective of outlawing explicit discrimination against certain categories of individuals (e.g., categories defined by attributes such as race, religion, gender, sexual orientation, and so on). But new statistical profiling techniques, coupled with the increasingly ubiquitous availability of personal information about individuals, provide many new opportunities for sorting and classifying people in ways that are much less obvious or straightforward.

    Originally undertaken to improve marketing, risk management, and strategic communications, statistical profiling has served as the basis for decisions in these areas—and thus may have served to inappropriately exclude people from opportunities that might otherwise improve their ability to grow and develop as productive members of society (even as others may be inappropriately included). However, the nature and scope of such exclusions are not known today, nor is the impact of these exclusions on the cumulative disadvantage faced by members of population segments likely to be victims of categorical discrimination. At the same time, others argue that equitable, efficient, and effective public policy requires the development of data resources that might require such sorting. A future review of privacy might examine these issues, as well as the potential constraining effects on options available to individuals and their ability to make truly informed and autonomous choices in their roles as citizens and consumers in the face of unseen statistical sorting.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
  • Privacy, intimacy, and affiliation. Although matters such as personal intimacy and affiliation are typically beyond the direct and formal purview of most public policy analysis, they are central to the good life. Indeed, one might well argue that a life without intimacy or without the freedom to affiliate with other people is a life largely shorn of meaning and fulfillment. It is at least plausible that the sense of privacy enjoyed by individuals affects the range of activity and behavior that might be associated with expressions of intimacy and affiliation. To the best of the committee’s knowledge, no review of privacy has ever considered these issues, and since almost all of the attention to privacy questions focuses on the behavior of governments and organizations, a future review might examine them.20

  • Informing and educating the public about privacy. The issues surrounding privacy are sufficiently complex that it may be unrealistic to expect the average person to fully grasp their meaning. A privacy commissioner’s office could help to educate the public about privacy issues (in the management of health care data and in other areas). Because this educational role would be institutionalized, it is reasonable to expect that the information such an office provided would be more comprehensible than the information offered by sources and parties with an interest in minimizing public concern about threats to privacy (e.g., difficult-to-read privacy notices sent by companies with economic interests in using personal information to the maximum extent possible).

    This educational role could have a number of components. For illustration only, it might include:

    • Review of and recommendations for how schools teach about privacy and how understanding of it could be improved in the face of recent rapid changes. For example, social networking, as might be found on Facebook.com and MySpace.com, continue to present challenges to the privacy and safety of many of the young people who use such sites and services. As relatively recent developments indicate, education about how these people should approach such services has been lacking.

    • Promotion among the manufacturers of surveillance equipment (whether tools for adults or toys for children) to include warning messages similar to those on other products such as cigarettes (e.g., use of the tools unless certain conditions are met is illegal). Instruction

20

Among some recent work relevant to the issue, see J. Smith, Private Matters, Addison Wesley, Reading, Mass., 1997; R. Gurstein, The Repeal of Reticence, Hill and Wang, New York, 1996; C. Calvert, Voyeur Nation, Westview Press, Boulder, Colo., 2000; Gary T. Marx, “Forget Big Brother and Big Corporation: What About the Personal Uses of Surveillance Technology as Seen in Cases Such as Tom I. Voire?,” Rutgers Journal of Law and Urban Policy 3(4):219-286, 2006, available at http://garymarx.net.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

booklets for such equipment might also briefly mention the value issues involved and, in the case of toys with a double-edged potential, encourage parents to discuss the issues raised by covertly invading the privacy of others, even if such actions appear to be benign and are undertaken only in fun.

  • Development of model discussions of privacy that could be used for instructional purposes.

The committee acknowledges that the notion of a privacy commissioner is controversial in a number of ways, emanating from many points along the privacy policy spectrum. Some believe that the establishment of such offices is in reality a mechanism to avoid coming to grips with the real policy issues of privacy. Others believe that the presence of such an office can be used to lend legitimacy to efforts that would otherwise be seen clearly as compromising privacy. Still others believe that the success of such a commissioner would be contingent on the power given to the commissioner and the policy decisions concerning what kinds of privacy are important to protect, and that such commissioners are rarely given enough explicit authority to make substantive policy decisions regarding privacy.

Another camp believes that such offices stultify real progress and are likely to be mismanaged. And there is no denying that such an office would mark a significant movement in the direction of giving government an important role in protecting privacy. Nonetheless, the committee believes that the value of having a national and institutionalized focal point for promoting public discourse about privacy outweighs these possible objections.

10.5.4.6
Establishing the Means for Recourse

Finding 21. The availability of individual recourse for recognized violations of privacy is an essential element of public policy regarding privacy.


Even the best laws, regulations, and policies governing privacy will be useless unless adequate recourse is available if and when they are violated. In the absence of recourse, those whose privacy has been improperly violated (whether by accident or deliberately) must bear alone the costs and consequences of the violation. This is one possible approach to public policy, but the committee believes this approach would run contrary to basic principles of fairness that public policy should embody. The committee also believes that when recourse is available (i.e., when individuals can identify and be compensated for violations), those in a position to act inappropriately tend to be more careful and more respectful of privacy policies that they might inadvertently violate.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×

Recommendation 15. Governments at all levels should take action to establish the availability of appropriate individual recourse for recognized violations of privacy.


These comments apply whether the source of the violation is in government or in the private sector, although the nature of appropriate recourse varies depending on the source. In the case of government wrongdoing, the doctrine of sovereign immunity generally protects government actors from civil liability or criminal prosecution unless the government waives this protection or is statutorily stripped of immunity in the particular kinds of cases at hand. That is, against government wrongdoers, a statute must explicitly allow civil suits or criminal prosecution for recourse to exist.

Against private sector violators of privacy, a number of recourse mechanisms are possible.21 One approach is for legislatures (federal or state) to create causes for action if private organizations engage in certain privacy-violating practices, as these legislatures have done in the case of unfair and deceptive trade practices. Such laws can be structured to allow government enforcement actions to stop the practice and/or individual actions for damages brought by individuals harmed by the practices.

There are other possibilities as well. When local privacy commissioners or advocates have been legislatively chartered, their charge could include standing to take action on behalf of individuals who have been harmed, either tangibly or intangibly, by some privacy-violating action. Mediators or privacy arbitration boards might be established that could resolve privacy disputes; while this would still require those who thought their privacy had been violated to bring action against the violator, it might reduce the overhead of such actions in a way that would be acceptable to all.

21

In pursuing remedies against private sector invasions of privacy by the news media, publishers, writers, photographers, and others, caution is in order respecting freedoms of speech and press, as noted in Section 4.2.

Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 303
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 304
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 305
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 306
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 307
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 308
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 309
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 310
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 311
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 312
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 313
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 314
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 315
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 316
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 317
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 318
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 319
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 320
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 321
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 322
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 323
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 324
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 325
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 326
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 327
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 328
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 329
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 330
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 331
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 332
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 333
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 334
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 335
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 336
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 337
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 338
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 339
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 340
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 341
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 342
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 343
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 344
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 345
Suggested Citation:"Part IV Findings and Recommendations, 10 Findings and Recommendations." National Research Council. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, DC: The National Academies Press. doi: 10.17226/11896.
×
Page 346
Next: Appendix A A Short History of Surveillance and Privacy in the United States »
Engaging Privacy and Information Technology in a Digital Age Get This Book
×
Buy Hardback | $59.95 Buy Ebook | $47.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Privacy is a growing concern in the United States and around the world. The spread of the Internet and the seemingly boundaryless options for collecting, saving, sharing, and comparing information trigger consumer worries. Online practices of business and government agencies may present new ways to compromise privacy, and e-commerce and technologies that make a wide range of personal information available to anyone with a Web browser only begin to hint at the possibilities for inappropriate or unwarranted intrusion into our personal lives. Engaging Privacy and Information Technology in a Digital Age presents a comprehensive and multidisciplinary examination of privacy in the information age. It explores such important concepts as how the threats to privacy evolving, how can privacy be protected and how society can balance the interests of individuals, businesses and government in ways that promote privacy reasonably and effectively? This book seeks to raise awareness of the web of connectedness among the actions one takes and the privacy policies that are enacted, and provides a variety of tools and concepts with which debates over privacy can be more fruitfully engaged. Engaging Privacy and Information Technology in a Digital Age focuses on three major components affecting notions, perceptions, and expectations of privacy: technological change, societal shifts, and circumstantial discontinuities. This book will be of special interest to anyone interested in understanding why privacy issues are often so intractable.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!