Part IV
Findings and Recommendations



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age Part IV Findings and Recommendations

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age This page intentionally left blank.

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age 10 Findings and Recommendations 10.1 COMING TO TERMS Finding 1. The meaning of privacy is highly contextual, and it can vary depending on the specific circumstances at hand, such as the situation and relationships at issue, the intentions of the parties involved, and the historical context, technology, and political environment. Chapters 1 and 2 of this report take note of the fact that in both everyday discourse and in the scholarly literature, a commonly agreed-upon abstract definition of privacy is elusive (Section 1.2). For example, “privacy” under discussion may involve protecting the confidentiality of information; enabling a sense of autonomy, independence, and freedom to foster creativity; wanting to be left alone; or establishing enough trust that individuals within a given community are willing to disclose data under the assumption that it will not be misused. Nevertheless, it is often possible to find agreement on the meaning of privacy in specific contexts (Section 2.4). In other words, the meaning of privacy depends on many specifics about the situation at hand, e.g., the situation and relationships at issue, the intentions of the parties involved, and the historical context, technology, and the political environment. For example, informational privacy involving political and religious beliefs raises different issues than does health information with respect to a contagious disease. A conversation with one’s attorney is different from a speech in a public park or a posting on an Internet bulletin board. Agreement on the meaning of “privacy” outside the specified context is

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age not necessary, but for making progress in a specific context, a common understanding is essential. In many cases, simply clarifying the terms constitutes progress in itself, and indeed may on occasion be sufficient to reduce the need for further argument. Because the committee found that common to almost all notions of privacy is a privileged status for personal information (privileged in the sense of information that is not immediately known or accessible to others), this report has focused on the meaning and implications of privacy as it relates to the gathering, aggregation, analysis, distribution, and use of personal information. A successful discussion about privacy policies requires the clear identification of both the nature of the personal information in question and the relevant contextual factors. Regarding the nature of the personal information, it is important to probe in several areas discussed in Section 2.1.3: Data capture, which includes the type(s) of personal information in question (e.g., Social Security number, medical information, publicly available information) and the circumstance and means of its capture; Data storage, which includes the time period for which data will be retained and available for use and the circumstances and means of storage (e.g., media used), and the protections for personal information while it is available for a specific use; Data analysis and integration, which includes the nature of the process through which the information is analyzed and the links that might be made to other data; and Data dissemination, which includes the parties who will have access to the information, the form(s) in which the information is presented, the type of harm that might result from unwelcome disclosure or dissemination, and the extent to which this information has privacy implications for other individuals. Regarding the relevant contextual factors, it might be useful to probe about the following: What is the relevant and applicable social and institutional context? For example, are rewards or benefits offered for sharing personal information? Is coercion used in the form of withholding benefits when personal information is not shared? Does the individual retain control over the initial and potential future uses of her information? Does she have the opportunity to review and correct personal information? Who are the actors and institutions involved? These might include the subject of the information, the provider of the information (which may not be the subject), the original recipients of the information, subsequent

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age recipients, and other individuals who might be affected without their active involvement—and the relationships among them. What are the stated and unstated motivations, goals, or purposes of the actors? Why do the recipients of the information want it? How might the information be repurposed—used for a purpose other than that for which it was originally collected—in the future? How are decisions made when there are competing interests regarding personal information, for example, public health needs versus individual privacy or national security versus civil rights interests? What are the informational norms in question? As noted in Chapter 2, informational norms specify how different kinds of information about various actors in the given context can flow. These norms can be illuminated in many instances through the technique of applying anchoring vignettes as described in Chapter 2. Relevant issues concerning these norms might include: The extent to which information is provided voluntarily (e.g., is the providing of information required by law, is the information acquired covertly or deceptively); The extent to which information can be passed along to third parties and the circumstances of such passing (e.g., is it part of a financial transaction); The extent to which reciprocity exists (is the subject entitled to receive information or other benefits from the recipient); The extent to which the gathering of information is apparent and obvious to those to whom the information pertains; Limitations on the use of the information that are implied or explicitly noted; Whether or not the act of subsequently providing information is known to the subject; and The extent to which collected information can/might be used for or against others (e.g., relatives, other members of a class). One important corollary of Finding 1 is that policy debates are likely to be sterile and disconnected if they are couched simply in abstract terms. It should thus be expected that policy debates involving privacy will be couched in the language of the specific context involved—and such context-dependent formulations are desirable. The reason is that even if the issues themselves seem to carry over from one context to another, the weighting of each issue and hence the relationships of issues to each other are likely to depend on the specific context. A second corollary is that because privacy has meaning only in context, the incidence of privacy problems (e.g., violations of privacy) is poorly defined outside specific contexts, and overall quantitative measures of

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age privacy are not particularly meaningful. What may be more meaningful is careful delimitation of claims that are made based on domain-specific data. An example from the identity theft domain might involve hypothesizing the number of individuals per year whose names and Social Security numbers were potentially compromised by a security breach, rather than asserting these numbers as indicating identity theft. A third corollary is that privacy is not primarily a technological issue—technology cannot violate or guarantee privacy. Technology can enhance or detract from the secrecy of information or the anonymity of an actor, but these are not the same as privacy. The nature and extent of privacy in any given context are tied to many factors, including the way in which information is accessed, the intentions of those accessing the information, and the trust relationships between the user of the information and the subject of the information. 10.2 THE VALUE OF PRIVACY Finding 2. Privacy is an important value to be maintained and protected, although it is not an absolute good in itself. As noted in Chapter 2, privacy is an important value to be maintained and protected. Certain types of privacy (e.g., those involving religious beliefs and political ideas or certain aspects of the body) approach the status of fundamental human rights. They are related to our most cherished ideals of the dignity of the person, the family, liberty, and democracy. At the same time, the committee does not view privacy as an intrinsic and absolute good independently of particular situations. There are times when crossing the informational borders of the person is appropriate and to fail to do so would be irresponsible. That is, the committee recognizes situations and contexts in which society negotiates appropriate tradeoffs between privacy and other values (as discussed below) such as public health and safety. To note this is not to deny the centrality of privacy to human dignity, candor, and intimacy as well as to a democratic society. Privacy is thus also a means as well as an end, and the committee recognizes considerable instrumental value in privacy—privacy in the service of other important goals. Beyond instrumentality, privacy has important symbolic value in demonstrating societal respect for the individual. Finding 3. Loss of privacy often results in significant tangible and intangible harm to individuals and to groups. In one obvious example, protecting the privacy of one’s personal information helps to make one safer from crimes such as fraud, identity theft, and stalking. (When undertaken on a large scale, identity theft can

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age also have important and negative effects on society, as suggested by the use of identity theft as an element in the financing of terrorist groups and their operations (see Box 4.1).) But such tangible harms, striking though they are, affect far fewer people compared with those who suffer less tangible harms (as suggested in Section 1.3). These intangible harms could be regarded as the consequential damages to individuals and to society that result from the loss or compromise of privacy, and they are no less real or significant for being intangible rather than tangible. Consider: A person whose personal information (name, address, Social Security number, and so on) may have fallen into the hands of identity thieves may not in fact suffer from an actual fraudulent purchase made in her name. But if the breach is identified and the subject learns of it, she will likely worry about being victimized and thus must look over her shoulder for a very long period of time. She may have to scrutinize her credit card statements more carefully; she may have to subscribe to a credit-monitoring service; she may have to put a freeze on her credit report and thereby deny herself the convenience of obtaining instant credit at a store. She may live in fear of assault, public embarrassment, or defamation, not knowing who has the information or how it might be used. Thus, absent the protection of her information, she stands to lose real benefits and the intangible peace of mind that she would otherwise enjoy, even if no actual direct harm occurs, not to mention the many dozens of hours needed to repair her records and relationships. Furthermore, it takes only a few such well-publicized incidents (i.e., a small number compared with the number of possible instances where it could happen) to cause a very large number of people to lose trust in electronic commerce and related matters—and thus to refrain from engaging in such commerce. Such broader impacts have larger consequences for the economy as a whole than simply the impact on the individuals directly affected by identity theft. Under public surveillance, many people change their behavior so that they are not seen as acting anomalously in any way, even if their behavior absent surveillance would be perfectly legal and ethical. For example, an interracial couple may walk down the road holding hands and even sneak a kiss. With surveillance cameras visibly trained on the road, they may not kiss, they may not hold hands, and they may even change their route so that they are not under video surveillance. Public surveillance may reduce the likelihood that someone would attend a public demonstration in which he might otherwise participate. In short, surveillance often has the effect of influencing the behavior of people in the direction of greater conformity and homogeneity. Greater conformity is sometimes defensible, as might be the case when safe driving can be linked to automatic traffic camera surveillance. But surveillance in some

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age instances has negative consequences, and in a culture and society that celebrate diversity and embrace tolerance, such chilling effects are not at all positive. In short, privacy supports many democratic societal values, such as the right to freely associate, the embrace of social diversity, and even the use of secret ballots in support of free elections, and consequently the loss of privacy can affect the entire society. Through the analysis of a variety of personal information, the U.S. government has placed many individuals on “watch lists” as suspected terrorists who should be denied airplane boarding privileges or entry into the United States. Individuals on these watch lists cannot know of their status until they are denied boarding or entry—and if they are in fact not terrorists, they suffer the consequences of mistaken identity. Further, they have no recourse—no way to be made whole—for the consequences they suffer. Workplace surveillance changes the workplace environment, almost by definition. But unlike most unfocused public surveillance, the very purpose of workplace surveillance is to change the behavior of everyone within its purview. From the standpoint of employees under poorly explained surveillance (which is often simply offered as a fait accompli), surveillance can result in a deadened work environment perceived as hostile and restrictive in which workers are not trusted and “are treated like children.” Ironically, work monitoring seen to be unreasonable is likely to be responded to in ways that undermine the goals of the organization, and such surveillance may raise the level of stress among workers in ways that limit their productivity. A voter without privacy is subject to coercion in casting his or her vote. Indeed, it was for just this reason that the secret ballot was gradually introduced in the United States in the late 19th century. With a secret ballot, there is no way to prove how an individual voted, and thus a voter can cast his or her vote freely without fear of later retribution. Secret ballots also impede vote buying, since a voter can vote one way and tell his or her paymaster that he voted the way he or she was paid to vote. The availability of personal information about an individual enables various organizations to provide him or her with information or product and service offerings customized to the interests and patterns reflected in such information. While such information and offerings do have benefit for many people who receive them, they can have negative effects as well. For example, personal medical information made available to drug manufacturers may result in drug advertisements being targeted to individuals with certain diseases. Receipt of such advertisements at one’s family home can compromise the privacy of the individual’s medical information if the diseases associated with such drugs are socially stigmatizing.

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age People who lose control of their personal information can be subject to discrimination of various kinds (Section 2.3). As a society, we have made a choice that discrimination based on race and religion (among other things) should be illegal as a matter of public policy. But there are many other distinctions that can be made when detailed personal information is available that facilitates the classification and assignment of people to groups—groups defined not by race or religion but by some nameless statistical sorting on multiple dimensions. Members of groups so defined can be denied services, information, opportunities, or employment to which they would otherwise be entitled if that personal information had been kept private. For example, political campaigns can use collections of personal information to tailor different messages to members of different groups that are designed to appeal to their particular views and attitudes. Such practices work against full disclosure and a community-wide consideration of the issues. These examples underscore the committee’s categorical rejection of the notion that if you have done nothing wrong, you have nothing to fear from a loss of privacy. It should also be noted that the ability to put individuals under surveillance is often as significant in changing behavior as the reality of such surveillance. From dummy surveillance cameras intended to deter crime to fellow diners in a cafeteria who might be listening to a private conversation, there are many ways in which potential surveillance can affect behavior. Finding 4. Privacy is particularly important to people when they believe that the entity receiving their personal information is not trustworthy and that they may be harmed by sharing that information. Trust is an important issue in framing concerns regarding privacy. In the context of an individual providing personal information to another, the sensitivities involved will depend on the degree to which the individual trusts that party to refrain from acting in a manner that is contrary to his or her interests (e.g., to pass it along to someone else, to use it as the basis for a decision with inappropriately adverse consequences). As an extreme case, consider the act of providing a complete dossier of personal information on a stack of paper—to a person who will destroy it. If the destruction is verifiable to the person providing the dossier (and if there is no way for the destroyer to read the dossier), it would be hard to assert the existence of any privacy concern at all. But for most situations in which one provides personal information, the basis for trust is less clear. Children routinely assert privacy rights to their personal information against their parents when they do not trust

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age that parents will not criticize them or punish them or think ill of them as a result of accessing that information. (They also assert privacy rights in many other situations.) Adults who purchase health insurance often assert privacy rights in their medical information because they are concerned that insurers might not insure them or might charge high prices on the basis of some information in their medical record. Many citizens assert privacy rights against government, although few would object to the gathering of personal information within the borders of the United States and about U.S. citizens if they could be assured that such information was being used only for genuine national security purposes and that any information that had been gathered about them was accurate and appropriately interpreted and treated (as discussed in Section 9.2.5). Perversely, many people hold contradictory views about their own privacy and other people’s privacy—that is, they support curtailing the privacy of some demographic groups at the same time that they believe that their own should not be similarly curtailed. This dichotomy almost certainly reflects their views about the trustworthiness of certain groups versus their own. In short, the act of providing personal information is almost always accompanied to varying degrees by a perceived risk of negative consequences flowing from an abuse of trust. The perception may or may not be justified by the objective facts of the situation, but trust has an important subjective element. If the entity receiving the information is not seen as trustworthy, it is likely that the individuals involved will be much more hesitant to provide that information (or to provide it accurately) than they would be under other circumstances involving a greater degree of trust. 10.3 PRESSURES ON PRIVACY The discussion in earlier chapters suggests that there are many pressures that are increasingly limiting privacy. Among them are advancing information technologies; increasing mechanisms for obtaining information; the value of personal information to business and government; and changing social norms and needs. Finding 5. Although some developments in information technology (IT) and other technologies do have considerable potential to enhance privacy, the overall impact of advancing technology including IT has been to compromise privacy. One obvious pressure on privacy is the evolution of information technology writ large, an evolution that has resulted in greater capability to invade and compromise privacy more deeply and more easily than ever before. One might ask whether this result was inevitable—whether under

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age a different set of societal structures and different notions of power and privilege the evolution of IT might have done more to enhance privacy. But even though some developments in IT do indeed have the potential to enhance privacy, there is little doubt that the overall impact of advancing IT has been to compromise privacy in important ways. For example, the rapidly decreasing cost of storing information has meant that personal information on an individual, once collected, may generally be available for potential use forever unless special measures are taken to destroy it (Chapter 3). Even when there is no particular need to keep information for a long time, it is often kept by default, because it is more expensive to decide on what to destroy or delete than to maintain it in storage. Such information is easily if not routinely added to existing databases on the individual, which means that the volume of information about an individual only grows over time.1 A second example is the proliferation of smaller, less expensive, and more easily deployed sensors that can readily obtain information in their ambient environment, information that is sometimes personal information about individuals. Technology has also facilitated greater access to information (Section 3.4). Nominally public records stored on paper are vastly more inaccessible than if their contents are posted on a Web site or are available online, and in that sense are more private apart from any rules regulating access to them. For example, property tax records have been available to the public in most municipalities for decades. The inconvenience of access has prevented widespread knowledge of neighbors’ property values, but when such information is available via the Internet, it is disseminated much more broadly. More generally, information technology is a rapidly changing field. New information technologies—and new sensor, biometric, and life science technologies, too—often offer capabilities poorly understood and considered in public debates or in individuals’ expectations of privacy. Traditional expectations about information are in a sense under continuous bombardment from such changes, and prior beliefs, understandings, and practices are not necessarily an adequate guide or control with respect to the torrent of new developments. The net result is that the appearance 1 An example is a person’s medical history, much of which is irrelevant to an individual’s current medical status. (Information regarding major medical events (surgeries, major diseases) and associated significant data such as reports on operations, X rays, and pathology reports continue to be useful, but much of the medical record over time becomes filled with data that may be maintained for medical legal purposes but has little value to the treating physician long after the fact. Such data might, for example, include lab work taken during a critical event or during routine care many years in the past.)

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age circumstances change rather than use such circumstantial changes to advance long-standing agendas that were previously blocked by public opposition. Note, too, that these comments apply irrespective of any particular policy outcome or preference. They are a call for deliberation and moderation rather than hasty overreaction—whether the issue is revelation of a government abuse (that might lead to excessive curtailment of law enforcement or national security authorities) or a terrorist incident (that might lead to excessive intrusions on privacy). And they also imply a need to build into policy some mechanisms, such as “sunset requirements,” that facilitate the periodic revisiting of these issues. 10.5.4.4 The Relevance of Fair Information Practices Today Finding 19. The principles of fair information practice enunciated in 1973 for the protection of personal information are as relevant and important today as they were when they were first formulated. Principles of fair information practice were first enunciated in 1973 (Section 1.5.4). At the time, they were intended to apply to all automated personal data systems by establishing minimum standards of fair information practice, violation of which would constitute “unfair information practice” subject to criminal penalties and civil remedies. Pending legislative enactment of such a code, the report also recommended that the principles be implemented through federal administrative action. In 1974, the Privacy Act (Section 4.3.1) was passed, applying these principles to personal information in the custody of federal agencies. In addition, the Fair Credit Reporting Act (first passed in 1970 and amended thereafter several times) applies these principles to the accuracy, fairness, and the privacy of personal information assembled by private sector credit-reporting agencies. Many other private sector organizations have also adopted privacy policies that trace their lineage to some or all of the principles of fair information practice. Since 1973, the environment surrounding the gathering and use of personal information has changed radically. Information technology is increasingly networked. Private sector gathering and use of personal information have expanded greatly since the early 1970s, and many private sector organizations that manage personal information, such as data aggregators (Section 6.5), are not covered by fair information practices, either under the law or under a voluntary privacy policy based on these principles. National security considerations loom large as well, and the risks of compromising certain kinds of personal information are arguably greater in an environment in which terrorism and identity theft go hand in hand (see Box 4.1 in Chapter 4).

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age For these reasons, the committee believes that the principles of fair information practice are as relevant today—perhaps more so—for the protection of personal information as they were when they were first formulated. Recommendation 9. Principles of fair information practice should be extended as far as reasonably feasible to apply to private sector organizations that collect and use personal information. Although some of the restrictions on government regarding the collection and use of personal information are not necessarily applicable to the private sector, the values expressed by the principles of fair information practice should also inform private sector policies regarding privacy. Reasonableness involves a variety of factors, including an assessment of the relative costs and benefits of applying these principles. This recommendation is thus consistent with the original intent behind the 1973 Department of Health, Education, and Welfare report covering all organizations handling personal information (not just government agencies),18 although the committee is explicitly silent on whether the legislative enactment of a code of fair information practices is the most appropriate way to accomplish this goal. For the sake of illustration, another approach to encourage the broad adoption of fair information practices might be based on the “safe harbor” approach described in Section 4.7. That is, a private sector organization that collected or used personal information would self-certify that it is in compliance with safe harbor requirements, which would be based on the principles of fair information practice. Periodic assessment of the extent to which mechanisms for ensuring enforcement of these requirements have been developed and applied would be provided to the public. Adherence to these requirements would similarly take the form of government enforcement of the federal and state statutes relevant to unfair and deceptive business practices. In return, complying organizations could be granted immunity from civil or criminal action stemming from alleged mishandling of personal information. Within the domain of fair information practices, the committee calls attention to two particularly important topics: the repurposing of data and the notion of choice and consent. 18 U.S. Department of Health, Education, and Welfare, Records, Computers and the Rights of Citizens, Report of the Secretary’s Advisory Committee on Automated Personal Data Systems, MIT Press, Cambridge, Mass., 1973.

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age Recommendation 10. To support greater transparency into the decision-making process regarding repurposing, guidelines should be established for informing individuals that repurposing of their personal information might occur, and also what the nature of such repurposing would be, and what factors would be taken into account in making any such decision. While repurposing is not necessarily privacy invasive (e.g., medical information gathered for clinical decision making can be used to conduct epidemiological research in ways that are privacy preserving), there is an unavoidable tension between a principle that one should know how personal information collected from him or her will be used before it is collected and the possibility that information collectors might want to use that information in the future for a purpose that cannot be anticipated today. While this tension cannot necessarily be resolved in any given instance, it should be possible to provide greater transparency into the resolution process. Accordingly, guidelines should be established for informing individuals that repurposing might occur, and also about the nature of such repurposing and what factors would be taken into account in making any such decision. Educating the public about the nature of this tension is also important, and might be undertaken as part of the effort described in Recommendation 14. Recommendation 11. The principle of choice and consent should be implemented so that individual choices and consent are genuinely informed and so that its implementation accounts fairly for demonstrated human tendencies to accept without change choices made by default. Even with mandated disclosure, individuals have choices about whether or not they provide information. But only informed choice—choice made when the deciding individual has an adequate amount of the important information that could reasonably affect the outcome of the choice—is morally and ethically meaningful. Individuals are entitled to be informed about answers to the questions articulated in Section 10.1—and parties acquiring personal information are morally and ethically obligated to provide such information to subjects. Vague notices that obfuscate and presume high educational levels of their readers do not satisfy these obligations, even if they do technically comply with legal requirements. Moreover, as the issues of data collection become more complex, the task of providing usable and comprehensible information increases in difficulty. The importance of default choices has been empirically demonstrated.

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age As noted in Section 2.2.5, the endless debate between the desirability of opt-in and opt-out regimes is a debate over which of these should be the information subject’s default choice. In fact, it is easy to circumvent this Hobson’s choice by requiring the individual to make an explicit choice to opt-in or to opt-out. Recall that opting in means that the individual must affirmatively allow the primary data recipient to share his or her information, while opting out means that the individual must affirmatively disallow such sharing of information. But consent requirements could be formulated so that the individual had to choose one of these options explicitly—either “I choose to share information” or “I choose to not share information”—and so that the selection of one of these options would be as essential to processing the form as the individual’s Social Security number would be for a financial institution. Absent a choice, the form would be regarded as null and void, and returned to the individual for resubmission. Recommendation 12. The U.S. Congress should pay special attention to and provide special oversight regarding the government use of private sector organizations to obtain personal information about individuals. As noted in Chapter 6, government use of private sector organizations to obtain personal information about individuals is increasing. Fair information practices applied to data aggregation companies would go a long way toward providing meaningful oversight of such use. However, even if data aggregation companies are not covered by fair information practices in the future (either directly or indirectly—that is, through the extended application of fair information practices to government agencies that use such companies), the committee recommends that such use receive special attention and oversight from the U.S. Congress and other appropriate bodies so that privacy issues do not fall in between the cracks established by contracts and service agreements. To illustrate what might be included under attention and oversight, the committee notes that two oversight mechanisms include periodic hearings (in this case, into government use of these organizations) and reporting requirements for U.S. government agencies that would publicly disclose the extent and nature of such use. 10.5.4.5 Public Advocates for Privacy Finding 20. Because the benefits of privacy often are less tangible and immediate than the perceived benefits of other interests such as public security and economic efficiency, privacy is at an inherent disadvantage when decision makers weigh privacy against these other interests.

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age As noted in Section 10.4, privacy offers benefits that are often less tangible, visible, or immediate than those benefits offered by public safety, economic efficiency, and so on. The consequence is that privacy is at an inherent disadvantage in the decision-making competition for priority and resources. For other issues in which short-term pressures tend to crowd out longer-term perspectives, the mechanism of institutionalized advocacy has found some success. For example, the Environmental Protection Agency was established to provide a bureaucratic counterweight to the forces of unrestricted economic development inside and outside government. Today, a number of privately funded organizations, such as the Electronic Privacy Information Center and the Electronic Frontier Foundation, act as generalized advocates for privacy in the public policy sphere. Such groups, while an important ingredient in the debate concerning privacy, are generally focused at the national level, and resource limitations mean that they focus primarily on the most egregious threats to privacy if and when they come to notice. Perhaps most importantly, they do not have institutionally established roles in the public policy process, and they achieve success primarily based on the extent to which they can mobilize public attention to some privacy issue. In contrast, an organizational privacy advocate would have better access to relevant information from government agencies and possibly private organizations under some circumstances, legal standing, and greater internal legitimacy, thus enabling it to play a complementary but no less important role. Recommendation 13. Governments at various levels should establish formal mechanisms for the institutional advocacy of privacy within government. Institutionalized advocacy can take place at a variety of different levels—at the level of individual organizations, local government, federal agencies, and so on. An example of institutionalized advocacy is the Privacy Office of the U.S. Department of Homeland Security (DHS), whose mission is to minimize the impact of departmental activities on the individual’s privacy, particularly the individual’s personal information and dignity, while achieving the mission of the DHS.19 The DHS Privacy Office is the focal point of departmental activities that protect the collection, use, and disclosure of personal and departmental information. In addition, the Privacy Office supports the DHS Data Privacy and Integrity Advisory Committee, which provides advice on programmatic, policy, 19 This description is based on the DHS description of its Privacy Office, available at http://www.dhs.gov/dhspublic/interapp/editorial/editorial_0338.xml.

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age operational, administrative, and technological issues within DHS that affect individual privacy, as well as data integrity and data interoperability and other privacy-related issues. The Privacy Office also holds public workshops to explore the policy, legal, and technology issues surrounding government’s, private sector’s, and individuals’ information and the intersection of privacy and homeland security. A common complaint about standards issued at a national level—regardless of subject—is that they do not take into account local contexts and perspectives, and a “one-size-fits-all” mentality can easily lead to absurdities that undercut both public support and the spirit of the original standard. But local communities can have their own institutional advocates, and it may make sense to consider the idea of local enforcement of national standards as a way to obtain some of the efficiencies afforded by national standards and the benefits of local awareness of how those standards might sensibly be implemented in practice. Recommendation 14. A national privacy commissioner or standing privacy commission should be established to provide ongoing and periodic assessments of privacy developments. As discussed in earlier chapters (especially Chapters 1 and 3), rapid changes in technology or in circumstances can and often do lead to changes in societal definitions of privacy and in societal expectations for privacy. Solutions developed and compromises reached today may be solidly grounded a year from now, but 3 years is enough for a new “killer app” technology to emerge into widespread use (thus changing what is easily possible in the sharing of information), and a decade is enough for today’s minority political party to become the majority in both the legislature and the executive branch. Any of these eventualities coming true is bound to require a new and comprehensive examination of privacy issues. Thus, it is unrealistic to expect that privacy bargains will become settled “once and for all” or that expectations will be static. Dynamic environments require continuous attention to privacy issues and readiness to examine taken-for-granted beliefs that may no longer be appropriate under rapidly changing conditions. Of significance is the likelihood that the effects of changes in the environment will go unnoticed by the public in the absence of some well-publicized incident that generates alarm. Even for those generally knowledgeable about privacy, the total impact of these developments is difficult to assess because rapid changes occur in so many different sectors of the community and there are few vantage points from which to assess their cumulative effects. For these reasons, it makes sense to establish mechanisms to ensure

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age continuing high-level attention to matters related to privacy as society and technology change and to educate the public about privacy issues. Although a number of standing boards and committees advise individual agencies on privacy-related matters (e.g., the Information Security and Privacy Advisory Board of the Department of Commerce, the Data Privacy and Integrity Advisory Committee of the Department of Homeland Security), their inputs are—by design—limited to the concerns of the agencies with which they are associated. The committee believes that at the federal level, a “privacy commissioner” type of office or a standing privacy commission would serve this role very well. A permanent governmental body with the charter of keeping discussions about privacy in the foreground of public debate and discussion could do much to reduce the number and intensity of unwanted privacy-related surprises that occur in the future. Areas of focus and inquiry for such an office could include the following: A comprehensive review of the legal and regulatory landscape, as described in Section 10.5.4.2. Such a review might be undertaken periodically so that changes in this landscape could be documented and discussed. Trends in privacy-related incidents and an examination of new types of privacy-related incidents. Prior to the widespread use of the Internet, certain privacy issues, such as those associated with online “phishing,” never occurred. Because the deployment of new technologies is often accompanied by new privacy issues, warning of such issues could help the public to better prepare for them. Documented trends in privacy-related incidents would also provide some empirical basis for understanding public concerns about privacy. Note also that “incidents” should be defined broadly, and in particular should not be restricted to illegal acts. For example, “incidents” might include testing of specific privacy policies for readability, and with an appropriate sampling methodology information could be provided to the public about whether the average readability level of privacy policies was going up or down. Celebration and acknowledgment of privacy successes. Much as the Department of Commerce celebrates the quality of private companies through its Baldrige awards program, a privacy commissioner could acknowledge companies whose privacy protection programs were worthy of public note and emulation. Normative issues in data collection and analysis. Grounded in the information technology environment of the early 1970s, the principles of fair information practice generally presume that the primary source of personal information about an individual is that person’s active and

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age consensual engagement in providing such information to another party. This source is still quite important, but new sources of personal information have emerged in the past 30 years—video and infrared cameras, Internet usage monitors, biometric identification technology, electronic location devices, radio-frequency identification chips, and a variety of environmental sensors. In addition, new techniques enable the discovery of previously hidden patterns in large data sets—patterns that might well be regarded as new information in and of themselves. These types of data acquisition devices and techniques have rarely been the subject of focused normative discussion. Currently, there are no principles or standards of judgment that would help public policy makers and corporate decision makers determine the appropriateness of using any given device or technique. (For example, the use of a given device or technique for gathering data may not be illegal, perhaps because it is so new that regulation has yet to appear, but the lack of legal sanctions against it does not mean that using it is the right thing to do.) Systematic attention to such principles by a privacy commissioner’s office might provide valuable assistance to these decision and policy makers. Collective and group privacy. Historically, privacy regulation in the United States has focused on personal information—information about and collected from individuals. Issues related to groups have generally been addressed from the important but nevertheless narrow perspective of outlawing explicit discrimination against certain categories of individuals (e.g., categories defined by attributes such as race, religion, gender, sexual orientation, and so on). But new statistical profiling techniques, coupled with the increasingly ubiquitous availability of personal information about individuals, provide many new opportunities for sorting and classifying people in ways that are much less obvious or straightforward. Originally undertaken to improve marketing, risk management, and strategic communications, statistical profiling has served as the basis for decisions in these areas—and thus may have served to inappropriately exclude people from opportunities that might otherwise improve their ability to grow and develop as productive members of society (even as others may be inappropriately included). However, the nature and scope of such exclusions are not known today, nor is the impact of these exclusions on the cumulative disadvantage faced by members of population segments likely to be victims of categorical discrimination. At the same time, others argue that equitable, efficient, and effective public policy requires the development of data resources that might require such sorting. A future review of privacy might examine these issues, as well as the potential constraining effects on options available to individuals and their ability to make truly informed and autonomous choices in their roles as citizens and consumers in the face of unseen statistical sorting.

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age Privacy, intimacy, and affiliation. Although matters such as personal intimacy and affiliation are typically beyond the direct and formal purview of most public policy analysis, they are central to the good life. Indeed, one might well argue that a life without intimacy or without the freedom to affiliate with other people is a life largely shorn of meaning and fulfillment. It is at least plausible that the sense of privacy enjoyed by individuals affects the range of activity and behavior that might be associated with expressions of intimacy and affiliation. To the best of the committee’s knowledge, no review of privacy has ever considered these issues, and since almost all of the attention to privacy questions focuses on the behavior of governments and organizations, a future review might examine them.20 Informing and educating the public about privacy. The issues surrounding privacy are sufficiently complex that it may be unrealistic to expect the average person to fully grasp their meaning. A privacy commissioner’s office could help to educate the public about privacy issues (in the management of health care data and in other areas). Because this educational role would be institutionalized, it is reasonable to expect that the information such an office provided would be more comprehensible than the information offered by sources and parties with an interest in minimizing public concern about threats to privacy (e.g., difficult-to-read privacy notices sent by companies with economic interests in using personal information to the maximum extent possible). This educational role could have a number of components. For illustration only, it might include: Review of and recommendations for how schools teach about privacy and how understanding of it could be improved in the face of recent rapid changes. For example, social networking, as might be found on Facebook.com and MySpace.com, continue to present challenges to the privacy and safety of many of the young people who use such sites and services. As relatively recent developments indicate, education about how these people should approach such services has been lacking. Promotion among the manufacturers of surveillance equipment (whether tools for adults or toys for children) to include warning messages similar to those on other products such as cigarettes (e.g., use of the tools unless certain conditions are met is illegal). Instruction 20 Among some recent work relevant to the issue, see J. Smith, Private Matters, Addison Wesley, Reading, Mass., 1997; R. Gurstein, The Repeal of Reticence, Hill and Wang, New York, 1996; C. Calvert, Voyeur Nation, Westview Press, Boulder, Colo., 2000; Gary T. Marx, “Forget Big Brother and Big Corporation: What About the Personal Uses of Surveillance Technology as Seen in Cases Such as Tom I. Voire?,” Rutgers Journal of Law and Urban Policy 3(4):219-286, 2006, available at http://garymarx.net.

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age booklets for such equipment might also briefly mention the value issues involved and, in the case of toys with a double-edged potential, encourage parents to discuss the issues raised by covertly invading the privacy of others, even if such actions appear to be benign and are undertaken only in fun. Development of model discussions of privacy that could be used for instructional purposes. The committee acknowledges that the notion of a privacy commissioner is controversial in a number of ways, emanating from many points along the privacy policy spectrum. Some believe that the establishment of such offices is in reality a mechanism to avoid coming to grips with the real policy issues of privacy. Others believe that the presence of such an office can be used to lend legitimacy to efforts that would otherwise be seen clearly as compromising privacy. Still others believe that the success of such a commissioner would be contingent on the power given to the commissioner and the policy decisions concerning what kinds of privacy are important to protect, and that such commissioners are rarely given enough explicit authority to make substantive policy decisions regarding privacy. Another camp believes that such offices stultify real progress and are likely to be mismanaged. And there is no denying that such an office would mark a significant movement in the direction of giving government an important role in protecting privacy. Nonetheless, the committee believes that the value of having a national and institutionalized focal point for promoting public discourse about privacy outweighs these possible objections. 10.5.4.6 Establishing the Means for Recourse Finding 21. The availability of individual recourse for recognized violations of privacy is an essential element of public policy regarding privacy. Even the best laws, regulations, and policies governing privacy will be useless unless adequate recourse is available if and when they are violated. In the absence of recourse, those whose privacy has been improperly violated (whether by accident or deliberately) must bear alone the costs and consequences of the violation. This is one possible approach to public policy, but the committee believes this approach would run contrary to basic principles of fairness that public policy should embody. The committee also believes that when recourse is available (i.e., when individuals can identify and be compensated for violations), those in a position to act inappropriately tend to be more careful and more respectful of privacy policies that they might inadvertently violate.

OCR for page 303
Engaging Privacy and Information Technology in a Digital Age Recommendation 15. Governments at all levels should take action to establish the availability of appropriate individual recourse for recognized violations of privacy. These comments apply whether the source of the violation is in government or in the private sector, although the nature of appropriate recourse varies depending on the source. In the case of government wrongdoing, the doctrine of sovereign immunity generally protects government actors from civil liability or criminal prosecution unless the government waives this protection or is statutorily stripped of immunity in the particular kinds of cases at hand. That is, against government wrongdoers, a statute must explicitly allow civil suits or criminal prosecution for recourse to exist. Against private sector violators of privacy, a number of recourse mechanisms are possible.21 One approach is for legislatures (federal or state) to create causes for action if private organizations engage in certain privacy-violating practices, as these legislatures have done in the case of unfair and deceptive trade practices. Such laws can be structured to allow government enforcement actions to stop the practice and/or individual actions for damages brought by individuals harmed by the practices. There are other possibilities as well. When local privacy commissioners or advocates have been legislatively chartered, their charge could include standing to take action on behalf of individuals who have been harmed, either tangibly or intangibly, by some privacy-violating action. Mediators or privacy arbitration boards might be established that could resolve privacy disputes; while this would still require those who thought their privacy had been violated to bring action against the violator, it might reduce the overhead of such actions in a way that would be acceptable to all. 21 In pursuing remedies against private sector invasions of privacy by the news media, publishers, writers, photographers, and others, caution is in order respecting freedoms of speech and press, as noted in Section 4.2.