by public figures and in legal and other academic literature. The myriad definitions have at their core the basic notions limiting access to and providing control over personal information or contact with individuals.2

Access and control can be provided through either technical or legal and regulatory measures. Access can be limited using either laws that prohibit or limit the collection and disclosure of information or technology that facilitates anonymous transactions or otherwise minimizes disclosure. One way to provide control over personal information is through laws and regulations that mandate choice, the choice either to opt in or to opt out. Another is the use of technology that facilitates informed consent, such as tools to keep track of and enforce privacy preferences.

Although work in the past has often focused on information collected by Web sites, a wide array of current and emerging technologies will have significant impacts on privacy, including behavioral advertising, social networks, deep packet inspection, server log files, and location sharing. All of these technologies raise questions about how to communicate meaningfully about the effects that these technologies will have on privacy and about how to help people understand privacy risks that may seem distant or not relevant to them today. Related to this, different rates and patterns of use and the acceptance of these technologies suggest that different types of communication may be necessary to reach people in different age groups, of different genders, or in different cultures.

Cranor drew the connection between privacy and usability, observing that the privacy concerns that people often express seem inconsistent with their actual behavior—that is, people say that they want privacy but do not always take the steps necessary to protect it. There are many possible explanations—for example, people may not actually care all that much about privacy, or they may favor short-term benefits that may come at the cost of privacy over the long-term consequences for their privacy. But there are other possible explanations for the gap between expressed concerns and behavior: people may not understand the privacy implications of their behavior; the cost of privacy protection may be too high (including the cost of figuring out what steps should be taken to protect their privacy); or users might think that they have taken steps to protect their privacy but misunderstood those steps and actually did not. All three possibilities directly implicate usability.

One case where usability issues impede privacy protection is the use of privacy policies, which are intended to inform consumers about pri-

2

Two recent CSTB reports explored these definitional issues: see National Research Council, Who Goes There: Authentication Through the Lens of Privacy, The National Academies Press, Washington, D.C., 2003; and National Research Council, Engaging Privacy and Information Technology in a Digital Age, The National Academies Press, Washington, D.C., 2007.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement