1. [Monali] was arrested on charges of assault and battery last year. She lives in a county that maintains all records of criminal charges for public inspection at the county courthouse.

  2. [David] was arrested on charges of assault and battery last year. He lives in a county that maintains all records of criminal charges at the county courthouse for public inspection and in an electronic database, to which any police officer or county official has access.

  3. [Andrea] was arrested on charges of assault and battery last year. She lives in a county that posts all criminal charges on the Internet. The Web page includes pictures and detailed profiles of all arrested.

Our intuitions about privacy in each of these situations reflect our answers to questions such as, How much privacy does the individual have in this situation?, Does David have more privacy than Andrea?, and so on. We can also ask how much privacy the individual should be granted in the situation.

One way to think about these vignettes is to imagine being asked a survey question about each vignette or even about yourself: How much privacy [does “Name”/do you] have? (a) unlimited, (b) a lot, (c) moderate, (d) some, (e) none? The imagined survey context helps to make the examples concrete and clarifies how they are to be read. Although such vignettes are often used for survey research, defining privacy from the bottom up does not involve administering a survey or necessarily asking these questions of others.

For each set of anchoring vignettes (denoting privacy in one specific context), different people will have different views about what thresholds delineate levels of privacy below which violations should be considered undesirable, unethical, illegal, or immoral. Agreement on normative issues like these will always be difficult to achieve. The anchoring vignette-based approach to privacy thus does not resolve all normative issues, but it helps to clearly define the playing field.

Note also that vignettes can be modified to illustrate different scenarios. For example, the above scenario can be modified by substituting “convicted” for “arrested on charges” and “convictions” for “charges.” It is likely that such changes might cause at least some people to reevaluate their answers.

the goals, location, type of technology, and data involved, and the conditions under which personal information is collected and used. Indeed, what constitutes privacy, what information should be private, and what individuals or institutions are posing potential threats to that privacy are all questions subject to considerable debate. A related set of questions involves the circumstances under which privacy can be seen to go too far. Under some conditions the failure to discover or reveal personal information can be harmful socially (e.g., in the case of potential for exposure to deadly contagious diseases or a person with a history of violent and abusive behavior).



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement