Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 62
Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop 10 Automated Policy Preference Negotiation Deirdre Mulligan I worked for a long time on the Platform for Privacy Preferences (P3P), which gives parents some control over the data collection practices at Web sites visited by their children. There are instances in which children disclose information about themselves that can be used to contact and communicate with them. P3P has no application in the context of limiting children’s access to pornography and other content that might be considered inappropriate. P3P is a project of the World Wide Web Consortium (which also developed the Platform for Internet Content Selection (PICS)), which enables Web sites to express privacy practices in a standard format. This means that a Web site can make an extensible mark-up language (XML) statement about how it uses personal data. The basic functionality of P3P is as follows. Say that a Web site collects information such as name, address, and credit card number for the purchase of goods, or it uses clickstream data (i.e., the data left behind when surfing a Web page) to target or tailor information on the Web site to your interests. On the client site, either through a browser or some plug-in to a browser, P3P allows individuals to set parameters for the types of Web sites their kids can visit based on the site’s data collection practices. For example, a child might try to enter a Web site that collects data from children and sells it—which is generally illegal in this country without parental consent, under the Children’s Online Privacy Protection Act (COPPA).1 The browser could be set up either to limit access to Web 1 COPPA, which regulates the collection of personal information from children under age 13, was signed into law in 1998 and went into effect in 2000.
OCR for page 63
OCR for page 64
Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop delivery2 will depend on the implementations. There are a wide variety of implementation styles, and it is unclear how the products will work. Part of it will be driven by consumer demand. Survey after survey has documented enormous public concern with privacy and a real anxiety about disclosing personal information, because people feel that Web sites are not forthright about what they do with data. A tool that allows people to gain better knowledge about how the data are used certainly may allow more personalization. Some people will choose personalization because they are comfortable having certain types of data collected; if data collection and the personalization it enables are done with the individual’s consent, it will advance privacy protection. If a Web site offers the news or sports scores, you might be comfortable telling it which state or county you live in, or your zip code, because the site provides a service that you think is worthwhile. But today you might be anxious about what the site does with the data. If there were a technical platform that allowed you to know ahead of time that only things you were comfortable with would be done with your data, then certainly it might facilitate personalization. But it would be personalization based on your privacy concerns and your consent to the data collection. With regard to the truth of a site’s privacy statements, the question of bad actors is one that we have in every context. There is nothing about P3P that provides enforcement, but it does provide for some transparency, which could facilitate enforcement. In this country, people who say something in commerce that is designed to inform consumers run the risk of an enforcement action by the FTC or a state attorney general if they fail to do what they’ve said. In other countries, there are similar laws prohibiting deceptive trade practices, and, in addition, many countries have laws that require businesses to adhere to a set of fair information practices designed to protect privacy. Collaborative filtering—a process that automates the process of “word-of-mouth” recommendations by developing responses to search queries based on the likes and dislikes of others who share interests, buying habits, or another trait with the searcher—is independent of P3P. I have not seen a discussion of its applicability in the privacy area. 2 Bob Schloss gave the hypothetical example of Sports Illustrated warning that some of its content shows people in skimpy bathing suits, and a user agent (or client) saying it does not want to see sites like this. Sports Illustrated could offer to present a subset of its content honoring the request. But why would the magazine go through such complex programming if only 10 people had user agents that could negotiate? To what extent would there be negotiations in which a site would either collect data or provide a subset of its function without collecting data?
Representative terms from entire chapter: