Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
1 ~ Technology-Based Tools for Users This chapter discusses tools for the end user, here the party respon- sible for making decisions on behalf of the child or youth in question. Thus, the "end user" may be a parent in the case of a home or family, a maker of school policy (e.g., a school principal or a school district), a maker of policy for a public library (or library system), or some other similar individual or body. The focus on tools for end users is important because such tools are intended to empower end users by providing a wide range of options for the children in their care, more or less regard- less of what other Internet stakeholders do or do not do. (The only excep- tion concerns instant help, in which the user is the child seeking help.) Table 12.1 provides a preview of this chapter. 12.1 FILTERING AND CONTENT-LIMITED ACCESS Filters are at the center of the debate over protecting children and youth from inappropriate sexually explicit material on the Internet. A good filter allows a specific filtering policy to be implemented with accu- racy, has enough flexibility, and can be implemented with a minimum of undesirable side effects. Box 12.1 describes the dimensions of choice that GetNetWise identifies for filters. 12.1.1 What Is Filtering and Content-Limited Access? Today, Internet access is largely unrestricted. That is, a user who does not take some explicit action to limit the content to which he or she is 267
268 YOUTH, PORNOGRAPHY, AND THE INTERNET TABLE 12.1 Technology-Based Tools for the End User Type of Tool Function One Illustrative Advantage Can be configured to deny access to substantial amounts of adult-oriented sexually explicit material from commercial Web sites One Illus Disadvan Filter Content-limited access Labeling of content Monitoring with individual identification Monitoring without individual identification Spam- controlling tools Instant help Block "inappropriate" access to prespecified content; typically blocks specific Web pages, may also block generic access to instant messages, e-mail, and chat rooms Allow access only to content and/or services previously determined to be appropriate Enable users to make informed decisions about content prior to actual access Examine a child's actions by an adult supervisor in real time or after the fact Watch the collective actions of a group (e.g., a school) without identifying individuals Inhibit unsolicited e-mail containing sexually explicit material (or links to such material) from entering child's mailbox Provide immediate help when needed from an adult Provides high confidence that all accessible material conforms to the acceptability standards of the access provider Separates content characteriza- tion (e.g., sexually explicit or not) from decisions to block; multiple content raters can be used Rarely prevents child from reaching appropriate material that might have been mistakenly flagged as . · . napproprlare Can provide useful information about whether or not acceptable use policies are being followed Can reduce the volume of inappropriate e-mails significantly Provides guidance for child when it is likely to be most effective, i.e., at time of need NOTE: The "end user" is generally the adult supervisor who makes decisions on behalf of a child. (This is true in all cases except for instant help, in which the user is the child seeking help.)
TECHNOLOGY-BASED TOOLS FOR USERS 269 One Illustrative Disadvantage In typical (default) configuration, generally denies access to substantial amounts of Web material that is not adult-oriented and sexually explicit Voluntary versus Involuntary Exposure Protects against both deliberate and inadvertent exposure for sites that are explicitly blocked; can be circumvented under some circumstances May be excessively limiting for those with broader information needs than those served by the access provider Effectiveness depends on broad acceptance of a common labeling framework Potential loss of privacy zone for child Does not enable individual account- ability for irresponsible actions Among users concerned about losing personalized e-mail, reduced tolerance for false positives that block genuinely personal e-mails incorrectly identified as spam Requires responsive helpers infrastructure of Very low possibility of deliberate or inadvertent exposure, given that all material is explicitly vetted Likelihood of exposure depends on accuracy of labels given by labeling party Warnings can help to deter deliberate exposure; ineffective against inadvertent exposure Warnings can help to deter deliberate exposure; less effective against inadvertent exposure Mostly relevant to inadvertent exposure (i.e., unsought commercial e-mail containing sexually explicit material) Mostly relevant to inadvertent exposure
270 YOUTH, PORNOGRAPHY, AND THE INTERNET
TECHNOLOGY-BASED TOOLS FOR USERS 271 exposed has access to any content that the Internet provides through Web pages, e-mail, chat rooms, and the like. This report uses the term "filter" to refer to a system or service that limits in some way the content to which users may be exposed. The vast majority of filters block access to content on specific Web sites (though these sites may be specified as a class). Other filters also block access on the basis of keywords appearing either in a user's query to a search engine or contained in the about-to-be dis- played Web sited Some filters provide the capability to block more broadly, so that an individual may be denied access to other common Internet services, such as interactive services (e.g., e-mail, chat rooms), Usenet newsgroups, file downloading, peer-to-peer connections, or even e-commerce with credit card usage. Users who wish to use a filter have a number of technical options: · Client-side filters. Filters can tee installed on the devices (today, desktop and laptop personal computers) that serve as the Internet access point for the end user. Client-side systems are installed as local software, in the same way that any other software is locally installed, except that standard uninstallation (which would disable the filter) can be done only by someone with the appropriate password. The party with the appropri- ate password is also generally responsible for configuring the profile of the system for those they are seeking to protect from inappropriate mate- rials. A common personal use is a client-side filter installed at home by a parent wishing to protect children from inappropriate material. Client- side filtering is also feasible in an environment in which only some access points in a local area network must be filtered for example, in a library attempting to segregate "children's areas" from areas for all patrons. · Content-limited Internet service providers. As a feature of their offer- ings, a number of ISPs provide Internet access only to a certain subset of Internet content. Content-limited ISPs are most likely to be used by orga- nizations and families in which the information needs of the children involved are fairly predictable. Today, such dedicated ISPs are a niche 1The actual content of a list of such keywords is usually held as proprietary information by the vendor of the filter. However, such lists include a variety of "four-letter words" associated with sex, reproduction, and excretory functions, words such as "sex," "naked," and so on. Other words that might be singled out for inclusion include "bomb," "bond- age," "fetish," "spunk," "voyeurism," "babe," "erotica," "gay rights," "Nazi," "pot," "white power," "girls," and "hard-core pornography." (These examples are taken from Lisa Guern- sey, 1999, "Sticks and Stones Can Hurt, But Bad Words Pay," New York Times, April 9.) Also, a more sophisticated form of filtering is based on the analysis of word combinations and phrases, proximity of certain keywords to certain other keywords, the presence of various URLs, and so on. In some cases, text-based analysis may also be combined with the analysis of images on the Web page in question. As a rule, tools based on this more sophisticated filtering are not broadly marketed today.
272 YOUTH, PORNOGRAPHY, AND THE INTERNET market and typically have subscriber bases in the thousands to tens of thousands. All subscribers which are often families and less often insti- tutions are subject to the same restrictions. Some content-limited ISPs, intended for use by children, make available only a very narrow range of content that has been explicitly vetted for appropriateness and safety. Thus, all Web pages accessible have been viewed and assessed for content that is developmentally appropriate, educational, and entertain- in~. (This approach is known as "white listing" all content from sources tJ ~ 1 1 tJ · . · ~ . - · . - .. . ~ . - . · ~ · - ~ ~ ~ ~ Ad. not on a white list are d~sa110wed,~ as discussed In beckon 2.~.-1.) That rooms and bulletin boards are monitored for appropriate content, and those violating rules of chatting or message posting are disinvited, forc- ibly if necessary. E-mail and instant messages (IMs) can be received only from specified parties and/or other users of the system. Other content- limited ISPs, intended for use by both children and adults, allow access to all content that is not explicitly designated by the ISP as inappropriate. Monitoring and limits on e-mail are less strict or non-existent. Some services allow multiple "login names" or "screen names." A screen name is an online identity, similar to a CB radio "handle." Each online session uses a single screen name, and families can choose not to give the adult or "administrative" screen name password to youth. An online account may have multiple screen names, and a user with appro- priate privileges (usually associated with paying for the master account) can create arbitrary screen names at will for himself or someone else on his account as long those names are not already in use. With each name can be associated unrestricted access or more limited access to online content (which may include both Internet and proprietary content). In the case of America Online (AOL), a parent can identify the age of the child for whom he or she is setting up a screen name. AOL then puts into place default limitations based on the age of the child, which the parent can then adjust if necessary. Such limitations might include, for example, Web access only to age-appropriate content or to everything except ex- plicitly mature themes, receipt of e-mail only without file attachments or embedded pictures, and access only to chat rooms intended for children (or no chat room access at all). · Server-sidefilters. Server-side filtering is useful in institutional set- tings in which users at all access points within the institution's purview 2Note that sources on a white list can be specified in advance or identified as appropriate because a source contains one or several "good words" that may be found on a "good- word" list. For an example of the latter, see Gio Wiederhold, Michel Bilello, Vatsala Sarathy, and XioaLei Qian, "A Security Mediator for Health Care Information," pp. 120-124 in Pro- ceedings of the 1996 American Medical Informatics Association Conference, Washington, D.C., October.
TECHNOLOGY-BASED TOOLS FOR USERS 273 must conform to the access policy defined by the institution. Server-side filtering might be used by a school district that provides Internet service to all schools in the district or a library system that provides Internet service to all libraries in the system. Server-side filters are located on systems other than the client.3 An institution may contract with an ISP to implement its filtering policy, or it may install a filter in the server that manages a local area network (e.g., that of a school district or a library system).4 (Note that there are no fundamental technological impediments to having filtering policies that differentiate between schools within a district, so that a high school might operate under a policy different from that for an elementary school. Such differentiation is simply a matter of cost.) · Search enginefilters. In a special class of server-side filters are those that are today part of major search engines such as Google, AltaVista, and so on. Each of these search engines has the capability of enabling an Internet safety filter (hereafter "filtered search engined. When activated by the user, these filters do not return links to inappropriate content found in a search, but they also do not block access to specifically named Web sites (so that a user knowing the URL of a Web site containing inappropri- ate sexually explicit material could access it).5 Other search engines are explicitly designed for use by children. For example, Lycos and Yahoo both offer a special kids-oriented version of their general-purpose search engine that restricts the universe of a search to child-appropriate areas. (This is the white-list approach.) Filters can be used to block certain incoming inappropriate informa- tion (an application that is the most common use of filters), to block access to certain Internet services (e.g., file downloads), or to block selected out- 3The distinction between server-side filtering and content-limited ISPs is not a technical one, because content-limited ISPs use server-side filters. Rather, the point is that server- side filtering provides a degree of institutional customization that is not possible with con- tent-limited ISPs, which tend to offer one-size-fits-all filtering policies. 4The use of server-side filters may degrade performance. In particular, a server-based filter may rely on proxy servers that are unable to take advantage of the caching tech- niques that are often used by major Internet providers to speed the retrieval of commonly requested pages. Such a filter would be forced to retrieve information from its host server and take whatever performance hit that might entail In other cases, performance is im- proved because without irrelevant material taking up space in the cache, retrieval of rel- evant material is faster. 5In practice, a responsible adult would set the filtering provision to the on setting, and save the configuration. Thereafter, other requests on that client to the search engine would encounter the on setting. The setting can also be turned off through entering a pass- word known only to the individual who initially set it ta possible problem if that person is the teenager in the household who manages the family's information technology'.
274 YOUTH, PORNOGRAPHY, AND THE INTERNET going information. All technology-enforced methods for blocking access to inappropriate information require a determination that certain pieces of content are inappropriate.6 Content can be deemed inappropriate on the basis of the methods discussed in Section 2.3.1. Many filter vendors establish lists of "suspect" Web sites (compiled as a list of specific URLs and/or IP addresses) deemed sources of inap- propriate content.7 The number of such sites may range from several hundred thousand to 2 million. In addition, many vendors establish lists of keywords (typically hundreds of words) that represent inappropriate content. Far fewer employ image analysis or statistical techniques to analyze text. In addition, techniques for textual and image analysis can be used to identify and block e-mail containing inappropriate content and for block- ing outgoing content as well. For example, the technology that identifies inappropriate content by searching for keywords can also prevent those words (or some set of them) from being used in e-mail messages or in chat rooms. (In this case, the adult supervisor can augment the keyword list to include certain phrases that should not appear, such as specific addresses or phone numbers.) Filters are perhaps the most widely deployed of all technological tools intended to protect children from exposure to inappropriate material. The majority of schools have deployed filters,8 while around 25 percent of 6Note that content that is transmitted through certain channels such as attachments to e- mail, videoconferences, instant messages, or peer-to-peer networking (in a Gnutella-like arrangement) is very difficult (arguably impossible) to block selectively, though a filter can block all interaction through these channels. Moreover, to the extent that the content of traffic is determined interactively, neither labeling nor sites are likely to provide a sufficient basis. The reason is that interactive sources, almost by definition, can support a variety of different types of interaction the best example of which is an online friend with whom one may exchange sports trivia, conversation about school homework, and inappropriate sexu- ally explicit material. Only real-time content recognition has a chance of filtering such content. 7Note also that the list of blocked sites often includes sites that could help users circum- vent the basic filtering. Thus, sites providing information on how to circumvent filters are often included on the list, and a number of filters block sites that allow language translation (Seth Finkelstein and Lee Tien, 2001, "Blacklisting Bytes," white paper submitted to the committee, available online at <http://www.eff.org/Censorship/Censorware/20010306_ eff_nrc_ paperl.html>) or access to Web archives (Seth Finkelstein, 2002, The Pre-Slipped Slope Censorware vs. the Wayback Machine Web Archive, available online at <http://sethf. com/anticensorware/general/slip.php>~. 8According to the National Center for Education Statistics, nearly three-fourths of all schools use blocking or filtering software. See A. Cattagni and E. Farris. 2001. Internet Access in U.S. Public Schools and Classrooms: 1994-2000. NCES 2001-071. Office of Educa- tional Research and Improvement, U.S. Department of Education, Washington, D.C. Avail- able online at <http://www.nces.ed.gov/pubs2001/internetaccess/>.
TECHNOLOGY-BASED TOOLS FOR USERS 275 libraries filter at least some workstations.9 Through AOL's parental con- trols (Box 12.2), a substantial number of Internet-using children enjoy the benefits and endure the costs of filtering. However, as a percentage of all children using the Internet, the fraction whose Internet access is filtered apart from school usage is small.l° It is noteworthy that filters are increasingly common in corporate and business settings and thus affect the Internet use of adults.l1 Many com- panies, driven primarily by concerns about productivity and time wasted on non-business Internet activities and about the possible creation of hos- tile work environments and the consequent liability, use filters to prevent inappropriate use of company IT facilities.l2 12.1.2 How Well Does Filtering Work? Denying access to inappropriate material Trough technological means, filters are intended to protect against both inadvertent and deliberate ac- cess. However, as discussed in Section 2.3.1, all filters are subject to over- blocking (false positives, in which filters block some anurouriate material 1 1 1 9By contrast, around 57 percent of public libraries do not filter Internet access on any work- station, while about 21 percent filter access on some workstations. About 21 percent filter all workstations. See Norman Oder. 2002. "The New Wariness," The Library Journal, January 15. Available online at <http://libraryjournal.reviewsnews.com/index.asp?layout= article& articleid=CA188739>. 10A survey conducted by Family PC magazine in August 2001 found that of 600 families surveyed, 26 percent used parental controls of some kind. About 7 percent of those using parental controls (about 1.8 percent of the total) used off-the-shelf store-bought filtering packages. The rest used filtering offered by an Internet service provider. (This study is not available in print, because it was scheduled for publication in October 2001, and Ziff Davis, the publisher of Family PC, terminated the magazine before that issue was printed.) 1lFor example, a survey taken by the American Management Association in 2001 found that 38 percent of the firms responding do use blocking software to prevent Internet con- nections to unauthorized or inappropriate sites. Seventy eight percent of the responding firms restricted access to "adult" sites with explicit sexual content, though it is not clear how the remaining 40 percent are enforcing such restrictions. (The survey suggests that they are doing it by actively monitoring Internet use.) See American Management Associa- tion. 2001. 2001 AMA Survey, Workplace Monitoring and Surveillance: Policies and Practices. Available online at <http://www.amanet.org/research/pdfs/emsfu_short.pdf>. 12Potential overlap between the business market and the school and library filtering mar- ket raises the following operational concern: a blocked category may be defined by a ven- dor so that it is appropriate in a business environment, but that definition may not be appropriate in a school or library context. For example, information about sexually trans- mitted diseases, safe sex practices, and pregnancy may not be necessary in most business environments (and hence an employer may have a legitimate business reason for blocking such information), but many would argue that older students using school facilities should not be blocked from receiving such information.
276 YOUTH, PORNOGRAPHY, AND THE INTERNET
TECHNOLOGY-BASED TOOLS FOR USERS 277 from Me user) and underblocking (false negatives, in which filters pass some inappropriate material to We user). While the issue of underblocking and overblocking should not, in and of itself, rule out filters as a useful tool, the extent of underblock~ng and overblock~ng is a significant factor in un- derstanding and deciding about the use of filters.l3 There is no agreed-upon methodology for measuring a filter's effec- tiveness, as might be indicated by an overblocking rate and an under- blocking rate (discussed in Section 2.3.1~.14 Filter vendors sometimes pro- vide estimates of overblock and underblock rates, but without knowing the methodology underlying these estimates, the cautious user must be concerned that the methodology is selected to minimize these rates. (The discussion in Box 2.7 illustrates some of the problems in estimating these rates. Note further that the lists of blocked Web pages change constantly, with both additions and subtractions made regularly.) Underblocking results from several factors: · New material appears on the Internet constantly, and the contents of given Web pages sometimes change. When content changes, the judg- ing parties must revisit the sources responsible for the content they pro- vide frequently enough to ensure that inappropriate information does not suddenly appear on a previously trusted source or that the inappropriate material remains on the Web pages in question. Technology is available that can indicate if a page has changed (thus flagging it for human assess- ment), but not to tell if it continues to be developmentally and education- ally appropriate. Vendors of filtering systems generally provide updates from time to time, but there is inevitably a lag between the time inappro- priate material first appears and the time that item is entered into the list of blocked sites. (Content-based filtering systems are not subject to this particular problem.) · The algorithms (i.e., the computational techniques) used to iden- tify inappropriate material are imperfect. For example, the emergence of 13Note also that legal challenges brought against the mandated use of filters in institu- tional settings have relied significantly on the existence of underblocking and overblocking as inherent flaws in the technology that make filters unsuitable for such use. 14For "bake-offs" comparing Internet filters, see Christopher D. Hunter, 2000, "Internet Filter Effectiveness: Testing Over and Underinclusive Blocking Decisions of Four Popular Filters," Social Science Computer Review 18 (2, Summer), available online at <http:// www.copacommission.org/papers/filter_effect.pdf>; Karen J. Bannan, 2001, "Clean It Up," PC Magazine, September 25, available online at <http://www.pcmag.com/article/ 0,2997,a%253D12392,00.asp>; and "Digital Chaperones for Kids," Consumer Reports, March 2001. For a critique of the Consumer Reports analysis, see David Burt, 2001, "Filtering Advo- cate Responds to Consumer Reports Article," February 14, available online at <http:// www. politechbot.com/p-01734.html>.
278 YOUTH, PORNOGRAPHY, AND THE INTERNET new slang for sexual acts will thwart filters based on keyword recognition until the new slang is incorporated into the filtering criteria. Another possibility is that the use of clothed people having sex may thwart image- recognition algorithms based on the assumption of searching for naked people. · Sites with adult-oriented content are often given names that are close in spelling to the names of legitimate sites. For example, <http:// www.whitehouse.com> is often reached by people intending to reach <http://www.whitehouse.gov>. While most filtering programs have fixed this specific problem, close-but-not-identical names can non un for a large number of legitimate general-purpose sites. Overblocking arises from three factors: 1 1 1 · Content may be less than clear-cut, especially in the context of machine-assisted understanding. As noted in Chapter 2, both text and images can be ambiguous.l5 Moreover, the precise definition of what should be blocked is inevitably subject to the vagaries of individual judg- ments. For reasons discussed below, filter vendors generally have more incentives to block ambiguous information than to allow it to pass, a fact that leads to overblocking. · Information on the Internet is updated constantly. Thus, a site that may have been blocked for good reason in the past may post new infor- mation that does not meet the criteria for blocking. As in the case of underblocking, until records of the site are updated in the filter, that filter will continue to mark such sites as inappropriate even if the information contained therein is perfectly innocuous. 15Two particularly egregious examples include Beaver College and online biogra- phies of individuals who have graduated magna cum laude. Beaver College in Penn- sylvania recently changed its name to Arcadia College because its name was being filtered out ("beaver" has crude sexual connotations in American English slang). Bea- ver College spokesman Bill Avington was quoted in Wired as saying, "We have a lot of evidence that people aren't able to get our information in high schools because of Web filters in the libraries" that block out sites with "Beaver" along with other presumed smut words. He continued, "With so many people using the Net as the initial means to look at colleges, that's a serious disadvantage." In addition, he claimed that filters sometimes block e-mail from Beaver college staffers to prospective students. (See Craig Bicknell, 2000, "Beaver College Not a Filter Fave," Wired, March 22, available online at <http://www.wired.com/news/politics/0,1283,35091,00.html>; and CNN story, 2000, "Beaver College Changes Oft-derided Name to Arcadia University," November 20, available online at <http: / /www.cnn.com/2000 /US/11/20 /embarrassingbeaver.ap / >. The "magna cum laude" problem was demonstrated when filtering software blocked access to all biographies of COPA Commissioners who had graduated magna cum laude (see <http: / /www.cdt.org/speech/filtering/001002analysis.shtml>~.
TECHNOLOGY-BASED TOOLS FOR USERS 279 · A site (or a given Web page) may contain both appropriate and inappropriate material. If the filter cannot separate appropriate from inappropriate, it will usually block (overblock) the entire site or page. The above three factors are basic to the fundamental imperfection of the filtering process. A fourth factor that can lead to overblocking results from the ways in which some filtering systems are implemented. If a filter blocks sites on the basis of the IP addresses of adult-oriented, sexu- ally explicit sites, and those sites are hosted on a server that makes use of IP-based virtual hosting (described in Chapter 2), other non-adult sites hosted on that server (and sharing those IP addresses) will be blocked.l6 Note an important distinction between overblocking and an overly broad scope of blocking (i.e., an overly broad blocking policy). Over- blocking is inadvertent and results from the inability of the automated systems for blocking to perfectly track human decision making. The model human decision maker, examining overblocked material, would conclude that the material should in fact have been free to pass. An example would be a search for "beaver dams" that results in pages being blocked because the word "beaver" is present on the page. An overly broad policy is more subjective, and results from a dis- agreement between the end user and the human decision maker about what information the end user should be allowed to receive. From the 16The magnitude of overblocking due to IP-based virtual hosting is unclear. One esti- mate (Art Wolinksy, 2001, "FilterGate, or Knowing What We're Walling In or Walling Out," MultiMedia Schools, May/June, available online from <http://www.infotoday.com/ mmschools/mayOl/wolinsky.htm>) suggests that such overblocking far outstrips over- blocking for other causes. However, a number of factors should be considered in assess- ing the potential impact of IP-based virtual hosting: · Most large sites are not hosted on virtual hosting services. Furthermore, large sites tend to be more heavily promoted and are often more likely to appear in a prominent position in a search engine's result list. Thus, large sites which typically account for the Web requests of most users are much less likely to be improperly blocked than smaller sites. · Many virtual hosting services ban adult-oriented, sexually explicit material and other material that they regard as offensive as well, and they enforce their acceptable use policies vigorously. Thus, the amount of sexually explicit material hosted overall by such services is likely to be small. (But, if such a service does host even one site containing inappropriate sexually explicit material and that fact is picked up by a filtering vendor that uses IP-based filtering, it will exclude all of the acceptable sites on that host. All of the acceptable sites that are improperly blocked will stay blocked until the hosting service eliminates the inappropri- ate site and the fact of elimination is communicated to the vendor.) · Different implementations of filtering (e.g., use of name-based filtering) can lead to the same intended result without the overblocking caused by IP-based filtering. As a rule, the primary reason for wishing to use IP-based filtering is technical when a hosting service is used primarily for adult-oriented, sexually explicit material, IP-based filtering reduces the amount of storage and processing needed by the filter.
280 YOUTH, PORNOGRAPHY, AND THE INTERNET perspective of the end user, a certain piece of material is blocked inappro- priately. However, upon examination of that blocked material, the hu- man decision maker concludes that the blocking decision was proper. For example, a student may wish to search for information on marijuana. But Web sites containing the word marijuana may be blocked because of a policy decision to block information about drugs.l7 The effectiveness of a filter also depends on whether its use is en- forced at all sites available to a child. In a specific venue, filters will block some material that some parties deem inappropriate and there is a rea- sonable argument to be had over whether the blocking that occurs is worth the cost of overblocking. But it is impossible for a filter deployed in a school to block material sought in a cyber-cafe or at home, and filtering limited to schools and libraries will not prevent the access of children to inappropriate sexually explicit material if they are determined to search for it and have other venues of access. The most common unfiltered venues are home Internet access or Internet access provided at a friend's home. (Filtering at home is not the norm,l8 even though a significant fraction of U.S. youth do have Internet access at home,l9 a point well represented by the adolescents to whom the committee spoke during its site visits.) Filters that are not based on real-time content-based identification of inappropriate content can be circumvented by users in a number of ways,20 both direct and indirect: 17The distinction between overblocking and an overly broad scope of blocking is further complicated by the fact that from time to time, a given site can be used for multiple pur- poses. Most filters include adult-oriented Web sites in their "to be blocked" categories. However, a high school student undertaking, for example, a study of the economics of the adult online industry might have an entirely legitimate purpose for seeking access to such sites. More generally, any student wanting to study a controversial issue and needing to consult sources for different sides of an argument may well find that advocates of one point of view or another are blocked because they are regarded as "inappropriate" where, in practice, "inappropriate" is likely to mean "controversial." 18 See footnote 10. 19According to Grunwald Associates, 17.7 million children aged 2 to 17 had Internet access from home in the last quarter of 1999. (The Web site <http://cyberatlas.internet.com/big_ picture/demographics/article/0,,5901_390941,00.html> provides a summary of the Grun- wald study. The full study is available online at <http://www.grunwald.com/survey/ index.htm>.) The U.S. Census indicates about 65.7 million children in the United States in this age bracket, for a percentage of about 27 percent. 20Many filtering products, especially those on the client side, are easily breakable by knowledgeable users. See Michael J. Miller, 2001, "When Does Web Filtering Make Sense?," PC Magazine, September 25, available online at <http://www.pcmag.com/ article / 0,2997,s%253D1499%2526a%253D12632,00.asp>.
TECHNOLOGY-BASED TOOLS FOR USERS 281 · It may be possible to defeat the filter itself. When this occurs, a specific Web page that the filter should block is made accessible to the user. Defeating the filter may sometimes be accomplished by: Uninstalling the filter; Obtaining the privileges needed to disable the filter. While such privileges are intended for the parental supervisor (and are usually pass- word-enabled), the ability of youth to obtain privileges is not uncommon in households in which the resident teenager serves as the de facto system administrator because of superior technical knowledge); Accessing the Web page indirectly through a proxy server,21 a translation service, an anonymizing service, or some other route; Finding a click route to the page other than the one that was di- rectly blocked; and/or Manipulating the reload/refresh and back/forward keys. Note that defeating a filter can be more difficult when the filter is server-based, because the circumventer does not have direct access to the system on which the filter resides. Further, note that because a child- oriented content-limited ISP would most likely be chosen by families in- terested in filtering for fairly young children (say, 10 and younger), the likelihood that the ISP's restrictions could be circumvented is substan- tially lower than it would be if users included older youth. In addition, inappropriate material (sexually explicit and otherwise) can flow to a child through routes other than Web sites peer-to-peer file transfers such as those available through Gnutella, e-mail attachments, and so on. While some filters can be set to block the use of such routes, such blockage is indiscriminate and insensitive to the content carried on these routes. · The user can obtain information that is generically similar to the information on the blocked Web page. As a general rule, information not associated with a specific source is resident on many locations on the World Wide Web, and the likelihood of all of those locations being blocked is low. These comments apply particularly to sexually explicit material, especially material containing images. An individual seeking explicit images for the purpose of sexual arousal is not particularly sensitive to which of hundreds or thousands of images on as many Web pages can be retrieved. 21A proxy server is a server that happens to be accessible from the client machine. The use of a proxy server, which can channel all requests "around" a server-side filter, can enable circumvention. Many local area networks, however, are configured in such a way as to prevent the use of proxy servers.
282 YOUTH, PORNOGRAPHY, AND THE INTERNET It is also true that the content provider could provide ways of circum- venting filters. For example, misspelled sexual words (e.g., "pormography," "dicck," "Orgy") may be used in a site's metatags to circumvent filters that search for keywords. As a general rule, though, commercial vendors of sexually explicit material argue that they are not economically motivated to expend a lot of effort to get through these filters, because children are unable to pay for such material. Those providing other types of content without commercial intent may be more likely to attempt to circumvent filters. Many of these methods of circumvention do not apply to filters that are based on real-time content-based identification of inappropriate con- tent. However, filters that are based on real-time content-based identifi- cation are not nearly as commonly available as filters based on lists of inappropriate sites and keywords. Furthermore, the technology of con- tent-based identification is relatively sophisticated compared to that re- quired for developing lists of sites and keywords, and hence is more difficult to implement properly. The effectiveness of label-based filters depends on the ubiquity of labels for Internet content and the willingness of the user to decide which labels indicate content that should be blocked. Label-based filters, such as those that incorporate PICS-compliant labels, are built into the major Web browsers, Internet Explorer (IE) and Netscape. However, PICS-compliant labels are not in wide use as of this writing (May 2002; but see Section 12.1.5~. Both IF and Netscape provide the user with the option of allow- ing or blocking unlabeled material, with the consequence that users of these browsers can either have access only to a very small segment of the Web (if unlabeled material is blocked) or enjoy minimal protection from inappropriate material (if unlabeled material is allowed). For this reason, label-based filters today do not work particularly well in reducing expo- sure to inappropriate material unless one is willing to tolerate a very high rate of overblocking. Whether they will work more effectively in the future depends on the extent to which Internet content will be labeled. While filters are designed to reduce children's access to inappropriate material on the Internet, there are some interesting psychological and social phenomena related to their use. In most of the schools and libraries that the committee visited, teachers, librarians, and administrators told the committee that filters played a very small role in protecting students and library users from inappropriate material, largely because most of these students and library users had unfiltered Internet access somewhere else (usually at home). (Of course, for the significant fraction of students without non-school access, such comments did not apply.22) Neverthe- 22U.S. public schools are increasingly providing Internet access to students outside regular school hours. For example, 80 percent of public secondary schools provided such a service in
TECHNOLOGY-BASED TOOLS FOR USERS 283 less, the school or library filter served a useful political purpose in fore- stalling complaints from the community about "public facilities being used for shameful purposes."23 In virtually every school the committee visited, avoiding controversy and/or liability for exposing children to inappropriate sexually explicit material was the primary reason offered for the installation of filters.24 In a public library setting, filters have also been used to prevent the display of material that would be regarded as offensive to other patrons walking by. For example, one technique used to shock other patrons is to display an adult-oriented Web site on a public Internet terminal and to "hide" it behind the terminal's screen saver (which places some innocu- ous image on the screen on top of whatever is on the screen). When an unsuspecting user clears the screen saver image, he or she is suddenly surprised by a sexually explicit image.25 Teachers and librarians can derive substantial benefit from filters. For example, most schools and libraries have acceptable use policies (AUPs, as discussed in Chapter 10) that forbid use of school or library computer resources for certain purposes, such as viewing sexually ex- plicit sites. In the absence of a filter, a teacher or librarian must confront the user and inform him or her that such use violates the AUP. For many, such confrontations can be unpleasant and can provoke anxiety. To the 2000. In addition, schools with high minority enrollments provided Internet availability out- side of regular school hours more frequently than schools with lower minority enrollments (61 percent versus 46 percent), a figure consistent with the notion that minority students may rely on schools to provide access more than do non-minority students. See A. Cattagni and E. Farris. 2001. Internet Access in U.S. Public Schools and Classrooms: 1994-2000. NCES 2001-071. Office of Educational Research and Improvement, U.S. Department of Education, Washing- ton, D.C. Available online at <http://www.nces.ed.gov/ pubs2001/internetaccess/>. 23Indeed, in one community, the public library system provided filters for 10 to 20 per- cent of its Internet access points but made no special attempt to guide children toward these filtered workstations. Nevertheless, the presence of these filters on 10 to 20 percent of its workstations was sufficient to allow it to assert to the community that "the library provides filtered access," an assertion that seems to have met the concerns of local government. 24In the site visits of the committee, committee members explicitly avoided leading ques- tions regarding the motivation for use. So, when teachers said, "Our school has filters" (which was true in all schools visited), committee members asked, "Why do you have them?" "What is the benefit having filters?" It is in this context that teachers said "to reduce exposure to liability." For the most part, the committee believes that given the overall context of all of the comments received in this manner (e.g., the accessibility of the Internet in unfiltered non-school venues for a large number of students), the avoidance of liability was indeed a primary or at least a very important reason for having filters in schools. Nevertheless, the committee recognizes the possibility that responders felt the protection benefits were so obvious so as not to need mentioning. 250f course, a filter is not the only way to solve this particular problem it would be almost as effective to install software that would clear the browser cache and return to the library's home page after a short period of inactivity.
284 YOUTH, PORNOGRAPHY, AND THE INTERNET extent that a filter reduces the possibility of a student or library patron viewing such sites, it also reduces the frequency of such confrontations. In addition, given community pressures for teachers and librarians to supervise or monitor the use of Internet resources by students and library users, filters reduce the burden on teachers and librarians to police usage and free time for other, more productive activities. Finally, many teach- ers and librarians are themselves uncomfortable in viewing certain types of inappropriate material26 and in the committee's informal discussions in its site visits, this was especially true for many sexually explicit images. Even apart from the claimed benefits of preventing exposure to inap- propriate material, filters can offer children other benefits. In the school environment, teachers reported that filters helped them to keep students "on task" while doing school-related Internet work by reducing the dis- tractions that might otherwise be available to them (the students); Box 12.3 provides some data on the extent to which filters may keep students on task. A number of younger students with whom the committee spoke during various site visits thought that the parental use of filters (generally AOL's parental controls) was a positive indication of their parents' con- cern, independently of whether they felt the filters were effective. (Ac- cording to the Kaiser Family Foundation, about two-thirds of teenagers and young adults support the Children's Internet Protection Act when provided with a description of it. This view does not vary among those who go online a lot or who have been denied access to Web sites because of filtering.27) Because they confine the user only to material explicitly considered appropriate, child-oriented content-limited ISPs provide the greatest de- gree of protection for children. By design, their approach seeks to mini- mize underblocking at the expense of overblocking all questionable ex- posure is blocked or at least monitored. For example, they evaluate for educational or developmental benefit every Web page that is accessible to a child. Under these circumstances, the likelihood of exposure to inap- propriate content is very low, especially with respect to sexual imagery. Interactive dialog in chat rooms and on bulletin boards is monitored, so that the first posting or sending of inappropriate messages can be cen- sured. (Such censure also provides observers with a lesson in the conse- 26In a preliminary finding issued in May 2001, the Equal Employment Opportunity Com- mission found that pornography downloaded on library computers was intended to create a sexually hostile work environment for a group of Minneapolis librarians. See Michael Bartlett, 2001. "No Internet Filtering Is Sex Harassment for Librarians EEOC," Newsbytes, May 25. Available online at <http://www.newsbytes.com/news/01/166171.html>. 27Victoria Rideout. 2001. Generation Rx.com: How Young People Use the Internet for Health Information. The Henry J. Kaiser Family Foundation, Menlo Park, Calif. Available online at <http: / /www.kff.org/content/2001 /20011211a/GenerationRx.pdf>.
TECHNOLOGY-BASED TOOLS FOR USERS 285
286 YOUTH, PORNOGRAPHY, AND THE INTERNET quences of such behavior.) The identities of e-mail and IM senders is not monitored, but because they are restricted for the most part to users of the service, the universe of those who might engage a child in e-mail or IM dialog is much smaller than on the entire Internet. Perhaps of equal or greater benefit, at least from the perspective of some adults, is that the content accessible to kids has definite educational or developmental value, rather than being simply not inappropriate. 12.1.3 Who Decides What Is Inappropriate? Filtering is based on the premise that a party or parties other than the child himself or herself decides what content is inappropriate. In general, the first pass at determining potentially inappropriate content is the ven- dor of the filter or the content-limited ISP. For those who choose to accept without change this determination (as is the case with subscribers to con- tent-limited ISPs or those who do not wish to customize further), this initial determination stands. For example, a school that installs a filter without additional customi- zation accepts the determination of the vendor about what is or is not appropriate. Even if it does undertake customization, it uses the vendor's determination as its point of departure, and detailed editorial control on a site-by-site basis for all sites in the vendor's database is not possible in practice. To accommodate those who wish to customize the characterization of inappropriate material for their own individual or institutional needs, filter vendors usually offer two options (which may be combined or imple- mented by themselves): · The characterization of inappropriate material can be divided into content categories with labels such as pornography, sex education, hate speech, violence, weapons, cults, and so on. (For example, a list of inap- propriate URLs or keywords flagging inappropriate content might be grouped into such categories.) These filters then provide the user with the option of blocking or accepting content by category, so that a user can, for example, block only pornography and hate speech while accepting all other content categories. Category-by-category blocking obviously re- duces the list of blocked URLs to a subset of the original list. · Some filters enable the end user (or, more precisely, the end-user supervisor who knows the appropriate password) to create a local "ex- ceptions" list, which specifies additional URLs to be blocked or URLs that should be allowed even if they are on the blocked list. For example, despite the filter, a child may view a URL that the supervisor deems inappropriate. A filter with this option enables the supervisor to add that
TECHNOLOGY-BASED TOOLS FOR USERS 287 URL to the list of blocked sites. If a child is blocked from viewing a particular site, it may be apparent from the description accompanying the original link (e.g., as displayed by a search engine) or from the site name in the link that the site should not be blocked. In this case, a "user over- ride" can be used to unblock the site. For practical purposes, the number of such sites that are either added to or subtracted from the original list is small compared to the size of the original list. The vendor's characterization of inappropriate content is quite sig- nificant, as it is at the very least the primary point of departure for a user's customization (described below) even when such customization is pos- sible.28 Filter vendors have many incentives to err on the side of over- blocking and few to err on the side of underblocking. Based on its site visits, the committee believes that the reason is that schools and libraries, which are the largest users of filters for purposes of this report, tend to receive many more complaints from parents and the community about sites that are not filtered (i.e., complaints about underblocking) than about sites that are filtered improperly (i.e., complaints about overblocking).29 28One concern raised by analysts such as Nancy Willard is that filter vendors sometimes have strong connections to religious organizations, and that the social and cultural values espoused by these organizations may drive the vendor's characterization of inappropriate content. For example, Willard finds that "most of the companies have filtering categories in which they are blocking web sites . . . known to be of concern to people with conservative religious values such as [Web sites involving] non-traditional religions and sexual orienta- tion in the same category as material that no responsible adult would consider appropriate for young people." She also notes that "because filtering software companies protect the actual list of blocked sites, searching and blocking key words, blocking criteria, and blocking processes as confidential, proprietary trade secret information it is not possible to prove or disprove the hypothesis that the companies may be blocking access to material based on religious bias." At the same time, Willard finds that while "information about the religious connections can be found through diligent search, such information is not clearly evident on the corporate web site or in materials that would provide the source of information for local school officials," though she acknowledges openly that "it is entirely appropriate for conser- vative religious parents or schools to decide to use the services of an ISP that is blocking sites based on conservative religious values. It is equally appropriate for parents to want their children to use the Internet in school in a manner that is in accord with their personal family values." See Nancy Willard, 2002, Filtering Software: The Religious Connection, Center for Ad- vanced Technology in Education, College of Education, University of Oregon, available online at <http: / /netizen.uoregon.edu/documents/religious2.html>. 29As with so many other "null" observations, the absence of complaints about over- blocking can be interpreted in many ways. One interpretation is, of course, that over- blocking simply does not occur very much (and/or that filters do not block a great deal of useful and appropriate information). But information collected through site visits is not consistent with this interpretation, and testimony to the committee suggests other explana- tions as well. For example, the relative lack of complaints may be partly explained by the
288 YOUTH, PORNOGRAPHY, AND THE INTERNET In the various site visits conducted by the committee, only a few stu- dents or parents reported making a formal complaint to the school about a site needed for research or schoolwork that was blocked by a school's filter, even though they (mostly high school students) often reported that infor- mation on the blocked sites might have been useful for legitimate academic research purposes.30 (The same was not true with most teachers, who reported that educationally relevant sites were blocked regularly. Still, in a number of cases, they were able to use their supervisory privileges to ob- tain access to blocked sites.) And, given that schools and libraries install filters largely to forestall complaints, it is clear that filters that do not gener- ate complaints would be highly preferred. As for label-based filters, the labeling party can be either the content creator or any third party. However, it is the adult or adults directly responsible for determining what a child should or should not see who make the actual decision about how content labeled in a certain manner should be handled. Because the vendor's philosophy regarding inappropriate material is the primary determinant of what will and will not be blocked, trust is a fundamental element in the user's selection of a filter. That is, the user places considerable trust in the vendor's judgment about what is and is not appropriate (or in the case of labeling, places trust in the labels deter- mined by various content raters). Thus, a person who wishes his or her religious values to be reflected in the content that is accessible to his or her children might choose a filter or a content-limited ISP that is sold by a firm with similar religious commitments. A person who wishes his or her children's Internet experience to be limited to positive, developmentally fact that filters for institutional use are increasingly flexible (see Section 12.1.4~. If blocked pages that are needed for educational purposes, for example, can be obtained quickly (e.g., in a matter of minutes), the issue of overblocking need not be as salient. (One school system told the committee that a filter used previously had not allowed such flexibility and had resulted in a large number of complaints from teachers and students. Other faculty and librarians in other locations told the committee that unblocking sites was cumbersome and difficult.) A second reason for the lack of complaints is likely to be the fact that once a filter is in place, the expectation of users is that access will be filtered. The committee heard stories of a number of complaints regarding filtering when filters were first installed, but in most such instances, the complaints ceased after a few months. Students without home Internet access seemed to accept a school's filtering policy as a given, and simply adapted to it, even if they were prevented from accessing valuable information. One librarian told the committee that a blocked Web page was analogous to a book that was not present in the library, and that the alternative approaches to obtaining the information were similar to using interlibrary loan. Students with Internet access at home have no particular reason or incentive to complain aside from issues of efficiency or convenience. 30In general, when students encountered blocked sites at school, they simply went to another venue to reach those sites most of the students to whom the committee spoke had unfiltered home access.
TECHNOLOGY-BASED TOOLS FOR USERS 289 appropriate, and educational material may choose a filter or a content- limited ISP that explicitly screens content for such criteria, rather than another vendor that might screen content for inappropriateness. Finally, "viewpoint discrimination" (discussed in Chapter 4) gener- ally cannot be practiced by public institutions, but the law in this area is currently unclear. In particular, it is not clear how a court would decide whether a public institution's use of a particular filter vendor's determi- nations of inappropriate material constitutes such discrimination for instance, where the line is drawn between viewpoint discrimination and content discrimination, and what weight should be given to the extent to which the institution relied upon the filter vendor's settings. It is also not clear to what extent public schools, as opposed to public libraries, may engage in certain kinds of viewpoint discrimination. 12.1.4 How Flexible and Usable Is the Product? Server-side filters can be easier to use than client-side filters, if only because they do not require installation on the client. Nevertheless, al- most all filters allow some degree of customization to a parent's (or other adult supervisor's) requirements. Filters can (but do not necessarily) al- low flexibility in many dimensions: · Changes to the criteria used for blocking. Sites or keywords to identify inappropriate material can be added or subtracted by the end user: when a filter program erroneously blocks a site, or fails to block something the user deems inappropriate, the parent or other administrator of the program can create an exception list that deletes or adds the site from or to the black list. An important consideration is the extent to which the blocking criteria are known to the user. While nearly all filter vendors provide a list of catego- ries that are blocked, very few provide a public list of all of the sites on their default "to be blocked" list,3~ and to the committee's knowledge, no filter Din general, they protect the list by encrypting it and had hoped that the Digital Millennium Copyright Act tDMCA' would outlaw reverse engineering to decrypt such lists. However, on October 27, 2000, the u.s. Copyright Office issued its final rule implementing the anti-circum- vention provisions of the DMCA. The statutory provisions of DMCA prohibit the circumven- tion of technical measures that prevent the unauthorized copying, transmission, or accessing of copyrighted works, subject to this rulemaking of the Copyright Office. The final rule establishes two exceptions to the anti-circumvention provisions, one of which allows users of Internet content filtering programs to view lists of Web sites blocked by such software. The Copyright Office recognized a First Amendment interest in access to this information and stated the need for circumvention in this instance since persons who wish to criticize and comment on them cannot ascertain which sites are contained in the lists unless they circumvent. This exception to the DMCA rule may have an impact on the ongoing public debate about filters. In March 2000, two programmers who revealed the list of thousands of Web sites blocked by the Internet filtering program CyberPatrol faced charges of copyright violation.
290 YOUTH, PORNOGRAPHY, AND THE INTERNET vendor provides a list of the objectionable words sought in keyword searches. Most companies that do not release the list of blocked sites regard such lists as intellectual property and argue that the non-release protects the effort that went into creating them. However, if users of these products do not know the criteria explicitly, they will know that sites are blocked only when access to those sites is blocked and they are told that they have been blocked. Thus, they cannot make an a priori determination of such a filter's fitness for use. · Ease of making authorized changes. A filter to which anyone can make changes would not be very useful. In general, the ability to make changes is restricted to "authorized parties." The effort and time needed to implement a change can vary. Some filtering systems allow individu- als (e.g., teachers, librarians) with the appropriate password to add or subtract a site or a keyword essentially instantaneously. Others require the submission of a request to the filter vendor which then evaluates the request and implements it or not at its discretion, a process that may take days or even longer. For educational purposes, the former is likely to be much better than the latter. On the other hand, increasing the effort needed to implement a change is likely to reduce the number of changes to the original filtering policy, an outcome that may be regarded as a benefit for those who want to enforce a uniform policy, or who have little faith that adult supervisors will act judiciously. · Granularity of content categories. Different vendors divide "inappro- priate" material into different categories, using more or fewer categories depending on their intended user base. Categories may be divided into subcategories (e.g., "sex" might be divided into "pornography" and "sex education," or nudity might be differentiated as "full frontal nudity" ver- sus "partial nudity". · Age or grade differentiation. As discussed in Chapter 5, information that is inappropriate for one age or grade level may not be inappropriate for older ages or higher grade levels. (Specifically, the information needs of high school students are typically different from those of students in middle school.) Some filters allow more or less restrictive filtering de- pending on age or grade. · Individually configured filtering policies. In a home context, a parent might wish to have different filtering policies (or none at all) for children of different ages. (Such differences would reflect the parent's belief that older children with more maturity and a broader scope of information needs might also require broader and less restricted Internet access.) To support different filtering policies, a filter would require the child to log into the system so that the child's individual filtering profile could be used. Individual policies can also support age or grade differentiation- requiring a student to log in with his or her grade (or associating a grade
TECHNOLOGY-BASED TOOLS FOR USERS 291 with an individual login). In a library context, an age-appropriate filter- ing policy might require the system to ask the user if he or she were over 18, and if under 18, ask the user to enter his or her age, thus providing the information necessary to set an age-appropriate policy. Many filtering products add a variety of features meant to offer par- ents, teachers, and others even further control. Some of these include: · Time windowsfor blocking. Under some circumstances, the inappro- priateness of a site may depend on the time of day. For example, in a school environment, sites referring to games and media entertainment may be inappropriate during school hours (because they constitute "off- task" material) but appropriate in an after-school program in which Inter- net usage may be less restricted. · Records or logs of attempted access to inappropriate materials. To un- derstand patterns of usage, records of attempted access may be kept. Note that such records, if individual logins are not used, can reflect at- tempted usage only from a given access point; further, if individual logins are not used, no records can be matched to individuals. · Bi-directional filtering. Some filters enable the blocking of certain outgoing user-entered information, such as phone numbers, addresses, foul language, and so on. Such blocking can be used to promote the safety of children and to enforce prohibitions against giving out such information. In addition, some filters can block certain types of Internet access entirely: instant messages, e-mail, chat rooms, file transfers, and so on. As noted in Chapters 2 and 5, e-mail, chat rooms, and instant mes- sages allow the child to send as well as to receive, and thus to engage in text-based interaction with other parties. File transfers allow images, video clips, and sound files to be sent and received. Usenet newsgroups- similar to bulletin boards contain a great deal of content that is un- moderated and inappropriate for children by the standards of many adults. In general, flexibility adds to the complexity of the filter usually in its initial configuration and sometimes for its use. For example, there is debate about whether consumers prefer less nuanced granularity and simpler choices, or prefer to have detailed control over their filtering. Some products have garnered praise for offering a simple age-by-age default set of parameters on what to block that may then be overridden with more levels of nuance. In a number of filter-using sites visited by the committee, the flexibil- ity of an installed product (especially the ability to unblock selected Web sites) was not used by administrators and teachers. In some cases, they
292 YOUTH, PORNOGRAPHY, AND THE INTERNET were simply unaware of the capability; in others, they lacked either the time or the knowledge to do so. In still other cases, the ability to unblock sites was limited to district-level administrators (rather than the local administrators or teachers). During site visits, a number of teachers told the committee that requests sent to district-level administrators to over- ride sites often met with resistance, and were slowly acted on and often refused. Furthermore, to the extent that flexibility is tailored for different individuals (e.g., for middle school students versus high school students), identification of these individuals is required and the appropriate policy must be mapped to the access points of those individuals. An additional dimension of functionality is the provision of explana- tions for blocking. A site that is blocked for no apparent reason has a lower perceived legitimacy than a site that is blocked for a stated reason. For example, many filters tell the user that a site has been blocked because it falls into Category X, where X may be pornography, sex education, weapons, violence, and so on. By contrast, a site that is blocked simply with the message "access denied" does not provide the child with useful feedback, and may increase his or her motivation to go to a nonblocked access device to retrieve the information. Note that filtered search en- gines provide the lowest level of transparency of all because a filtered search engine never returns even a link to content that is deemed inappro- priate, the user has no way of knowing what has been filtered out or even that anything has been filtered out. 12.1.5 What Are the Costs of and the Infrastructure Required for Filtering? Financial Costs As with all technology deployments, the financial costs of using fil- ters can be divided into acquisition costs and maintenance costs. Acquisi- tion costs can be quite low, especially for server-based filters in which an installation of the filter at each access point need not be undertaken. Maintenance costs of server-side filters are usually absorbed into a per- year per-seat charge. However, payments to the vendor are not the only cost, as some on- site effort must be made to ensure that filters are working properly. On- site technical support is generally necessary. Management of the environ- ment must be taken into account as well in particular, teachers, librarians, and parents may be faced with managing requests to unblock sites that are blocked. In an institutional environment, there are costs of teaching the responsible adults what the filter can and cannot do, and providing training that familiarizes them with operating in a filtered en-
TECHNOLOGY-BASED TOOLS FOR USERS 293 vironment. When filtering impedes their own legitimate searches for in- formation, they must know how to obtain that information despite the presence of filtering. And, for those institutions that rely on centralized (e.g., district-based) administration of the unblocking function, the staff time needed to manage these decisions can be significant (and hence costly). Use of content-limited ISPs appears to entail fewer financial costs than the use of server-side or client-side filters. Because the major selling point of using a filtered ISP is that the user gets to delegate to another party all of the responsibilities of deciding upon and enforcing a filtering policy, it is likely that users will be comfortable for the most part with the default policy of the filtered ISP. Thus, the costs of filtered ISPs for this class of users will be relatively small. Also, filtered ISPs makes the cost of updating the filtering algorithm or database invisible to most users. One trend pushing toward the lowering of filtering costs is that the basic technology of filtering is increasingly available in certain common hardware products. For example, a variety of hardware routers for home or small office use (generally to support a small local area network at a home or office site) have native site-based filtering capabilities (that is, they have the ability to exclude traffic from specified sites). Some also have the ability to search for objectionable keywords embedded into site URLs. If trends toward hardware-based filtering continue, such filtering may well become ubiquitous. If and when vendors of these hardware products provide these routers with lists of sites to be blocked, with up- dates as a service to purchasers, the cost may well drop significantly. Restrictions on Information Flow AS noted in Chapter 5, it is likely that in most communities there would be a rough consensus on some kinds of sexually explicit material on the Internet that should be made unavailable to children. But on other kinds of sexually explicit material, such consensus would be unlikely. Moreover, some of such material would undoubtedly constitute protected speech according to established First Amendment interpretations. The discussion in Section 12.1.2 points out that overblocking is inher- ent in any process that makes decisions about what to filter (so is underblocking, but underblocking is not a restriction on information). Thus, for inappropriate sexually explicit material that might loosely be classified as "for adults only," some material that should not be placed into this category will be and will therefore be improperly blocked. Filter vendors often provide options for blocking entire categories in addition to the category of sexually explicit material: violence, weapons, pro-choice and anti-abortion material, gay and lesbian lifestyles, and so
294 YOUTH, PORNOGRAPHY, AND THE INTERNET on. Much of the material in these categories does not fit the legal defini- tion of material that is "obscene with respect to minors," but based in the default setting of many filters, would be blocked anyway. While this restriction is not a legal problem in the context of home use, it may present a problem in publicly funded institutions, which are constrained by the requirements and current interpretations of the First Amendment.32 In an educational context, the restrictions on information flow associ- ated with filters may lead to substantial problems for teachers and librar- ians who are trying to develop useful and relevant educational activities, assignments, projects, and so on. Indeed, some teachers reported to the committee during site visits that sometimes their lesson preparations were hampered by the fact that their Internet access was filtered at school. In other cases, when they prepared a lesson plan at home (with unfiltered Internet access), they were unable to present it at school because a site they found at home was inaccessible using school computers. Restrictions on information flow may also reduce the benefits of the Internet as an information retrieval mechanism. Specifically, one of the advantages of the Internet is that it facilitates the comparison of how different sources treat a given topic. While it is true that there are often many unblocked sources for basic information (and hence blocking any one of these sources may not be critical in this context),33 advanced work in which the specific source providing information affects its presentation or credibility is more adversely affected by overblocking. Such might also be the case when alternative political points of view or analyses may be blocked as being inappropriate. Psychological Costs Another potentially major cost of filters is that their use reduces op- portunities for young people to practice responsible behavior on their own. That is, to the extent that filters work as they are intended (i.e., they block rather than discourage access to material that may be inappropri- ate), children have fewer opportunities to choose and thus fewer oppor- 32A study by the Kaiser Family Foundation found that among teenagers aged 15 to 17 who have sought health information online, 46 percent reported that they experienced blocking from sites that they believed were non-pornographic. For example, 15 percent of those who were blocked reported that they were searching for information on sexual health topics. see Rideout, 2001, Generation Rx.com: How Young People Use the Internet for Health Information. 33secause of keyword filtering, sites containing certain keywords may be blocked. How- ever, synonyms to these keywords may not be filtered, and sites with these synonyms will not be blocked.
TECHNOLOGY-BASED TOOLS FOR USERS 295 "unities to learn how to make responsible decisions. Such children may well have greater difficulty in developing internal standards of appropri- ateness. In addition, while some youth have reported that the use of filtering by their parents makes them feel loved, others have reported that it makes them feel untrusted by their parents. Filters also create forbidden fruit in this context, specific content that is made (more) desirable simply because it is inaccessible. A com- mon response to forbidden fruit is to engage in more active and deter- mined efforts to obtain it. Given the technological sophistication of some teenagers, these efforts often succeed. Even worse, from the standpoint of minimizing exposure of children to such material, the results of technical circumvention efforts are often widely circulated, with the ultimate effect of greater exposure to inappropriate material rather than less, at least within the immediate circle of individuals close to those with the neces- sary skills. The introduction of filters may also serve to create resentments and resistance among the children to whom they are targeted. That is, be- cause filters are explicitly intended to limit one's freedom of access, it is entirely possible that introducing filters, especially into an environment in which unrestricted access was the rule, would create tensions and an- ger in the children against those responsible for the decision. This dy- namic is likely to be most significant in a family environment, in which parental rules are generally negotiated to some extent with children. Finally, unfair treatment of youth can result from the use of filters. A young Internet user, knowing that the Web sites she or he is viewing are filtered, can make the reasonable assumption that what is not filtered conforms to parental or organizational policy (e.g., an acceptable use policy, as discussed in Chapter 10), and thus that access to those unfil- tered sites is not restricted. However, because filters inevitably allow some inappropriate material to pass, this may not be a good assumption, and a child who relies on a filter that allows objectionable material to be viewed can get into trouble with parents or organizational authority. Infrastructure ~ . Because a critical issue in filtering is the extent of underblocking and overblocking, users are well advised to test in advance what may be improperly blocked or passed. Apart from this predeployment testing, source- or content-based filters require minimal infrastructure. However, label-based filters require content providers or third parties to cooperate in labeling content. To develop such an infrastructure, providers and third parties must have incentives to label content. The minimal success
296 YOUTH, PORNOGRAPHY, AND THE INTERNET of labeling schemes for Internet content to date suggests that the present environment does not provide such incentives.34 Note also that labeling by third parties entails essentially the same type of effort that must be undertaken by filter vendors to develop lists or criteria for inappropriate content. For labeling to be useful, a large vol- ume of information must be examined and rated; otherwise, the user is left with the choices described in Section 12.1.3. Recognizing that the primary impediment to the success of rating schemes is the extent to which Internet content is currently not labeled, the Internet Content Rating Association (ICRA) has undertaken a global effort to promote a voluntary self-labeling system through which content providers identify and label their content using predefined, cross-cultural categories (Box 12.4~. ICRA is a global, non-profit organization of Internet industry leaders committed to making the Internet safer for children while respecting the rights of content providers. According to ICRA's chief executive officer, ICRA hopes that over the next several years the most popular Web sites and portals, those account- ing for the most Internet traffic, will have labeled with ICRA. If these efforts are successful, ICRA labels will be associated with sites that ac- count for a large fraction of Web traffic, though not necessarily with a large fraction of existing Web sites. The operators of these sites will encourage their business partners and the sites they host to use ICRA labeling. (However, because these sites do not in general have a business relationship with other Web sites that might turn up through use of their search engines, these other Web sites cannot be expected to be labeled in general.) Another approach is to mandate by government fiat the labeling of all Web content. But such an approach involves a number of significant issues: · Such compelled speech raises important First Amendment issues. · The enforcement of label accuracy is complex. Even labels created by the content owner may be inaccurate. · The volume of Web information is so large that a government man- date requiring labeling would impose an enormous expense on content providers. 34It is interesting to note that industry labeling initiatives in other media have been more successful and widely accepted and used; these other media include movies Through the MPAAy, TV Through a joint effort of the Motion Picture Association of America, the Na- tional Association of Broadcasters, and the National Cable Television Associationy, and software CD-ROMs and games Through the Interactive Games Developers Associationy. One reason for this success is that the volume of content produced in these media is much smaller than the volume of content produced for the Web.
TECHNOLOGY-BASED TOOLS FOR USERS 297
298 YOUTH, PORNOGRAPHY, AND THE INTERNET · Web content can be posted in many different national jurisdictions, and it would be easy for content creators and providers to evade such a mandate by moving offshore. Apart from government-required content labeling, the widespread use of labels will turn on private incentives. Incentives can be positive- by labeling, a content provider or creator could receive some financial benefit, either directly or by attracting more parties to its content. How- ever, the committee did not see a compelling business case for how con- tent providers or creators can benefit commercially from labeling, and testimony to the committee indicated how difficult it is to develop child- friendly Internet businesses. Or, incentives can be negative by labeling, a content provider or creator might receive immunity from prosecution (for example, for obscenity) for the content being labeled (e.g., as adults- only). Such a safe harbor might be particularly applicable in labeling of sexually explicit material (as discussed in Section 9.3~. To date, child-centered content-limited ISPs are small enterprises, and many efforts to establish a viable business model for providing good, attractive, and educational content for kids have foundered, as noted in Chapter 10.35 Thus, it is an open question whether children will be able to take advantage of consistent, dependable, long-term service of this nature Note also that because content is explicitly vetted for appropriateness, it is likely that the content offered by such ISPs would be more limited- and hence more suitable for younger children whose information needs are generally less than those of older children. By contrast, certain Internet portals, such as Yahoo and Lycos, have search engines that search only within the appropriate (and educational) child-oriented universe of content. Available for free, these search en- gines return only links to appropriate and educational content, and as long as the child does not surf outside these links, a responsible adult can have confidence in his or her child's activity. 12.1.6 What Does the Future Hold for Filtering? Image-Only Filtering Visual imagery often results in a highly visceral impact compared to textual descriptions of the same image. As discussed in Section 6.3.3, males tend to respond to visual sexual imagery more than females do. 35Of course, entrepreneurs in other areas are also struggling to find viable long-term Internet business models.
TECHNOLOGY-BASED TOOLS FOR USERS 299 And, as a general rule, sexually explicit text does not generate nearly the same controversy that sexually explicit imagery generates.36 A filter that blocks images on Web pages that have been determined to be inappropriate rather than all of the content of the Web pages them- selves is thus well suited to meet this concern. Most of today's filters block specific Web pages, based on criteria established by their filtering policies. But there is no technical reason that the filter could not instead block all images contained on those Web pages, while passing through all text on those pages. (However, icons and text rendered in image format, such as those in banner advertisements and sidebars, would be blocked as well. And, concerns about overbreadth of blocking would remain, so that given images of the Greek gods, Leonardo DaVinci's Vetruvian man, paintings by Rubens, and Michelangelo's David might still be blocked.) Such a filter addresses many of the concerns raised by students, teach- ers, and librarians about children who need information that would oth- erwise be blocked by a page-blocking filter; as a general rule, such infor- mation is presented textually and would be passed by an image-blocking filter. Of course, to the extent that the concerns of filter advocates involve text, an image-blocking filter is not helpful. A more sophisticated approach to filtering of embedded images would involve analyzing them. Very small images are likely to be only icons, and very small images (say 200 x 200 pixels) do not convey much excitement. The content of larger images could be analyzed using tech- nology described in Section 2.3.1 and Appendix C, and if found to be sexually explicit, those images would be blocked (subject to all of the difficulties inherent in image recognition). Selective Degradation of Service As discussed in Chapter 8, it is possible to reduce the appeal of deliberate contact with inappropriate material. Such an approach changes the yes/no approach to filtering to one in which the user can still gain access to material that might have been improperly classified as inappropriate, but only at some cost. Two easy technical methods to do so are slowing down the speed with which images of offensive con- tent are displayed or reducing the visual resolution (or the audio fidel- 36In the future, it may be possible that other kinds of content (e.g., sound files associated with sexually explicit content) will be regarded as being as objectionable as images. (Recall that "dial-a-porn" services had some appeal for adolescent youth, and that the availability of such services to minors created significant controversy in the early l990s.) If that future comes to pass, the media containing such particularly objectionable content might also be selectively blocked (e.g., by blocking all sound files on sexually explicit Web pages).
300 YOUTH, PORNOGRAPHY, AND THE INTERNET ity) of such images. Presuming that such content can be identified, a child who must wait a few minutes for an image to be displayed is likely to lose patience with it. Such a tactic is most relevant if the child knows that the image being sought is inappropriate it reduces the immediate gratification usually available from Internet downloads, and it increases the risk of being discovered in the act. (It also provides more time for the child to reflect Do I really want to do this? Am I being respon- sible?) Similarly, an image of significantly reduced resolution is far less appealing than one with high resolution. Another possible approach depends on penalizing the user after viewing inappropriate content by automatically logging out, freezing the computer so that a reboot is necessary, or simply delaying for several minutes the child's accessing of other Internet content. Approaches that depend on degradation of service force the child to make decisions about whether the cost and inconvenience of access are worth the appeal of accessing content that adults might deem inappropriate. Bundling Filters with Other Functionality Filters are a special-purpose tool. Parents and others who purchase and install filters or filtering services thus can be assumed to feel that the problems raised by unfiltered Internet access are worrisome enough to warrant such efforts. However, other individuals may be concerned but reluctant to foster resistance or resentment that their introduction might generate (as discussed under "Psychological Costs" in Section 12.1.5~. For such individuals, associating filters with packages that provide other useful features may make it easier to obtain the benefits of filtering. For example, parents wishing to obtain filtering services might subscribe to a content-limited ISP, and "sell" it to their children on the basis of addi- tional content that the ISP would make available to them. Warning Rather Than Blocking Built into any filter is a specification of content that should be blocked. Instead of blocking access, a filter could warn the child of impending access to inappropriate material, but leave it to his or her discretion whether or not to access the material. Because the child does have choices, such a feature would have pedagogical advantages with respect to helping children to make responsible choices, assuming an environ- ment structured in a way to facilitate such assistance. (A feature to warn of impending access to inappropriate material might or might not be combined with logging of such access a point discussed in Section 12.2 below.)
TECHNOLOGY-BASED TOOLS FOR USERS Opportunities for Internet Safety Education 301 Because child-oriented content-limited ISPs are oriented toward pro- viding information especially for kids, they provide unique opportunities for Internet safety education. For example, these opportunities might include material that provided context for children and explained con- cepts about judging the value and or validity of the site being flagged or blocked. Future Prospects Over time, filtering is likely to gradually improve, decreasing both underblocking and overblocking. However, these improvements will al- most certainly be incremental rather than revolutionary, and users would be well advised to view with some skepticism claims of revolutionary improvement. (For example, the phenomenon of blocking breast cancer sites because a user performed a search for "breast" is now rare. How- ever, the reason this particular error is no longer frequent is that many users complained about it, and breast cancer sites were specifically taken off the black list.37) One goal is quite unlikely to be met the generation of a class of objectionable or inappropriate material from a single example. It would be highly desirable for a user who has received an objectionable image (for example) to be able to tell a filtering program, "I don't want to see any more stuff like this." But what counts as "like this" is virtually impossible to generalize from one example, which is why even the best training systems today require hundreds or thousands of samples of objectionable material to offer any hope of even a minimally adequate categorization of material. 12.1.7 What Are the Implications of Filtering Use? Today's filters cannot be the sole element of any approach to protect- ing children from inappropriate sexually explicit material on the Internet 37The first widespread instance of such blocking occurred in 1995 when a major online service provider blocked all sites containing the word "breast," including those dealing with breast cancer. In the wake of widespread complaints, the service provider in question quickly restored access to breast cancer sites. Since then, this particular problem has oc- curred only rarely, as a number of techniques described in Section 2.3.1 can be used to avoid problems arising from simple-minded keyword matching. Still, the problem has not been eliminated entirely, and a recent instance of a Web site involving breast cancer being blocked was brought to the committee's attention in January 2002 (personal communica- tion, Bennett Haselton, Peacefire.org). In this instance, the reason for such blocking appar- ently arises from the use of IP-based virtual hosting.
302 YOUTH, PORNOGRAPHY, AND THE INTERNET (or any other inappropriate material), and it is highly unlikely that to- morrow's filters will be able to serve this role either. But they can be a useful element, as long as their limitations are kept in perspective. In particular, with or without filters, Internet-using children will have op- portunities for encountering some non-zero amount of inappropriate material, and thus regardless of the benefits that filters do confer, they will have to internalize codes of conduct and appropriate online behavior if they are to be safe. Using a child-oriented content-limited ISP is approximately analo- gous to allowing a child to watch only selected videos on television, rather than broadcast or cable television. And, as in that case, the use of such a practice is most likely appropriate for fairly young children. However, as a child's Internet information needs outgrow what a kid-friendly service can provide, he or she will have to turn to other sources. Other sources- by definition will provide information that is less thoroughly vetted, and will likely involve exposure of the now-older child to some inappro- priate information; however, an older child may well be better able to cope with inadvertent exposures to such material. Furthermore, there is no guarantee that the point at which a child's information needs outgrow a kid-friendly service will coincide with the point at which he or she can cope well with such exposure, and it is likely that the former point occurs earlier than the latter point. As for server- and client-side filtering, it is helpful to regard such filtering as "training wheels" for children on the Internet as they learn to make good decisions about what materials are and are not appropriate for their consumption. An adult who explains the purpose of the filter to the child (and different explanations are appropriate at different ages), and who can provide some in-person guidance when the child first encoun- ters blocked material, is in a much better position to help the child inter- nalize the rules than an adult or institution that simply installs the filter with no explanation or rationale either before or after the fact. Indeed, the latter situation is what the detractors of filters have in mind when they argue that the use of filters can lead to a false sense of security: a filter user (parent, library, school), knowing that a filter is in place, will then be tempted to assume that all is well, and then fail to exercise appropriate oversight or to take other measures when such oversight or other mea- sures would still be necessary. Underlying much of the concern about the deployment of filters- even on a voluntary basis is a fear that the creation of a technical infra- structure that supports filtering will inexorably, over time, lead to even stronger pressures for formal content regulation (a so-called "slippery slope" argument).
TECHNOLOGY-BASED TOOLS FOR USERS 303 Furthermore, even without the pressures for formal content regula- tion, those advocating the free flow of information are concerned that authorities (parents, schools, libraries, businesses, and others) will find the use of filters irresistible to block any kind of content or information that they find objectionable, and not just for children. (lust such a se- quence of events was related to the committee in one of its site visits: a county-wide library system installed filters to block sexually explicit ma- terial from all patrons, not just children, though the concerns were first raised in the context of children's access to such material.) 12.1.8 Findings on Filters 1. Filters have some significant utility in denying access to content that may be regarded as inappropriate. However, many of today's youth have access to unfiltered Internet venues (e.g., at home, at a friend's house), and school and library filters do not block content accessed from these other venues. 2. All filters those of today and for the foreseeable future suffer (and will suffer) from some degree of overblocking (blocking content that should be allowed through) and some degree of underblocking (passing content that should not be allowed through). While the extent of over- blocking and underblocking will vary with the product (and may im- prove over time), underblocking and overblocking result from numerous sources, including the variability in the perspectives that humans bring to the task of judging content. 3. Filters are capable of blocking inappropriate sexually explicit ma- terial at a high level of effectiveness if a high rate of overblocking is also acceptable. Thus, filters are a reasonable choice for risk-averse parents or custodians (e.g., teachers) who place a very high priority on preventing exposure to such material and who are willing to accept the consequences of such overblocking. (For example, these individuals may be more in- clined to take such a stance if the children in question are young.) Such consequences may include the blocking of some material that would be mistakenly classified as inappropriate sexually explicit material, and/or the blocking of entire categories of material that are protected by the First Amendment (a consequence of more concern to publicly funded institu- tions such as public libraries than to individual families). 4. Automated decision making about access is generally inferior to decision making with a responsible adult involved in the decision-mak- ing process. Furthermore, to the extent that the content of concern is in multimedia formats and unaccompanied by textual descriptions, auto- mated decision making is subject to a high degree of overblocking (iden-
304 YOUTH, PORNOGRAPHY, AND THE INTERNET tifying content as objectionable when it is benign) and underblocking (identifying content as benign when it is objectionable). 5. To the extent that Internet content is created or produced in real time (e.g., a live videoconference over Webcams), it will be impractical to insert a human reviewer into the decision making process about whether access to that content should or should not be granted thus weakening the role that filters play. 6. Overblocking must be distinguished from overly broad blocking policies. Overblocking is a mistake content is blocked that should not have been blocked, even in the judgment of the human being responsible for identifying content that should be blocked (the censor). Overly broad blocking policy represents a disagreement with that human being in which the content seeker asserts that certain content should be accessible and the censor believes that content should be blocked. 7. Based on information gathered in its site visits, the committee be- lieves that filters are deployed by schools and libraries at least as much for political and management reasons as for the protection of children, be- cause the deployment of filters enables staff to pay more attention to teaching and serving library patrons. 8. Because most filters are deployed to forestall complaints, and com- plaints are more likely to be received about underblocking rather than overblocking, filter vendors have more incentive to block content that may be controversial than to be careful about not blocking content that should not be blocked. 9. Transparency of operation is important, in the sense that filters that inform a user that a site is being blocked and that provides the reason for blocking are more likely to be seen as legitimate than those that do not provide such information. 10. The use of blocking filters does not promote the development of responsible choice in children. With removal of the option of making certain choices, children are denied an opportunity to choose and hence do not learn appropriate decision making skills from the fact of blocking. 11. Filters are a complement to but not a substitute for responsible adult supervision. Using filters without adult supervision and/or in- struction for users in what constitutes appropriate behavior is not likely to result in children learning what is or is not appropriate behavior. Fur- thermore, filters cannot guarantee that inappropriate material will not be accessed. 12.2 MONITORING Tools that provide monitoring of the Internet activities of children have been proposed by some as an alternative to filters in certain con-
TECHNOLOGY-BASED TOOLS FOR USERS 305 texts.38 Tools for monitoring have not received nearly the same attention as filters, but are potentially controversial as well. Box 12.5 describes the dimensions of choice that GetNetWise identifies for monitoring tools. 12.2.1 What Is Monitoring? Monitoring, as a way of protecting youth from inappropriate content, relies on deterrence rather than prevention per se. In some cases, it is the threat of punishment for an inappropriate act that has been caught through monitoring that prevents the minor from behaving in an inap- propriate manner. In other cases, "catching someone in the act" can provide an important "teachable moment" in which an adult can guide and explain to the child why the act was inappropriate, and why this content is on the Internet. Monitoring a child's use of the Internet is a generic approach that can be implemented in both technical and non-technical ways. Adult super- vision of Internet use, a non-technical strategy, is discussed in Chapter 10. But the technical methods for monitoring someone's use of the Internet are many. · The simplest means of monitoring are built into the browser or the operating system. Parents and other monitors do not need to rely on any additional technology to do this simple monitoring. 38John Schwartz. 2001. Schools Get Too! to Track students use of Internee, New York Times, May 21.
306 YOUTH, PORNOGRAPHY, AND THE INTERNET The major Internet browsers have a "history" file that indicates all of the sites visited most recently. (The time frame for which such histories are kept can be adjusted by the user a typical default time frame is 20 days.) Such a history file can be easily viewed by an adult supervisor, a step that requires no additional technology and very little computer savvy. On the other hand, older kids may be knowledgeable enough to erase the history file (it takes just one click) or even to forge a history file with enough innocuous entries to allay suspicion. Most browsers make use of a temporary "cache" that contains im- ages that have been displayed on a user's screen. Inspection of this cache can show most of the images that have appeared recently on a user's screen. Cookie files can indicate the sites with which a child has interacted, as well as who has received information from the child. In Windows, for example, the cookie file is found in the Windows program directory and can be viewed using any text editor. · Commercially available monitoring systems go a step further: Certain devices and programs can capture all of the keystrokes made by a child. Thus, comments or input made by a child can be re- corded for future inspection. The workstation of a supervising adult can be set up to capture and/or display the contents of a child's monitor in real time. Thus, a supervisor (e.g., a teacher or librarian) could monitor what appears on a child's screen from his or her office. E-mail is generally not encrypted in storage, and thus may be read- able by an adult who is responsible for a child. Monitoring tools can provide a variety of functions, enabling a re- sponsible adult to review incoming and outgoing e-mail, instant message and chat room dialogs, Web pages accessed, and so on. Further, such tools may or may not provide notification to the child that monitoring is occurring. A monitoring tool can also use technology to identify inappropriate material (the same technology that filters incorporate) so that it can pro- vide a warning to the child when he or she is about to be exposed to inappropriate material. The child can then decide whether or not to heed that warning. If the child does not, the monitoring tool may or may not provide a record of that access. Warnings can also be accompanied by making a log of access to such exposures or notifying a responsible adult about such access. Depending on the tool selected, monitoring tools can be used at home and in libraries or schools. One distinguishing characteristic is that moni- toring tools generally require human responses to detections of access to
TECHNOLOGY-BASED TOOLS FOR USERS 307 inappropriate information, or at least the threat of such a response. Thus, a parent or librarian or teacher must be involved to talk to the minor engaged in inappropriate behavior and in institutional contexts the cost of reviewing access logs and following up on these reviews would likely be substantial. For monitoring to be effective, usage must be tied to specific individu- als. Thus, in an institutional setting, individual Hogans which tend to be more common in higher grades Wan in lower ones are necessary if mon~- tor~ng Information is to be acted on after We fact of inappropriate usage.39 (If immediate action is taken, individual login information is not needed, since an adult can simply walk over to We Internet access point and talk to We child in question.) The same is true In a home setting, especially with multiple individuals accessing one computer. Indeed, without We ability to associate a given Web access (for example) win a given child, ~ndividu- alized guidance or punishment cannot be provided. As with filters, monitoring is also increasingly common in corporate and business settings.40 12.2.2 How Well Does Monitoring Work? Because monitoring tools do not place physical blocks against inap- propriate material, a child who knowingly chooses to engage in inappro- priate Internet behavior or to access inappropriate material can do so if he or she is willing to take the consequences of such action. However, We theory of monitoring is Mat knowledge of monitoring is a deterrent to taking such action. Note, however, that unless warnings are given repeatedly and in dif- ferent forms, users are known to habituate rapidly to them and behave as Dough they had never been given.41 Warnings in and of them- 39Note also that individual logins are a necessary though far from sufficient aspect of maintaining computer, network, and system security. In the absence of individual logins, it is essentially impossible to hold any specific individual responsible for actions that might compromise security. For more discussion, see, for example, computer science and Tele- communications Board, National Research Council, 1997, For the Record: Protecting Elec- tronic Health Information, and 1991, Computers at Risk: Safe Computing in the Information Age National Academy Press, Washington, D.C.~. Thus, there are advantages for institutions to consider individual logins for reasons entirely apart from protecting their youth from inap- propriate material 40see, for example, Associated Press, 2002, ''IM Monitoring Grows in Popularity,,, April 12. Available online at <http://www.msnbc.com/news/737956.asp?osi=-#soDy>. 4lsee computer science and Telecommunications Board, National Research Council, 1996, Cryptography's Role in Securing the Information Society, Kenneth w. Dam and Herbert s. Lin, eds., National Academy Press, Washington, D.C.
308 YOUTH, PORNOGRAPHY, AND THE INTERNET selves are not likely to deter inappropriate access in the long run. The same habituation may not be true, however, of warnings or monitoring associated with a human presence. An adult supervisor who forces a non-routinized interaction with a child has a far better chance of captur- ing his or her attention. Browser histories log Web sites that have been viewed, though to learn the actual content of these sites, the adult supervisor must either examine these Web sites or make inferences about their content on the basis of the site's URL. Keystroke monitors are equally capable in moni- toring Web sites, e-mail, and anything else that requires user input. Moni- toring of screens being used by children, if done on a large scale (i.e., many screens being supervised by one person), in practice monitors ac- cess to inappropriate imagery. Text also can be monitored remotely, but in this case, the adult supervisor cannot tell at a glance if the text contains inappropriate material, and thus must spend more time in reading that text to make a judgment. Because monitoring leaves the choice of access up to the child, inad- vertent access to inappropriate material is possible. (For this reason, moni- toring is arguably less appropriate for children whose decision making capabilities have not matured.) But the child also retains the choice to gain access to information that may be relevant to his or her information needs, and thus the problem of overblocking described in Section 12.1.2 does not exist. Judgments about the effectiveness of monitoring are mixed. Monitor- ing coupled with punishment has deterrence effects at least in the short term and at least for some fraction of youth. But because the decision making party is the youth, rather than an adult (e.g., the filter vendor), denial of access to inappropriate material cannot be assured. Moreover, as with filters, a change of venue will often suffice to eliminate overt monitoring. A critical dimension of monitoring is the kind of adult response that is coupled to a detection of inappropriate access or behavior. If an adult offers guidance and explanation rather than punishment (as is most ap- propriate if a violation is accidental), the youth may learn how to differen- tiate for himself or herself appropriate from inappropriate actions. To the extent that this is true, protection from inappropriate material may be extended to non-monitored venues and for much longer periods of time. (On the other hand, once clear explanations for rules have been provided, punishment for intentional infraction of rules is entirely appropriate for demonstrating that infraction carries consequences.) Another critical dimension is whether monitoring is covert or overt. Covert monitoring, if undiscovered, is more likely to provide information about what the child is doing "when left to his or her own devices." And,
TECHNOLOGY-BASED TOOLS FOR USERS 309 if undiscovered, the individual being monitored will not change venues. But covert monitoring by definition cannot deter, because the youth in question must be aware that monitoring is happening if it is to have an effect on his or her behavior. Moreover, undertaking monitoring covertly leaves the question of what the responsible adult should do if anything in the event that monitoring reveals that the child is behaving inappropriately. If the adult does nothing except watch, learning that is directly coupled to inappro- priate access or behavior cannot occur, and the inappropriate behavior may well continue. Yet, if the adult does respond to such a revelation, he or she may be forced to disclose the fact of monitoring, with all of the attendant consequences (e.g., a child who reacts negatively because the adult is changing the rules from what was expected). In principle, an adult could act without disclosing the fact of monitor- ing for example, the adult may draw the child into a general discussion of appropriate Internet behavior without reference to any specifics that might be associated with the child's behavior. However, many adults are likely to find it difficult to act on such information without revealing the source. Overt monitoring can deter. But it can also have negative effects, as described below. If monitoring is coupled to explanations and guidance about appro- priate and inappropriate behavior, there is some potential that this ap- plication can promote the long-term development and internalization of appropriate behavioral norms. But the explanation and guidance are essential. If, as is much more likely in an institutional setting and in many home situations, the primary or exclusive consequence of detec- tion of inappropriate access is punishment, such learning may well not occur. Even more destructive would be punishment resulting from in- advertent access to inappropriate material, as one can easily imagine might be imposed by an adult supervisor who did not believe an asser- tion by his or her charge that the inappropriate Web page was viewed by accident. Finally, as with filtering, monitoring can be circumvented by a change of venue in which monitoring is not present. 12.2.3 Who Decides What Is Inappropriate? Decision making is shared between adults and youth. It is the re- sponsibility of responsible adults (e.g., parents and school officials) to provide general guidelines about what constitutes inappropriate material or behavior. However, it is the responsibility of the youth to interpret
310 YOUTH, PORNOGRAPHY, AND THE INTERNET these guidelines. And it is the interaction between adult and youth that can provide guidance in any particular instance. For those products that identify inappropriate material, the relevant decision makers are the same as those for filtering. 12.2.4 How Flexible and Usable Are Products for Monitoring? Given the burden imposed on responsible adults when all access is monitored, some monitoring products make a record only when inappro- priate material has been accessed. Of course, such a product requires definitions of inappropriate material and all of the discussion above in Section 12.1 is relevant to such definitions. Monitoring can also be intermittent. For example, a product may take a "snapshot" of a given computer screen that records its contents at ran- dom intervals that average once an hour. In this case, the auditing burden is directly proportional to the frequency of screen capture and the number of screens being monitored. Monitoring software that never records the screen when Web content from an innocuous site is shown further re- duces the number of snapshots that need to be reviewed. Real-time display of a child's screen can be performed at any level of resolution desired. In the instance when the supervisor's monitor dis- plays simultaneously "screen shots" of multiple user screens (e.g., all of the monitors in use by a class), each image appears in a smaller "thumb- nail" version that makes reading most text on those screens difficult or impossible while at the same time usually enabling the supervisor to determine if an image is being displayed. Moreover, many inappropriate sexually explicit images are easy for humans to recognize even at low resolution and/or smaller size. The product might then offer the supervi- sor the chance to "zoom in" on this screen shot for closer examination, and perhaps capture the image for documentation purposes. Records of access may be kept or not, and if kept, at different levels of detail. For example, recorded screen images can be kept at high resolu- tion, enabling the reading of words on the screen, or at low resolution, enabling only a general identification of pictures that may be on the screen. Keystrokes can be recorded for differing lengths of time, enabling a su- pervisor to know in detail what the minor has done. And, some monitor- ing packages allow the tracking of timing and sequencing of access that would help administrators to distinguish between intentional and unin- tentional access. Tools for monitoring Web usage in the aggregate (without individual tracking of users) can provide important information on whether usage conforms to acceptable use policies. Such tools would not compromise individual privacy (indeed, individual logins would not even be re-
TECHNOLOGY-BASED TOOLS FOR USERS 311 quired), but would provide data on the Web sites children visited and the Internet activities in which children engage. Such data could then be reviewed to determine the extent to which a given audience is in fact conforming to an AUP. If the system is configured to not monitor Web browser windows where the same page has not been on the screen for at least some period of time, perhaps 15 seconds, and to never monitor e-mail where the same incoming item has not been on the screen for at least that period of time, then the young person may not have the anxiety that "the adults will discover I inadvertently saw this material, but won't believe it was unin- tentional." Youth will know that as long as they quickly determine that the content is inappropriate, and exit that Web site or that e-mail, no record will be established and no adult will know. To the committee's knowledge, current monitoring software does not have this "do not cap- ture within threshold feature," as it opens a potential loophole: a student clicking through a series of pages of inappropriate Web content, as long as each page is on the screen for less than the threshold time. 12.2.5 What Are the Costs and Infrastructure Required for Monitoring? Financial Costs The primary financial cost of monitoring is the human effort and labor needed to monitor usage. Monitoring records can be extensive, and in the absence of automated tools to flag questionable accesses, the exami- nation of extensive audit records is both tedious and time-consuming. (Recording screen images frequently for a large number of users also consumes hard disk space at a rapid rate.) A second cost in the institutional setting is that the effort needed to manage individual logins is significant. Login names and passwords must be managed, and procedures must be set up to deal with children who lose or forget passwords, passwords that become known to other individuals, revocation of passwords and login names, and so on.42 Psychological and Emotional Costs Punishment that is given soon after undesirable acts are initiated is more effective in deterring a repetition of such behavior than is punish- ment administered long afterward, suggesting that monitoring systems 42For more discussion, see CSTB, NRC, Computers at Risk, 1991, and For the Record, 1997.
312 YOUTH, PORNOGRAPHY, AND THE INTERNET are unlikely to build positive habits in students unless feedback is re- ceived more quickly than may be practical in most situations. (Feedback is needed in minutes or hours, whereas review of access logs may take days or weeks. On the other hand, real-time monitoring can provide opportunities for feedback to the child when the offense occurs.) To be effective in deterring undesirable behavior, punishment must be consis- tent, which suggests that intermittent monitoring, which saves time and energy, will not be conducive to helping students learn to resist the temp- tation of seeking out inappropriate materials. To be a component of effective discipline also requires that the basis for punishment (or conse- quences) not be seen as arbitrary authority but rather an explanation for why certain behavior is unacceptable. A second point is that monitoring and privacy can be antithetical. While the desirability of privacy as an element of an appropriate develop- mental environment is a cultural issue, most of Western society places privacy in high regard, especially for adolescents who are at a develop- mental stage during which they often seek some separation from their parents. A need for privacy is an essential component of separation as adolescents begin to create their own identity, an identity that includes an understanding of himself or herself as a sexual person.43 There are some personal issues that adults want to keep to themselves because they are embarrassed or have other feelings or behaviors that they do not want to share generally. Many children (especially adolescents) have those same feelings. To deny them that personal freedom by constant electronic moni- toring may convey a lack of trust by an adult community that tells them that there is no personal space that belongs to them and them alone. Monitoring can easily be regarded by youth as a violation of privacy and an unwarranted intrusion that demonstrates a lack of trust, and one common unfortunate consequence is that when mistrusted, an individual often proceeds to act in ways that justify that mistrust. Certainly parents do monitor their children's activities, but the balance of how much chil- dren and adolescents are watched varies, depending on characteristics such as age, gender, maturity, and parenting practices. In general, a child's need for personal freedom increases as he or she grows older. 43According to a survey by the Kaiser Family Foundation in 2001, teenagers place a high value on privacy with respect to their Internet usage: 76 percent of online youth agreed that ''looking up information online is good because I can look things up without anybody knowing about it. Where looking for health information is concerned, 82 percent said that confidentiality is very important. A sizable minority of young people are concerned about the privacy of their online searches for information, with 40 percent saying they are worried that the computer might keep track of what they do online. see Rideout, 2001, Generation Rx.com: How Young People Use the Internetfor Health Information.
TECHNOLOGY-BASED TOOLS FOR USERS 313 Furthermore, children who are constantly watched by parents have less opportunity to develop their own internal controls about their own behavior. They have less opportunity to confront the challenges of life that ultimately develop character, for it is that struggle that makes us who we are. Parents cannot always watch their children. It is then that the effectiveness of socialization is put to the test, for it is what children do in the absence of a parent or an adult that tells of their character. A child who has not internalized parental values may well attempt to break the rules whenever an adult is not watching, for the rules are outside, not inside, the child. By contrast, a child who has internalized those rules generally follows them, regardless of whether they are being watched. Sometimes they will fail, but they can also learn from those mistakes. At the same time, the level of privacy that students can expect in school in using a computer as well as in other aspects of school life is different from what they can expect at home, and school computer sys- tems are not private systems. The expectation of privacy when students use computers in schools is more limited, as demonstrated by a variety of actions that have been supported in court decisions, including searches of student lockers, backpacks, and so on. Thus, provided that students have been given notice that their use is subject to monitoring, the use of moni- toring systems raises fewer privacy concerns. In libraries, privacy expectations have traditionally been fairly high. That is, libraries have traditionally protected the materials used by their patrons, and have even resisted the efforts of law enforcement authorities to investigate such use. Thus, monitoring in a library context even with explicit notice may violate such privacy traditions. Note also that technological monitoring has a different psychological relationship to the one being monitored than does in-person oversight. Consider, for example, a student using an Internet access terminal in the school library. In one scenario, a school librarian walks the floor periodi- cally; if she sees something suspicious on the student's screen, she walks over and asks, "What are you doing?" In a second scenario, the screen on the Internet access terminal is displayed on the school librarian's terminal (in her office) for several seconds at random intervals ranging from once every 5 minutes to once every 20 minutes. Five or ten seconds before the image of the screen is transmitted to the librarian's terminal, an unobtru- sive but noticeable warning flashes on the student's terminal to indicate that monitoring is about to take place. In addition, the display on the librarian's terminal is blurred so that words cannot be read, but blurred images can appear, and no records of the screen on the terminal are kept. If the school librarian sees something going on that warrants attention, she can leave her office, walk over to the student, and ask, "What are you doing?"
314 YOUTH, PORNOGRAPHY, AND THE INTERNET For most people, the first scenario feels like responsible supervision. However, reactions to the second scenario are decidedly mixed, despite the fact that the monitoring system described does nothing that the librar- ian does not do. What accounts for this mixed reaction? One factor is that the person being monitored from the librarian's office would have no particular assurance, beyond the say-so of the li- brarian, that any of these assertions would be true. However, where the monitoring takes place by walking the library floor, it can be seen that the librarian is not taking photographs of the library patrons or their screens. But more importantly, the fact of technological monitoring is also likely to change the nature of the relationship between school librarian and stu- dent. In the first scenario, the interaction between librarian and student is a human one, and the events have unfolded for the reasons that one would expect a chance encounter that leads to an open question. But in the second scenario, the school librarian approaches the student with apparent foreknowledge of what the student is doing, and the question is more disingenuous than open. An additional point is that the foreknowl- edge provided by the monitoring system invites the librarian to jump to conclusions about what the student is doing, and she will be less likely to give the student the benefit of the doubt when she does engage him. Under such circumstances, a useful educational experience is less likely to occur than if she approached the user with an open mind. Infrastructure While filters can be installed on the client side without the coopera- tion of any other party, real-time monitoring requires a mechanism for displaying the contents of one monitor on another. When tools for moni- toring are used on a large scale, a sufficient number of responsible adults is necessary to provide in-person intervention. (How many adults are required depends on how thorough the monitoring is.) An alternative is after-the-fact review of behavior or actions, a pro- cess that requires storage of logs, screen snapshots, and so on, and auto- mated tools to flag suspect behavior. The administrative burdens can be sharply reduced if the records reflect only potentially suspect accesses and exposures rather than all Internet use. 12.2.6 What Does the Future Hold for Monitoring? As noted above, one major difficulty with monitoring is the effort needed to review audit trails. Thus, there is a role for automated tools that can review audit trails to identify patterns of behavior that are worth further investigation. For example, tools could be available that:
TECHNOLOGY-BASED TOOLS FOR USERS 315 · Identify problematic viewing, resulting from accessing known problematic sites or by analyzing material on Web sites for inappropriate content; · Determine how long someone stayed on inappropriate sites or in what sequence sites were accessed; and · Notify responsible adults in real time of access to inappropriate sites (e.g., in a home context, notification might consist of a light flashing on a monitor carried by the parent when a child using the Web is access- ing a potentially inappropriate site, or an e-mail sent to a parent). Alterna- tively, a request for access to a potentially inappropriate site can be trans- mitted to a responsible adult for approval within a certain period of time (e.g., 1 minute). If approval arrives, access can be granted and the deci- sion remembered by the system. Some of the functionality described above is available in some moni- toring products today, but it is far from common. 12.2.7 What Are the Implications of Using Monitoring? In general, our society subjects criminals to a high degree of monitor- ing because they have proven untrustworthy. For the most part, indi- viduals follow societal rules, not through constant monitoring and the invasion of privacy, but because they have learned to internalize the val- ues underlying those rules. Put another way, if laws were followed only because of police monitoring and enforcement, then we would need as many police as other people to maintain law and order. Children make mistakes, and criminals make mistakes, but to be a child is not to be a criminal. Nevertheless, active supervision of children is often appropriate not because they are criminals but because it is the responsibil- ity of adults to teach them how to internalize the appropriate values and to become better at avoiding inappropriate behavior as they mature. For example, responsible parenting always entails monitoring chil- dren in inverse proportion to their capability and maturity, something only a parent can ultimately determine. But as noted in Section 12.2.5, the wise parent couples monitoring with education and discussion in support of helping the child internalize the parents' values, an outcome that will help him or her (or them) to behave appropriately whether parents are watching them or not. Parents have rights to what might be called the "imposition of sanctions," which, like any other part of parenting, fails if used alone.44 As always, the density of loving wisdom in a parent's 44Even these rights have limits. Parents, for example, cannot subject their children to abuse in the name of discipline.
316 YOUTH, PORNOGRAPHY, AND THE INTERNET actions is everything, and it is the nature of the existing parent-child relationship, the intent of the monitoring process, and how it is carried out that counts in understanding its ultimate implications. Is the monitoring of children and adolescents a step in the erosion of privacy for all citizens? Many people believe that it is, and point to other measures that they find equally disturbing employers who track the behavior of their employees, and commercial online sites that track the behavior and clicks of those who pass through them. The monitoring of children raises special concerns because of the fear that a younger genera- tion will grow up never having known a world in which they had rights to privacy, and thus never realizing what rights they might have lost. Others argue that such fears are overplayed, pointing to social and commercial benefits of increased customization of information delivery and an assumed lack of government interest in the affairs of ordinary people, as well as the fact that schools are expected to act in loco parentis with respect to the students in their care. Indeed, some believe that it would be a positive development in society if adults in all venues felt some responsibility for looking after the welfare of children and for su- pervising children when they are in a position to do so.45 Resolving this question is beyond the scope of this study, but noting the question raised by monitoring of children is certainly not.46 12.2.8 Findings on Monitoring 1. Monitoring that warns when exposure to inappropriate material may occur is an alternative to filtering and eliminates the problem of overblocking associated with filtering. 2. Overt monitoring in concert with explicit discussion and education may help children develop their own sense of what is or is not appropri- ate behavior. Monitoring coupled primarily with punishment is much less likely to instill in children such an internal sense. In general, the simple presence of monitoring equipment and capabilities (or even the assertion of such capabilities) may create a change in behavior, though the change in behavior is likely to be restricted to the situation in which . . . momtormg occurs. 3. Because human intervention is required on a continuing basis, monitoring is more resource-intensive than filtering. For the same rea- son, monitoring is more likely to be construed as a violation of privacy than are other techniques that simply block access. 45In this instance, there are debates about the role of technology in supervising children vis-a-vis an in-person adult doing so. 46A current CSTB study on privacy in the information age will address these issues.
TECHNOLOGY-BASED TOOLS FOR USERS 317 4. Covert monitoring leads to an entirely different psychological dynamic between responsible adult and child than does overt moni- toring. (Furthermore, because people habituate to warnings, children may respond to overt monitoring as though it were covert i.e., more negatively.) 12.3 TOOLS FOR CONTROLLING OR LIMITING "SPAM" "Spam," e-mail that is similar to the "junk mail" that an individual receives through the post office in the brick and mortar world, is sent- unsolicited and indiscriminately to anyone with a known e-mail ad- dress. E-mail addresses can be purchased in bulk, just as regular mailing lists can be purchased: a typical rate for buying e-mail addresses is 5 million addresses for $50. Alternatively, e-mail addresses can be found by an e-mail address "harvester." (See Box 12.6 for more details.) Spam
318 YOUTH, PORNOGRAPHY, AND THE INTERNET refers to any form of unsolicited e-mail a person might receive, some of which might be sent by publishers of adult-content Web sites. A typical spam message with sexual content would contain some "come-on" words and a link to an adult-oriented Web site, but would in general arrive without images. Policy issues associated with spam are addressed in Chapter 10. 12.3.1 What Are Technologies for Controlling Spam? Technologies for controlling spam fall into two categories tools that seek to conceal the e-mail address (because if an e-mail address is not known, spam cannot be sent to it) and tools that manage spam once it has been received. Whether an individual can implement such tools varies with the ISP and/or e-mail service used. To conceal e-mail addresses with certain ISPs, one can create different login names. For example, online services such as AOL enable a user to create more than one login name that can serve as an e-mail address. An individual can thus use this special login name for activities that might result in spam (e.g., for participating in a chat room). This special name becomes the attractor for spam, and mail received at that address can be deleted at will or even refused. A number of services (some free, some not) automate this process, enabling users to create special-purpose ad- dresses that can be turned off or discarded at will. In addition, e-mail systems may allow a user to disallow all e-mail except e-mail from a specified list of preferred addresses and/or domain names. To manage spam that does reach the user's mailbox, a number of tools are available. Most of these tools depend on the ISP or the use of an e-mail program with filtering capabilities (e.g., Eudora, Netscape Messen- ger, Microsoft Outlook). Spam e-mail can be identified and blocked on the basis of: · Content. Content-based analysis examines the text, images, and attachments to e-mails to determine its character. (The technology for content-based analysis is much like that for content-based filtering, as described in Section 2.3.1.) · Source. E-mail being received from a particular e-mail address or a particular domain name may be defined as spam after a few examples have been received. AOL mail controls are based in part on identifying certain sources as spam sources. · Addressees. For the most part, spam mail is not explicitly addressed to a specific individual. Instead, a spam e-mail is addressed to a large number of people in the "blind copy" field. (On the other hand, "blind
TECHNOLOGY-BASED TOOLS FOR USERS 319 copies" (bcc: foo~example.com) sent to an individual and e-mail sent through mailing lists to which an individual has subscribed also make use of the hidden address technique.) Mail filters (e.g., one based on "procmail," a mail processing system for Unix and some other platforms) can check and file or delete an e-mail if it arrived at the user's location via blind copy addressing. (Steps can be taken to set up exceptions for mailing list mes- sages and bcc: messages from legitimate correspondents.) Users can also take a number of procedural measures. For example, Web sites often ask for information from the user. By inserting false information (e.g., indicating an income lower than is actually true), it is sometimes possible to avoid marketing attacks based on demographic information consistent with the group being targeted by the marketers. Internet service providers also take measures to limit spam. For ex- ample, AOL limits how fast other users can repeatedly enter and exit chat rooms, because a pattern of repeatedly and rapidly entering and exiting chat rooms can also be an indication that someone is harvesting e-mail addresses. Most ISPs also have lists of known spammers from which they refuse to carry traffic. 12.3.2 How Well Do Spam-Controlling Technologies Work? Spam-control technologies for dealing with e-mail that has arrived in one's mailbox suffer from the same underblocking and overblocking is- sues that are discussed in Section 12.1.2. One important issue is that spam often contains links to inappropriate sexually explicit material rather than the actual material itself, and no content-screening spam-controlling tool known to the committee scans the content for links that may be embed- ded in an e-mail. That said, some spam-controlling technologies are highly effective against spammers. Those that restrict the e-mail that can be received to a given set of senders are very effective (i.e., do not accept mail unless the sender is on a list of permissible senders or from specific domains). On the other hand, they also sharply restrict the universe of potential contacts, so much so that a user may fail to receive desired e-mail. (For example, a friend who changes his or her sending e-mail address will not be able to reach someone who has identified a white list of preferred senders.) ISP-based or e-mail-service-based spam filters are partially effective. For example, the researcher mentioned in Box 12.6 found that the spam filter on one popular free e-mail service reduced the volume of spam by about 60 percent, though it still passed more than one message per day. Spam filters that are based on content analysis techniques have all of the problems with false positive and false negatives that Web filters have.
320 YOUTH, PORNOGRAPHY, AND THE INTERNET 12.3.3 Who Decides What Is Spam? Some spam filters have preconfigured lists of known spammers. But in general, it is the user who must decide what is spam and what is not. Of course, the difficulty especially for a child is to recognize spam without opening the e-mail. In some cases, it is easy to recognize from the header or subject line. But many spam messages reveal themselves only when they are opened. (Note also that one person's spam is another person's service or information. Unsolicited notices for ski vacations or material on a local political candidate may be useful to some individuals and useless to others.) 12.3.4 How Flexible and Usable Are Products for Controlling Spam? Because many ISPs filter out spam for most users, users of those ISPs need not take any action at all to reject spam. However, when spam leaks through the ISP filters (or if e-mail is not filtered for spam at all), as is typical of most e-mail, the user must take action. Note that unsolicited e-mail, and the resources and attention it con- sumes, is not limited to sexually explicit e-mail for youth. It would be reasonable to assume that the number of parties sending unsolicited e- mail, the frequency with which they send it, and the volume that they send will all increase. Therefore, approaches to this problem are likely to be developed, regardless of the concerns about youth and sexually ex- plicit material. However, this can easily turn into another race: as better spam-discriminating technologies are invented, alternative ways of wrap- ping the unsolicited e-mail are invented, and the cycle continues. 12.3.5 What Are the Costs and Infrastructure Required for Using Spam-Control Products? Spam can be controlled in a variety of locations. When control is located at the receiver, locally installed software spam filters can help to process and eliminate spam. Conceptually, the cost of local spam filters is similar to that for content filters. However, spam filters must be inte- grated into software for processing e-mail in general. Control points based in the network (or an ISP) are both more com- plex and more comprehensive. Some ISPs have extensive capabilities for detecting the sending of spam mail (e.g., by monitoring the volume of e- mail sent in a given time interval), preventing the "harvesting" of e-mail addresses, and so on, and developing these capabilities entails substantial effort.
TECHNOLOGY-BASED TOOLS FOR USERS 321 Individual organizations often incur cost in configuring software and servers to stop spam from going through their own internal networks. Such efforts are often undertaken to help manage internal bandwidth more effectively. Finally, there may be costs incurred for infrastructure that may be needed to support legislative efforts to curb spam. For example, one method for dealing with junk mail and phone telemarketers is to establish a clearinghouse where people can register their names, addresses, and phone numbers. But the effectiveness of this approach is based on the fact that it is in the marketer's self-interest to refrain from wasting phone and mail effort and time on people unlikely to buy. Because sending spam is so much cheaper than mail and phone calls, a similar approach is unlikely to work effectively without some kind of legal cause of action that can be taken against those who ignore the clearinghouse. (Policy-based solu- tions are discussed in Chapter 9.) 12.3.6 What Does the Future Hold for Spam-Controlling Systems? There has been an acceleration of commercial organizations introduc- ing their messages into schools, although almost always after signing an agreement with the school board (which agreement usually includes new funds flowing to the school to supplement the budget). However, schools may wish to install some mail filtering before the marketing department of some soft-drink manufacturer decides to send e-mail to students just before lunch, promoting its product while also, to prevent uproar, giving "the spelling word of the day," "the math hint of the day," or whatever. It is easier for the school district to add another item to the spam filter than to have its lawyer sue the sender of the e-mails. As in the case of age verification technologies, expanded use of "mail deflection" beyond is- sues of sexually inappropriate material may warrant the trouble of install- ing Spam-Controlling systems. 12.3.7 What Are the Implications of Using Spam-Controlling Systems? As described in Chapter 9, legislative efforts to curb spam do have societal implications. 12.3.8 Findings on Spam-Controlling Technologies 1. Spam-Controlling technologies generally do not allow differentia- tion between different kinds of spam (e.g., hate speech versus inappropri-
322 YOUTH, PORNOGRAPHY, AND THE INTERNET ate sexually explicit material). Rather, they seek to identify spam of any nature. 2. Spam-controlling technologies that filter objectionable e-mail have more or less the same screening properties that filters have. That is, they do block some amount of objectionable content (though they do not gen- erally screen for links, which are often transmitted in lieu of actual explicit content). However, they are likely to be somewhat less effective than filters at preventing such e-mail from being passed to the user because users are likely to be more apprehensive about losing e-mail that is di- rected toward them than about missing useful Web sites, and thus would be more concerned about false positives. 3. Behavioral and procedural approaches to avoiding spam (rather than filtering it) have at least as much potential as spam-controlling tech- nologies to reduce the effect of spam. However, using such approaches adds somewhat to the inconveniences associated with Internet use. 12.4 INSTANT HELP The technologies discussed in Sections 12.1 and 12.2 are intended to prevent the exposure of children to inappropriate material. Instant help is a tool to deal with exposure after the fact. 12.4.1 What Is Instant Help? The philosophy underlying instant help is that from time to time children will inevitably encounter upsetting things online inappropri- ate material, spam mail containing links to inappropriate sexually explicit material, sexual solicitations, and so on. When something upsetting hap- pens, it would be helpful for a responsible adult to be able to respond promptly. An "instant help" function would enable the minor to alert such an adult and appropriate action could then ensue, could provide another channel to law enforcement through which threats, and solicita- tions, and obscene materials or child pornography could be reported. To the best of the committee's knowledge, there are no commercially available tools that provide instant help. But current technology could easily support an instant help function. For example, a secure one-click call for help47 could be: 47security for this ''one-click,, button is an important element of help the functionality of the button must not be disabled, as it is in mousetrapping When the ''back,, button sends the user to a new adult-oriented Web sited.
TECHNOLOGY-BASED TOOLS FOR USERS toolbar; 323 · On ISPs, an "always-on-top" button that would connect the user directly to a trained respondent; · On Internet browsers, a "plug-in" represented by a button on the · On search engines, an icon that would always be displayed on the results page; · On e-mail readers, an "I object to this e-mail" button on the toolbar; · On a computer desktop, a button that activates an application that allows a remote helper to take over control of the user's machine and view the screen. These buttons might be as simple as an icon for the CyberTipline (CTL) that would serve as an easily accessible channel for the public to use in reporting child pornography. The CTL icon has proven to be an effective tool in reporting obscene or child pornography because it is user-friendly and is the most direct method to report such images to the appropriate law enforcement authority. Because the CTL icon was built for the sole purpose of interfacing with the public to facilitate reporting to law enforcement computer-assisted crimes against children, it is more effective than other mechanisms for such reporting. Depending on the context of the technology through which the user is coming into contact with inappropriate content or interactions, a wide range of functionality is possible once the button is clicked. For example, "instant help" on a browser or an ISP could immediately connect the user to a helper who provides assistance. To provide context, an image of the screen could be transmitted to the helper. Such assistance might be most useful if the user encounters a solicitor or inappropriate conversation in a chat room or an instant message. Or, if a user encounters inappropriate material, the last several Web pages viewed could be shared with the helper, who could assist the user in whatever action he or she wished to take (e.g., sending URLs to the CyberTipline). For Internet access on a LAN, instant help could be configured to summon assistance from a re- sponsible adult within the LAN, such as a teacher or a librarian. Instant help would be applicable in both home and institutional con- texts. Implementing instant help functionality must be undertaken by service providers and technology vendors. But such functionality is not helpful without a human infrastructure to assist those seeking help the human infrastructure may be provided by the ISP, a parent, a school, a library, or even an expanded National Center for Missing and Exploited Children (NCMEC).
324 YOUTH, PORNOGRAPHY, AND THE INTERNET 12.4.2 How Well Might Instant Help Work? By providing assistance to the minor, instant help could potentially reduce the negative impact that may result from exposure to inappropri- ate material or experiences. Such exposure can come from either deliber- ate or inadvertent access, but in practice instant help is more likely to be useful in the case of inadvertent access. Instant help obviously does not enable the user to avoid inappropriate material but does provide a means for trying to cope with it. It also provides opportunities to educate chil- dren about how to avoid inappropriate material or experiences in the future, and it might lead to the creation of more civil norms online. It provides immediate assistance in the case of aggressive solicitations and harassment. Finally, it might lead to greater law enforcement activity if the materials involved are obscene or constitute child pornography. Metrics of effectiveness that indicate the extent to which children are not exposed to inappropriate materials do not apply to instant help. Rather, effectiveness is better assessed on the basis of the quality of the assistance that helpers can provide, and the responsiveness of the instant help function. Assistance that arrives 20 minutes after the user has pressed the instant help button is obviously much less helpful than if it arrives in 5 seconds, and of course, human helpers must be trained to handle a wide variety of situations. A specialist trained to provide this kind of help to youth, or a peer with special training, could potentially be more effective than the child's own parent or teacher or librarian. However, because this approach has never been implemented on a wide scale, staffing needs for instant help centers are difficult to assess. In many urban areas, crisis intervention hotlines (focused on helping people subject to domestic abuse, or feeling suicidal, struggling with substance abuse addictions, and so on) exist, but there are none known to the committee that give training to their volun- teer staffs concerning children's exposure to sexually explicit material on the Internet. 12.4.3 Who Decides What Is Inappropriate? Unlike other tools, the locus of decision making in the context of instant help rests with the minor. The minor decides what is upsetting and determines the situations in which he or she needs help. 12.4.4 How Flexible and Usable Is Instant Help? The purpose of an instant help function is to ensure that something can be done with very little difficulty. Thus, the flexibility and usability of an instant help function are paramount.
TECHNOLOGY-BASED TOOLS FOR USERS 325 For example, individual parents, libraries, or schools could customize who is contacted when the instant help button is pressed. Thus, a family with strong religious ties could set instant help to alert helpers from a group associated with their religious tradition, while a school district could set the instant help button so that in the elementary school, a mes- sage went to a staffer in that building and in the middle school, to a staffer in the middle school building. This is in some sense analogous to the national phone emergency number 911 going to a local 911 dispatch cen- ter based on the exchange from which 911 was dialed. 12.4.5 What Are the Costs and Infrastructure Required for Instant Help? The infrastructure and institutional cooperation needed for instant help to work successfully are considerable. Vendors must be willing to use precious screen space to provide instant help buttons. The infrastruc- ture of helpers must be developed and deployed. For an ISP, such an infrastructure might well be expensive; for the NCMEC or law enforce- ment agencies, it would be very expensive. But for a school or library (or even for responsible adult guardians), the infrastructure of helpers may already be in place.48 The costs are roughly proportional to the size of the helper infrastruc- ture; helpers (who could be volunteers) must be trained in how to re- spond to a call for help. Note also that a skilled adult predator or even adolescents bent on mischief could create a flood of diversionary instant help requests so that the responding individuals would become backlogged, during which time the predator could attempt to continue an interaction with a young per- son. Thus, some mechanism for protection from "flooding attacks" would be needed by any responding center that serves a large number of anony- mous end users or devices. 12.4.6 What Does the Future Hold for Instant Help? To the committee's knowledge, instant help functionality has not been implemented anywhere, and it remains to be seen if children would actu- ally use it if and when they are confronted with inappropriate material or 48It is true that in schools or libraries a child should be able to request help from these individuals without instant help features. The primary advantage of clicking an instant help icon is that it can be done privately and without drawing attention from other users.
326 YOUTH, PORNOGRAPHY, AND THE INTERNET experiences that upset them. Thus, some small-scale piloting of the con- cept to evaluate how it might work is likely to be very helpful before any major effort to implement instant help is considered. 12.4.7 What Are the Implications of Using Instant Help? A potential downside of a "low-cost" implementation that would require the child to describe the material and how he or she got there is that the child might be forced to focus more on the inappropriate mate- rial, perhaps causing at least discomfort to the child who may be better off if transferred back to appropriate activities as soon as possible. Such a negative outcome could be avoided if the inappropriate material could be automatically transmitted to the helper. (Since the material may well not be present on the child's screen when he or she contacts the helper, the automatic display of material might have to retrieve the last several screens this may be a difficult technical task under some circumstances.) In cases where a new type of offensive material or communication begins to occur for the first time on the Internet, the first instant help response center to identify this new material could share that information with schools and parents, other instant help response centers, youth (as warnings), or even filtering vendors. In that sense, instant help might harness all youth who use it to improve the monitoring of the Internet for new offensive material or communication. Dissemination of the insights of the staff of an instant help center should be considered a networked response, as opposed to the response of assisting a child when requested. The Internet technical community has experience with networked re- sponse in the CERT system to propagate information about worms, secu- rity holes, and the like. 12.4.8 Findings on Instant Help As the committee is unaware of any implementation of instant help that fits the description above, there are no findings to report.