Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
11 A Perspective on Technology-Based Tools There are a number of tools for protecting children from inappropri- ate Internet material and experiences. Most common are filters that at- tempt to block certain types of content, but tools for monitoring usage, verifying age, and protecting intellectual property fall into this domain as well. While each of these tools offers some degree of protection, there are many factors that enter into choices about what technology, or technolo- gies, should be used, or whether technology is appropriate at all. 11.1 TECHNOLOGY-BASED TOOLS As in many other areas of life, the Internet is an arena in which many adults (mostly parents) attempt to stay aware of their children's activities and some young people, particularly adolescents, attempt to evade pa- rental oversight. Technology-based tools for protecting children from exposure to inappropriate Internet material and experiences promise "hard" security against unknown threats and offer to compensate for parental lack of knowledge about how to understand and control the Internet usage of their children. Because they appear to promise such security, it is easy to believe that all that must be done is to install the technology and then one can forget about the problem. To be fair, technology vendors rarely make such claims explicitly. But the rhetoric of public discourse about technology solutions to "the problem" most definitely has such a tone. Indeed, the advocacy of tech- nology-based solutions has much of the same tone as commercials in 258
A PERSPECTIVE ON TECHNOLOGY-BASED TOOLS 259 which cereal is seen to be "part of a balanced breakfast," a qualification of approximately 1 second in a 30-second commercial extolling the virtues and pleasures of the cereal. Moreover, technology that helps to create a problem and technology that helps to solve it are another instance of the familiar measure/coun- termeasure game. In banks, better safes inspire bank robbers to develop better methods for cracking safes, which in turn inspire still better safes. When safes become too hard to crack, bank robbers can turn to high-tech fraud as a way of draining money from banks, starting the cycle all over again in a different domain. This implies that no technological solution is durable. The desire for simple, inexpensive, decisive technology-based solu- tions is understandable. But as noted in Chapters 8 and 10, a strong infrastructure of social and educational strategies that help children de- velop an internal sense of appropriate behavior and response is founda- tional for children's safety on the Internet. Technology-based tools can serve useful roles in much the same way that "training wheels" have a useful role in teaching children to ride bicycles. In addition, technology can strengthen the positive effects of good parenting, and serve as a backup for those instances in which parents are temporarily inattentive (as all parents are from time to time). For purposes of this report, tools are defined as information technol- ogy devices or software that can help to reduce the exposure of children to inappropriate material and experiences on the Internet. These devices or software can be installed in any one of a number of locations. Material on the Internet originates at a "source." It is then transmitted through a variety of intermediate points and is finally displayed on the user's screen. Inappropriate material can be identified at any point before the material appears on the user's screen allowing some appropriate action to be taken at any of these points. Box 11.1. describes in greater detail some of the points of content identification and control. It is also worth noting that there are technological and business pressures that are likely to ameliorate the problem. These include the following: · The development of most industries follows a pattern of innova- t~on, copycat, and then shakeout. The wide proliferation of adult Web sites suggests that the industry is in its copycat phase. If the industry continues on the traditional trajectory, shakeout in the industry is likely to occur in the future. If so, the remaining players are likely to demonstrate more corporate responsibility in differentiating children from adults in giving access to their products and services, although non-commercial sources of sexually explicit material are likely to be unaffected by this trend.
260 YOUTH, PORNOGRAPHY, AND THE INTERNET · Decentralization of the Internet (discussed in Section 2.1.2) is an enabler for a variety of technology-based tools that can be deployed by end users (e.g., individual families, schools, and libraries) to increase the range of options available to help parents and other responsible adults fulfill their responsibilities. · Some technological developments, such as the trend away from open chat rooms to closed instant message rings, will make it more diffi- cult for spammers and molesters to find individual victims. · As today's children become parents, the generational divide in technical knowledge and sophistication may begin to close. These comments should not in any sense be taken to mean that tech- nology-based tools themselves are useless or unnecessary, and the re- mainder of this chapter, as well as Chapters 12 and 13, describe how such tools might be useful. Nevertheless, the statements immediately above are collectively a message that not all technology or business trends bode ill with respect to the exposure of children and youth to inappropriate sexually explicit material and experiences on the Internet.
A PERSPECTIVE ON TECHNOLOGY-BASED TOOLS 26 11.2 CONTEXTUAL ISSUES FOR TECHNOLOGY-BASED TOOLS All tools to protect someone from inappropriate content require judg- ments about what content should be deemed inappropriate. While all decisions about what is inappropriate are derived from human judgments, the decision regarding any given content can be made by a computer program that seeks to mimic these human judgments (and examines the content itself as it is coming into the computer) or by people who examine that specific content, generally in advance (and sometimes far in advance) of an actual attempt to access this content. Not all tools, or even a given type of tool, are equally adept at identi- fying all kinds of inappropriate material. For example, for reasons de- scribed in Chapter 2, the identification of certain types of inappropriate sexually explicit material e.g., that which is found on adult Web sites- is considerably easier from a technical standpoint than the identification of other types of sexually explicit material or other types of content that may be judged inappropriate (e.g., material on bomb making, hate speech, religious cults). Chapters 12 and 13 address several generic categories of tools. But before turning to those tools, it is helpful to make a number of comments that apply across most technological options for protection. · The party that decides that a given tool is appropriate is almost always the party that must manage its use. Management of a tool in- cludes decisions about setting it up initially, maintaining it, and configur- ing it so that it does the appropriate things in the particular environment. It also entails decisions about the appropriate users (i.e., the children or youth in question) and when and under what circumstances they are subject to the restrictions imposed by the tool. Further, the decision- making process for considering the use of a given tool must consider a wide variety of factors that may include some emanating from external sources (e.g., government). · Technology solutions are brittle, in the sense that when they fail, they tend to fail catastrophically. In general, the catastrophe is not that they suddenly allow the child to access all possible kinds of inappropriate material (though this can sometimes happen), but rather in the sudden violation of expectation that the given technology would not fail. A child who has not been educated about what to expect and how to deal with problematic material found on the Internet, but has been "protected" by technology alone, will not have the coping skills needed to deal with such exposure. · Ease of use is a major factor in implementation of any technical tool. Tools do not provide effective protection if they are so difficult to use that
262 YOUTH, PORNOGRAPHY, AND THE INTERNET they go unused, and the complexity of a tool's setup and ongoing mainte- nance is a major factor in a tool's suitability. As a general rule, the "default" settings of a tool are the ones that are most often used. And, there is a distinct trade-off between simplicity of use and customizability to a user's specific preferences. Customization may, for example, require a user to specify preferences in many different domains partial frontal nudity is acceptable while full frontal nudity is not (except in images of classical art); violence is acceptable, while religious cults are not, and so on. But a user, faced with such choices, often tends to opt for the simplest solution which is likely to be "not OK" for all of the specified domains because of the defaults built into the software by the vendor. Further, the wide variety of computing environments often forces vendors into requiring a "setup" pro- cedure to adapt the tool to the user's particular hardware/software con- figuration. And, because of the constantly changing nature of the Internet, the tools must constantly be updated in order to remain current and valid. This puts the onus on the user to acquire a fair degree of technical know- how. Unless this step can be made easy for the user, only the most skilled or dedicated users will bother to use such a tool. · As a general rule, most technology-based tools can be circum- vented with sufficient effort. Furthermore, the history of information technology suggests that a method of circumvention, once discovered, is often proliferated widely.1 Not everyone is privy to such information, of course, but those who care about the topic and about circumvention- can usually find it with a relatively small effort. What technology can do is to pose barriers that are sufficient to keep those who are not strongly motivated from finding their way to inappropriate material or experi- ences, and the fact that technology can be circumvented does not mean that it will always be circumvented. For many people, circumvention will not be worth the effort. For others, the circumvention techniques will not always be available. Still others may not receive the word on circumven- tion. Sometimes, circumvention may be illegal even if feasible. For such reasons, technology-based tools have utility even though circumvention techniques exist. Nevertheless, as most parents and teachers noted in their comments to the committee, those who really want to have access to Specifically, when the circumvention is based on a software technique (as it has usually been to date), the circumvention can be easily broadcast at very low cost to many individu- als. When changes to hardware are involved (as happens relatively rarely), proliferation of such changes is more difficult. A number of high school students told the committee that once one of them finds a circumvention of some sort, he or she shares it with other inter- ested students almost immediately. For example, a high school student developed a way to bypass his school district's filtering system, and publicized it by sending an e-mail to every teacher and administrator in the district. See <http://www.salon.com/tech/feature/2001/ 06/14 /net_filtering / indexl .html>
A PERSPECTIVE ON TECHNOLOGY-BASED TOOLS 263 inappropriate sexually explicit materials will find a way to get them, and technology is relatively ineffective in the long run against those who are strongly motivated. From this point it follows that the real challenge is to reduce the number of children who are strongly motivated to obtain inap- propriate sexually explicit materials. This, of course, is one focus of the social and educational strategies described in Chapter 10. · Tools can and do improve over time as more effort is put into research and development. Some improvements are possible with better design and implementation of known technologies. Other improvements await advances in the underlying technologies and may eventually be incorporated into technology-based tools. However, almost by defini- tion, technological improvements are likely to be evolutionary rather than revolutionary, and so it is unwise to base any approach to protecting children and youth from inappropriate Internet materials and experiences on the hope of revolutionary technological breakthroughs. · Tools such as filters that are implemented on the local client ma- chine (i.e., at the receiver's point of interaction) tend to be easier to cir- cumvent than those elsewhere (e.g., those embedded in the network or in the enterprise that provides service). The reason is that tools co-located with the receiver are more readily accessible to the potential circumventer, and thus more subject to inspection, manipulation, and unauthorized or improper alteration or disabling. Moreover, tools that run on the client machine add an additional layer of complexity that can make a computer less reliable and more prone to lockups and system crashes. · Because currently deployed technologies do not yet support access policies to be associated with an individual rather than a workstation, a simple change of venue (i.e., a movement of the child or youth to another place where the Web can be accessed) is often all that is necessary to defeat the most effective technological tools for protection. A change of venue may be a deliberate attempt to avoid technologically imposed restrictions, as in the case of students using home computers with Internet access to bypass filtered access at schools. Alternatively, it may be entirely acciden- tal, in the sense that a venue that the minor uses for reasons of convenience on some occasions, for example, may not offer the same technological tools in its computing environment as the ones that he usually uses. · One of the principles underlying the architecture of the Internet is that to the extent possible, functionality resides at the end points of a communication, rather than in the middle. Thus, to the extent that end- to-end encryption is used, on-the-fly content identification by the content carrier (e.g., the Internet service provider (ISP)) is impossible, and there- fore interdiction based on such identification is impossible. (In practice, this fact gives the interdictor of content only two choices: to allow all unknown traffic to pass, or to block all unknown traffic. Allowing all
264 YOUTH, PORNOGRAPHY, AND THE INTERNET unknown traffic to pass is likely to permit some inappropriate content through, while blocking all unknown traffic is likely to block some appro- priate content.) · As with social and educational strategies, informed decisions about the use of technology-based tools (whether to use them and if so, which onets)) must take into account the developmental stage of the children for whose benefit they would be deployed. Some tools are most appropriate for younger children, who are presumably more impressionable, less ex- perienced in the ways of the world, and less skilled in the use of informa- tion technology, and for whom the consequences of exposure to inappro- priate material of any sort might be considerable. Other tools, perhaps allowing more discretion on the part of the user, might be more appropri- ate for older youth that are more experienced and mature. · Improvements in technology can be rapidly deployed compared to the time scale on which social and educational strategies change. That is, communities may be involved in the decision to use technology-based tools, but do not generally get involved in the technical design or imple- mentation of those tools, which are usually within the discretionary pur- view of the tool designer and vendor. By contrast, social and educational strategies for a community (though usually not for individual families) are often extensively debated, and once debated, an extensive training effort is then needed to promulgate a new approach. · The deployment of technological tools entails some financial cost, both initially and in ongoing costs. Thus, one potential social inequity is that those lacking in resources will be denied the benefits of various tools a "digital divide" issue.2 Advocates of using tools might argue 2For example, in 2000, 54 percent of public schools with access to the Internet reported that computers with access to the Internet were available to students outside of regular school hours, and secondary schools were more likely than elementary schools to make the Internet available to students outside of regular school hours (80 percent compared with 46 percent). Large schools (1,000 or more students) were more likely than medium-sized and small schools to make the Internet accessible to students outside of regular school hours (79 percent com- pared with 53 and 49 percent, respectively). In addition, schools with the highest minority enrollment reported Internet availability outside of regular school hours more frequently than schools with the lowest minority enrollment (61 percent compared with 46 percent). Such statistics suggest that schools do provide a considerable amount of out-of-class Internet access for many students and it is likely that many of these students do not have Internet access at home. (Statistics are taken from A. Cattagni and E. Farris, 2001, Internet Access in U.S. Public Schools and Classrooms: 1994-2000, NCES 2001-071, Office of Educational Research and Improvement, U.S. Department of Education, Washington, D.C. A Kaiser Family Foun- dation/NPR poll taken in 2001 found that schools are playing an important role in equalizing access to computers for kids. Specifically, African American children and children from lower-income households are considerably less likely to use a computer at home than are white kids or kids from higher-income families, whereas virtually the same percentage of all kids have used a computer at school. See <http://www.npr.org/programs/specials/poll/ technology/>.
A PERSPECTIVE ON TECHNOLOGY-BASED TOOLS 265 that it would be desirable for all people, rich and poor, to enjoy the protec- tion benefits that such tools confer, but if economics make such equity impossible, better for some to enjoy such benefits than for none to do so. On the other hand, detractors of tools, especially of tools whose use is made mandatory, can argue that laws such as CIPA which make the use of filters mandatory in exchange for e-rate funding force the problems of filters on the poor. 11.3 THE QUESTIONS TO BE ASKED OF EACH TOOL Chapters 12 and 13 discuss seven major types of technology that can be used to protect or limit children's exposure to inappropriate sexually explicit material on the Internet. These include filtering, adoption of spe- cialized domain names, surveillance and monitoring, age verification tech- nologies, instant help, tools for controlling spam, and tools for protecting intellectual property. To provide the basis for a systematic understand- ing of each of these tools, the committee found the following set of ques- tions useful. · What is it? The answer to this question provides a clear description of the tool and a discussion of its variants. While the need for a clear description may be obvious, the public debate has often been hampered by a lack of common understanding about exactly what option is being discussed. · How well does it work? What are the benefits that the product is intended to offer? As noted in Chapter 8, "protection" is a term with multiple possible meanings, and not all options provide the same kind of protection. Only after the nature of the protection offered has been established is it meaningful to ask about the tool's effectiveness. Note that effectiveness is a multidimensional concept, and efforts to reduce a tool's effectiveness to a single metric are generally not useful. · Who decides what is inappropriate? All options presume some defini- tion of inappropriate material, and such definitions reflect the values held by the decision-making party involved. Indeed, as indicated in Chapter 5, there are few universally held and objective standards for defining or recognizing inappropriate material. An understanding of the locus of definitional control is thus important, because who "should" be respon- sible for decisions about what is inappropriate for a child is at the center of much controversy. Advocates can be heard for the responsible party being the child, the child's parents, the child's teacher, a representative of the school, the city council, the state legislature, the U.S. Congress, the Supreme Court, local juries, and so on.
266 YOUTH, PORNOGRAPHY, AND THE INTERNET · Howilexible and usable is it? It is rare that a given implementation of an option is perfectly matched to the needs of its user, and the details of implementation may make a given product unsuitable for a user, even if, in general, the philosophy underlying the product is consistent with a user's needs. (An example is the ease with which user-enabled "over- rides" of product settings is possible.) A product may also have other features that enhance or detract from its usability. Note also that people are usually of two minds about the flexibility of a product. On the one hand, they generally believe that a product with greater flexibility can be customized to their needs to a greater extent. On the other hand, they find the actual exploitation of a product's flexibility to be a chore that they tend to avoid. As a result, the most common use of any technology tool is in its default "out of the box" configuration. (Thus, for practical pur- poses, it is fair that any assessment of a tool place great weight on what the tool does out of the box.) · What are the costs entailed and infrastructure required to use it? Pro- tection against inappropriate material does not exist in a vacuum, and in general, an infrastructure is necessary to support the long-term use of a tool that provides maximum protection consistent with the user's other functional requirements. Costs should be understood broadly, and they include financial costs, ease of implementation, ease of use, false posi- tives and false negatives, interference with the functions being served in the chosen environment, and lack of transparency about the option in operation. · What are the implications of using it? It is rare that the adoption of an option will have no side effects, and understanding the possible unin- tended consequences (which may be desirable or undesirable) may affect judgments about a tool's desirability. · What is its future? Some technologies whose effectiveness is lim- ited today may increase in effectiveness tomorrow as research progresses. Or, the technology necessary for a certain type of a tool might exist but not be implemented in any product now on the market. And, different environmental circumstances may lead to different levels of effective- ness which is especially true of tools or strategies whose effectiveness increases as others make use of them.