National Academies Press: OpenBook

Youth, Pornography, and the Internet (2002)

Chapter: 9. Legal and Regulatory Tools

« Previous: 8. Approaches to Protection from Inappropriate Material
Suggested Citation:"9. Legal and Regulatory Tools." National Research Council. 2002. Youth, Pornography, and the Internet. Washington, DC: The National Academies Press. doi: 10.17226/10261.
×
Page 201
Suggested Citation:"9. Legal and Regulatory Tools." National Research Council. 2002. Youth, Pornography, and the Internet. Washington, DC: The National Academies Press. doi: 10.17226/10261.
×
Page 202
Suggested Citation:"9. Legal and Regulatory Tools." National Research Council. 2002. Youth, Pornography, and the Internet. Washington, DC: The National Academies Press. doi: 10.17226/10261.
×
Page 203
Suggested Citation:"9. Legal and Regulatory Tools." National Research Council. 2002. Youth, Pornography, and the Internet. Washington, DC: The National Academies Press. doi: 10.17226/10261.
×
Page 204
Suggested Citation:"9. Legal and Regulatory Tools." National Research Council. 2002. Youth, Pornography, and the Internet. Washington, DC: The National Academies Press. doi: 10.17226/10261.
×
Page 205
Suggested Citation:"9. Legal and Regulatory Tools." National Research Council. 2002. Youth, Pornography, and the Internet. Washington, DC: The National Academies Press. doi: 10.17226/10261.
×
Page 206
Suggested Citation:"9. Legal and Regulatory Tools." National Research Council. 2002. Youth, Pornography, and the Internet. Washington, DC: The National Academies Press. doi: 10.17226/10261.
×
Page 207
Suggested Citation:"9. Legal and Regulatory Tools." National Research Council. 2002. Youth, Pornography, and the Internet. Washington, DC: The National Academies Press. doi: 10.17226/10261.
×
Page 208
Suggested Citation:"9. Legal and Regulatory Tools." National Research Council. 2002. Youth, Pornography, and the Internet. Washington, DC: The National Academies Press. doi: 10.17226/10261.
×
Page 209
Suggested Citation:"9. Legal and Regulatory Tools." National Research Council. 2002. Youth, Pornography, and the Internet. Washington, DC: The National Academies Press. doi: 10.17226/10261.
×
Page 210
Suggested Citation:"9. Legal and Regulatory Tools." National Research Council. 2002. Youth, Pornography, and the Internet. Washington, DC: The National Academies Press. doi: 10.17226/10261.
×
Page 211
Suggested Citation:"9. Legal and Regulatory Tools." National Research Council. 2002. Youth, Pornography, and the Internet. Washington, DC: The National Academies Press. doi: 10.17226/10261.
×
Page 212
Suggested Citation:"9. Legal and Regulatory Tools." National Research Council. 2002. Youth, Pornography, and the Internet. Washington, DC: The National Academies Press. doi: 10.17226/10261.
×
Page 213
Suggested Citation:"9. Legal and Regulatory Tools." National Research Council. 2002. Youth, Pornography, and the Internet. Washington, DC: The National Academies Press. doi: 10.17226/10261.
×
Page 214
Suggested Citation:"9. Legal and Regulatory Tools." National Research Council. 2002. Youth, Pornography, and the Internet. Washington, DC: The National Academies Press. doi: 10.17226/10261.
×
Page 215
Suggested Citation:"9. Legal and Regulatory Tools." National Research Council. 2002. Youth, Pornography, and the Internet. Washington, DC: The National Academies Press. doi: 10.17226/10261.
×
Page 216
Suggested Citation:"9. Legal and Regulatory Tools." National Research Council. 2002. Youth, Pornography, and the Internet. Washington, DC: The National Academies Press. doi: 10.17226/10261.
×
Page 217

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

9 Legal and Regulatory Tools This chapter presents a variety of legal and policy options that the committee considered and discussed during its deliberations regarding how best to protect children from inappropriate sexually explicit material on the Internet. However, it should be kept in mind that the legal and policy environment today (as this report is being written) is highly fluid and rapidly changing. Court cases are being heard, and legislation is pending on areas such as privacy, elimination of spam, and protecting kids on the Internet. Furthermore, even if some of these legal and policy options are helpful for regulating sources of sexually explicit material within the United States, a substantial number of sources outside the direct enforceable jurisdiction of U.S. law will remain. This fact might well limit the success of U.S. legal and policy options for regulating sources of sexually explicit material. Table 9.1 provides an overview of the public policy options described in this report. 9.1 VIGOROUS PROSECUTIONS OF OBSCENE MATERIAL With all of the difficulties described in Chapter 4 in defining obscenity, it is still the case that material deemed obscene is not protected by the First Amendment. Federal and state obscenity laws impose criminal and civil penalties on the production, distribution, and sale of obscene matrial. In recent years, however, obscenity prosecutions have been relatively rare. Such prosecutions can be very difficult, especially in the context of the 201

202 YOUTH, PORNOGRAPHY, AND THE INTERNET TABLE 9.1 Examples of Advantages and Disadvantages of Various Public Policy Options for Protecting Children from Inappropriate Sexually Explicit Material on the Internet One Illustrative Advantage One Illustrative Disadvantage Vigorous prosecution of existing obscenity laws Imposition of civil liability for dissemination of obscene materials Required use of age verification systems by commercial suppliers of adult-oriented sexually explicit material Would clarify existing uncertainties about the feasibility of obscenity prosecutions Would enable private parties to take action through the court system when prosecutorial resources are limited Required use of "text-only" Would reduce front page to en-sure that inadvertent access to first page of Web site is the sexually explicit material "warning" page about from adult-oriented material in-side being for commercial Web sites adults only Required labeling of material that is obscene for minors Prohibitions on spam Would reduce unsolic- containing material that is ited e-mailed advertise- obscene for minors meets for adult-oriented material Prohibitions on mouse- Would improve naviga- trapping to Web sites tional experience for containing material that is Internet users obscene for minors Would place a minimal burden on content providers Stricter enforcement of record-keeping requirements Would require personnel and resources that might be used for other law enforce- ment activity that may be of higher priority Would generally require some showing of individua- lized harm, which may be difficult to demonstrate Currently under judicial review; see discussion in Section 4.2.4 Would not reduce access for children willing to lie about their ages Would require positive market response for success (e.g., browsers must be cap- able of recognizing labels, parents must configure browsers accordingly) Content-based restriction would make regulation more problematic on First Amendment grounds Regulatory target may be uncertain Would increase costs of compliance with such requirements; might reduce number of commercial entities unwilling to behave responsibly Would require personnel and resources that might be used for other law enforce- ment activity that may be of higher priority

LEGAL AND REGULATORY TOOLS TABLE 9.1 (continued) 203 One Illustrative Advantage One Illustrative Disadvantage Streamlined process for handling reports of violations Self-regulation (e.g., vigorous enforcement of ISP terms of service) Would facilitate citizen reporting of child pornography; would reduce interagency impediments to cooperation in such prosecutions Would likely lead to faster "take down" of material posted in violation of terms of service (which generally includes material that is obscene or child pornography) Might increase false alarms and screening effort required Would require vigorous monitoring effort on part of ISP Internet, where community standards, jurisdictional, and international is- sues abound. Nevertheless, vigorous enforcement may help to persuade operators in the adult online industry to change their behavior to act more respon- sibly in denying access to children. It may also reduce to some extent the number of operators of Web sites carrying obscene material by putting some of them out of business and changing the cost-benefit calculus for other "marginal" operators who would choose to exit the business in a different regulatory climate. Note that a reduction in the number of Web site operators providing such materials is not likely to reduce greatly the ease with which one can find such material. The reason is that search engines operate in such a way that a given search is more or less independent of the number of such sites. Thus, if there are 100,000 such sites today, a reduction to 10,000 sites would not significantly increase the difficulty of finding such material (or even reduce the likelihood of coming across it inadvertently. Thus, it is likely that the second effect—persuading operators to behave more re- sponsibly would be larger in magnitude. It is also important to note that the problem of defining which com- ~Strictly speaking, the results of a search do depend on the number of Web pages being searched: the more pages in the index, the less often the search engine re-indexes/re- searches the Net for updates and changes. Thus, Web pages that have recently been added to the Internet are less likely to be found when the number of pages in the index is large. However, this is a relatively small effect.

204 YOUTH, PORNOGRAPHY, AND THE INTERNET munity's standards should govern obscenity prosecutions for material on the Internet is at issue in current litigation regarding the Child Online Protection Act of 1998 (COPA). As noted in Chapter 4, the Supreme Court held on May 13, 2002, that COPA's reliance on "community stan- dards" to identify what material is "harmful to minors" did not by itself render the statute substantially overbroad for First Amendment purposes, but was silent on the extent and nature of the community in question. The Court of Appeals reasoned that in dealing with the Internet, un- like other forms of communication, the material is immediately available in all jurisdictions without regard to conventional geographical bound- aries. If local community standards govern the definition of material that is obscene for minors, then providers will be liable to criminal prosecu- tion if the material they make available violates the standards of any jurisdiction in the nation. In such a situation, providers will censor them- selves by providing only that sexually oriented material that is not ob- scene for minors in the most conservative community in the nation. The Court of Appeals concluded that such a situation would violate the First Amendment rights of both providers and of citizens of all of the other communities in the nation. This same issue would arise with respect to traditional obscenity prosecutions for material presented on the Internet. It should also be noted that even if the Supreme Court upholds the Third Circuit Court of Appeals in this instance, this does not necessarily mean that all legislation intended to serve the same goals as COPA will be unconstitutional. For example, one possible solution to this problem would be to require the government in a prosecution of obscenity on the Internet to prove that the material is obscene (or obscene for minors) under both national and local community standards. The use of the na- tional standard would avoid the problem that concerned the Third Cir- cuit Court of Appeals, and the use of the local standard would be neces- sary to ensure that the material was in fact unprotected in the particular jurisdiction in which the prosecution takes place. In any event, this issue is currently pending. More vigorous prosecution of federal and state obscenity laws re- gardless of the outcome would help to clarify whether the current state of affairs with respect to obscenity prosecutions is due to a liberalization in "community standards," a lack of willingness to undertake prosecutions, or some other factor or factors. Thus, such prosecution could help to estab- lish more up-to-date benchmarks for the field, a development that would provide much-needed guidance for law enforcement professionals dealing with such issues. Finally, vigorous enforcement of laws against false and deceptive advertising and business practices (an example of which is de- scribed in Section 3.4.2) could help to reduce exposure to inappropriate

LEGAL AND REGULATORY TOOLS 205 sexually explicit material that results from mouse/rapping, takeovers of spam that advertises adult-oriented, sexually explicit sites, and so on. Finally, note also that despite the Supreme Court ruling overturning the provisions of the Child Pornography Prevention Act relating to computer-generated imagery (discussed in Chapter 4), there is no bar to the prosecution of material that is obscene, whether or not it involves computer-generated images. Thus, if material depicts a child engaged in sexual activity, the full weight of the obscenity laws continues to apply should that material be found through the Miller tests to be obscene. 9.2 CIVIL LIABILITY FOR PRESENTING OBSCENE MATERIAL ON THE INTERNET Because prosecutors have not been inclined in recent years (i.e., through- out most of the 1990s) to commit substantial resources to obscenity prosecu- tions, an alternative is to allow private individuals to bring civil actions for damages against individuals or businesses that purvey obscenity on the In- ternet. Under such a regime, any person who finds obscenity on the Internet could sue the Web site operator for civil damages. This use of the concept of "private attorneys general" is not unknown in the law. But it is exception- ally rare. Ordinarily, one cannot bring a civil action for damages without show- ing some legally cognizable harm to the would-be plaintiff that has not been suffered by other persons generally. For example, if X drives his car in excess of the speed limit, he cannot be sued for damages by people who were not individually damaged by his speeding. Similarly, people cannot sue for damages a murderer, a thief, or a drug dealer, without a showing of particularized, specific harm to them as individuals. Although the idea of essentially creating a "bounty" by authorizing such suits has some appeal, it is generally not consistent with the standards of the U.S. legal system or the basic goals of U.S. system for civil actions. And although civil actions for damages are familiar in the realm of expression (for ex- ample, civil actions for defamation), they have always required a showing of individualized harm. A further difficulty is the potential for abuse using the court system merely to harass providers of material that one group or another finds objectionable. 9.3 OPTIONS FOR DEALING WITH MATERIAL THAT IS OBSCENE FOR MINORS In recognition of the special problems posed by the exposure of chil- dren to sexually explicit material, the Supreme Court held in Ginsberg v.

206 YOUTH, PORNOGRAPHY, AND THE INTERNET New York that the government can constitutionally prohibit "the sale to minors . . . of material defined to be obscene on the basis of its appeal to them whether or not it would be obscene to adults." In other words, the government can prohibit children from having access to certain types of sexually explicit material that it cannot constitutionally ban for adults. As noted in Chapter 4, this doctrine works best in those situations in which it is possible to separate children from adults, for as the Supreme Court has also observed, the government "may not reduce the adult population . . . to reading only what is fit for children." Thus, in decisions like Reno v. ACLU, the Court has made clear that the government may not prohibit material that is "obscene as to minors" on the Internet unless it can do so in a way that does not unduly interfere with the rights of adults to have access to such material. Further, there is a very wide developmental range from birth to the age of legal majority. The very concept of speech that is obscene for minors has never been well defined, but presumably its content varies with the age of the minor. This creates a problem for any unitary definition for the term. Furthermore, what is obscene, either for adults or for children, turns in part on community standards. But as noted in Chapter 4, it is difficult to define or identify community standards when one deals with the Internet. This may, or may not, present a constitutional problem depending on the re- strictions that are imposed on such material (see below). Although an outright prohibition of material that is obscene for mi- nors would therefore be unconstitutional, more finely tuned proposals, such as those described below, may pass constitutional muster. 9.3.1 Age Verification In the Child Online Protection Act of 1998 (COPA), Congress sought to remedy the deficiencies in the Communications Decency Act of 1997 (CDA) that led the Supreme Court unanimously to invalidate the CDA in Reno v. ACLU. (See Chapter 4.) COPA declares it unlawful to communi- cate on a commercial Web site material that is obscene for minors by means of the World Wide Web if the material is available to minors. COPA states that it will be an affirmative defense if the defendant, in good faith, takes reasonable measures to restrict access to the material by minors bv requiring a credit card, an adult access code, an adult personal identification number, or other appropriate proof of age. As of April 2002, the constitutionality of COPA is currently pending before the Su- preme Court of the United States in Ashcroft v. ACLU. As noted above, one issue in this case is whether the use of local community standards in the context of the Internet is consistent with the First Amendment. Another issue is whether the requirement of age veri- ' ) 1 ~ -

LEGAL AND REGULATORY TOOLS 207 fication passes constitutional muster. For example, in a book store one can require proof of age to purchase a book that is obscene for minors, thus providing access to adults while denying access to children. COPA attempts to establish a similar basis for differentiation between adults and minors on the Internet. Because the implementation of age verification on the Internet requires the use of technology, the objection to COPA is that it imposes significant costs on the Web site operator and/or the adult viewers and that by potentially creating a permanent record it violates legitimate privacy interests and chills the freedom of adult viewers. If the Supreme Court upholds the constitutionality of COPA in Ash- croft v. ACLU, this will appreciably advance the interests of those who seek to prevent minors from gaining access to material that is deemed to be obscene for minors. It will not necessarily meet all of their concerns, however. First, COPA applies only to material that is obscene for minors. The precise definition of this concept remains largely undeveloped, and it is not clear how far it will reach. Second, COPA applies only to material on the World Wide Web. It does not apply to chat rooms or e-mail. Third, COPA applies only to commercial Web sites. It does not apply to non- commercial sites. These three limitations on COPA were necessary to meet the concerns of the Supreme Court in Reno v. ACLU in invalidating the CDA. Fourth, COPA will be effective only to the extent government actually prosecutes violations with sufficient vigor to have a significant deterrent effect. The lack of Internet obscenity prosecutions in recent years raises questions about whether such prosecutions will occur. Fifth, COPA applies only to Web sites in the United States. For jurisdictional reasons, federal legislation cannot readily govern Web sites outside the United States, even though they are accessible within the United States. Because a substantial percentage of sexually explicit Web sites exist out- side the United States, even the strict enforcement of COPA will likely have only a marginal effect on the availability of such material on the Internet in the United States. Thus, even if the Supreme Court upholds COPA, COPA is not a panacea, illustrating the real limitations of policy and legal approaches to this issue. The committee also notes that, even if COPA is constitutional, this does not necessarily mean it is good public policy. The concerns raised against COPA could at least arguably lead to the conclusion that it is insufficiently effective to justify its costs, whether or not it is consistent with the First Amendment. If the Supreme Court invalidates COPA because age verification pro- cedures in the context of the Internet are too burdensome on the First Amendment rights of adults, this will make it very difficult to regulate material that is obscene for minors in this environment. In the next few sections, the committee presents several legal and regulatory approaches that might be available even if the Supreme Court invalidates COPA.

208 YOUTH, PORNOGRAPHY, AND THE INTERNET 9.3.2 Plain Brown Wrappers and Age Verification Many commercially oriented adult Web sites subject the viewer to an assortment of "teaser" images that are intended to entice a viewer to pay in order to see more such images. In many cases, the teaser images in- clude material that may be obscene. To prevent minors from viewing such materials, it might be possible to grant such Web sites a statutory "safe harbor" immunity from prosecution under obscenity laws if the provider places the Web site behind a "plain brown wrapper."2 Such a "notice" page would contain an appropriate warning indicating that go- ing past this notice page should be done only if the viewer is older than 18, and that going past this notice page constitutes certification that the user is indeed older than 18.3 The notice page would contain no images, or perhaps images that are obscured in the same way that the covers of adult-oriented magazines are obscured in the newsstands. The purpose of the notice page is to ensure that anyone who reaches the sexually explicit Web pages of a site has actually read the notice and agreed to its terms. However, many sites today have notice pages and it is still often possible to reach sexually explicit pages on those sites through search engines that index the pages behind the notice page. Thus, by clicking on a link that may be returned by a search engine, the user cir- cumvents the notice page entirely. To prevent such circumvention, it is necessary to prevent search en- gines from indexing the pages behind the notice page, and a standard protocol for accomplishing this task is described in Chapter 2 (Box 2.2~. For a site that uses this protocol (the "robots.txt" protocol), a Web indexer for a search engine cannot reach the pages behind the notice page, and so search engines cannot return links to those pages and thus users cannot access them directly. Thus, the only way that a user could reach the contents of the adult Web site would be to go through the notice page. This approach would reduce inadvertent access to teaser images on adult-oriented sites, and thus provide a greater level of denial of such access than is currently in place. Of course, this approach would not 2The reason for this approach (of granting immunity from prosecution under obscenity laws rather than obscene-for-minors laws in exchange for using age verification technolo- gies and "plain brown wrappers") is that if it were constitutional to prosecute a Web site operator under obscene-for-minors laws, the government would simply do it. However, if such prosecutions are found to be unconstitutional, then the Web site is immune from such prosecutions regardless of what it does with respect to age verification technologies and "plain brown wrappers," and thus an incentive of a different sort is needed to persuade them to adopt such measures. In this case, the incentive is to grant another, different benefit namely, immunity from obscenity prosecution. 3The age of 18 is an age that denotes legal emancipation for a minor, but there is no particular reason that the age could not be some other number.

LEGAL AND REGULATORY TOOLS 209 prevent access by individuals under 18 who are willing to lie about their age. To deal with such individuals, it may be possible to add to the plain brown wrapper an age verification requirement. That is, in order to get past the notice page, proof of age would be required.4 (A discussion of age verification technologies is contained in Chapter 13.) Such a provi- sion might be constitutional, even if COPA is declared invalid, because the use of age verification is encouraged by the offer of immunity from prosecution for obscenity, but is not legally required. 9.3.3 Labeling of Material That Is Obscene for Minors Another possibility would be to require any commercial provider of material that is obscene for minors to label that speech in a machine- readable manner that enables parents and others to technologically block the access of minors to such material (Section 12.1 has further discussion of this approach). Because this approach focuses only on a category of speech that can constitutionally be restricted for minors, and does not prohibit adults (or even minors) from accessing it, it may not be unduly burdensome. And if the market responds appropriately, the pro- posal provides parents with a reasonable degree of control over what their children see. If the labeling requirement is found to present constitutional difficul- ties, a less speech-restrictive approach would be to grant safe-harbor im- munity from prosecution under "obscene for minors" and obscenity laws for those who voluntarily label their materials in an appropriate manner. 9.3.4 Prohibiting Spam That Is Obscene for Minors A third approach would be to prohibit any person from sending on the Internet commercial spam that includes material that is obscene for minors. This is less intrusive on First Amendment interests than COPA because it deals only with commercial advertising and it involves sending information to individuals who have not requested it or sought it out. 4The use of age verification technologies poses a significant privacy issue. Indeed, in the cases of the CDA and COPA, the courts reviewing these acts found that these requirements were unreasonable given the current state of technology and that age verification measures impose significant burdens on Web sites because the verification measures require site visitors to provide personal information. Because users are reluctant to provide this infor- mation and are discouraged from accessing sites that require such disclosures, the imposi- tion of age verification requirements may chill or inhibit adults from accessing non-obscene Web sites, both because they might not wish to give personal information and because they may not be able to prove their age. These measures, the courts found, would diminish access to protected speech and thereby impose significant expense on commercial sites.

210 YOUTH, PORNOGRAPHY, AND THE INTERNET However, this approach is potentially more problematic than COPA be- cause it restricts the sending of such material to adults as well as to chil- dren. Unlike COPA, it does not attempt to differentiate between them. In general, the Supreme Court has held that the government cannot prohibit the sending of constitutionally protected material (including com- mercial advertising) to individuals who have not expressly asked not to receive it. In Bolger v. Youngs Drugs Products Corp.,5 for example, the Court invalidated a federal statute prohibiting the mailing of unsolicited advertisements for contraceptives because the interest in shielding "re- cipients of mail from materials that they are likely to find offensive" is not sufficiently substantial to justify the restriction of "protected speech." The most plausible distinction between the law invalidated in Bolger and a ban on sending through the Internet commercial spam that includes material that is obscene for minors is that material that is obscene for minors is constitutionally protected for adults, but not for children. This may not be a sufficient distinction to make a constitutional difference. More modest versions of this proposal, more likely to withstand constitu- tional scrutiny, would prohibit any person from sending commercial spam that includes material that is obscene for minors (a) without appropriate labeling (e.g., having a warning on the e-mail subject containing a mes- sage like "Not appropriate for children under 16 years of age"), or (b) without prior age verification, or (c) after the recipient has objected to receiving such material in the past. It is important to note that for speech to be regulated under the com- mercial speech doctrine, it must consist of advertising. Thus, to the extent that the constitutionality of the alternatives noted above turns on the commercial speech doctrine, non-commercial spam or spam that does not consist of advertising could not be restricted. It is also worth noting that most spam concerning sexually explicit material does not consist of the sexually explicit material itself, but of links to Web sites that have such material embedded within them. Thus, the recipient of the e-mail must affirmatively take some action actually to reach the Web site (e.g., clicking on the link). From a constitutional per- spective, there is a significant difference between "inflicting" sexually explicit material on individuals who do not want to be exposed to it and providing those individuals information about how to find such material. Even if the former can be regulated, the latter may warrant greater consti- tutional protection. This observation may be especially important in applying the com- mercial speech doctrine. From a constitutional perspective, there is a 5463 u.s. 60 (1983~.

LEGAL AND REGULATORY TOOLS 211 difference between giving individuals information about how to obtain an unlawful thing and actually providing them with the thing. Making illegal the mere providing of the link might pass constitutional muster if (a) the material at the link could be determined to be illegal (as it could sometimes be under obscenity laws) and (b) the party providing the link is essentially an accomplice under the criminal law. Requirement (b) would not be met merely because someone provided information about how to find obscene material. However, it would be met if the spammer is also the operator of the Web site containing obscene materials (or if the spammer is hired by the Web site operator), because the spam could be regarded as an advertisement for an illegal product and the provider or sender punished on that basis. Note that a variety of legislative proposals have appeared with the intent of reducing the problem of spam e-mails. For example, one federal proposal calls for prohibiting senders of unsolicited commercial electronic mail from disguising the source of their messages, and giving consumers the choice to cease receiving a sender's unsolicited commercial electronic mail messages.6 This proposed legislation prohibited senders from in- cluding materially false or misleading header information and deceptive subject headings in commercial e-mail, required the inclusion of a valid return address in commercial electronic mail so that the recipient could indicate a desire not to receive further messages from the sender, and penalized further transmissions of e-mail once such a desire had been indicated. States have also sought to regulate spam, as illustrated in Box 9.1. Still another possibility for regulating spam is a mechanism similar to that for regulating the telephone calls of telemarketers.7 Thus, it might be feasible for a central clearinghouse to register and maintain specific e-mail addresses or perhaps even entire domain names as a "do not spam" database. The clearinghouse would also be configured to provide data- base results automatically. Any party sending spam would be required to check the "do not spam" database and eliminate from its mass mailing all addresses contained in the database. Not doing so (and the proof would be the receipt of a spam by someone contained in the database) would subject the sender to some civil liability and/or class-action suit. (Note 6HR 718, the Unsolicited Commercial Electronic Mail Act of 2001, passed the House on April 4, 2001. 7At present, a customer request to refrain from calling must be honored only by the specific company to which the customer has given notice. As this report goes to press, the Federal Trade commission is proposing to create a centralized national "Do Not Call,, registry that would enable consumers to eliminate most telemarketing calls simply by reg- istering with a central '~do-not-call,, list maintained by the FTC. see <http://www.ftc.gov/ opa/2002/O1 /donotcall.htm>.

212 YOUTH, PORNOGRAPHY, AND THE INTERNET that a mechanism of this sort that was specifically aimed at senders of sexually explicit spam would be much more suspect under the First Amendment because it would not be content-neutral.) 9.3.5 Prohibiting the Practice of Mousetrapping to Web Sites Containing Material That Is Obscene for Minors Another approach may be to prohibit the practice of mousetrapping at sites that contain material that is obscene for minors without prior age verification. Even if the Court finds COPA unconstitutional, it may be that the act of directing children to material that is obscene for minors without their consent or any affirmative act on their part would be up- held. The act of mousetrapping involves not only exposing individuals to material they would prefer to avoid, but rather actually taking over their freedom of choice and effectively compelling them to view such material. Whether or not all mousetrapping can or should be restricted, a reason- able case can be made for prohibiting operators of Web sites from sending

LEGAL AND REGULATORY TOOLS 213 children without warning or choice to sites that will expose them to mate- rial that is obscene for minors. A more modest variation would be to require at least a warning before mousetrapping a viewer to a site con- taining material that is obscene for minors. 9.4 ENFORCEMENT OF RECORD-KEEPING REQUIREMENTS One element of the federal obscenity laws (18 U.S.C. 2257, as dis- cussed in Section 4.2.2) involves a record-keeping requirement intended to ensure that performers and models depicted in sexually explicit scenes are older than 18. More active enforcement of this provision may better protect minors from participation in the creation of child pornography.8 Assuming that strict enforcement of this provision can withstand con- stitutional scrutiny, such enforcement might also have the effect of in- creasing the rate at which the adult online Web industry consolidates. Compliance with the regulation would increase the expenses of such pro- viders, and would be likely to drive out of business the small-scale "quick buck" enterprises, while the established adult content providers would simply absorb those expenses as a cost of doing business. (At the same time, as a matter of constitutional doctrine, the intent behind enforcement is highly relevant. If these laws are enforced with the intent of driving out of business otherwise legal business operations, such enforcement might well raise constitutional questions.) As described in Chapter 3, representatives from the online adult in- dustry testified to the committee that attracting children was not in their business interest. Taking this testimony at face value, and assuming that the "quick buck" providers are the ones that are not discriminating in their attraction of traffic to their sites (i.e., not distinguishing between adults and minors), then enforcement of 18 U.S.C. 2257 might result in a withering of the irresponsible enterprises, leaving behind businesses that take more seriously their responsibilities to deny access of their wares to minors. Note also that the issue of recognizing "sexually explicit con- duct" is far simpler than recognizing obscenity, a fact that would simplify considerably the task of prosecution.9 This point is likely to be most relevant in the context of Web sites that depict "barely legal" models engaged in sexually explicit behavior. Thus, the child pornography at issue is most likely to be images of an older minor engaged in sexual activity. 918 U.S.C. 2257 defines "sexually explicit conduct" as actual sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal, whether between persons of the same or opposite sex; bestiality; masturbation; sadistic or masochistic abuse; or lascivious exhibi- tion of the genitals or pubic area of any person.

214 YOUTH, PORNOGRAPHY, AND THE INTERNET 9.5 STREAMLINING THE PROCESS OF HANDLING VIOLATIONS Prosecutors seeking to enforce child pornography laws rely to a sig- nificant extent on lay person reporting of child pornography. That is, law enforcement officials may come across such material in the course of their everyday work, but citizens filing complaints about such material are a major source of leads and tips. In an Internet environment, the most natural way to file such com- plaints is likely to be electronic. For example, a concerned citizen lodging a complaint with law enforcement officials should provide the route to which the material came to the citizen's attention and a description of the image. Because images are hard to describe in words, a copy of the image would be desirable to include in the complaint. Indeed, in the case of child pornography, such a copy so forwarded might be the only tangible evidence that law enforcement officials could obtain, as child pornogra- phy sites are generally highly transient (and by the time law enforcement officials are able to act on such complaints, the site may be gone). How- ever, if the citizen files an electronic complaint with a copy of the suspect image, he or she may be in technical violation of statutes that prohibit the electronic transmission or distribution of child pornography, even though it is being transmitted to law enforcement authorities or the National Center for Missing and Exploited Children (NCMEC) and even though such evidence might be crucial for the investigation and prosecution of the offender by law enforcement. Instead, complainants must often go through a cumbersome and inconvenient Procedure to file such a re- port.l° - r- A similar problem affects the NCMEC. Despite the NCMEC's role in providing technical assistance to law enforcement in the investigation of child pornography, it does not enjoy the same immunity enjoyed by law enforcement authorities to receive, possess, and store complaints of child pornography, and it does not have the authorization to transfer evidence of child pornography from the NCMEC to other designated law enforce- ment agencies outside the CyberTipline (CTL) system and sometimes even within the CTL system.ll 1OFor example, to report a suspected violation through the NCMEC, they must provide the relevant URL where the image can be found and a textual description of the image. In an online environment, it would be much simpler and easier for the citizen to simply for- ward the image. 1lThe CyberTipline, operated by the NCMEC, is a national reporting mechanism for use by citizens to report to law enforcement authorities apparent instances of sexual exploita- tion of children; possession, manufacture, and distribution of child pornography; online enticement of children for sexual acts; child prostitution; child-sex tourism; and child sexual molestation. See <http: / /www.cybertipline.com> for more information.

LEGAL AND REGULATORY TOOLS 215 Relief for the first problem i.e., allowing citizens to report suspected child pornography to the NCMEC without fear of prosecution could enable and encourage more citizen action in this area, while relief for the second problem would enable the NCMEC to proactively forward such evidence to law enforcement outside the CTL system. 9.6 SELF-REGULATORY APPROACHES Successful self-regulatory approaches are based on the fact that the firms in an industry are generally willing to abide by a common code of behavior, though as noted in Chapter 4, such a willingness may reflect a desire to stave off legislation or other regulation that these firms would find more onerous. One example of self-regulation with respect to certain media content is the willingness of private producers of TV content to provide ratings of their content that can be processed by the V-chip. Today, a large number of reported instances of child pornography remain on Internet service provider (ISP) servers because law enforce- ment lacks the resources to investigate every report. An approach used in Europe with some success is employed by the European INHOPE Hot- lines. Under the INHOPE approach, European ISPs support a non-gov- ernmental organization, staffed by trained specialists to identify child pornography and funded by the ISPs, whose role is to advise Internet service providers of possible postings of child pornography.l2 (Through Internet "Hotlines," this organization takes tips from the public, but screens them for credibility.) Such advisories do not have any binding effect on ISPs, but in fact many ISPs cooperate with such advisories by taking down the offending material because these advisories provide more authoritative advice than that provided by members of the public.l3 In a U.S. context, such a function could be provided by the NCMEC, which currently lacks the authority to provide such advisories. A second facet of possible self-regulatory efforts might include promi- nent placement of the CTL reporting icon on adult-oriented Web sites. Today, many such Web sites provide links to information on filtering products, and some even have a banner that says "fight child pornogra- phy." It would be simple for these sites to add the CTL icon, which has proven quite useful in reporting online child pornography. 12Note also that the fact of private support by the ISPs is the key component of the "self- regulatory" dimension of this approach as viewed by the Council of Europe. 13Note that the private terms of service to which users must conform as a condition of an ISP's service agreement with the user grant ISPs considerably more latitude in the exercise of such "take-down" authority than would be possible if they were agents of government and hence constrained by legal and constitutional barriers.

216 YOUTH, PORNOGRAPHY, AND THE INTERNET Another dimension of self-regulation is the willingness of ISPs to enforce the terms of service to which users must agree. To the extent that these terms of service prohibit posting or sending of inappropriate mate- rial, harassment, or other inappropriate behavior (and most terms of ser- vice do contain some restrictions along these lines), ISPs have the author- ity to take quick action against offending users without waiting for legal action. A third example of self-regulation could be set by the commercial sources of adult-oriented, sexually explicit imagery that provide much of the content for smaller "affiliates." In particular, they could build into their contracts with affiliates conditions that require those affiliates to engage in responsible behavior. Thus, as one possibility, affiliates could be required contractually to put their content behind the Internet equiva- lent of "plain brown wrappers" with age verification. The firms that supply them with content would be in a position to check on them and penalize them if they did not (by cutting off a content source). 9.7 GENERAL OBSERVATIONS In its consideration of various public policy options to help shield children and youth from inappropriate sexually explicit material on the Internet, the committee realizes that the viability of many proposals de- pends on how makers of public policy make certain trade-offs. Proposals that depend on regulating a certain type of content (namely, sexually explicit speech) are inherently more suspect on First Amendment grounds than proposals that regulate speech independent of content. For example, the committee believes that spam containing material that is obscene for minors should not be sent to children. But laws ban- ning such e-mail to minors are potentially problematic in an online envi- ronment in which it is very difficult to differentiate between adults and minors. At the same time, a ban of all spam regardless of content may be seen as too broad because it affects many other interests. The committee also believes that it would be desirable for adult Web site operators who exhibit material that is obscene for minors to use age verification systems so that children would not be able to access such material. However, in an online environment in which it is very difficult to differentiate between adults and minors, it is not clear whether this can be achieved in a way that does not unduly constrain the viewing rights of adults. Thus, as one illustrative example, the government might offer a grant of immunity from prosecution under obscenity laws to Web site operators who use age verification systems to prevent minors from ac- cessing such material. In this instance, the trade-off is helping to protect

LEGAL AND REGULATORY TOOLS 217 children from exposure to certain kinds of inappropriate sexually explicit material in return for limitations on possible obscenity prosecutions. Enforcement of obscenity laws also presents trade-offs. Increased prosecution of obscenity would likely require increased resources, and those resources must be taken from some other activity. If, as is likely, the other activity represents prosecutions of other crimes, policy makers must make the judgment that it would be wise to pursue more obscenity pros- ecutions rather than other criminal prosecutions, or that more prosecu- tions for obscenity would necessarily be the best use of additional re- sources if such resources are available. Such judgments are complex and require a careful weighing of many competing factors well beyond the scope of this report. Several other general observations follow below: · While the foundation of protecting children from inappropriate Internet materials and experiences continues to be social and educational strategies to instill ethics of responsible choice and coping strategies for inadvertent exposure, public policy has a role in shaping the environment in which children exercise their choices. · Actions to remove illegal material from the Internet can occur much more quickly if the authority to do so is based on the terms of private contracts (such as terms of service) rather than the requirements of pub- lic law. · Obscenity prosecutions are often difficult to undertake, because community standards are often not knowable in advance of an actual trial. By contrast, child pornography is based on standards that are often easier to identify (e.g., is this material sexually explicit? versus does this material violate community standards?) and as such is easier to pros- ecute, all else being equal. Nevertheless, more aggressive prosecution of allegedly obscene materials, even if not always successful, would help to clarify the status of this instrumentality.

Next: 10. Social and Educational Strategies to Develop Personal and Community Responsibility »
Youth, Pornography, and the Internet Get This Book
×
Buy Paperback | $65.00 Buy Ebook | $54.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Internet has changed the way we access the world. This is especially true for kids, who soak up new technologies like eager little sponges. They have access to an enormous array of material, including educational links, sports info, chat rooms—and, unfortunately, pornography. But we must approach our need to protect children with care to avoid placing unnecessary restrictions on the many positive features of the Internet.

Youth, Pornography, and the Internet examines approaches to protecting children and teens from Internet pornography, threats from sexual predators operating on-line, and other inappropriate material on the Internet. The National Research Council’s Computer Science and Telecommunications Board explores a number of fundamental questions: Who defines what is inappropriate material? Do we control Internet access by a 17-year-old in the same manner as for a 7-year-old? What is the role of technology and policy in solving such a problem in the context of family, community, and society?

The book discusses social and educational strategies, technological tools, and policy options for how to teach children to make safe and appropriate decisions about what they see and experience on the Internet. It includes lessons learned from case studies of community efforts to intervene in kids’ exposure to Internet porn.

Providing a foundation for informed debate, this very timely and relevant book will be a must-read for a variety of audiences.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!