National Academies Press: OpenBook
« Previous: 10 Automated Policy Preference Negotiation
Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×

11
Digital Rights Management Technology

John Blumenthal

I am a security architect specializing in digital rights management (DRM) systems. I am engaged now in the music and publishing space, but I have a history of looking at rights management in terms of digital products and messaging, e-mail in particular, dealing with issues such as the unauthorized forwarding of e-mails in the sense of how conversations are considered under copyright law and the ability to abuse conversations. I have both a technological hacker perspective and a policy approach that includes a focus on risk management in terms of how to control content.

11.1 TECHNOLOGY AND POLICY CONSTRAINTS

How do we prevent particular types of content floating around on the Internet from reaching certain classes of users? We would like to implement a technological restriction. How do we implement these controls on contents to contain propagation? The Internet is all about propagation. This question raises not only the issue of viewing but also the issue of ownership and super-distribution or forwarding. On the policy and legal side, can this be implemented in a legal structure once you achieve this “nirvana” of a universal technological solution?

Is this really any different from the MP3 debate? There may be social or psychological issues as to why people consume and propagate this type of content, but fundamentally, to look at the MP3 debate is to stare in the face of the problem. The current crisis in the music industry is that this

Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×

format, MP3, which compresses and renders audio,1 is not associated with any type of use controls. Napster posts these files, or references to them, such that users can send and swap the files without any control, effectively undermining the music distribution channel, typically compact disk with read-only memory (CD-ROM). The publishers chose not to encrypt the data on CDs, for cost and other reasons.2 Music on a CD is stored digitally in a totally unencrypted way, which is why you can make copies to play in your car.

There is no way to control this problem technologically; we can only continue to raise the bar, effectively placing us in the domain of risk management. This is the core problem, which I refer to here as the trusted client security fallacy. I have complete ownership of this device, literally, physically, and in every aspect, when it is on a network. This means that, with the proper tools, I can capture that content no matter what type of controls you place on me. There are people within @stake who are experts in reverse engineering, which allows them to unlock anything that has been encrypted. If we attempt a technological solution, then there will be ways to circumvent it, which then will propagate and become much easier for the masses to use.

I believe that policy drives technology in this problem, simply because technology does not offer a complete solution. The only way to attempt a solution to mitigate risk is to adopt a hybrid approach, mixing technology and policy. Whatever system you come up with in the digital rights space must be sensitive to these policy constraints. You have to distinguish the type of content in attempting to invoke rights on it and control it. This is a fundamental premise of the way a DRM system is designed and applied.3

These policy constraints create the archenemy of security and content control—system complexities. There are serious economic consequences for the technology industry in general, because you are imposing on the end user experience. You are disrupting and removing things, such as free use of and access to information, that I have become accustomed to using on the Internet. Decisions regarding how to implement the policy and technology will affect this industry.

1  

To render means to convert a format into a human-consumable element—displaying data as images, playing data as sound, or streaming data as video.

2  

Milo Medin pointed out that the music publishers themselves created the unencrypted format in which CDs are published, effectively creating this problem. He said we cannot expect people to use a digital management format that offers them fewer capabilities than the native format in which the material originally was published.

3  

References for DRM and client-side controls can be found at <http://www.intertrust.com>, <http://www.vyou.com>, and <http://www.oracle.com>.

Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×

The policy constraints causing these problems are privacy, the First Amendment and free speech, censorship, the legal jurisdiction issue, rating systems (which will become difficult to implement and maintain), copyright and fair use, and compliance and enforcement. These are all difficult issues.

11.2 DESIGNING A SOLUTION TO FIT THE CONSTRAINTS

This is how I would approach designing a system that conforms to the policy constraints. Some of this is very technical. First, we have to design a system to operate across all the consuming applications: chat, e-mail, Web browsers, file transfer protocol, and so on. This is a massive infrastructure. Then, given all of the policy constraints, how can we authenticate age—to determine if a user is 18—and only age without stomping on privacy issues? The only thing that I could come up with is biometric authentication. A biometric approach can detect who you are. I have heard that devices exist that can take a biometric measurement and determine the age of that measurement, but I do not believe it.4

The collector of the information is responsible for enforcing the privacy issues. If you are willing to go deeper into the privacy issue and maybe involve so-called trusted third parties, porn sites often perform age authentication through the submission of a credit card number. Thus, if you release some of the constraints, you get more of what you want to achieve. But the problem of hacking is inescapable.5 Gaining access to porn—something forbidden—is probably one of the most deep-rooted psychological motivations for becoming a hacker in the early stages. Talk to any hacker; if there is lurid content, then they want access to it. Music probably brings them into the same psychological realm.

The bigger issue is, now that you provide access, do you permit propagation? In other words, is the authorized user allowed only to view the content? This issue has more to do with content consumption than con-

4  

Herb Lin said that he does not believe this; his 6-year-old daughter just had a bone-age scan, which said she is three-and-a-half. Milo Medin suggested that a blood test probably could determine age. David Forsyth suggested counting the rings in a section of a long bone. Herb Lin noted that, to be useful legally, a biometric would have to change suddenly in a significant way between age 17 years and 364 days and age 18 years and 1 day. Milo Medin countered that a real-world system need not be accurate to within 1 day. Gail Pritchard summed up the problem by saying, “The minute I turn 18, I want access.” She noted that there are other means for checking a person’s birthday.

5  

David Forsyth pointed out the conundrum of “anything I own, I can attack.” In other words, if a parent has an age verification system and a technically creative offspring, then the system is essentially meaningless.

Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×

tent access. You want to prevent the propagation of certain types of Internet material. There is a subtle, more hidden issue here. If content is provided to someone who is authorized and authenticated, and it is rendered, then you are heavily into DRM. Should the user be permitted to propagate that material to another party such that it is rendered, in effect, in an uncontrolled fashion? The system needs to consider both consumption and propagation issues to provide a whole solution.

In the system that I am designing, I will install a virtual V-chip. Some of you may be familiar with the V-chip Initiative,6 which led to many debates and various laws. As of January 2000, new television sets have this capability. There is a twin effort in the V-chip analogy, in which the so-called client side (i.e., the television, desktop) and the publisher side (i.e., the broadcasters) are driven by policy makers not only to implement this bar to maintain risk on the client rendering side, but also to come up with a rating system so that the V-chip can look at a stream of art or video and say whether it is inappropriate content. The parents have set up this virtual ratings wall to prevent the rendering of, and access to, the content.

As applied to television, the V-chip impedes the user experience so onerously that people do not use it. Instead, they police the use of television by simply physically being in their children’s presence—or they do not police it at all.7 A lot of work would need to be done with both the

6  

See <http://www.cep.org/vchip.html>, <http://www.fcc.gov/vchip.html>, <http://www.webkeys.com>.

7  

Janet Schofield said that parents typically do not police their children’s television use systematically. Linda Hodge said that parents do not trust the filtering system because the broadcasters themselves set V-chip ratings, which are voluntary, and they have no incentive to use them. Janet Schofield said that many parents do not believe that the violence seen on television is really a problem, at least not to the degree that they don’t watch things they want to see because their children will be exposed to it. When she talks to kids about experiments on the connection between television violence and kids’ behavior, she loses their interest. She said parents or adults would take pornography issues more seriously than they do violence, so there may be a difference in motivation to use the filter. Sandra Calvert noted that the V-chip is not designed to censor violence only; it also screens sex and language. It has about five different ratings: fantasy violence, real violence, sex, language, and so on. Robin Raskin said parents are not using filters on their PCs or AOL’s parental controls either, because they do not see the link between entertainment and behavior. Part of the problem is that the research on this link is 20 years old and not very good. Sandra Calvert said that people who watch violence but are not incited to kill by it tend to disbelieve the general findings in the literature about the connection, which depends on the individual. But there is a new review article showing a link between playing aggressive video games and being aggressive personally, for both males and females. People can become desensitized to violence and no longer pay attention to it. At this time, the culture is not so desensitized to pornography, but this could become a problem.

Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×

purveyors of this technology (Microsoft and Intel) and the publishers on the server side offering up the content. The complexity and impossibility of this problem starts to avalanche here.

A precedent to frame thinking in this debate is encased in an interesting act of 1990 that ultimately led to this technology. The first initiative to look at is the Platform for Privacy Preferences (P3P).8 I argue that extensions to this initiative, in effect, could implement a rating system. This would be done using the extensible mark-up language (XML), a revolution in the industry and the treatment of content. XML is a natural evolution from HTML.9 It provides more power and will be the native format in which all Microsoft documents are stored. (Today, Word is stored in a format proprietary to Microsoft.) The XML processing engines sit inside the operating system, at least in forthcoming versions of Windows; virtually every device in the world will be capable of parsing that type of content. The idea is to modify the processing engine to require a P3P rating. If the description of the P3P rating is not in the content, the processing engine will not render it. This would force everyone in the industry to adopt this standard on a global basis.

This idea is not that farfetched. HTML achieved global status over a period of time; XML will achieve similar status over a period of time. XML already is being applied in various ways that have a global effect. The idea of modifying client applications that already use the underlying XML processing engine is not a stretch either. XML even could be extended to handle commerce material (e.g., from Napster). This initiative, which is in front of the World Wide Web Consortium, is achieving standards that are unprecedented. P3P is not a burdensome implementation, either, technologically. It is in line with where the vendors are going with a whole slew of other initiatives.

Next, you would need to start applying pressure on software industry giants and possibly hardware industry giants, too. In doing so, the entire client-side security fallacy—that you can control the rendering of content on an untrusted and unsecured host—must be recognized. The only way to compensate for it is through policy, by going after the people who create compromises in reverse engineering of the system itself. The

8  

See <http://www.w3c.org/P3P>.

9  

Nick Belkin said that, so far, XML has done only what HTTP has done—formal characterization. No one has had any significant experience with content characterization. If this is done, then a database is needed that incorporates ontology that describes the whole thing, and someone has to construct and maintain the database. Bob Schloss said there would be an announcement soon related to this issue by a consortium of companies.

Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×

Digital Millennium Copyright Act of 1998 outlaws some of these techniques. It did not stop the DeCSS10 model, but it did end up in court.

Reverse-engineering techniques would permit me to create controls around the content of any type of system. Reverse engineering unleashes content across all of computing. It is one of those difficult problems that have not been solved in the computer science field. Embedded systems raise the bar,11 but you create a cottage industry of reverse engineers who will get down to assembly level code and remove the actual execution set on the chip and replace it. This is done widely now. There are ways of raising the bar continually;12 the question is, how far you want to raise the bar and, in doing so, affect the industry in many different ways.

If we implement such a solution in the turbulent waters of the industry now, we would create an interesting and difficult problem. Some giants, such as Microsoft, want to dominate the content-rendering space, and whoever wins that battle effectively dominates digital entertainment. Microsoft is the best positioned to do this, as America Online and everyone else knows. The interesting economic and political issue is that the operating system vendor would dominate this area. If this solution were implemented in the interests of policy, then the vendors would scramble

10  

DeCSS is software that breaks the Content Scrambling System (CSS), which is weak encryption used for movies on digital versatile disks (DVDs).

11  

Herb Lin said it would be very difficult, although not impossible, to do on-screen decryption. In principle, you could build into the display processor some hardware that decrypts data on the fly before they are put on the screen. Milo Medin noted that such technology is used for high-definition television. David Forsyth said the problem with raising the bar is that you only raise it for one person. The federal courts say that DeCSS is naughty, but he has DVDs stolen from a Macintosh that required no programming to obtain.

12  

Milo Medin said the problem with standards is that computer power increases. A DVD player cannot send out raw, high-depth material; it has to be encoded in some way. (A PC does not have this constraint.) This requirement is in the license signature process for DVDs. All consumer devices have the same fundamental issue. You want to build a standard that consumer electronics companies can blast into hardware, make cheap, and make widely available. You want that standard to last for 10 to 20 years. To make an affordable device when the standard is released, there must be a manageable level of complexity and security. But 10 years later, a computer is much faster, and the standard cannot change. Anything that uses a fixed standard for cryptography is doomed. DirecTV dealt with this problem in the right way. People often steal the modules and clone them. One Superbowl Sunday, the company turned off about half a million to 1 million pirate boxes. Over time, the company sent down little snippets of code and then, all at once, decrypted the code and ran it, and it changed the way the bits are understood. A flexible crypto scheme is the only way to address this problem. However, it is very difficult to implement in consumer electronics when you do not have a data link; it may be easier in the future when everything is Internet connected.

Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×

to provide a solution, not so much to solve this very ugly problem,13 but rather to control the rendering of music, documents, and images.

Of course, the client security fallacy continues to hold.14 Once someone has developed a way to circumvent the system, he or she can package it into an application or executable and put it on the Internet, and anyone else who wants to shut the whole system off just clicks on this application.15 The goal is to raise the bar to a level of hassle so high that only a very motivated individual would engage in cracking it. Such safeguards are all hardware related.16 Any solution not hardware related will end up with a one-click compromise. When you have to crack open a device

13  

Milo Medin said many stupid ideas are circulating in this space. One idea is to put controls in the logic of hard drives so that they will not store or play back files. But as long as the industry wants a cheap, easy-to-display, and easy-to-implement consumer electronics standard, security will remain elusive, because you cannot have all these things and security too. This is a problem that the industry has made for itself.

14  

Milo Medin noted that, as long as a general-purpose operating system is used, someone can circumvent the system by changing a device driver. In fact, a network makes such changes automatically. As long as people can make a change between the XML rendering engine and the underlying hardware, they can get around anything. Dan Geer said another future trend is automatic updating by manufacturers on a regular basis. This is done for two reasons: to ease the burden of updating on the average user, and to handle security problems that cannot wait for system updates. The question of whether the software will run on a desktop internally and belong to the user, or whether there has to be an opening for others elsewhere to reach in and change it as part of a contract or lease, is outside the scope of the present discussion. Herb Lin noted that automatic updates already are made to Norton AntiVirus, Word, and Windows. Milo Medin emphasized that both the software programs and users can do automatic updates. A provider can trigger an update on the desktop of a subscriber at home—a capability built into the software. But the provider cannot prevent the user from also doing an update.

15  

John Rabun said this would be a problem for law enforcement, because many pedophiles would get the chip needed to circumvent the system. However, the system would prevent normal exposure of children to pornography. Milo Medin disagreed. Unless the industry changes the architecture of PCs completely, there will be a way to intervene in instructions by loading executables into an operating system and running them between the hardware and renderer. By contrast, a cell phone is an intelligent device running software that is relatively secure. People cannot make calls with someone else’s cell phone because they cannot download programs into it. In the case of the cable modem, the network operator, not the user, controls the code. The problem with PCs is that the user controls the code, and the operating system does not have trusted segments that interplay with the hardware to prevent circumvention. The situation is different with a set-top box, because the operating system is embedded and is managed and downloaded remotely. A user cannot get around it because there is no hook to execute.

16  

David Forsyth gave the example of region codes in the DVD world. If someone wanted to convert a DVD player into a non-region-coded player, he or she would have to fiddle around in the guts of the device. Clear instructions can be obtained from the Internet on how to do this, but most people are inhibited from changing the firmware on their DVD players.

Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×

case and replace a chip or do something else that involves hardware, you raise the bar pretty significantly. But this is a general-purpose computer, and the idea of shipping a chip associated with digital rights, which Intel tried to do, has not worked.17

I am creating a futuristic scenario, drawing on themes in the industry and technology that are moving toward what I am describing. The older systems that remain in legacy states would not be able to participate in the system; they would not be able to render content as easily as newer systems. The king holding all the cards is Microsoft, because it is the one entity that can modify the operating system to require tags on content for rendering. If Microsoft took that step, then, in effect, you would drive the pressure back to the publishers, who are saying, “If I don’t rate, then I don’t render.” Microsoft can drive this issue, but this brings you back full circle to the question of whether you give it the power to do that.

Let us fantasize about this world in which content is legislated and rated, effectively much like the V-chip. The whole argument over ratings already has been conducted on Capitol Hill, so you would end up with an interesting and difficult technological problem. How do I know that content is accurately rated and that my P3P profile on my browser renders that? How do I enforce the association between the content being posted and the rating that it is purported to have?18

There would need to be a law that defines the answers. Technology is part of the solution, but this is difficult technologically. A crawler or piece of software could wander around the Internet, looking at your P3P rating and then descending into your Web site to determine what that content really is and whether it is accurately rated. This is feasible, and it is probably an interesting project for some of the best computer scientists in this country. There are things like this on the Internet today, not necessarily looking at porn, but providing other search engine capabilities. This technology will improve over time. You would have to build a component that is highly complex and globally capable of crawling around the Internet.

17  

Milo Medin said Intel would still fail if it tried this approach again today, because people do not want someone else controlling their computers. Robin Raskin argued that it is a trade-off between service and privacy; if Intel can make the users’ lives easier, then users will comply. Milo Medin said the problem is that consumer electronic companies want to build cheap devices without elaborate internal workings. All it takes is for one or two people to crack the code and post it to Usenet, and it will be replicated all over the place. Providing access to the content (as opposed to the algorithm) is illegal because of copyright.

18  

Milo Medin said this is a Federal Trade Commission (FTC) issue. There must be a negative consequence for rating aberrations to change behavior. In the privacy arena, everyone posted something in the deal with the FTC, and the FTC said it would pursue anyone who violated the agreement. Bob Schloss suggested a default rating, so that if actual rating information is absent, the content is assumed to be X rated and for adults only.

Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×

Realistically, to achieve this system, you would go after Microsoft based on its market dominance in the rendering device itself. If you control that, then you effectively control how things get published to those devices. This would be the creation of a V-chip-like initiative that goes to the heart of a much more homogeneous environment than what the V-chip vendors were concerned about. Technologically, it fits with the P3P protocol and borrows from classification models, such as Label Security implemented by Oracle 9i, that define data and how the rendering client should treat them. But it is still futuristic and requires huge global change. Another layer you can add is policy-based filtering in the network itself. The only way you can approach this problem holistically is with a model that layers additional components of control from the network to the client application and operating system to the publisher.

The publishers will oppose this because it will limit their market reach. Yet they have an incentive to protect copyrights and to have a control model in place. They are all trembling in the wake of the Napster crisis. This is why I hold out hope that solving this problem also solves some of those issues for them.

11.3 PROTECTING CHILDREN

I say it is up to the parent to define a child’s user profile during the installation of an application. Many applications do this today: AOL accounts, Netscape, and Internet Explorer offer a profiled login. This way, when a child sits down to use that computer, he or she is constrained by the user profile, which technically becomes intertwined with the P3P profile. Once the child gets past a profile login, his or her Internet world is constrained by the definition of that profile.

This is in line with how you operate today. The difference is that the content you would access in my system would be controlled by the definition of your profile. This link is not strong today; there are no preset rules as to what renders in a browser. I am suggesting that you have to deal with the login issue to gain access to a profile based on your age. This comes back to the question of how you authenticate just age without violating other policy constraints and privacy and so forth. The P3P negotiation occurs at the machine level. For the level of detail in the profile, imagine a sliding bar representing content acceptable to the parent.19

19  

Robin Raskin said the more granular the P3P negotiation, the less it will be used. Systems do not work when they ask parents to make distinctions among, for example, full frontal nudity, partial nudity, and half-revealed nudity; in such cases, parents decide to let their kids see everything. A good profile requires a lot of granularity, but to convince a parent to use it, it cannot have any nuances.

Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×

The privacy issue arises not when a person provides access to personal information but rather when someone else records it. If you focus on the client side, then at least you can throw to the privacy advocates a bone that says, “All of that information is stored locally.” But there are systems in which you need a connection to a remote server, and your private information—like a credit card number or some other authenticating token—goes somewhere else. Once you do that, the privacy advocates will descend on this like vultures and pick it apart.

The adult entertainment industry’s age verification services move the issue of trust somewhere else. When you give your age, you get a challenge response asking you to prove your age by filling out a form. You might do that with a credit card number or other personal information. You repose this information with the trusted third party. This information could be loaded to say, “Your P3P profile now permits you to see this type of material.” But because you send your private information somewhere else, this age verification service, over time, now becomes a list of names of people who want access to porn.20 You can see the privacy people going crazy about the fact that this database is being used for that purpose.

There is another industry trend that relates to age verification. Dan Geer is probably one of the world’s leading experts on this, because he designed the system that Wall Street uses, Identrus, which issues digital certificates to own identity. The forms that describe the identities in those certificates have an age field. There are initiatives concerning the issuance of multiple certificates based on multiple types of identities and use of identity. There is talk in various committees in front of the Internet Engineering Task Force about the issuance of age-specific certificates.

To obtain an age-specific certificate, you would prove to VeriSign that you were born on the following date and your Social Security number is x. Then you can be issued a certificate to be loaded onto your computer. There is discussion in the public key infrastructure community that VeriSign might fill the trusted third-party role, in which it would gain no further knowledge about you other than your age. VeriSign has a bunker that enforces the limits in physical and legalistic ways. I would feel comfortable proving my age to VeriSign, knowing that it is legally bound. In

20  

Herb Lin noted that whoever is verifying the age information does not have to keep a list, even though it would be valuable. If people could be sure that no list was being kept, then the privacy issue would disappear. The difference between cyberspace and the real world is that, if a person goes into an adult bookstore and shows a driver’s license as proof of age, then the clerk just looks at it and says, “OK.” The clerk does not make a photocopy of it and file it away.

Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×

fact, VeriSign exists on a foundation of trust that is assumed when you use and obtain its certificates.

This system might indeed provide the trusted third party for age authentication, and it fits with the public key infrastructure. The problem— and Simpson Garfinkel and others have pointed to this in the privacy debates—lies in the meta-aggregation that will come in the future. I will get that database; VeriSign sells data like that. I also will get the clickstream from all the porn sites, and interesting data mining techniques will be used to aggregate and combine these data to trace it back to me and say, “You were the person who did this.” There is widespread compromise on the server side—look at Egghead and CD Now. This is an uncontainable problem that you do not encounter until after the compromise has occurred.21

11.4 SUMMARY

There are many threats to the system I just designed.22 Compliance is a major issue, which the search engine industry is addressing to some extent. Bots will be required to crawl the Internet for server-side ratings implementation; anti-bots can be created to defeat compliance checking. Client-side Trojans, worms, and viruses all can be injected into this machine to modify the XML processor. If it has memory, then I can hack it. If it has a processor, then good reverse engineers can create a one-click compromise. Ratings can be stripped off of content, or interesting techniques can be used to create content that appears G-rated to the rendering engine but is actually X-rated. In the Secure Digital Music Initiative, they tried to watermark the content to control it; this was hacked within days. The same thing would happen here. Finally, you would face widespread dissemination of a one-click compromise created by one hacker. “Script kitties” enable people to click on an attack that someone else created to automate everything I described. The scenario is not very hopeful.

21  

David Forsyth said you could prohibit people from possessing certain types of data or using them in certain ways. You also could punish violators. But the chances of actually catching them might be very small. Someone could keep a database in a way such that it would be difficult to find.

22  

Herb Lin summarized the presentation as follows: To control distribution of content to only age-appropriate people, you would have to make many changes in the existing technology and policy infrastructure, going far beyond the issue of age verification for inappropriate content. This would offer some benefits but would not necessarily solve the problem.

Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×
Page 65
Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×
Page 66
Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×
Page 67
Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×
Page 68
Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×
Page 69
Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×
Page 70
Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×
Page 71
Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×
Page 72
Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×
Page 73
Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×
Page 74
Suggested Citation:"11 Digital Rights Management Technology." National Research Council and Institute of Medicine. 2002. Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/10324.
×
Page 75
Next: 12 A Trusted Third Party in Digital Rights »
Technical, Business, and Legal Dimensions of Protecting Children from Pornography on the Internet: Proceedings of a Workshop Get This Book
×
Buy Paperback | $48.00 Buy Ebook | $38.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

In response to a mandate from Congress in conjunction with the Protection of Children from Sexual Predators Act of 1998, the Computer Science and Telecommunications Board (CSTB) and the Board on Children, Youth, and Families of the National Research Council (NRC) and the Institute of Medicine established the Committee to Study Tools and Strategies for Protecting Kids from Pornography and Their Applicability to Other Inappropriate Internet Content.

To collect input and to disseminate useful information to the nation on this question, the committee held two public workshops. On December 13, 2000, in Washington, D.C., the committee convened a workshop to focus on nontechnical strategies that could be effective in a broad range of settings (e.g., home, school, libraries) in which young people might be online. This workshop brought together researchers, educators, policy makers, and other key stakeholders to consider and discuss these approaches and to identify some of the benefits and limitations of various nontechnical strategies. The December workshop is summarized in Nontechnical Strategies to Reduce Children's Exposure to Inappropriate Material on the Internet: Summary of a Workshop. The second workshop was held on March 7, 2001, in Redwood City, California. This second workshop focused on some of the technical, business, and legal factors that affect how one might choose to protect kids from pornography on the Internet. The present report provides, in the form of edited transcripts, the presentations at that workshop.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!