National Academies Press: OpenBook

Critical Information Infrastructure Protection and the Law: An Overview of Key Issues (2003)

Chapter: 3 Liability for Unsecured Systems and Networks

« Previous: 2 Increasing the Flow of Information
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

3
LIABILITY FOR UNSECURED SYSTEMS AND NETWORKS

The existing legal framework for critical infrastructure protection consists of a patchwork of state and federal laws that are generally aimed at deterring certain types of conduct on computer networks. In addition, there are laws aimed at deterring certain types of conduct, including protection of electronic information pertaining to individual consumers and patients, for specific sectors, such as the Gramm-Leach-Bliley (GLB) Act1 for the financial services sector and the Health Insurance Portability and Accountability Act of 1996 (HIPAA)2 for the health services sector. There are no comparable regulations, however, that require entities to conform to any specific practices designed to promote critical infrastructure protection. Symposium participants differed on whether it would be more effective to target hackers and other perpetrators (for the intentional harm caused) or vendors/service providers (for harm caused through negligence). This chapter explores the legal theories supporting Internet-related liability.

CRIMINAL LAW

The purpose of criminal law is generally to deter future crime and punish perpetrators. Criminal threats to critical information infrastruc

1  

15 USC, Subchapter I, Sec. 6801-6809. PL 106-102.

2  

PL 104-191.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

tures include unauthorized access to computer networks (either from an insider or an outside hacker), malicious code (such as viruses and worms), and distributed denial-of-service attacks. The conventional wisdom is that prosecution of computer crimes will help reduce the number of future computer attacks. This approach depends on the private sector entities—the owners of the information infrastructures—to report criminal computer activities. However, to use prosecution as a deterrent, the attack and subsequent prosecution must be publicized. This may be acceptable when criminals are caught in the process of attempting an attack (which is therefore rendered unsuccessful) but may not be desirable when the attack succeeds. Craig Silliman suggests that a victim’s decision to report a computer attack to law enforcement depends on a careful balancing of interests. For example, an ISP differentiates itself based on the quality and service of its networks; a single advertised attack could lead to a loss of customers and revenue. In addition, information in the public domain about the vulnerability of a network could lead to copycat attacks. Hence, it would take a large number of prosecutions, Mr. Silliman argues, to compensate an ISP for the corresponding bad publicity. These concerns—echoed by companies in many industries (e.g., financial institutions)—have contributed to private information-sharing efforts (such as ISACs and CERT) to reduce attacks and to detect and prevent the successful conclusion of an attack.

Domestic Jurisdiction

Congress has passed a number of laws related to computer crime.3 These laws are generally focused on hackers and other individuals who use computer networks for illegal purposes.4 This section provides a brief overview of the key computer crime laws.5

3  

Many states also have computer crime laws that may affect critical information infrastructure protection.

4  

Many of the attacks that occur today are the result of malicious or indifferent acts by individuals (often referred to as “script kiddies”). They generally do not have the sophistication to develop their own attacks, but rely on programs (“scripts”) written by others and/or other ready-made tools to launch network attacks. An in-depth analysis might be helpful to consider a variety of issues surrounding these hacker kits such as whether such kits are protected under the First Amendment rights (e.g., bomb-making is protected, so arguably hacker kits should be as well); whether these kits are circumvention tools; when it is appropriate to use these kits (e.g., security firms that use them to conduct audits for insurance purposes).

5  

There are many laws that pertain to conduct on computer networks that are not designed solely for online environments (e.g., the Espionage Act, 18 U.S.C. Sec. 793, 794, and 798; Wire Fraud, 18 U.S.C. Sec. 1343; and the Economic Espionage Act, 18 U.S.C. Sec. 1831 et al.). The laws cited in this section are all part of Title 18, “Crimes and Criminal Procedures.”

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Computer Fraud and Abuse Act

The Computer Fraud and Abuse Act (CFAA) of 1986 (18 U.S.C. § 1030) was the first federal law specifically directed at computer crime. It was initially aimed at protecting “federal interest” computers as well as computers used by financial institutions but now protects any computer used in interstate commerce. The CFAA imposes penalties on individuals who knowingly and with intent to defraud gain unauthorized access to computers. For example, in United States v. Morris6 (an early case demonstrating some of the challenges associated with criminal computer prosecutions), the court found Morris liable for damages caused by his actions because he knowingly accessed a computer even if he did not intentionally cause harm.7 Although the CFAA does not include provisions for critical information infrastructure protection per se, it has played a major role in prohibiting and sanctioning cyberattacks.8 Congress has continued to amend the CFAA over the last several years to increase its effectiveness as the threat and technology have evolved.9

Electronic Communications Privacy Act

The Electronic Communications Privacy Act of 1986 (ECPA; 18 U.S.C. § 2701) updated the legal framework for electronic surveillance of oral and wire communications established in Title III (the Omnibus Crime Control and Safe Streets Act of 1968) to include electronic communications.10 ECPA provides criminal and civil penalties for accessing and obtaining or altering without permission stored electronic communica

6  

U.S. v. Morris, 928 F.2d 504 (2d Cir. 1991).

7  

Sarah Faulkner. 2000. “Invasion of the Information Snatchers: Creating Liability for Corporations with Vulnerable Computer Networks,” Journal of Computer & Information Law, Vol. 18:1019-1047.

8  

For example, in U.S. v. David L. Smith (D. N.J. May 1, 2001), Mr. Smith was convicted under 18 U.S.C. 1030(a)(5) for launching the Melissa virus (see <http://www.usdoj.gov/criminal/cybercrime/melissaSent.htm>) and in U.S. v. Bret McDanel (C.D. Cal June 25, 2002), Mr. McDanel was convicted under CFAA for maliciously bombarding a company computer system with thousands of e-mail messages (see <http://www.usdoj.gov/criminal/cybercrime/mcdanelConvict.htm >).

9  

The U.S. House of Representatives recently approved H.R. 3482, the Cyber Security Enhancement Act, which raises the penalty for computer crime to a maximum of life imprisonment. The bill was received in the Senate and read twice and then referred to the Committee on the Judiciary on July 16, 2002. For more information, see <http://thomas.loc.gov/cgi-bin/bdquery/z?d107:HR03482:@@@D&summ2=m&>.

10  

For more information on electronic surveillance, see Computer Science and Telecommunications Board, National Research Council. 1996. Cryptography’s Role in Securing the Information Society. National Academy Press, Washington, D.C., Appendix D. Available online at <http://www.cstb.org/pub_crisis>.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

tions. It also governs what an applicant must do to be granted access to evidence of computer crime possessed by ISPs. The unlawful access to stored communications provision, like the CFAA, protects the critical information infrastructure by enabling the prosecution of individuals who attempt to halt the flow of information to or from electronic storage systems.

Fraud and Related Activity in Connection with Access Devices

Section 102911 in Title 18 of the U.S. Code is the “federal statute condemning various crimes involving . . . access devices.”12 The law defines an access device as “any card, plate, code, account number, electronic serial number, mobile identification number, personal identification number, or . . . other means of account access that can be used, alone or in conjunction with another access device, to obtain money, goods, services, or any other thing of value. . . .”13 Section 1029 provides for penalties ranging from fines to—in some cases—imprisonment for up to 20 years.

Wire and Electronic Communications Interception and Interception of Oral Communications

Section 2511 in Title 18 of the U.S. Code “provides specific criminal and civil penalties for individuals (law enforcement officials and private citizens alike) who conduct electronic or wire surveillance of communications . . . in a manner that is not legally authorized. Legal authorization for such surveillance is provided for specific circumstances in law enforcement and foreign intelligence collection. . . .”14 Section 2511 includes what is referred to as the one-party consent provision, which allows federal law enforcement officials to monitor telephone conversations without obtaining a court order provided they obtain the consent to do so from one of the parties engaged in the conversations.

11  

The Access Device Law was primarily developed for low-tech prosecutions, such as credit card fraud, but has since been adopted for use in more complex cases involving computers.

12  

Charles Doyle. 2002. The USA PATRIOT Act: A Legal Analysis (RL31377). Washington, D.C.: Congressional Research Service. Available online at <http://www.fas.org/irp/crs/RL31377.pdf>.

13  

The text of 18 U.S.C. 1029 can be found online at <http://www.usdoj.gov/criminal/cybercrime/usc1029.htm>.

14  

Computer Science and Telecommunications Board, National Research Council. 1996. Cryptography’s Role in Securing the Information Society. National Academy Press, Washington, D.C., p. 396. Available online at <http://books.nap.edu/html/crisis/>.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
USA PATRIOT Act

The USA PATRIOT Act of 2001,15 designed as a collection of amendments to existing laws, includes a number of provisions related to critical infrastructure protection: revisions to CFAA (increased penalties for hackers who damage protected computers; a new offense for damaging computers used for national security; and an expansion of the coverage of the statute to include computers in foreign countries so long as there is an effect on U.S. interstate or foreign commerce); increased information sharing; strengthened criminal laws against terrorism; and enhancements to the government’s legal authorities to conduct electronic surveillance.

International Jurisdiction

The nature of modern communications, including the Internet, makes international cooperation in cybersecurity of increasing importance. The perpetrators of many recent cybercrimes (such as the “I Love You” virus and the distributed denial-of-service attacks in February 2000) were hackers in foreign countries. The recent case of U.S. v. Gorshkov,16 in which an FBI agent conducted a cross-border search of a Russian computer to obtain evidence to indict a Russian citizen on extortion charges, is an example of how courts look at cross-border searches in the current environment and how it might become the norm in the absence of formal international coordination.

Increasing cross-border criminal activity highlights the need for common international standards and objectives for cybersecurity. Different countries have different laws and practices, making prosecution of these criminals very difficult.17 In August 2000, Sofaer and Goodman (Center

15  

Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act (USA PATRIOT Act) of 2001, PL 107-56.

16  

U.S. v. Gorshkov, 2001 WL 1024026 (W.D.Wash.). After a series of computer hacker intrusions into U.S. businesses, the FBI identified a Russian as one of the intruders. The FBI lured him to the United States on the pretext of a job interview, during which the defendant was asked to prove his computer hacking and security skills. The defendant logged into his home computer network to access computer hacker tools. The FBI were able to obtain the defendant’s userid and password to his home computer by using a tool to capture all keystrokes on the computer provided during the interview. The userid and password allowed the FBI to download the information contained on the defendant’s home computer, which confirmed that the defendant was involved in the computer hacker intrusions. At issue is whether the FBI violated the defendant’s fourth amendment rights by breaking into his space. The court found that the defendant should not have had an expectation of privacy during the interview; hence he intended to disclose the userid and password.

17  

One notable exception is the arrest and prosecution of Ehud Tenebaum, an Israeli citizen, for his part in the Solar Sunrise cyberattacks. For more information on Solar Sunrise, see <http://www.sans.org/newlook/resources/IDFAQ/solar_sunrise.htm>.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

for International Security and Cooperation, Stanford University) proposed a multilateral convention on cybercrime and terrorism that would encourage international cooperation.18 The Council of Europe developed the Convention on Cybercrime (with the United States participating as an observer), which was signed (as of December 2001) by 26 members of the European Union as well as the United States, Canada, and Japan.19

Jack Goldsmith, professor of law at the University of Chicago Law School, argues that remote cross-border searches and seizures are a necessary tool in fighting cybercrimes.20 Such measures, he asserts, are not prohibited by existing norms of territorial sovereignty and furthermore are not without precedent. There remains a debate about when and how a nation can attempt to enforce its own laws to affect those outside its own territory, and for that reason enforcement is still an ambiguous concept that should be clarified as sovereign governments adjust to the realities of new technology.21Box 3.1 provides a general discussion of criminal liability with respect to international cybercrime.

CIVIL LIABILITY

Elliot Turrini, former assistant U.S. attorney from the District of New Jersey, argues that criminal law alone is not sufficient to deter intruders and prevent cybercrime. Civil liability, he argues, is “essential to insure proper incentives to create an optimal computer crime strategy.” Although tort-based liability with regard to CIP is not well developed at present, many experts believe that a few CIP-related liability suits could

18  

Abraham D. Sofaer and Seymour E. Goodman. 2000. A Proposal for an International Convention on Cyber Crime and Terrorism. Stanford, Calif: Center for International Security and Cooperation. Their proposal calls for adoption of laws making certain cyberactivities criminal, enforcement of laws or extradition to the United States for prosecution, cooperation in investigating criminal activities, and participation in efforts to adopt and implement standards and practices to enhance security.

19  

The European Convention is somewhat controversial in the cybersecurity arena. Some experts argue that the Convention has several major flaws, including these: It does not include all the states that should be included; it is restricted to criminal law cooperation; and it does not encourage cooperation in the development of standards and practices that would make cybercommunication safer. Information on the Convention is available at <http://www.coe.int>.

20  

Jack L. Goldsmith, “The Internet and the Legitimacy of Remote Cross-Border Searches,” University of Chicago Law School, October 2001.

21  

For a discussion of the relationship between global information networks and local values (political, economic, and cultural norms), see Computer Science and Telecommunications Board, National Research Council. 2001. Global Networks and Local Values: A Comparative Look at Germany and the United States. National Academy Press, Washington, D.C. Available online at <http://www.cstb.org/pub_globalnetworks>.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

BOX 3.1 Offshore Cybercrime and Jurisdiction

by

Jack Goldsmith

Professor of Law, University of Chicago Law School

The anonymity made possible by cyberspace is one reason why attacks on critical infrastructures from hackers, cybercriminals, and cyberterrorists are hard to stop. Another reason has to do with borders. When these activities take place from abroad, they are hard to stop for a different, independent reason.

A nation’s power to enforce its laws is limited by territory. The Restatement (Third) of Foreign Relations Law states these limits as follows: “It is universally recognized, as a corollary of state sovereignty, that officials in one state may not exercise their functions in the territory of another state without the latter’s consent.”1 The Restatement adds that one state’s law enforcement officials “can engage in criminal investigation in [another] state only with the state’s consent.”2

Even with the territorial limits of enforcement jurisdiction, nations can often do a good job of controlling Internet transmissions from abroad by regulating persons and property within the territory. They can, for example, seize the foreign content provider’s local assets, penalize in-state end-users severely, regulate in-state hardware or software through which offending transactions are made, and regulate Internet access providers and local financial intermediaries that facilitate unwanted Internet transactions.3 In these and other ways, a territorial government exercising power solely within its territory can indirectly regulate offshore content providers by raising their costs, often significantly.

Unfortunately, these forms of end-user and intermediary regulations tend not to work well with respect to the cybercrimes and cyberterrorism committed from abroad. Because these crimes are (usually) one-time, discrete events, it is hard for local Internet intermediaries to identify and screen out the pertinent cross-border data flows. Moreover, there is a special need in this context to secure evidence of the crime immediately. Pseudonymity is relatively easy to achieve in the commission of these crimes. And perhaps most importantly, evidence of the crime can be destroyed relatively quickly.

For these reasons, enforcement authorities face two distinct jurisdictional challenges with respect to cybercrimes and cyberterrorism committed from abroad. The first challenge concerns evidence. Authorities often must take immediate steps to identify the computer sources of the criminal activity and seize (or at least freeze) information on the computers relevant to the crime before all records of the crime are erased. The second challenge concerns prosecution. Authorities must secure the presence of the offshore perpetrator so they can punish him.

There are basically three ways to achieve these goals, consistent with the principle that enforcement jurisdiction is territorial.

Cooperation by Treaty

The nation subject to the attack can cooperate with the nation (or nations) from which, or through which, the attack occurs. Officials in the originating state(s) can assist officials in the target state in identifying, freezing, and retrieving evidence related to the crime and in apprehending the author of the crime and either

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

bringing him to justice in the originating state or extraditing him to the subject state for prosecution.

This, in a nutshell, is the strategy of the Council of Europe draft Cybercrime Convention. In addition to harmonizing domestic definitions of cybercrime, the convention aims to enhance fast and effective international cooperation in the enforcement of cybercrime. It requires each nation to enact laws authorizing expedited searches, seizures, and preservations of computer data within the territory. It also provides for a system for rapid enforcement assistance. For example, the convention contemplates that nations where the crime originates will, at the request of the nation where the crime is causing damage, preserve and disclose stored computer data. It also contemplates that each treaty signatory will establish a round-the-clock point of contact to ensure immediate assistance for the purposes of cross-border information requests. Finally, the treaty contemplates extradition of criminals from the nation where the attacks originated to the nation where the attacks occurred.

There are at least four problems with this approach. First, treaties take years, and sometimes decades, to draft and ratify. The Council of Europe convention has not yet been ratified, and it will take years before major nations outside Europe ratify it (assuming they ever do). Second, any nation that does not ratify the treaty (and there will be many) can serve as a haven for cybercriminals and cyberterrorists. For the treaty to work, State parties will need to impose significant collateral sanctions on nations that fail to ratify, implement, or enforce the convention. Fourth, the convention does not authorize remote cross-border searches (i.e., unilateral searches by one nation on computers in another nation for the purpose of seizing and freezing evidence). Assuming that the convention eventually comes into force, it will be necessary for a nation pursuing a cyberterrorist to consult with local officials before seizing, storing, and freezing data on computers located in their countries. Even with the contemplated round-the-clock consultation and mutual assistance machinery, this extra and unwieldy step will give cybercriminals precious time to cover their tracks.

Informal Cooperation

Even in the absence of treaties, enforcement authorities from many nations cooperate in the fight against cybercrime by (1) swapping information and (2) cooperating in the seizure of evidence on local computers.

Informal cooperation is crucial and does not require lengthy treaty processes. But it is uncertain. In the absence of an official treaty framework, many nations do not provide adequate cooperation. Moreover, extradition of cybercriminals is difficult to do in the absence of a treaty. This is because of the principal of double criminality, which requires as a precondition to extradition that the allegations be a crime in both the rendering and receiving state. Laws against cybercrimes are underdeveloped in most nations and not harmonized across nations. A treaty regime can, over time, rectify these shortcomings (as the European treaty aims to do). But in the meantime, extradition for cybercrimes is difficult. Consider the fate of the author of the “I Love You” virus, which caused over $10 billion in damage around the world. He was not prosecuted in the Philippines because that country lacked adequate criminal laws. And because he did not violate Philippine law, the

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

double criminality principle precluded him from being extradited to other countries pursuant to general extradition treaties.

Cross-Border Searches and Seizures

The nation attacked can also act unilaterally. Sitting at their desks in one country, law enforcement officials can take unilateral steps on computer networks to trace the origins of the cyberattack and explore, freeze, and store relevant data located on a computer in the country where the crime originated. These actions are known as remote cross-border searches and seizures.

There are two problems with these unilateral acts. First, many believe they violate the principle of territorial sovereignty and thus violate international law. (In “The Internet and the Legitimacy of Remote Cross-Border Searches,”4 I argue, contrary to conventional wisdom, that cross-border searches and seizures are consistent with international law.) And second, cross-border searches cannot produce the criminal defendant himself.

1  

Restatement (Third) of the Foreign Relations Law of the United States 432, comment b.

2  

Ibid.

3  

For an elaboration of these truncated points, see Jack L. Goldsmith, “Against Cyber-anarchy,” 65 University of Chicago Law Review 1199 (1998); Jack L. Goldsmith, “The Internet and the Abiding Significance of Territorial Sovereignty,” 5 Indiana Journal of Global Legal Studies 475 (1998).

4  

Jack L. Goldsmith, “The Internet and the Legitimacy of Remote Cross-Border Searches,” University of Chicago Law School, October 2001.

change the cost-benefit analysis of securing critical infrastructures. Civil law is intended to deter undesirable or wrongful conduct and compensate those harmed by such conduct. An important component of civil liability is that it would allow a victim to recover losses from third parties if such parties were negligent or engaged in intentional misconduct and such negligence or misconduct was the proximate cause of the loss. In the Internet environment, such third parties may be the only source of recovery,22 since criminal law offers no compensation to the victim if the computer criminal cannot be identified or is judgment-proof (a likely scenario given the anonymity of the Internet and the lack of financial

22  

Civil lawsuits may be ineffective at recovering losses against third parties located outside the United States. As noted earlier, the European Convention on Cybercrime is limited to criminal law cooperation.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

assets of many computer hackers).23 The ability to impose civil damages on a third party, such as a communications carrier or a service provider who is proven to be negligent, could motivate that party to invest the necessary resources in improving security (e.g., by closing known software bugs to help deter hackers). Civil liability can arise from contract law, tort law, or regulation.

Contract Law

Contracts are agreements between two parties that create an obligation to do, or not do, a particular thing. If one party breaches its contractual obligations, the law provides a remedy to the aggrieved party.

Contract law is generally viewed as the only basis for bringing computer-related cases because other theories of liability are inapplicable for several reasons: (1) damages from computer crimes are almost always monetary, and courts have traditionally denied negligence claims for purely economic losses (see “Tort Law” section);24 (2) there is no specific standard of conduct for negligence-based claims; and (3) the intervening criminal act, not the network owner’s negligence, is generally viewed as the proximate cause of the harm. By contrast, liability between two entities is easily facilitated by contract. Say, for example, Company A contracts with ISP B to provide network services. If ISP B fails to uphold its contractual bargains, then company A can seek remedy in court. In this way, contracts can be a positive force in helping secure critical infrastructures.

Contract law, however, often fails to provide an adequate remedy for third parties. Suppose a hacker breaks into Company A’s inadequately secured network and then uses Company A’s network to launch an attack against Company B. The attack against Company B disables its networks, thereby causing Company B to fail to deliver promised services to its customers.25 If Company B is not in privity with Company A (i.e., the two companies do not have a contractual relationship), Company B cannot seek remedy for business losses from Company A under contract law. Company B is often referred to as the “downstream” victim in this type of computer attack. This scenario is quite common in distributed denial-of-service attacks. Hence, the limitations of contract law have caused com

23  

Erin E. Kenneally. 2000. “The Byte Stops Here: Duty and Liability for Negligent Internet Security,” Computer Security Journal, 16(2).

24  

David Gripman. 1997. “The Doors Are Locked But the Thieves and Vandals Are Still Getting In: A Proposal in Tort to Alleviate Corporate America’s Cyber-Crime Problem,” The John Marshall Journal of Computer and Information Law 16(1):167.

25  

Ibid.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

mentators to suggest the use of tort law as a model for computer-related cases.

Tort Law

A tort is a wrongful act other than a breach of contract for which relief may be obtained in the form of damages or an injunction. The purpose of tort law is to deter wrongful conduct and compensate those harmed. While contract law rests in large part on obligations imposed by negotiation or bargain, tort law rests seminally on obligations imposed by law (either case law or federal or state regulation).26 For example, in the case of a distributed denial-of-service attack, there is no question that the hacker who intentionally caused harm should be held responsible in tort. The question, however, is whether tort liability should also apply to entities (companies, vendors, service providers, universities, or individuals) whose systems or products were used or accessed in the course of a computer attack and who failed to take reasonable steps prior to the attack to protect against misuse of their networks. To date, no U.S. court has addressed the issue of liability for failure to secure a computer network adequately.27 If tort law is found to apply to computer security, then the potential for civil liability lawsuits (with the likelihood of monetary damages) could encourage companies to invest in computer security measures. It would also influence decisions about computing system development. As a consequence, the ability of tort law to motivate action on critical information infrastructure protection is one possible avenue to explore.

A key conceptual question is whether tort law should allow recovery of damages from a company whose networks were not properly secured and then were used by a third party to cause harm. Generally, to recover damages in tort, the plaintiff must show that the defendant was negligent. Negligence has four basic elements: (1) a legal duty; (2) a breach of that duty (i.e., a failure to conform one’s conduct to the required standard of care, such as “reasonable care”);28 (3) causation (i.e., the damage was the

26  

Sarah Faulkner. 2000. “Invasion of the Information Snatchers: Creating Liability for Corporations with Vulnerable Computer Networks,” Journal of Computer & Information Law, 18:1019-1047.

27  

Erin E. Kenneally. 2000. “The Byte Stops Here: Duty and Liability for Negligent Internet Security,” Computer Security Journal, 16(2).

28  

Reasonable care is often defined as “the degree of care that a reasonable person would exercise under the circumstances.” David Gripman. 1997. “The Doors Are Locked but the Thieves and Vandals Are Still Getting In: A Proposal in Tort to Alleviate Corporate America’s Cyber-Crime Problem,” The John Marshall Journal of Computer and Information Law 16(1):167.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

proximate cause or foreseeable consequence of the risk created by the defendant’s act or omission); and (4) actual damage. Before liability will be imposed, a plaintiff must substantiate all of the necessary elements to support its claim.

Under existing law, a plaintiff would have some difficulty meeting all of the required elements in the computer security context. Although a corporation might be deemed to have an existing legal duty to protect the information of its customers or clients (especially if it is a financial institution or a custodian of medical records),29 currently no legal duty exists between a service provider and other unrelated or “downstream” parties on the Internet. If such a duty were to be recognized, it would have to be based on (1) a public policy determination that the victim needs legal redress, (2) the foreseeability of risk of harm to the victim, (3) the defendants’ ability to control or minimize the risk of harm, and (4) the determination that the defendant is the party best positioned to protect against the harm.30

With regard to the foreseeability of harm to third parties in the network environment, several cases demonstrate that this question is inextricably related to the question of whether the defendant knew or should have known that certain illegal or wrongful conduct was, in fact, occurring on its networks (and not just “likely” to occur). These cases—although decided in the very different contexts of copyright infringement and defamation law—also suggest that holding defendants liable, either directly or indirectly, for harm caused as a result of known and unaddressed computer security vulnerabilities would be a reasonable extension of traditional legal principles. As indicated in the cases set forth in Box 3.2, if a corporation (or service provider) knows or has reason to know that its computer networks are being used to cause harm, and it has the capacity to stop such harm from occurring, the corporation may be required to take action to avoid liability, especially if it derives a financial or other benefit from allowing its networks to be accessed by others. Clearly, however, the determination as to the “capacity to control” the unwanted network behavior will be a matter of significant dispute.

As to the fourth factor, proponents of tort liability argue that the

29  

It could be argued that financial institutions have an existing duty under the Gramm-Leach-Bliley implementing regulations to provide immediate and effective incident response to protect the confidentiality of consumer data maintained on their own networks. These regulations, however, would not apply to unregulated parties whose networks are being used to cause harm.

30  

Kimberly Keifer and Randy Sabett. 2002. “Openness of Internet Creates Potential for Corporate Information Security Liability,” Electronic Commerce and Law Report, Vol. 7, No. 24, June 12, p. 10.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

BOX 3.2 Liability Based on Knowledge of Misconduct on Computer Networks

Cubbyv.CompuServe1

In October 1991, a federal judge in New York found that CompuServe was a distributor of general online information services and was therefore not liable for defamatory messages carried on one of the 150 computer bulletin boards on the service. The court found that CompuServe had no contractual relationship with the bulletin boards on its service and that an intermediate entity known as CIS had accepted the contractual responsibility to edit the bulletin board that had carried the defamatory messages. The court also noted that CompuServe received no compensation from the bulletin board and had no opportunity to review the contents of the bulletin board before the defamatory comments were published.

The court concluded that CompuServe was the modern equivalent of the corner news vendor selling numerous newspapers and magazines. As such, CompuServe could not be held responsible for defamatory information carried on individual bulletin boards. The court indicated that the network administrator could only be held civilly liable if he “knew or should have known” about improper or illegal network traffic. The court also observed that if a network administrator is viewed as a publisher, he is held to a higher level of responsibility. Moreover, the court observed that while CompuServe had received some information about problems on the bulletin board system in question, that information was not sufficient to prompt a further inquiry. Thus, the nature of the traffic played a significant role.

Stratton Oakmontv.Prodigy2

In 1995, a New York state court came to the opposite conclusion as the court in Cubby v. Compuserve, finding that Prodigy could be held responsible for defamatory messages posted by one of the users of its service. At the time, this case expanded the potential liability of commercial online computer service providers, by determining that Prodigy could be held liable as a publisher of defamatory material for information posted on its network even though Prodigy maintained it was nothing more than a passive conduit of information and therefore should be treated only as a distributor (like a newsstand or a library) of news or information. In determining that Prodigy knew or should have known about the content of the defamatory material posted on Prodigy bulletin boards, the court recognized that Prodigy had put into effect a series of content review policies and had utilized editorial software to screen messages uploaded to the network. Accordingly, Prodigy did have affirmative responsibility for the contents of the messages posted by Prodigy users. The ultimate decision in this case, however, was superseded by the principles of liability and safe harbor contained in the Communications Decency Act of 1996.

RTCv.Netcom3

In November 1995 a federal judge in California ruled that an Internet service provider could be held liable for contributory infringement for copyrighted material it made available online if it had notice of the copyrighted nature of the material and refused to delete it from its archives.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

RTC was the exclusive licensee to certain unpublished copyrighted works. RTC and its licensees maintain extensive security over these materials, which are central to the advanced spiritual development of Church of Scientology members. When RTC discovered some of its materials being posted in text file format on the Internet in late 1994 and 1995, it brought three lawsuits against the posters of the materials and the owners of the bulletin boards posting the materials. In two of the cases, Netcom and another ISP were named as defendants after they refused to remove the copyrighted information contained in the notice from RTC.

Netcom moved for summary judgment, arguing that they could not be held liable under principles of direct, contributory, or vicarious liability. The court partially agreed, granting summary judgment on the direct and vicarious copyright infringement liability theories. However, the court held that Netcom might be found liable for contributory infringement since, despite warnings from the RTC, it allowed new postings by the defendants and did not remove prior postings. On August 4, 1996, the RTC and Netcom4 announced that their litigation had been settled on undisclosed terms. After the settlement, Netcom posted new guidelines for protecting intellectual property on its Internet service, allowing copyright holders to complain to Netcom about alleged postings of their copyrighted material. Of course, the legal duty to refrain from participating in copyright infringement flows directly from the Copyright Act.

A&M Records, Inc.v.Napster, Inc.5

The decision of the Court of Appeals for the Ninth Circuit in the Napster litigation again demonstrates that operators of computer networks can be held liable for misconduct that occurs on such networks if they know or should know of the illegal uses. In Napster, the defendants created a file-sharing system that allowed individual users to, among other things, share copyrighted music files. In its suit against Napster for copyright infringements, the plaintiffs contended that Napster should be liable under theories of contributory and vicarious copyright infringement. As to contributory infringement, one key question was whether Napster knew or had reason to know of the direct infringement committed by Napster’s users. On that issue, the court, relying on the prior Netcom decision, concluded that Napster had both actual and constructive knowledge that its users were engaged in illegal activities. Similarly, with regard to the issue of vicarious liability, the court concluded that Napster could be held liable for the activities of users on its network because of Napster’s failure to police its system to rid it of illegal uses and the fact that Napster benefited financially from the continuing availability of illegal activities on its network.

Cyber Promotionsv.Apex Global Information Services

The suggestion that there may be best practices with regard to online computer security is found in Cyber Promotions v. Apex Global Information Services (AGIS).6 In this case, AGIS had contracted to be the Internet service provider for Cyber Promotions. At the time of the contract, AGIS knew that Cyber Promotions regularly sent unsolicited commercial e-mail (“spam”), and it imposed a 30-day without cause termination provision in the contract. In September 1997, only 6 months after the contract was signed, AGIS suffered a massive flood attack directed

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

at Cyber Promotions that completely consumed AGIS’s bandwidth. AGIS responded by immediately terminating Cyber Promotions’ use of its service.

Cyber Promotions then filed a temporary restraining order (TRO) and preliminary injunction. In granting the TRO against AGIS, the court noted that security requirements evolve, and AGIS had not taken significant steps to deal with ping attacks. The court noted that the only security step taken by AGIS was to remove Cyber Promotions from its network; AGIS had not hired a security expert or attempted to install a router to control potentially hostile ping attacks. The court’s basic approach to AGIS was this: “Other ISPs are able to mitigate retaliatory actions by pingers, why not you?” The TRO was granted and AGIS was directed to reinstate service to Cyber Promotions. This result appears to have been substantially influenced by the fact that the court noted that AGIS had not taken the same measures as other reputable Internet service providers had taken to mitigate similar attacks.

1  

Cubby v. CompuServe, 776 F. Supp. 135 (S.D.N.Y. 1991).

2  

Stratton Oakmont v. PRODIGY, 1995 WL 323710 (N.Y.Sup.Ct.).

3  

RTC v. NETCOM 3, 907 F. Supp. 1361 (N.D.Cal. 11/21/95).

4  

RTC v. NETCOM 3, 907 F. Supp. 1361, at 14.

5  

A&M RECORDS, Inc. v. NAPSTER, Inc., 239 F.3d 1004 (9th Cir. 2001).

6  

1997 WL 634384 (E.D. Pa. 1997).

companies that control the computer networks are in the best position to implement appropriate security measures31 and are therefore the “lowest-cost avoiders.” This cost-benefit analysis owes its origins to Judge Learned Hand’s equation B < PL, in United States v. Carroll Towing Co.32 According to Hand’s logic, a party is negligent if the cost (B) of taking adequate measures to prevent harm is less than the monetary loss (L) multiplied by the probability (P) of its occurring.33

However, courts have traditionally limited third-party liability in two ways: (1) by excluding damages in negligence actions for purely eco

31  

Alan Charles Raul, Frank R. Volpe, and Gabriel S. Meyer. 2001. “Liability for Computer Glitches and Online Security Lapses,” BNA Electronic Commerce Law Report, 6(31):849.

32  

United States v. Carroll Towing Co., 159 F.2d 169, 173-74 (2d Cir. 1947).

33  

See Alan Charles Raul, Frank R. Volpe, and Gabriel S. Meyer. 2001. “Liability for Computer Glitches and Online Security Lapses,” BNA Electronic Commerce Law Report, 6(31):849 and Erin E. Kenneally. 2000. “The Byte Stops Here: Duty and Liability for Negligent Internet Security,” Computer Security Journal, 16(2).

34  

Margaret Jane Radin. 2001. “Distributed Denial of Service Attacks: Who Pays?,” <http://www.mazunetworks.com/white_papers/radin-print.html>.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

nomic losses34 and (2) by holding that intervening criminal acts break the chain of causation such that any breach of duty by the defendant would not be deemed the proximate cause of the harm to the plaintiff. The economic loss doctrine prohibits parties from recovering financial losses, absent injury to person or property, under tort law.35 Many courts, however, are beginning to reject the economic loss doctrine. For example, in People Express Airline v. Consolidated Rail Corporation, the New Jersey Supreme Court concluded that “a defendant who has breached his duty of care to avoid the risk of economic injury to particularly foreseeable plaintiffs may be held liable for actual economic losses that are proximately caused by its breach of duty.36 Similarly, if a court found that the likelihood of misconduct on networks was so great, the fact of the “intervening” criminal act would not necessarily be sufficient to break the chain of causation.

Standards and Best Practices

As a motivating factor for industry to adopt best practices, tort law can be a significant complement to standard-setting, because compliance with industry-wide standards is usually an acceptable demonstration of due care. If tort liability were recognized in this area, implementing security standards would be a way for a company to minimize its liability. Gripman argues that corporations have “a duty to select and implement security measures, to monitor the security measures’ effectiveness, and to maintain and adapt the security measures according to changing security needs.”37 However, today there is no such duty and no nationally recognized

35  

Sarah Faulkner. 2000. “Invasion of the Information Snatchers: Creating Liability for Corporations with Vulnerable Computer Networks,” Journal of Computer & Information Law, 18:1019-1047.

36  

In People Express Airlines v. Consolidated Rail Corporation, 495 A.2d 107 (N.J. 1985), a railway accident caused a tank of flammable liquid to spill and ignite near the plaintiff’s business. The fire caused no physical damage, but the plaintiff’s business operations were interrupted, causing severe financial loss. The court rejected the economic loss doctrine and allowed the plaintiff corporation to prosecute its claim for purely economic loss.

37  

David Gripman. 1997. “The Doors Are Locked But the Thieves and Vandals Are Still Getting In: A Proposal in Tort to Alleviate Corporate America’s Cyber-Crime Problem,” The John Marshall Journal of Computer and Information Law 16(1):167.

38  

The International Organization for Standardization (ISO) adopted in August 2000 the Code of Practice for Information Security Management (also known as ISO 17799), which is based on the British standard BS 7799. The standard has faced criticism from several countries and security experts, who argue that ISO 17799 is too vague because it focuses on general policies and best practices rather than concrete mechanisms for auditing compliance. However, some insurance companies, such as AIG, are using ISO 17799 as a basis to measure the security of cyber insurance policy holders. An analysis of ISO 17799 conducted by the Information Technology Laboratory of the National Institute of Standards and Technologies is available at <http://csrc.nist.gov/publications/secpubs/otherpubs/reviso-faq.pdf>.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

standard of care38 among computer security experts. Adopting such a standard is not a simple process due to the evolving nature of security vulnerabilities and the wide variety of the size and resources of individuals and entities that have an Internet presence. For example, applying security patches promptly may be one component required for demonstrating “reasonable care,” but how often should patches be applied? Should a corporation be deemed negligent if its security policy is to search for and apply patches once a month but the corporation’s servers were hacked in the third week after a patch was released? Determining the duty that a corporation should have is complicated by the fact that although a patch may close one vulnerability, it could open a new vulnerability when installed in a local environment. Should a corporation be deemed negligent if it installs a patch that leaves the system more vulnerable?

Eric Benhamou suggests that the one action that firms should take immediately is to begin sharing best practices (including attack scenarios and practices to protect against these attacks). Establishment of operational best practices for network administrators and users (combined with ongoing training and enforcement of the practices through intrusion detection tests) is one possible way of increasing computer security. The CERT Coordination Center at Carnegie Mellon University (a federally funded research and development center) and the SANS Institute are two examples of organizations working to develop and disseminate suggested best practices for computer security. Even with such organizations, adopting good security practices will not happen overnight, and implementation will vary.

In addition to playing a role in tort liability determinations, best practices can also serve as a benchmark against which firms can be audited. Audits, a normal part of business management, can be beneficial in the computer security arena. A firm is more likely to avoid litigation or reduce its liability if it is routinely audited and if its auditors apply well-accepted principles of testing and analysis, which do not yet exist in the security environment. Moreover, by forcing a corporation to understand what it will be audited for, auditing serves to educate the corporation on

   

Meanwhile, industry-specific standards are emerging. For example, Gramm-Leach-Bliley, implemented by the SEC, imposes rules that financial institutions must follow. The Health Insurance Portability and Accountability Act outlines the responsibilities that health care providers and insurers have with respect to security measures to protect electronic information.

39  

For example, the government recently announced that it was creating a security seal of approval that consists of a set of software standards that all DoD computers must meet. See “Government’s Seal of Security,” Wired News, July 16, 2002, <http://www.wired.com/news/politics/0,1283,53901,00.html>.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

what is expected of it. Audits or certification programs39 would also serve as a mark of acceptance for the corporation that will help it gain customer acceptance for its products and services and could result in reduced insurance premiums.

If a liability regime is imposed, entities may still be held negligent even if they comply with industry standards. In T.J. Hooper v. Northern Barge Corporation,40 two barges towed by two tugboats sank in a storm. The barge owners sued the tugboat owners, claiming negligence and noting that the tugboats did not have weather radios aboard. The tugboat owners countered by noting that weather radios were not the industry norm. Judge Learned Hand found the tugboat owners liable for half the damages even though the use of weather radios had not become standard industry practice. He observed: “Indeed in most cases reasonable prudence is in fact common prudence; but strictly it is never its measure; a whole calling may have unduly lagged in the adoption of new and available devices. . . . Courts must in the end say what is required; there are precautions so imperative that even their universal disregard will not excuse their omission.” This case shows that the meaning of “reasonable care” is never static and must constantly be reevaluated as technology changes. Industry’s failure to develop a standard or to adapt the standard to changes in technology could lead courts to develop their own standard.

Given the relatively novel nature of liability for insecure computer systems, one option is to create a safe harbor (immunity from tort liability) for corporations that comply with standards that are disseminated by a designated body. Care must be taken, however, to ensure that the establishment of best practices and associated safe harbor provisions does not deteriorate into a substitution of ritual for effective practices. In rapidly evolving areas, procedures or rituals may be all that can be standardized. Hence, although a liability regime could result in more compliance with the procedures and policies set forth in applicable standards, it may not actually improve network security. A corporation that is given safe harbor may not have sufficient incentive to surpass the prevailing standard (e.g., by implementing security policies, such as aggressive local testing of patches, that would provide it with a better level of protection) or to develop innovative solutions (e.g., to detect and resist computer attacks). Moreover, any improvement in network security achieved through a liability regime also could result in increased corporate liability for failing to follow sound information security procedures, even when data, systems, and networks are not actually put at risk.

Should all entities in the Internet community be held to the same standard of care with respect to computer network security? Legal liability

40  

60 F.2d 737 (2d Cir. 1932).

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

often depends on which actors are best positioned to prevent the harmful activities (in this case, computer attacks). The committee has found it useful to examine the potential duty owed by a few key players: Internet service providers, vendors, universities and colleges, and individual users (see Box 3.3).

BOX 3.3 Assigning Liability to Key Players

Internet Service Providers

Distributed denial-of-service (DDOS) attacks1 can be inbound to or outbound from the ISP’s2 network. Inbound DDOS attacks are launched from outside the ISP and either pass through or terminate on the ISP’s customers. Outbound attacks are launched from computers connected to the ISP’s network. An ISP can connect several thousands of high-bandwidth, always-on computers to the Internet. These computers as a group are prime targets for hackers, who can use them as zombies to launch an attack through the ISP’s network to other targets on the Internet. When such an attack is launched, who is liable for damages?

ISPs are in a unique position to prevent or contain the harm caused through DDOS attacks in that they can cut off certain computer network attacks when these attacks enter the ISP’s network. The high-capacity communications links and the powerful routers (switches) used by ISPs within their networks are much better positioned to deal with large volumes of DDOS packets than are the links and routers used by many end customers. This is particularly true if some changes are made to both the technical protocols used to move packets through the Internet and the routers (switches) that sort and route those packets. While ISPs could detect and cut off attacks, they have little incentive to do so today. Implementing better network security measures would cost the ISPs more money and could slow overall network performance, resulting in customer dissatisfaction. To efficiently and effectively identify packets as belonging to a DDOS, some changes (e.g., implementation of mechanisms that make it possible to reliably identify the origin of a packet and the route it has taken) might be required that reduce the anonymity enjoyed by Internet users today. In addition, some attacks are difficult to detect, analyze, and respond to, thus requiring advanced network management systems. The ISPs are largely unregulated,3 and there are no formal standards imposed upon them by regulatory bodies for such things as the security, trustworthiness, or reliability of the services they provide.4 ISPs’ policies on incident response vary at the discretion of each ISP and its own best estimates of what may be required of it in a competitive marketplace, by its target customers, in the future.

Given that ISPs know (or should know) about the risk and have the capability to mitigate DDOS attacks, some experts believe that ISPs should face significant liability if their systems are insecure. However, some service providers argue that they should receive immunity for hostile traffic flowing through their networks.5

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

Vendors6

Should vendors (hardware and software) be liable for developing products with extensive security flaws? Do companies that knowingly use defective products incur liability if their systems are used to launch a downstream attack? For example, hackers know that one particular hardware vendor ships routers in an insecure state. Because of this insecurity, companies know that they need to boot up the router and configure it before it is attached to the Internet. However, in one case, a summer intern did not know about the policy and attached the default-configured router to the Internet before reconfiguring the router. Within 30 seconds, a hacker broke into the company’s network. Who should be held accountable—the vendor who knowingly shipped an insecure product or the company whose employee did not follow the company’s established procedure? Certain vendors recognize that this is an instance of marketplace failure—that is, the marketplace does not reward those who invest resources in building more secure products. Customers want more features delivered more quickly. If there was a market demand for more secure products, vendors argue that they would change their approach.7

Some experts have called for vendors to be held liable for releasing products with security holes. Applying a strict liability standard for the computer security context seems unfair because software development is never a perfect process and most IT environments are too complex and diverse to ensure that a given product will perform exactly the same in every case. However, the CSTB report Computers at Risk notes that one way to reduce the frequency and severity of errors is through the use of tools and testing methods prior to the release of a software product. If the use of such tools and testing methods were part of industry-accepted best practices, vendors could be held liable for negligence if such tests are not performed. Accordingly, allowing vendors to be held liable for negligence would change the cost-benefit calculation to encourage the development and delivery of more secure computer products. Because such liability raises costs, vendors have lobbied against it (and for such measures as UCITA, which contains liability). However, there is some case law that supports the concept of vendor liability in connection with consumer products (Shaw v. Toshiba).

Universities and Colleges

Many universities have rather open, large-scale, high-capacity networks that can serve as a base from which hackers can launch attacks. Although some larger, well-funded universities and colleges have the resources and technical knowledge to implement appropriate security measures, not all do. In addition, providing an open environment that encourages information sharing and intellectual exploration is highly valued as an important role of U.S. educational institutions, and some security strategies would inhibit network-based interactions. Hence, it is not clear whether universities and colleges should face liability for failure to secure their computer networks. One approach is for the large research universities to set an example and play a lead role in designing and implementing secure computer network environments for all universities and colleges.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

Individual Users

In the new age of broadband computing, home users represent a major source of potential security hazards. Should home users be liable if they do not take certain steps (e.g., apply software patches, install a firewall, or use antivirus software) to secure their computers? Most users simply buy a preinstalled PC or install their computer with default settings chosen by the software vendor. Typically, these default-configured installations do not provide any level of computer security. Therefore, it can be easy for a hacker to break into a home user’s computer and use it as a zombie computer in a distributed denial-of-service attack. The average user does not necessarily have the knowledge to secure his or her home computer and may not even be aware of the risks.8 Furthermore, even if the vast majority of end users were to proactively learn about and implement effective computer and network security measures, hackers would still be able to launch attacks with the remaining unprotected zombies.

“The average user is essentially clueless about how to prevent his computer from being taken over, so assigning liability to him would be pointless,” argues Hal Varian.9 While currently available “personal firewalls” (software on the user’s home computer that is supposed to protect it from network attacks) are considered highly ineffective by the majority of the network security community, very effective stand-alone firewalls for small (e.g., home or small-office/home-office) networks need not be expensive or complex and can be configured to be very protective by default. The marginal cost of manufacturing a cable or DSL modem with one of these firewall devices integrated with it (i.e., sharing a chassis, power supply, or network interface) would presumably be very low. An effective firewall, conservatively configured, could largely prevent home computers from being attacked over the Internet no matter how security-poor the default software configuration of those computers. Yet, no supplier of broadband Internet services to the home is providing such integrated boxes to its users. Most experts agree that liability should be assigned to those entities that are best positioned to control the risks. However, it is not clear that home users yet have the initiative, education, or resources to maintain an adequately secured network. In the home user context, the service providers may be the least-cost avoider.

1  

The intent of a distributed denial of service attack is to reduce the availability of a computer network or resource below the level needed to support critical processing or communication. An attacker exploits security vulnerabilities to compromise one or more systems (often called “zombies”), which are then used to launch an attack against the target.

2  

This discussion focuses on ISPs with retail customers. ISPs whose business focuses on backbone connectivity and transport may not have directly connected end users, and they may face a different kind of legal context.

3  

The North American Network Operators Group is an educational forum that promotes the exchange of technical information and coordination among network operators in the United States. For more information, see <http://www.nanog.org>.

4  

The decision by many enterprises to implement multihoming arrangements (connecting the enterprise network to more than one ISP and routing traffic based on real-time availability and performance of connections) highlights the perception that ISPs do not deliver adequate and reliable service. See, for example, <http://www.routescience.com> or <http://www.sockeye.com>.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

5  

Representative Bob Goodlatte introduced the On-line Criminal Liability Standardization Act of 2002. According to the Congressional Research Service summary, this act proposes to amend “the Federal criminal code to provide that no interactive computer service provider shall be liable for an offense against the United States arising from transmitting, storing, distributing, or otherwise making available material provided by another person. Waives such liability limitation where the defendant intended that the service be used in the commission of the offense. States that a provider does not have such intent unless: (1) an employee or agent has such intent; and (2) the conduct constituting the offense was authorized, requested, commanded, performed, or tolerated by one or more members of the board of directors or by a high managerial agent acting for the benefit of the provider within the scope of his or her office or employment.” See <http://thomas.loc.gov/cgi-bin/bdquery/z?d107:HR03716:@@@D&summ2=m&> for more information.

6  

The committee is not addressing the provision in the USA PATRIOT Act that confers narrow immunity on vendors. It is a clarification of the CFAA, explaining that CFAA is aimed at hackers as perpetrators, not at vendors who may have noted the vulnerabilities.

7  

In response to calls for more secure software products, Microsoft launched the Trustworthy Computing Initiative in January 2002, which is purported to ensure the security and reliability of its software. The Sustainable Computing Consortium is a collaborative effort formed in May 2002 by industry and academia to drive improvements in software quality and security. See <http://www.microsoft.com/PressPass/exec/craig/05-01trustworthywp.asp> and <http://www.sustainablecomputing.org/> for more information.

8  

However, the FTC and other consumer-oriented agencies have launched a significant education campaign starring Dewie the Turtle, designed to increase the security awareness of the average home user. See <http://www.ftc.gov/bcp/conline/edcams/infosecurity/index.html>. The National Strategy to Secure Cyberspace also focuses on increasing the information security awareness of home users.

9  

Hal R. Varian. “Managing Online Security Risks.” New York Times, June 1, 2000.

REGULATION

Often, Congress passes broad legislation that calls for implementing regulations to be promulgated by administrative agencies with oversight authority over certain regulated industries.41 Direct regulation typically involves prescriptions and proscriptions (e.g., X is allowed but Y is not). An entity that fails to conform to the prescribed or proscribed conduct could face criminal liability. Broadly speaking, regulation that may relate to CIIP could come from any combination of four imperatives: efficient economic conduct, national security, public health and safety, and consumer protection. The purpose of economic regulation is to control

41  

The Administrative Procedures Act defines the processes for rule-making followed by various agencies.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

behavior (e.g., supra-competitive pricing) associated with market power or monopoly. Regulation associated with national security and with public health and safety recognizes the obligations of providers of products and services to stakeholders beyond their direct customers. Consumer protection regulation places the regulator in the role of a surrogate for efficiently protecting direct consumers’ interests (acting on behalf of those direct customers).

Telecommunications,42 electric power, and other critical infrastructures have historically been regulated as utilities.43 The regulatory status of these industries reflects perceived public interest, and it provides a basis for other government-industry interactions. An example relevant to CIIP is the rise of national security/emergency preparedness activities in telecommunications and the establishment of the Network Reliability and Interoperability Council, composed of industry representatives and staffed by the Federal Communications Commission. As those developments illustrate, regulation can be associated with certain kinds of reporting and the establishment of and conformance to certain performance standards, both of which have been sought for CIIP.

An entirely different category of regulation—consumer protection regulation—also contributes to CIIP, albeit indirectly, because such regulations target major users or suppliers of the critical information infrastructure. For example, the Gramm-Leach-Bliley (GLB) Act,44 which gave rise to regulations implemented by several government agencies (including the banking agencies, the Securities and Exchange Commission (SEC), and the Federal Trade Commission (FTC)), and the Health Insurance Portability and Accountability Act of 1996 (HIPAA),45 which gave rise to regulations under the jurisdiction of the Department of Health and Human Services, outline the responsibilities of financial institutions and health care providers and insurers, respectively, with regard to protecting consumer privacy. These acts and their implementing regulations speak to security measures that the institutions should implement to protect consumer information stored in their computer databases. On May 17, 2002, the FTC issued the Safeguards Rule, which implements the safeguard

42  

The telecommunications sector was regulated by the Communications Act of 1934 as amended by the Telecommunications Act of 1996. PL 104-104, February 8, 1996.

43  

The Defense Production Act of 1950 is an example of a regulation imposed on the private sector to ensure availability of industrial resources for national security purposes. For more information, see Congressional Research Service, “Defense Production Act: Purpose and Scope,” June 22, 2001. Lee Zeichner argues that the DPA could be extended to critical infrastructure protection (Lee M. Zeichner. “Use of the Defense Production Act of 1950 for Critical Infrastructure Protection,” 2001).

44  

15 USC, Subchapter I, Sec. 6801-6809. PL 106-102.

45  

PL 104-191.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

provisions required by the Gramm-Leach-Bliley Act. The Safeguards Rule requires covered entities to implement a comprehensive information security program by May 23, 2003, to ensure the security, confidentiality, and integrity of nonpublic customer information against both internal and external threats. Institutions that fail to comply could face potential FTC enforcement actions and potential liability under state consumer protection laws or common law claims (such as negligence).46 Recent FTC settlements47 have established “reasonable security” as a written, comprehensive information security program that (1) designates appropriate personnel accountable for information security, (2) assesses security risks, taking into account, among other things, employee training, (3) implements reasonable security safeguards to control risks, and (4) adjusts the information security program in response to regular testing and monitoring. The GLB implementing regulations and recent FTC actions go a long way to setting the stage for best practices and may give rise to a de facto industry standard for negligence liability.48 However, a number of questions remain about the FTC’s de facto security standard. It is not clear whether ISO 17799 meets these requirements. Nor is it known what types of documentation, training, and supervision are necessary to meet the standard. The Microsoft settlement appears to indicate that damage is not necessary to trigger an FTC inquiry and the imposition of its security standard. Clearly, though, the recent FTC actions, combined with the GLB and HIPAA regulations, confirm that companies can no longer continue to address security issues informally.49

46  

GLB and HIPAA regulations have caused a seismic shift in the financial and health care industries (similar to the effect of Y2K on the computer industry) as institutions scramble to comply with the detailed requirements. If similar legislation could be passed in the CIIP arena, it might have equally extraordinary results. However, one major difference between CIIP and these other regulations is that CIIP does not have an immediately apparent benefit. In the case of Y2K, entire networks would purportedly have crashed if fixes were not put in place. In the case of HIPAA, Medicare payments will be withheld and other penalties may be imposed if HIPAA is not heeded. In the case of CIIP, we are largely dealing with what-if scenarios.

47  

Microsoft Corporation, File No. 012 3240, August 8, 2002, and Eli Lilly, File No. 012 3214, January 18, 2002. See the FTC Web site for more information: <http://www.ftc.gov>.

48  

Although not a precedent, the case of Ziff Davis Media Inc. shows how states (like the FTC) can use promises in privacy policies as a lever to enforce good security practices. The remedy includes a specific set of security provisions. For more information see <http://www.oag.state.ny.us/press/2002/aug/aug28a_02.html>.

49  

One potential consequence of the recent FTC actions might be efforts by a company to explicitly disclaim the suitability of its software for use in certain industries or applications, such as health care. Although the software may not differ substantially from what would normally be used, the disclaimer could be a shield to protect the company from threats or application of regulations or liability. It is not clear that such concerns are applicable in the CIIP arena, given that the information infrastructure is holistic and larger than one single industry.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

The SEC, with a mission to protect investors and the securities markets,50 oversees activities relating to traded securities. Its interests in avoiding fraud—a major consumer protection interest generally—have led to regulations requiring companies to disclose certain kinds of information about what they do and their circumstances, some of which relate to measures affecting the security and stability of a company’s information infrastructure.51 Disclosure is a common vehicle for consumer protection; it helps consumers protect themselves.52

One way to encourage companies to protect the critical infrastructures that they own and operate is to adopt disclosure agreements similar to those used during Y2K. In 1997, proposed legislation was introduced in Congress to require publicly traded companies to disclose certain information related to Y2K remediation and risk management status (via disclosure of Form 10Ks to the SEC). The SEC then published Staff Legal Bulletin No. 5, which reminded public companies, investment advisers, and investment companies to consider their disclosure obligations relating to anticipated costs, problems, and uncertainties associated with Y2K. The SEC released Interpretation No. 33-7558 in 1998, which superseded the staff legal bulletin. William J. Semancik, director of the Laboratory for Telecommunications Sciences at the National Security Agency, suggests that if we believe it is crucial that a company be up and operating for the sake of the country, then perhaps that company should be required to disclose the steps it is taking so the public can verify that the company is fulfilling its fiduciary responsibility.

Consumer protection is also the umbrella under which regulation of product quality falls. For CIIP, issues arise for the security dimension of the quality of computing and communications hardware and software. In the 1990s, both case law and efforts to revise the Uniform Commercial Code to address software more effectively progressed, but the eventual proposals for the Uniform Computer Information Transactions Act (UCITA), which would alter state law on contracts, were so controversial—in part because of allegations that they tipped the balance of power too far toward vendors—that they bogged down its passage in all the states but Maryland and Virginia.53 The effort itself, though, illustrates

50  

The SEC’s jurisdiction arises from the Securities Act of 1933 and the Securities Exchange Act of 1934. Other laws also shape the SEC’s mission, notably including the Public Utility Holding Company Act of 1935, which provides reporting requirements for electric power and natural gas utilities.

51  

The SEC’s reporting requirements for Y2K are an illustration.

52  

Sometimes a consumer’s options may be limited, but disclosure may illuminate problems that will motivate parties to come up with alternatives.

53  

On August 2, 2002, a group of legal experts proposed several amendments to UCITA to address concerns about consumer rights. It is not clear whether the amendments will

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×

the potential for lawmaking to influence vendor responsibilities under the law and to reframe liability.

As this brief overview illustrates, regulations relevant to CIIP are a patchwork. That situation will complicate any efforts to develop a regulatory framework (rationale, legal basis, agency oversight) for critical infrastructure protection. Hank Perritt, CSTB member and dean and professor of law at Illinois Institute of Technology, Chicago-Kent College of Law, suggests that regulation is really about a fundamental choice: whether the need for a robust, reliable, critical information infrastructure is better met by a highly centralized approach—the model for which is AT&T as it existed in 1965—or whether it is better served by a highly decentralized and very market-oriented and loosely regulated approach (such as is exemplified by the Internet).54 Given how the economy and the information infrastructure have evolved, we have a decentralized system today. Any changes would have many ramifications. Many (including the current administration and the Internet community, which is often described as cyberlibertarian) view regulation by the government as interference in the market economy. On September 11, the Internet was very resilient (due in large part to a fair amount of redundancy),55 which shows that a decentralized model does not necessarily produce a less robust infrastructure. A decentralized scenario does not foreclose the possibility of the law having more bite but, rather, offers a choice of instruments that are not necessarily regulatory. For example, Mr. Perritt suggests that contract and tort law could ratchet up the cost of having an insecure network, and this disincentive could be further strengthened through regulation, without eliminating competition or decentralization.

Regulatory compliance and the desire to avoid new regulations serve both to require and to motivate all parties to pay more serious attention to securing the nation’s critical infrastructure against cybercrime and attack. The mere threat of such regulation could motivate vendors and corporations to self-regulate, providing their own standards and audit policies. The heightened interest in ISACs in 2002 is an indicator that the private sector is moving toward self-regulation. The government could periodically review such self-regulation efforts and provide reports showing deficiencies that would need to be corrected by a given deadline if regulation is to be avoided.

   

appease the many critics of the bill, including the Consumers Union and the Electronic Freedom Foundation. See <http://zdnet.com.com/2100-1104-948194.html>.

54  

This is not to suggest that centralization implies regulated and that decentralization implies loosely regulated.

55  

Computer Science and Telecommunications Board, National Research Council. 2003. Internet Under Crisis Conditions: Learning from September 11. National Academies Press, Washington, D.C.

Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 35
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 36
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 37
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 38
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 39
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 40
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 41
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 42
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 43
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 44
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 45
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 46
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 47
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 48
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 49
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 50
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 51
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 52
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 53
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 54
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 55
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 56
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 57
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 58
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 59
Suggested Citation:"3 Liability for Unsecured Systems and Networks." National Research Council and National Academy of Engineering. 2003. Critical Information Infrastructure Protection and the Law: An Overview of Key Issues. Washington, DC: The National Academies Press. doi: 10.17226/10685.
×
Page 60
Next: 4 Moving Forward »
Critical Information Infrastructure Protection and the Law: An Overview of Key Issues Get This Book
×
Buy Paperback | $45.00 Buy Ebook | $35.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

All critical infrastructures are increasingly dependent on the information infrastructure for information management, communications, and control functions. Protection of the critical information infrastructure (CIIP), therefore, is of prime concern. To help with this step, the National Academy of Engineering asked the NRC to assess the various legal issues associated with CIIP. These issues include incentives and disincentives for information sharing between the public and private sectors, and the role of FOIA and antitrust laws as a barrier or facilitator to progress. The report also provides a preliminary analysis of the role of criminal law, liability law, and the establishment of best practices, in encouraging various stakeholders to secure their computer systems and networks.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!