All critical infrastructures (transportation, finance, electric power, water, etc.) are increasingly dependent on the evolving information infrastructure—the public telephone network, the Internet, and terrestrial and satellite wireless networks—for a variety of information management, communications, and control functions. September 11 significantly increased the nation’s awareness of the interdependencies of critical infrastructures, and it heightened the government’s sense of urgency regarding the need for increased private sector and public sector information sharing with respect to cyber and physical threats. The Committee on Critical Information Infrastructure Protection and the Law held a symposium October 22-23, 2001, to outline issues related to the protection of the critical information infrastructure. This symposium, which had been scheduled well in advance of September 11, was profoundly influenced by those tragic events. Twenty-four presentations, over 2 days, illustrated the wide range of perspectives and concerns that complicate policy making and the development of an adequate legal regime when it comes to critical information infrastructure protection (CIIP). Subsequent crafting of new law and administrative activity under the rubric of homeland security have kept the legal framework for CIIP a moving target. This report examines the range of legal issues associated with information infrastructure protection, particularly those that affect the willingness of private sector companies and organizations to cooperate with the government to prevent, detect, and mitigate cyberattacks. It considers separately different aspects of information sharing and liability—recognizing that
there is a tension between these approaches that strategies for critical information infrastructure protection must ultimately resolve.
Although the sharing of information has been the centerpiece of both the government’s and the private sector’s efforts over the past several years to protect critical information systems, most information sharing still occurs through informal channels. Fundamental questions persist about who should share what information, when, how, why, and with whom. One reason for the lack of progress, according to private industry representatives, has been the lack of clarity regarding the benefits and associated liabilities in sharing information within and between industry sectors and with the government. For example, information sharing could lead to allegations of price fixing, restraint of trade, or systematic discrimination against certain customers; it could raise privacy concerns, expose proprietary corporate secrets, or reveal weaknesses and vulnerabilities that erode consumer confidence and invite hackers. Overcoming these concerns requires an informed position on the existing legal framework—an imperfect understanding of the law is both excuse and explanation for some observed limits to sharing.
Freedom of Information Act
Many private sector companies believe that proprietary CIIP-related information shared with federal government entities may be disclosed to third parties under the Freedom of Information Act (FOIA). Therefore, private sector companies have proposed amending FOIA to create a new exemption that would protect critical infrastructure information from disclosure. Opponents of such an exemption argue that the case law and agency interpretations demonstrate that the information—that is, information that is a trade secret or information that is commercial or financial, obtained from a person, and privileged or confidential—already is protected under the existing FOIA Exemption 4. Changing the FOIA, opponents argue, could upset the existing FOIA framework and open up the possibility for new litigation. Although the Homeland Security Act of 2002 did feature such an exemption, the fundamental issues remain.
A key problem is whether the federal government has the processes in place to protect information that should be protected under existing FOIA rules from inappropriate or accidental disclosure. The government may need to strengthen its formal controls on disclosure of information under FOIA, disclose to the private sector what those controls entail, and
strengthen its programs to better educate federal agency employees (who respond to the FOIA requests) about the types of information that cannot be released under existing law.
An additional concern of many in the private sector is that sharing CIIP-related data with competitors could be viewed as a violation of the provisions of the Sherman Antitrust Act. As a result, many in the private sector have called for a new antitrust exemption. Opponents argue that a new exemption is not needed to protect firms from allegations of anticompetitive behavior. They suggest that firms can obtain informal legal advice from antitrust experts or formal advice from the Department of Justice—in the form of a business review letter—on whether its proposed future conduct would be viewed as a violation of the antitrust laws. In addition, an exemption would create a new body of law that would upset 30 years of case history and lead to years of new litigation. Hence, the American Bar Association opposes new antitrust exemptions. Like FOIA, the existing antitrust law does not prevent the private sector from sharing critical infrastructure information. However, because official reviews of proposed information sharing activities require time and money to obtain, the use of such reviews may be a barrier to the types of ad hoc information sharing that are most likely to uncover well-planned attacks on the infrastructure. Also, as with FOIA, there are persistent perception problems related to what may be deemed permissible and what may be deemed illegal.
Experts observe that criminal law alone is not sufficient to deter hackers and prevent cybercrime; civil liability is necessary to ensure proper disincentives are in place to deter would-be cybercriminals. Ideally, civil liability allows a victim to recover losses from third parties if such parties were negligent or engaged in intentional misconduct and if such negligence or misconduct was the proximate cause of the loss. Because contract law does not provide an adequate remedy for third parties that have no privity of contract, many experts have suggested the use of tort law as a model for computer-related cases. Proponents of tort liability argue that companies that control the computer networks are in the best position to implement appropriate security measures. If a company knows or has reason to know that its computer network is being used to cause harm, and it has the capacity to stop such harm from occurring, the company could be subject to liability if it does not take some corrective action. The
applicability of tort law and the potential for civil lawsuits and monetary damages could encourage companies to invest in computer security measures. Debate continues in the private sector on whether there is a legal duty on the part of the company to secure its critical information infrastructure.
Standards, Best Practices, and Audits
Establishment of operational best practices for network administrators and users, combined with ongoing training and enforcement of the practices through random tests, is one possible way of increasing computer security. An obvious option is for firms to begin immediately to share best practices, including attack scenarios and practices to protect against these attacks. Best practices should focus on policies that improve computer network security rather than on procedures and rituals, which only create a perception of protection. In addition to playing a role in tort liability determinations, best practices can also serve as a benchmark against which firms can be audited. Routine audits based on well-accepted principles of testing and analysis, principles that need to be developed for computer security, can help firms avoid litigation or reduce liability.
As a force motivating industry to adopt best practices, tort law can be a significant complement to standard-setting practices, since compliance with industry-wide standards is often an acceptable demonstration of due care. If tort liability were more directly applicable in computer security cases, implementing security standards would be a way for a company to minimize its liability. Adopting a nationally recognized computer security standard of care is not, however, a simple process, owing to the evolving nature of security vulnerabilities and the diverse players that have an Internet presence. In addition, the meaning of “reasonable care” is never static, and firms must adapt the standard as technology changes.
Because legal liability often depends on which actors are best positioned to prevent the harmful activities (in this case, computer attacks), some experts suggest that the diverse entities in the Internet community should not all be held to the same standard of care with respect to computer network security. Given that certain Internet Service Providers (ISPs) may know (or should know) about risks and have the capability to mitigate attacks, many experts suggest that ISPs should face significant liability if their systems are insecure. It is possible to reduce (although not
eliminate) the frequency and severity of errors through the use of tools and testing methods prior to the release of a product. If the use of such tools and testing methods were part of industry-accepted best practices, it would be possible for vendors who do not perform such tests to face greater exposure under theories of negligence. Allowing vendors to be held liable for negligence may change the cost-benefit calculation, encouraging the development and delivery of more secure computer products.
Home users represent an important source of potential security hazards (as well as potential victims) since they often do not have the knowledge or expertise needed to secure their computers to prevent hackers from using them in a denial-of-service attack. Efforts to educate home users about the use of firewalls or antivirus software undoubtedly will help, but thought should be given to assigning liability to those other entities who are best positioned to mitigate the risks related to the systems and services used in the home.
The patchwork of regulations relevant to CIIP complicates efforts to develop a regulatory framework for critical infrastructure protection. The Gramm-Leach-Bliley Act (GLB), which resulted in regulations promulgated by several government agencies (including the banking agencies, the Securities and Exchange Commission, and the Federal Trade Commission), outlines the responsibilities of financial institutions with regard to protecting consumer privacy. The GLB-implementing regulations help set the stage for best practices and may become a de facto standard in assessing negligence liability. Regulatory compliance and the desire to avoid new regulations serve both to require and to motivate all parties to pay more serious attention to securing our critical infrastructure against cybercrime. The mere threat of such regulation could motivate vendors and corporations to self-regulate, providing their own standards and audit policies. The heightened interest in private sector Information Sharing and Analysis Centers (ISACs) in the last few years is a sign of movement toward self-regulation. The government could periodically review such self-regulation efforts and provide reports showing deficiencies that would need to be corrected by a given deadline if regulation is to be avoided. The Federal Trade Commission, for example, has done this for Web site privacy policies. Another approach to encouraging companies to protect the critical infrastructures that they own and operate is to adopt requirements for disclosing the steps that have been taken to evaluate risks and mitigate them, similar to the requirements of the Securities and Exchange Commission for Y2K.
THE BIG PICTURE
The legal framework for critical information infrastructure protection must be considered in the larger context of the business, social, and technical environment. The increasing dependence on common technology and interconnected systems suggests that many of the technical vulnerabilities can be overcome only through collective, concerted action. Externalities are common in computer network security; the incentive that one network owner has to invest in security measures is reduced if the owner believes that other connected networks are insecure. Insurance can play a role in motivating the private sector by transferring the risk of computer security losses from a company to the insurance carrier. The few cyber insurance policies in effect today require companies to employ appropriate security measures. Most policies also require firms to undergo an initial independent security evaluation of network defenses and ongoing intrusion-detection tests during the life of the policy.
Prior to September 11 the security of information systems and the protection of personal data and privacy were considered to be mutually reinforcing and compatible goals. Many experts suggest that the crisis-management mentality in the aftermath of September 11 has pushed aside issues of privacy and civil liberties. Technical mechanisms proposed to aid government efforts in the war on terrorism appear, to some, to sacrifice privacy and civil liberties for only the illusion of an increased ability to protect the nation’s infrastructures. Mechanisms should be implemented to ensure that surveillance conducted to combat terrorists and hackers does not result in a loss of privacy for innocent citizens. Symposium participants noted that the seriousness and urgency of protecting the nation’s infrastructures make it even more important to protect well-established constitutional and statutory principles of privacy and civil liberties in crafting a solution.
Trust among those sharing information is one of the most important prerequisites for successfully protecting the nation’s critical information infrastructures. Trust is necessary to achieve an atmosphere of openness and cooperation. Although trust has been a central component of the government’s CIIP efforts over the past several years, the government has failed to build sufficient trust between the public sector and the private sector for four reasons. First, the government’s message to the private sector has vacillated—at times it stresses national security, at other times, economic vitality—raising concerns about whether the priority of the day will trump prior promises. Second, the government has so many focal points for CIIP that firms often do not know which agency to contact or what authority and established processes underpin the promises of that agency to protect information from disclosure. Third, the government has been slow to reciprocate in sharing information with the private sector.
Finally, in the aftermath of September 11, the government took actions that produce a perception (right or wrong) that it may unilaterally suspend prior agreements with respect to the nondisclosure of information if it deems that circumstances warrant. The government should clearly and consistently explain to the private sector what its objectives are for CIIP, how it has organized itself to accomplish those objectives, what the information flows are, what kind of information should be shared and in what form, what the government is willing to share with the private sector, and why all of this is important (i.e., what the threat is, and how the proposed actions will address that threat). This message should clearly and consistently articulate what protections already exist for information sharing and what safe harbors exist (or will be established) to encourage information sharing in light of FOIA and antitrust concerns in the private sector. Consolidation of critical infrastructure protection functions in the new Department of Homeland Security will create a focal point; the tasks of clarifying the policies and communicating with the public remain. A clear and consistent message from the government to the private sector will go a long way toward building the trust that is necessary to protect the nation’s critical information infrastructures.