Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 104
Page 104 14 Legal Pressures in National Security Restrictions David Heyman I will address the new national security restrictions and the pressures that they may impose on the public domain. The key questions to address today are what information can we create and share with the world and what information cannot be openly shared, in light of the increased threat environment? I want to start by reading to you from a government directive that attempts to answer this question in part. It was issued in December 2001 and provides regulations on publications. It says that no publication is allowed to have the following, contents that: “insult or slander other people”; “publicize pornography, gambling and violence, or instigate crimes”; “leak state secrets, endanger national security, or damage national interests”; “endanger social ethics and outstanding national cultural traditions”; “disrupt public order and undermine social stability”; or “publicize cults and superstitions.” 1 This document is from the People's Republic of China's “Regulations on Management of Publications,” issued December 31, 2001. I bring it to our attention because clearly the question as to what information can and should reasonably be controlled is really open for interpretation—some of the regulations in China seem quite reasonable, some are questionable, and some are beyond the pale. What is important here is that there is a line, which at some point we cross, as you probably did while listening to me. It is a line that contemplates legitimate restrictions on traditional public-domain information. I believe if our policies go to one extreme of that line, we risk national and economic security, and if our policies go to the other extreme, we risk freedom. Our job today and in the future is to clarify that line and shine a light on it so that reasonable people can make reasonable policy. I am going to focus on federal investments in science and technology and the contributions these investments make to creating and disseminating public-domain information and, ultimately, to improving the security and well-being of our nation. Let us first look at the new pressures that are being put on the public domain as a result of the attacks on September 11. I think what was so shocking about the terrorist attacks was the realization that the terrorists lived among us and used our open society against us. Our experience turned out to be a bit of a perversion of Pogo's famous quote, “we have met the enemy and it is us.” We felt victimized after September 11 in part because the terrorists exploited the very aspects of American society that make our country strong: its openness, easy access to information, freedom of association, ease of mobility, and right of privacy. The terrorists lived among us and used our freedoms 1 See “PRC Regulations on Management of Publications,” issued on December 31, 2001. Translated from the Foreign Broadcast Information Service and available online at http://www.fas.org/irp/news/2002/05/xin123101.html.
OCR for page 105
Page 105 against us. The recognition that we were vulnerable to this, and the fear that this vulnerability caused, generated a wave of response by the government to limit those very attributes of an open society that were used against us—in particular, access to public-domain information. I will discuss some specific examples later in my remarks. As a result of this experience, we are now witnessing two key components of public-domain information being constricted due to national security pressures: first, the creation of public-domain information, and second, its availability for others to use. Let us start by looking at the creation of public-domain information. Science can provide us with the capability to acquire information about the nature of the physical world, as well as the technological alternatives that we do not presently possess. This information, in the long run, is vital to the future of the U.S. economy, our national defense, and general well-being. Science can only do this through sustained investments over the long term and through the continuous development of a talented workforce to perform research and development (R&D). Today, the United States is investing more in R&D than it ever has in the past, even with adjustments for inflation. Nonetheless, the U.S. share of total world R&D is decreasing. More research capabilities are becoming available outside the United States. Over the past 50 years, we have seen the United States go from performing more than 70 percent of the world's total R&D in dollars spent to a point where today the rest of the world performs approximately twice as much R&D as the United States. The changing investment patterns are having an impact on where research results are being produced and where they are found. For example, since World War II, U.S. scientists have led the world in authoring scientific publications, a measure of where scientific discoveries are being made. Recently, as a result of the increased quality and volume of scientific activity in many countries, more and more articles are submitted to journals from scientists outside of the United States. In the physical sciences, in the early 1980s, U.S. publications accounted for nearly 70 percent of all articles. Today, U.S. publications have fallen to approximately 25 percent of the world's total. The changes in the U.S. science and engineering workforce are more interesting, particularly in the context of national security. As I noted, there has been an increasing amount of research and technical resources available outside the United States and a growing internationalization of science and technology. As a result, U.S. scientists and engineers comprise a diminishing share of the total global technical workforce. Fewer U.S. scientists are pursuing physical sciences and other comparable hard sciences, and those that are pursuing them prefer academia and industry to a government career. Last, more foreign-born students are being trained as scientists and engineers in the United States, and approximately 50 percent remain in the United States to become part of the workforce. At the same time, domestically, the increased investments in the private sector and the employment opportunities that emerged as a result of the technology boom in the 1990s created a honey pot in the private sector and competed with the government to fill the valuable R&D positions. To summarize, the creation of science-based public-domain information, in a simplified world, is a product of investment and effort. As more of the U.S. share in total world R&D diminishes, so likely does its share of the creation of the world's total public-domain information. As investments shift from civilian to military priorities, as can be anticipated with a war on terrorism, we might expect to see some crowding-out of investments that lead to public-domain information as well. And lastly, as the U.S. workforce becomes increasingly reliant on foreign scientists and engineers, the challenges for controlling transfer of technology will become greater, limitations on interactions with foreign scientists may increase, and we will likely see more restrictions on access to public-domain information. This is, in fact, what we are seeing. The availability of public-domain information has been squeezed significantly more by recent national security developments. There is a multitude of ways in which public-domain information is made available. They include publications, Web sites, conferences, and presentations, as well as through working collaborations. Security professionals are not only concerned with what information is being published, but how it is transferred to the public and, in particular, how interactions among scientists and engineers serve as a mechanism for exchanging information. The post-September 11/post-anthrax attacks security environment has raised concerns regarding the possible malicious use of scientific and technical information and puts greater pressures on scientific institutions to strengthen security to prevent the unintended transfer of technology to those who would harm us. We felt this pressure when we realized that the perpetrators of the September 11th attacks lived secretly in our neighborhoods, and operated freely within our open society. Some of the September 11 terrorists entered the United States on
OCR for page 106
Page 106 student visas, but never matriculated to the school to which the visa applied; some received pilot training in the United States. These examples raised concerns that terrorists or other enemies of the United States may seek to gain entry to the United States under acceptable and rather innocent guises, but may, in reality, seek to enter the country to acquire knowledge, skills, or technologies to mount future attacks against the United States. At about the same time, we also saw the publication of three research papers that have generated significant alarm. First, a study published in the Journal of Virology described an experiment by Australian scientists to re-engineer a relative of smallpox, called mousepox, in a way that made the virus far more deadly. 2 Some have argued that if the same technique were applied successfully to smallpox, the consequences to society could be devastating. Similarly, the Proceedings of the National Academy of Sciences published a study by scientists at the University of Pennsylvania that provided details about how smallpox uses a protein to evade the human immune system. 3 Again, such information, critics suggest, could be quite harmful if misapplied. Lastly, in July 2002, Science magazine published a paper in which scientists at the State University of New York at Stony Brook described how to make poliovirus from mail order DNA. 4 The publication of that study spurred Rep. Dave Weldon, a Florida Republican, to introduce a resolution criticizing Science for publishing “a blueprint that could conceivably enable terrorists to inexpensively create human pathogens for release on the people of the United States.” 5 As a result of these and other developments, there are a number of growing efforts today to protect and limit access to scientific information, including efforts to restrict the activities of foreign nationals and the interactions of U.S. nationals with foreign nationals. The Bush Administration, the U.S. Congress, and some scientific communities have adopted or are considering implementing new security measures that could dramatically shrink the availability of public-domain information. These include increased foreign student monitoring, 6 restricted access to certain technical materials or tools, expanding export controls, tightening visa requirements, 7 and limiting publications. 8 Security reforms also include efforts to limit information historically provided to or already in the public domain; 9 to expand the use of a 2 See Jackson, R.J., Ramsay, A.J., Christensen, C.D., Beaton, S., Hall, D.F., Ramshaw, I.A. 2001. “Expression of mouse interleukin-4 by a recombinant ectromelia virus suppresses cytolytic lymphocyte responses and overcomes genetic resistance to mousepox,” Journal of Virology, 75(3):1205-10, Feb. 3 See Rosengard, A.M., Liu, Y., Nie, Z., Jimenez, R. 2002. “Variola virus immune evasion design: expression of a highly efficient inhibitor of human complement,” Proceedings of the National Academy of Sciences, (13):8808-13, June 25. 4 See Cello, J., Paul, A.V., Wimmer, E. 2002. “Chemical synthesis of poliovirus cDNA: generation of infectious virus in the absence of natural template,” Science, 297(5583):1016-8, Aug. 9. 5 See House Resolution 514. 2002. 107th Congress. Introduced by Dave Weldon, July 26. 6 The USA Patriot Act of 2001 (Public Law No: 107-56), requires universities and “other approved educational institutions [including] any air flight school, language training school, or vocational school” to build and maintain a sizable database on its students and transmit that information to the Department of Justice, the Immigration and Naturalization Service (INS), and the Office of Homeland Security. The database system, called the Student and Exchange Visitor Information System, would automatically notify the INS of a student's failure to register or when anything goes wrong in the student's stay. Further, failure of a university to provide the information may result in the suspension of its allowance to receive foreign students (the ability to issue I-20s or visa-eligibility forms). 7 The Patriot Act also allows for the U.S. Attorney General to detain immigrants, including legal permanent residents, for seven days merely on suspicion of being engaged in terrorism. The bill denies detained persons a trial or hearing, where the government would be required to prove that the person is, in fact, engaged in terrorist activity. Further, in September 2002, the INS implemented the initial phase of the National Security Entry-Exit System (NSEERS) at selected ports of entry. Under the NSEERS program, certain individuals will be interviewed, fingerprinted and photographed upon entry into the United States, and their fingerprints will be checked against a database of known criminals and terrorists. These individuals also must periodically confirm where they are living and what they are doing in the United States, as well as confirm their departure from the United States. 8 Taking the first significant step toward self-regulation in this area, in February 2002 the publishers of some of the most prominent science journals in the country issued a pledge to consider the restriction of certain scientific publications in the name of security. The statement outlined the unique responsibility of authors and editors to protect the integrity of the scientific process, while acknowledging the possibility that “the potential harm of publication [of certain research may outweigh] the potential societal benefits.” 9 In October 2001, Attorney General Ashcroft revised the federal government's policy on releasing documents under the Freedom of Information Act, urging agencies to pay more heed to “institutional, commercial, and personal privacy interests.” The administration wants the new Department of Homeland Security exempted from many requirements of the Freedom of Information Act. In March 2002, the President's Chief of Staff issued a memo to executive agencies requesting that they safeguard information that could reasonably be expected to assist in the development or use of Weapons of Mass Destruction, including information about current locations on nuclear materials. As a consequence, Federal agencies have removed a range of information from their websites and other public access points.
OCR for page 107
Page 107 category of classification known as sensitive, unclassified information; to broaden the enforcement of a concept called deemed exports, which is the oral transfer of technology between people; broaden classification authority in the executive branch; 10 and to impose new restrictions on fundamental research. A concern raised by all of these developments is the impact that they may have on the scientific community and, consequently, on the scientific enterprise. Additional security requirements may wittingly or unwittingly diminish the amount of scientific and technical data available in the public domain. Furthermore, requirements may slow the production of new knowledge or reduce the ability for some to publish or present their findings because of new classification concerns, which, in turn, will diminish peer recognition, career advancement, and, ultimately, morale. At that point, students may choose not to matriculate to U.S. universities, and scientists may choose to leave their government positions or reject government funding, rather than endure the environment in which they must operate. And who will replace them and who will do their work? This scenario is not fabricated, and it is not without precedent. In fact, it is exactly what we witnessed at the Department of Energy (DOE) a few years ago. Between 1998 and 2000, the United States faced three national security crises involving the potential loss of scientific and technical information. First, a high-level congressional investigation determined that China had stolen advanced missile technology from the United States, from U.S. corporations, as well as plans for the W88, one of the nation's most sophisticated nuclear weapons. Second, a scientist at one of the DOE's premier national security laboratories was accused of giving sensitive nuclear information to China. This is the Wen Ho Lee case. Last, less than a year after the first two issues surfaced, two computer hard-drives containing classified nuclear weapons information disappeared from a DOE laboratory for over a month. These incidents spurred dramatic reforms from both the legislative and the executive branches, including the institution of numerous new security measures at DOE to protect scientific and technical information and to prevent access of foreign nationals to the labs in certain circumstances. Concerned about the consequences of these new reforms, then Secretary of Energy William Richardson established a high-level commission led by former Deputy Secretary of Defense John Hamre to assess the new challenges that DOE faces in operating premier science institutions in the twenty-first century, while protecting and enhancing national security. An analysis by the commission, which included former FBI Director William Webster, former Deputy Director Robert Bryant, and numerous Nobel scientists, reveals that, although most reforms were well intentioned, many security reforms were misguided or misapplied and only exacerbated existing tensions between scientists and the security community, contributing to a decline in morale and, in some instances, productivity. I recommend the report, which goes into much more detail on this topic. 11 In the end, the commission found “that DOE's policies and practices risk undermining its security and compromising its science and technology programs.” Relevant to today's discussion, the commission found management dysfunction that impairs DOE's ability to fulfill its missions. In other words, good policy could be undermined by poor management. In the area of information security, the commission found that the process for classifying information was, in fact, disciplined and explicit. However, the same could not be said for the category of sensitive, unclassified information, for which there is no usable definition at the department, no common understanding of how to control it, no meaningful way to control it that is consistent with its various levels of sensitivity, and no agreement on what significance this category has for U.S. national security. Consequently, security professionals found it difficult to design clear standards for protection, and scientists felt vulnerable to violating rules on categories that are ill defined. As a consequence, scientists and engineers began opting out of conferences and in some cases opting out of publishing. We have to understand that heightened security is, in fact, appropriate and necessary after September 11. But we should also be deliberate and learn from the DOE's experience, or, like the DOE, we will risk undermining the very security we seek and diminish the scientific programs vital to our national security and our economy. 10 Through Executive Orders issued in December 2001, and May and September 2002, the Secretary of Health and Human Services, the Administrator of the Environmental Protection Agency, and the Secretary of Agriculture were respectively granted the authority to classify information originally as “secret.” 11 See Center for Strategic and International Studies (CSIS). 2002. Science and Security in the 21st Century: A Report to the Secretary of Energy on the Department of Energy Laboratories, CSIS, Washington, D.C.
OCR for page 108
Page 108 In conclusion, I would like to offer a few principles from the DOE experience that may be instructive on the question of limitations for public-domain information. The first principle focuses on security. Any policies that we seek to derive or practices we wish to employ regarding safeguarding scientific and technical information must be determined by collaboration between the scientific and technical community and the intelligence and security communities. Scientists cannot be expected to be aware of all the risks they face from hostile governments and agents. At the same time, security professionals can only understand what is at stake by working with scientists. In fact, these two communities must depend on each other to do their shared job successfully. Second, we must know what we want to protect. What is secret? We cannot make judgments on the architecture of security without first understanding the nature and conduct of science and the scientific environment in which we operate. For example, it is crucial to understand that today classified work has come to be dependent on classified science and technology, and unclassified science, in turn, has become more international and connected by digital communications. There are consequences to this in terms of whom U.S. scientists and engineers seek to collaborate with and whom they seek to employ. There are also costs if we choose to impose limitations on this methodology of work. Third, we must know what threats we face. We must understand that we exist in a changing world of dynamic threats and national security interests. Technological advances not only serve our social goals, but they also may enable our adversaries not only to exploit our critical systems but also our key personnel. There are legitimate security concerns that we must confront. Fourth, we must understand that absolute protection is impossible. Security is a balance of resources, which are limited, and risks, which can never be eliminated. We will have to make choices that will mitigate the risks. That means that we must understand the value of what we seek to protect, the consequences of it being compromised, and the cost of protecting it. Fifth, security processes should minimize disruptions to scientific activity. Security procedures must strike a balance. They must be unobtrusive enough to permit scientific inquiry, but effective enough to maintain strong security. Sixth, we should control information where there are no other cost-effective alternatives to ensuring national security. I think this is similar to what Justin Hughes proposed earlier. Finally, if information security is required, use understandable, meaningful, and workable classification systems to protect information. I think these principles represent hard decisions, which we must make to manage our growing information society and vital scientific enterprise today. Misguided or misapplied limitations on scientific activities motivated by security concerns pose a clear threat to science and society today. Information, in the end, is the oxygen that feeds science, our economic system, and our democracy. Consequently, we must be deliberate in defining the line between what should be in the public domain and what should be restricted.
Representative terms from entire chapter: