VOLUNTARY RESTRAINTS ON RESEARCH WITH NATIONAL SECURITY IMPLICATIONS: THE CASE OF CRYPTOGRAPHY, 1975–1982
In 1977 the Institute of Electrical and Electronics Engineers (IEEE) scheduled a symposium at which several important papers on cryptography were to be presented. Research had established a basis for developing powerful new encryption schemes, using fundamental concepts of computer science, and examples of these schemes were included in the papers. Prior to the symposium, however, a letter arrived at IEEE headquarters warning that the presentations might subject the authors and the IEEE to prosecution under the Arms Export Control Act of 1976. The letter was signed by an IEEE member, Joseph Meyer, who gave only his home address, but who turned out to be an employee of the National Security Agency (NSA). The function of the NSA is to intercept and decipher the communications of foreign governments and to safeguard the secret communications of the United States.
After due deliberation, the IEEE decided to proceed with the symposium, although the papers of some graduate students were presented by their faculty advisors to ensure legal backing from their universities. No action was taken by the government. (It should be noted that Admiral B.R.Inman, then director of NSA, has denied that NSA attempted to suppress scholarly work in cryptography, citing a Senate committee finding that Meyer’s letter to the IEEE was a personal initiative.)
Two other events also occurred in 1977. In October the University of Wisconsin at Milwaukee filed a patent application (through an affiliated foundation) for an encryption device invented by George Davida, associate professor of electrical engineering and computer science. Six months later Davida received a letter from the U.S. Patent and Trademark Office informing him that the Invention and Secrecy Act of 1951 had been invoked and that if the principles of his invention were disclosed to anyone other than federal agents, he would be subject to
The material in this appendix is drawn from a number of sources, including (a) verbatim portions of an article by Stephen H.Unger, “The Growing Threat of Government Secrecy,” Technology Review (February/March 1982), pp. 32–33, 39, 84–85; (b) a memorandum prepared for the Panel by the Office of the Director, National Science Foundation; and (c) testimony presented to the Panel in briefings by Martin Hellman, Howard Rosenblum, and Ronald Rivest.
a $10,000 fine and two years in prison. The Patent Office did not indicate how long the invention had to be kept secret, did not justify the secrecy order, and did not indicate whether there was any way to appeal its decision.
Meanwhile, three engineers in Seattle filed an application for a patent on an inexpensive voice scrambler they planned to market. They too were the subject of secrecy order from the Patent Office. A furor arose around both cases as protests were filed and widely reported. In June 1978 the secrecy order involving Davida’s invention was rescinded, and the restriction on the scrambler unit was lifted the following October.
A related sequence of events began in 1975, when a grantee of the National Science Foundation (NSF) inquired whether NSA had sole statutory authority to fund research in cryptography and whether other federal agencies were specifically enjoined from supporting that type of work. After investigation by the NSF legal staff, no basis was found for such a belief.
The matter of support for cryptography research was raised more formally in May 1977 when two NSA representatives visited the Division of Computer Research at NSF to explore ways of improving the coordination of policy between the two agencies. At that time an NSF program officer agreed to send proposals for funding research in cryptography to NSA for review, but with the caveat that an NSA recommendation against funding that gave no reasons for the recommendation would be considered unacceptable. NSF therefore reserved the right to fund such research at its own discretion. This agreement between the two agencies, confirmed in a letter to NSA from the Director of the NSF Division on Mathematical and Computer Sciences in November 1980, is now observed informally by all other NSF divisions as well.
In September 1978 NSF Director Richard Atkinson visited the NSA to discuss the likely response of NSA if NSF-supported basic research began to impinge on areas related to national security. To help prevent problems of this nature, Atkinson proposed that NSA sponsor a small unclassified research program to increase the overall level of support for cryptographic research and to differentiate between the areas to be funded by NSF and those to be funded by NSA. Meetings on such a program were never convened, but NSA subsequently established an unclassified research grants program, which made its first award in FY 1982. The NSF is cooperating in this new program and has made one joint award with the NSA.
The next development occurred in July and August 1980 when NSF received two letters from Admiral B.R.Inman, then Director of NSA, concerning research proposals submitted by Leonard Adelman and Ronald Rivest, respectively. NSA had reviewed the proposals and had decided that the probable results, if openly published, would have a serious negative impact on national security. The NSA proposed that both Adelman and Rivest contact it directly regarding support for their proposals. The NSF’s response to the Inman letters was largely determined by its responsibilities under Executive Order 12065, which states that when an employee or contractor of an agency not having original classification authority originates information believed to
require classification, the information must be transmitted to the relevant agency that has authority to classify that information.
The Foundation’s policy was elaborated in a letter from its then Acting Director, Donald Langenberg, to Science and to Nature on November 6, 1980. The letter contained two main points: (1) the Foundation would continue to support cryptographic research while coordinating its research support with the NSA and encouraging NSA to develop its own program of support for basic research on cryptography; and (2) the Foundation would ensure that its reporting requirements were adequate to allow it to meet its responsibilities with respect to classification. The Adleman proposal was approved by the NSF on December 9, 1980, and the award letter included a statement of NSF policy and an elaboration of the reporting requirements:
The National Science Foundation does not expect that results of basic research it supports will be classified, except in very rare instances. Further, while NSF does not have classification authority, it has the responsibility to refer any information which NSF has reason to believe might require classification to the agency with appropriate subject matter interest and original classification authority. Therefore, the grantee is responsible for immediately notifying the NSF Program Official of any data, information, or materials developed under this grant which may require classification. The grantee shall, prior to dissemination or publication of potentially classifiable research results obtained under this grant, allow NSF the option to review such materials. The grantee shall defer dissemination or publication pending the review and determination that the results are not classified, provided such review and determination are completed within sixty days of receipt by NSF of such material. If the review results in classification, the grantee agrees to cooperate with NSF or other U.S. agencies in securing all related notes and papers.
M.I.T., where Rivest worked, found the language in the Adleman award to be in conflict with its policy on cryptographic research, and unnecessary as well, since the Institute routinely send all of its cryptography-related research results to the NSA at the same time as it sends them out for technical comment from others in the field. After negotiation between M.I.T. and the NSF, mutually satisfactory wording for the reporting requirement was worked out, and a grant was made to Rivest on September 25, 1981. On April 2, 1982, President Reagan signed Executive Order 12356 on National Security Information. It states, in part:
when an employee, contractor, licensee, or grantee of an agency that does not have original classification authority originates information believed by that person to require classification, the information shall be protected in a manner consistent with this order and its implementing directives.
The new executive order, in other words, places the responsibility for the initial judgment about the sensitivity of research results squarely on the grantee.
MAJOR GOVERNMENT CONCERNS ABOUT DISSEMINATION
The government has a number of reasons for its efforts to restrict the open dissemination of research results in cryptology. It is worried that open publication would jeopardize national security by making available to foreign governments encryption techniques that NSA would have difficulty breaking, call to the attention of foreign governments the vulnerability of their own encryption methods, and reveal knowledge that might endanger the inviolability of codes used by the United States. It is important to note that those aspects of cryptology that are applicable to national defense are considered a munition and require a license for export under the Arms Export Control Act of 1976.
The inviolability of U.S. codes is particularly important because of length of time during which codes and encrypting devices normally remain in use. NSA is now working on codes and equipment whose useful lifetime will extend through the year 2030. At the same time, however, some of the encoding equipment still in use today dates from not long after the end of World War II. Thus, if theoretical information on the design of newer encrypting equipment were to become available, the working lifetime of older machines would be reduced substantially. Finally, it is often the case that cryptographic equipment is modified incrementally in order to extend its lifetime. If state-of-the-art information were published more speedily, the practice of making incremental changes would also have to be discarded.
A concern of an entirely different sort flows from the growing dependence in the United States on electronic communications. New types of fraud have become possible, based on the manipulation of data in computer storage banks or the interception and transformation of coded information. This raises the possibility that foreign agents could cause national economic chaos by manipulating data. One defense against this kind of “data sabotage” would be the development and deployment of powerful encryption and verification systems in the business community. In this case, however, excessive secrecy in cryptological research could actually impair national security.
FORMATION OF THE PUBLIC CRYPTOGRAPHY STUDY GROUP
One outgrowth of the cryptography controversy was the formation in 1980 of the Public Cryptography Study Group (PCSG), which was created by the American Council on Education and has been funded by the NSF. The nine-member group includes mathematicians and computer scientists nominated by various professional societies, university administrators, and the general counsel of NSA. The group’s goal was to find a way to satisfy NSA’s concerns about the publication of cryptographic research papers without unduly hampering such research or impairing First Amendment rights.
Although the PCSG initially did not wish to be bound by national security restrictions in considering various options, it ultimately agreed to accept the need for such constraints as a working hypothesis. It first considered a mandatory system, backed by the NSA, that would require all papers dealing with cryptography (as defined by the NSA) to be submitted to the Agency for prepublication review. This proposal was rejected, partly because the group felt that it had not been able to assess clearly the need for secrecy. (The PCSG neither sought nor obtained security clearance for its members during its deliberations.) The Group also felt that a voluntary arrangement would be more likely to gain the cooperation of researchers.
The PCSG eventually recommended the establishment on a trial basis of a system in which NSA would invite authors to submit cryptography manuscripts voluntarily for prior review at the same time that the manuscripts were submitted to journals. NSA would determine the research areas to be covered by the system after consultation with the appropriate technical societies. Manuscripts would be returned promptly to the authors with explanations “to the extent feasible of proposed changes, deletions, or delays in publication, if any.” An author who disagreed with NSA’s views on a manuscript could request a review by a committee composed of two members appointed by the director of the NSA and three appointed by the President’s science advisor. The entire process would be voluntary, with neither authors nor publishers required to participate or comply with any proposed restrictions.
This proposal was accepted by all members of the PCSG except Professor Davida, who wrote a minority report arguing against any restraints. Among his many objections was the difficulty of distinguishing between basic research and knowledge directly applicable to actual systems. Davida also was concerned that a voluntary system could be a first step toward a compulsory system, and that the PCSG report could be used to support NSA’s argument about the necessity of government controls over cryptographic research.
To date, approximately 46 papers have been handled using the procedures proposed by the PCSG. Only two of the papers were deemed by NSA to have implications for national security, and in both cases the problems were resolved to the satisfaction of all parties. It should be noted, however, that the NSA is not entirely satisfied with the PCSG solution. The Agency’s two principal concerns are that the present arrangement sets a precedent for the future and that there is potential danger in simultaneous review for security and publication purposes. The most significant problem, in NSA’s view, is the possibility that a “blockbuster” paper—i.e., a paper reporting on research constituting a radical breakthrough in current knowledge—might slip through the system and seriously damage national security. The Agency believes that, as more researchers move into the field of cryptography—due, in part, to increased private sector interest—the potential for a blockbuster paper will increase significantly.
WIDER APPLICABILITY OF THE CRYPTOGRAPHY MODEL
The question of whether PCSG’s solution for cryptographic research might be applied to research papers in other fields is beyond the scope of this paper. However, since cryptography has a number of characteristics that are unique, these characteristics would have to be taken into account in determining how the solution could be applied elsewhere.
First, the field of cryptography involves only a few dozen researchers—most of whom are working colleagues—and the publication of less than 100 papers per year. Second, the agency with the principal interest in cryptography, the NSA, is both technically competent and mission-oriented. In other words, it is engaged in the direct use of cryptography. Third, the frequency of problem papers—i.e., papers that would interfere with NSA’s mission—is small. These characteristics do not prevail in other areas of science and technology. Hence, it is far more difficult for the government to evaluate the potential impact on national security of any single research paper in other areas of science and technology.