Over the past 40 years, the size and structure of the scientific enterprise have changed dramatically. Prior to World War II the fundamental-knowledge-creating research work was done in universities (mostly in Europe) by a relative handful of the intellectually elite. Support for scientific research was scarce and sparse, and only the best and most dedicated professors succeeded. World War II brought about a fundamental change in the relationship between science and society. Most notably, the Manhattan project joined basic and applied science in an unprecedented way and in a very short time achieved spectacular technological results that were highly visible to the public. The intellectual challenge met by scientists and engineers in achieving a nuclear-fission explosion had a profound impact on public attitudes toward science. It was an accomplishment that played a major role in the formulation of a new social contract in which science (physics, in particular) was elevated to a level of very high national priority. Following the war, the United States rapidly expanded its investment in basic science in both the public and private sectors. The new “love affair” with science facilitated the formation of national laboratories, industrial research laboratories (many in attractive, rural locations), and university research parks. These developments occurred without much understanding, however, of the links between effort and resulting utility (investment and return). Rather, a mixture of awe and faith, sustained by breakthrough discoveries that led to other examples of utility (antibiotics, the transistor, lasers, and computers, among others), kept the social contract intact. Even today, most universities in the United States aspire to be “research universities,” and a large fraction of new science PhDs still aspire to a government-supported career in basic research at a university, at a federal laboratory, or in one of the few industrial laboratories still conducting such research.

The image of science as a sure-fire source of social utility (now expanded to include “high-tech” consumer products, medical diagnosis and treatment, high-speed transportation, and so on) persisted in the public mind for three decades with little questioning. However, by the 1980s several events had brought substantial change. Industry was withdrawing from basic research to reduce costs in the face of growing foreign competition. At the same time, increasing federal budget deficits, social problems, disclosures of environmental abuses, distrust of major institutions (especially the defense establishment), and the rising costs of military systems and mega-science projects were causing a growing, if unconscious, unrest with the social contract. The demise of the Soviet Union and the end of the Cold War accelerated this process by opening to scrutiny all forms of government-supported research. Among other consequences, this led many to suggest that our expensive defense research laboratories be dismantled and that the dollars thus saved be redeployed to address pressing near-term social and economic problems.

Finally, the realization that the real threat to national security is an economic struggle in which many foreign nations have achieved startling successes without sizable investments in basic science is causing a critical reassessment of national research priorities. The current situation, in which other countries are converting U.S. scientific discoveries and technological innovations into competitive advantage, is reminiscent of U.S. successes during and after World War II that were based in large part on European science of the 1920s and 1930s. This phenomenon has led to the realization that economic success in a global economy depends more on the speed and efficiency with which a nation transforms scientific and



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement