Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 15
--> PART I THE FEDERAL ROLE IN COMPUTING RESEARCH
OCR for page 16
This page in the original is blank.
OCR for page 17
--> 1 Introduction The latter part of the 20th century has witnessed a revolution in computing and related communications technology. As earlier eras witnessed transformations wrought by steam power, internal combustion engines, and electricity, the 1990s have seen the development, elaboration, and diffusion of a general-purpose technology that is transforming society. Computing technology has infiltrated all corners of society, from the workplace and the laboratory to the classroom and the home, changing the way people conduct business, govern, learn, and entertain themselves. The computer revolution is predicated on 50 years of effort by industry, universities, and government. Together, these entities have created an innovation system that has vastly improved the capabilities of computer-related technologies, from semiconductors to computers, and from software to data communications networks. Real-time, online operating systems, graphical user interfaces, the mouse, the Internet, high-performance computers, and microprocessors are all offspring of the productive interaction among government, universities, and industry in the innovation process. Understanding the interplay among industry, government, and universities in developing new computing technology is an important step in framing both public and private policies that will shape future research activities. As the nation attempts to maintain its leadership in computing, business leaders, policymakers, and university researchers will need to understand the sources of their past success. This report examines the history of innovation in the field of computing and related communications technologies with emphasis on the role
OCR for page 18
--> of the federal government in supporting computing research.1 It provides an overview of the federal government's investments in the nation's research infrastructure for computing and, through a series of historical case studies, illustrates the ways in which these investments have influenced the field. As such, the report is not a comprehensive history of computing, but rather an attempt to provide insight into the role of federal research funding in the innovation process for computing. It is hoped that the lessons learned from this report will provide guidance to policymakers attempting to plot the course of federal research investments over the next several decades. Using History as a Guide Historical analysis is one means of informing debates over the role of the federal government in computing research. History provides empirical evidence of the success and failure of different policies over time, and it offers evidence from which patterns can be seen and conclusions drawn about the funding process in particular and innovation in general. Examining changes in government support for technology over many decades helps eliminate spurious events resulting from short-lived fads, political and technical fashions, and individual anomalies. It allows recognition of the often long time lags between initial funding of research and its subsequent incorporation into commercial products. Similarly, it puts into perspective the frequently long lag times between the implementation of policies and the realization of their major and lasting effects. Case studies are a standard tool of historical analysis, allowing one to move more deeply into the mix of events, people, and organizations associated with the funding of computer research. Case studies provide an intimacy with history akin to that experienced by persons who lived it. They present the messy details of real-life experiences not available in abstract, quantitative analysis. At the same time, case studies are limited in their analytical capabilities. To some extent, the conclusions learned from case studies are conditioned by the particular cases examined (see Box 1.1 for an example). As economist Richard Nelson noted after conducting case studies of seven major U.S. industries to derive lessons for federal policy, broad analogies are difficult to identify and outcomes often are tied closely to special circumstances of the industry, a specific technical problem, or the policy approach taken. ''It is very hard to tease out from the historical record clear cut lessons that are applicable to future policy decisions,'' he concluded. Nevertheless, he noted that it is possible to make judgments about the kinds of policies that are feasible and effective in different contexts (Nelson, 1982, p. 454). This report uses a series of case studies, supplemented by a historical
OCR for page 19
--> BOX 1.1 Drawing Conclusions from Case Studies The selection of case studies can greatly influence the nature of the conclusions drawn from them. For example, examination of the development of voice telephony networks would suggest lessons different from those to be found in case studies of data networking. A case study of telephony might suggest that computing research would witness a decline in government funding and the rise of other types of government support, as well as industry support for research and development. It would demonstrate the role of the federal patent system in providing companies and individual inventors a means for protecting their innovations long enough to recover research and development funds, stimulating further expenditures. It would suggest the possibility of government support for—and industrial inclination toward—mergers, monopolies, and regulation. Considering the negative attitude today toward government regulation of private enterprises, the likelihood of a return to regulated monopoly seems unlikely. Yet, contrary to conventional wisdom, private enterprise has in the past favored government regulation under certain circumstances. During the first quarter of this century, for example, state governments supported the spread of telephone service through the granting of natural monopolies coupled with government regulation.1 Possession of a natural monopoly allowed AT&T to levy a supplemental charge on customers that provided funds for research at Bell Laboratories. 1 A classic example from the electric power industry involves the Commonwealth Edison Company of Chicago asking the State of Illinois to grant it a natural monopoly in a large region surrounding Chicago. In exchange, in 1914 Commonwealth Edison readily accepted state regulation of price and service. Previously, the City of Chicago had regulated Commonwealth Edison and limited the company's area of supply to the city limits. Other urban utilities followed a similar policy. This resulted in a cascading effect. The natural monopoly allowed the utilities to avoid competition resulting from duplication of service. overview of federal involvement in computing, to derive lessons regarding innovation in computing technology. The cases in this report build on earlier work by the Computer Science and Telecommunications Board (CSTB, 1995a) that identified the role of federal research funding in stimulating innovation in several areas of computing and communications (Figure 1.1). The case studies are not intended to be definitive surveys of the various subjects; nor are they necessarily fully representative of the interactions of federal research funding with other elements of the nation's innovation system. Instead, they are narratives that illustrate the kinds of influences federal research funding has exerted on the innovation system and that highlight the interactions among government, universities, and industry. The cases represent a diversity of examples, differing in the time periods they cover, the technologies they address, and the type of
OCR for page 20
--> Figure 1.1 Illustrations of the role of government-sponsored computing research and development in the creation of innovative ideas and industries. RAID, redunant arrays of inexpensive disks; RISC, reduced instruction set computing. Source: CSTB (1995a), Figure 1.2, p. 20.
OCR for page 21
--> government involvement. They range from limited discussions of particular projects and programs (such as relational databases and the Internet) to broader discussions of federal support for various fields of inquiry (such as artificial intelligence and virtual reality). Applying historical lessons to future policy making is a difficult exercise, one historians justifiably are reluctant to do. A stock answer is that history does not repeat itself, but this response is misleading. From everyday observation, professional historians and others know that comparable, recurring events are embedded in long-term trends and enduring factors. The impressive ability of statisticians to predict the level of automobile accidents on national holidays, the accurate predictions of trends in economics and demography, and the long-term forecasts of particular cyclical effects of climate changes on agricultural production and energy consumption provide examples of the durability of trends and the persistence of circumstances. French historian Fernand Braudel in his study, The Mediterranean and the Mediterranean World in the Age of Philip II, writes persuasively of the influence of the physical environment upon society and the resulting slow but perceptible rhythms of social behavior: "a history in which all change is slow, a history of constant repetition, ever-recurring cycles" (Braudel, 1972, pp. 21-22). Historians use two processes to apply the past to the future: projection and analogy. Projection, in a sense, moves the past into the future in a continuous, linear way. It assumes that conditions prevalent in the past will continue to exist largely unchanged in the future and that yesterday's lessons apply equally well to tomorrow's problems. Analogies, on the other hand, presume a discontinuity between present and future. They assume that the future will not be like the recent past, but may in fact resemble the more distant past when circumstances differed. Analogies raise interesting and unorthodox questions that can inform policy making and business strategy (see Box 1.2 for an illustration of analogy in scientific thought), but the art of drawing analogies requires a sensitive touch in choosing what is comparable. Analogies can be dramatically misleading if events and trends in the past are wrongly assumed to have arisen from conditions and contexts that will repeat themselves in the future. History is replete with examples of poorly applied analogies that resulted in poor decisions.2 Thus, great care must be taken in extrapolating from the past to the future, and it must be recognized that reasoning through analogy may prove erroneous in detail, even if it allows anticipation of events and outcomes. Clearly, the future of computing will differ from the history of computing because both the technology and environmental factors have changed. Attempts by companies to align their research activities more closely with product development processes have influenced the role they
OCR for page 22
--> BOX 1.2 Analogy in Technological Innovation Analogies are often used in the process of technological innovation. Inventors use analogy to help them conceptualize new ideas. Edison conceived of the quadruplex telegraph, perhaps the most elegant and complex of his inventions, "almost entirely on the basis of an analogy with a water system including pumps, pipes, valves, and water wheels."1 Later, continuing to reason by analogy, he conceived of the interaction of existing illuminating-gas distribution systems and the illuminating, incandescent-light system he intended to invent. The analogy stimulated him to invent a system, not simply an incandescent lamp (Friedel et al., 1986, pp. 63-64). Lee de Forest, inventor of the triode vacuum tube, also inclined to analogy. Observing under a microscope the flow of minute particles between electrodes in his wireless receiver, he imagined, "Tiny ferryboats they were, each laden with its little electric charge, unloading their etheric cargo at the opposite electrode and retracing their journeyings or, caught by a cohesive force, building up little bridges, or trees with branches of quaint and beautiful patterns" (de Forest, 1950, p. 119). Spurred on by analogous thinking, he resolved to invent a flaming hot-gas (ionized), or incandescent-particle, receiver, a search that culminated in his invention of a gas-filled, three-element electronic tube (Hughes, 1990; Aitken, 1985). The emerging history of computer networks also reveals instances of invention by analogy.2 J.C.R. Licklider, whose vision of the future of computing inspired the problem choices and research and development activities of numerous of his contemporaries, opened his 1960 seminal paper, "Man-Computer Symbiosis'' (Licklider, 1960), with a metaphor: The fig tree is pollinated only by the insect Blastophaga grossorum . The larva of the insect lives in the ovary of the fig tree, and there it gets its food. The tree and the insect are thus heavily interdependent: the tree cannot reproduce without the insect; the insect cannot eat without the tree; together, they constitute not only a viable but a productive and thriving partnership. This cooperative "living together in intimate association, or even close union, of two dissimilar organisms" is called symbiosis.3 Man-computer symbiosis, he adds, is a subclass of man-machine systems. Other human-machine systems use machines as extensions of humans. Still others deploy humans to extend machines—to perform functions, for instance, that cannot yet be automated. By contrast, man-computer symbiosis depends on an interactive partnership of man and machine. MIT professor John McCarthy, an early contributor to computer time-sharing, suggested by analogy the potential of commercialized time-sharing and computing utilities. In a 1961 lecture he predicted: If computers of the kind I have advocated [time-sharing] become the computers of the future then computing may someday be organized as a public utility just as the telephone system is a public utility. . . . The computer utility could become the basis of a new and important industry. (Fano, 1979, p. 43)
OCR for page 23
--> After a successful demonstration of the ARPANET in 1972, other computer engineers and scientists saw the analogy. They no longer considered ARPANET a research site for testing computer communications but a communications utility comparable to the telephone system. "It was remarkable how quickly all of the sites really began to want to view the network as a utility rather than as a research project," Alexander McKenzie, an ARPANET pioneer, pointed out.4 An analogy drawn between a conventional office and a future electronic one provided a metaphoric bridge for ingenious computer scientists and engineers at Xerox PARC (Palo Alto Research Center). In the 1970s they invented the "Electronic Office," which they embodied in the Alto computer system. Not long afterward the PARC group began to use visual analogies to introduce icons into the displays of personal workstations. 1 Edison, Theodore M. 1969. "Diversity Unlimited: The Creative Work of Thomas A. Edison," a condensation of a paper given before the MIT Club of Northern New Jersey, January 24. 2 The discussion of the use of metaphors by ARPANET/Internet pioneers is based on a chapter on the ARPANET in Hughes (1998). 3 Licklider quoting the definition for "symbiosis" in Webster's New International Dictionary (Springfield, Mass.: Merriam Company, 1958), p. 2555. 4 Alexander McKenzie as quoted in an interview conducted by Judy O'Neill, Charles Babbage Institute, University of Minnesota, Minneapolis, March 17, 1990. may play in the innovation process. As the computing industry has grown and the technology has diffused more widely throughout society, government has continued to represent a proportionally smaller portion of the industry. The lessons contained in this report attempt to discern crosscutting, pervasive themes and patterns regarding federal support for computer-related research. As such, the report attempts to identify fundamental, enduring trends and relationships that will survive change. It is hoped that they will both help historians better understand development of a dynamic industry and provide technologists with a deeper appreciation of the heritage of their trade, as well as assist policymakers in making more informed judgments about federal support for computing. The Computing Revolution The United States is clearly a leader in the computing revolution. Computing technology has diffused throughout the U.S. economy with far-reaching effects. Over 36 percent of households in the United States owned a personal computer in 1995, a number far exceeding that of other major regions of the world (Table 1.1). Spurred by advances in comput-
OCR for page 24
--> TABLE 1.1 Worldwide Deployment of Computers in 1995 World United States Europe Japan Rest of World Population (millions) 5,700 264 477 125 4,834 Number of computers (millions) 257 96 54 18 89 Percentage of world's computers 100 37 21 7 35 Percentage of region's population with a computer 4.5 36.5 11.3 14.5 1.8 SOURCE: Petska-Juliussen and Juliussen (1996). ing power and data communications, government, industry, and home users moved onto the Internet in record numbers to exchange electronic mail, buy and sell goods and services, gather and disseminate information, and browse the World Wide Web. Recent surveys indicate that some 58 million adults in the United States and Canada are now online (Nielsen Media Research, 1997). Computers have become ubiquitous, with microprocessors running desktop and laptop computers, quietly controlling the operation of aircraft and automobile engines, and adding functionality to common household devices, such as telephones, thermostats, and coffeemakers. Effects on the Economy The effects of this revolution on the economy are pervasive. Although productivity gains from computing have remained difficult to measure quantitatively,3 the qualitative effects are manifest. Many industries, from banking to insurance to airline reservations, could not operate at current levels of activity without computing and communications systems. Computer-based devices, such as automated teller machines, have dramatically altered the ways banks operate, and they enable banks to offer a range of new services to customers. Electronic commerce is changing the way customers and vendors buy and sell goods. As individuals and businesses become more familiar with the technology and industry churns out more innovative information-technology products, it is clear that the influence will be felt in ways that cannot yet have been foreseen. U.S. firms have led the computer revolution. Companies such as International Business Machines (IBM), Intel, and Microsoft dominate global markets for computing devices and software. Others, such as Cisco
OCR for page 25
--> Systems and Lucent Technologies, are leaders in the data communications field. Computer-related manufacturing represents a significant fraction of the nation's economy. Sales of computers, telecommunications equipment (including data networking equipment), software, and semiconductors by U.S. firms topped $280 billion in 1996 (Table 1.2), a figure that has grown at an average rate of almost 10 percent a year since 1960. Taking into account the changing prices of information-processing equipment of equivalent performance, annual expenditures on information-processing equipment grew at an average pace of 9.7 percent per year in real terms from 1970 to 1994. The corresponding figure for investments in computers and peripheral equipment (monitors, disk drives, and so forth) increased at a rate of 27.5 percent per year (Sichel, 1997, Table 4.1). Employment in these manufacturing industries stood at over 1 million in 1996, representing 6 percent of the total U.S. manufacturing workforce. Related service industries have also blossomed. Computer and data processing firms generated close to $150 billion in revenues in 1996, with revenues from domestic telecommunications services climbing from $10 billion to $320 billion between 1960 and 1996.4 Employment in U.S. communications services and computer and data-processing-services companies topped 2.4 million in 1996 (U.S. Department of Labor, 1997, Table B-12). The U.S. Department of Labor predicts that demand for information technology workers in all sectors of the economy will grow by 95,000 jobs annually between 1994 and 2005, with systems analysts posting the largest gains and the service sector absorbing most of these workers.5 TABLE 1.2 Sales and Employment in the Information Technology Industry, 1996 Industry Sector Sales Revenues (in billions of dollars) Employment IT Manufacturing Computing and office equipment 111 254,700 Communications equipment 65 263,000 Softwarea 36 215,900 Semiconductors 68 257,000 IT Services Computing and data processing 144 1,037,300 Communications services 322 1,404,000 a Includes prepackaged software only, standard industrial classification (SIC) code. SOURCE: Sales revenues for multiple industries from Bureau of the Census (1997). Employment data from U.S. Department of Labor (1997), Table B-12.
OCR for page 29
--> BOX 1.3 Early Industrial Efforts in Computing IBM and Remington Rand were two early industrial pioneers in computing. Both were engaged in electromechanical punched-card machines at the close of World War II, with IBM holding 90 percent of the domestic market and Remington Rand having most of the rest. Between them, they also had most of the much smaller foreign market. IBM chose to build its electronic computer business internally, whereas Remington Rand purchased two small computer companies that had gotten their start primarily through government encouragement and funding. The first of these small companies was Engineering Research Associates (ERA), which was established in January 1946 with the active support of military leaders and a promise of lucrative government contracts. Initially ERA's only business was to design and build top-secret, electronic, code-breaking equipment—a task that could no longer be accomplished adequately in the Navy once the war ended and technically trained people were free to seek better opportunities elsewhere. By 1947 ERA had begun to design general-purpose, electronic, stored-program computers because it was concluded that they would be more cost-effective than the special-purpose equipment ERA had designed previously. The first of these computers, code-named Atlas, was delivered to the government in Washington, D.C., in December 1950. The second of these small companies was the Eckert-Mauchly Computer Corporation (EMCC), which was founded in June 1946 by the chief designers of ENIAC. The business of EMCC was to design and manufacture computers (of the von Neumann rather than ENIAC design) and to sell them in the commercial market to displace punched-card equipment in installations having very large data processing requirements. Short of money, despite a contract with the Census Bureau for its first large-scale computer, EMCC accepted an offer to be acquired by Remington Rand in 1950. Just over 1 year later in March 1951, the company's first Univac was accepted by the Census Bureau. One year later, Remington Rand acquired ERA, which needed additional funding to enter the commercial market with the computers it had previously sold only to the government. Thus, by 1952 the number two supplier of punched-card equipment had become the leading supplier of large-scale electronic computers. The decision of IBM to build its electronic capability internally was based on the belief that it had a leadership position in applying electronic computing capability in commercial equipment. Using electronic circuits developed in its Endicott Laboratory as the country was entering World War II, IBM in 1946 introduced its 603 Electronic Multiplier, the first commercial product to incorporate electronic arithmetic circuits. Two years later in the fall of 1948, shipments of the IBM 604 began. Containing over 1,400 vacuum tubes, its electronic circuits performed addition, subtraction, multiplication, and division, and could execute up to 60 plugboard-controlled program steps between reading data from a card and punching out the result. Beginning in 1949, the IBM CPC (card-programmed electronic calculator) was shipped to customers. It combined the electronics of the 604 with other equipment to permit the user to enter both data and program commands on cards. Architecturally similar to the ENIAC, but much smaller, the IBM CPC was sometimes referred to as ''a poor man's ENIAC." Thus, IBM was first in the marketplace with electronic accounting and computing
OCR for page 30
--> equipment. Over 5,000 IBM 604s and 700 CPcs were shipped to customers during the first half of the 1950s when Remington Rand delivered only 14 UNIVACs. IBM had also begun work on large stored-program computers to compete with those of Remington Rand and of other companies drawn into the field by large government research, development, and procurement contracts. This growing competition forced IBM to make a major policy change in 1950. Previously it had avoided government research and development contracts in electronics because it did not want to lose proprietary rights to its developments. Finally recognizing that its own technical and financial resources were insufficient to compete with the countrywide effort the government was orchestrating, it began to seek government research and development contracts in electronic computing. The first such project was the development of NORC (a supercomputer for the Navy), the design and construction of which was authorized early in 1951 and completed late in 1954. But without doubt, IBM's most important government contract put it in close collaboration with the Massachusetts Institute of Technology's Lincoln Laboratory (beginning in 1952) to design and manufacture computers for the Semi-Automatic Ground Environment (SAGE) air-defense system (see Chapter 4). Thus began an era of vigorous competition that pitted IBM against Remington Rand, RCA, General Electric, NCR, Honeywell, Raytheon, Philco, and many others. These companies vied with each other to lead in computer-related technologies lest they fall behind in the marketplace. They sought government research contracts, collaborated with government laboratories and agencies, and worked with people in universities. Technical people published articles on their work in professional society journals and spoke at professional meetings where they could also talk informally with people from other laboratories. Although proprietary and classified information was carefully guarded by most participants, the information that could be exchanged was invaluable in moving forward the government's overall research and development effort. Government funding of computer research, development, and procurement had dramatically stimulated the rapid growth of the computer industry. SOURCE: Summarized from Pugh (1995). Government, universities, and industry all play a role in the innovation process. Research is a vital part of innovation in computing. In dollar terms, research is just a small part of the innovation process, representing less than one-fifth of the cost of developing and introducing new products in the United States, with preparation of product specifications, prototype development, tooling and equipment, manufacturing start-up, and marketing start-up comprising the remainder (Mansfield, 1988, p. 1770).9 Indeed, computer manufacturers allocated an average of just 20 percent of their research and development budgets to research between 1976 and 1995, with the balance supporting product development.10 Even in the largest computer manufacturers, such as IBM, research costs are only
OCR for page 31
--> about 1 to 2 percent of total operating expenses.11 Nevertheless, research plays a critical role in the innovation process, providing a base of scientific and technological knowledge that can be used to develop new products, processes, and services. This knowledge is used at many points in the innovation process—generating ideas for new products, processes, or services; solving particular problems in product development or manufacturing; or improving existing products, for example. Both industry and government fund research activities, with the research itself generally conducted by workers in industry or university laboratories. The computer industry has supported several large and highly productive research facilities, such as IBM's T.J. Watson Research Center, American Telephone and Telegraph's (AT&T) Bell Laboratories, and the Xerox Palo Alto Research Center (PARC). In 1996, computer manufacturers invested about $1.7 billion in research (out of $8.1 billion in total R&D), most of which supported research in their own facilities.12 Federal research expenditures in computer science totaled roughly $960 million in 1995, approximately $350 million of which supported university research, the remainder supporting work in industry and government laboratories (see Chapter 3 for a more complete discussion of federal investments in computer-related research). Traditionally, research expenditures have been characterized as either basic or applied. The term "basic research" is used to describe work that is exploratory in nature, addressing fundamental scientific questions for which ready answers are lacking; the term "applied research" describes activities aimed at exploring phenomena necessary for determining the means by which a recognized need may be met. These terms, at best, distinguish between the motivations of researchers and the manner in which inquiries are conducted, and they are limited in their ability to describe the nature of scientific and technological research. Recent work has suggested that the definition of basic research be expanded to include explicitly both basic scientific research and basic technological research (Branscomb et al., 1997). This definition recognizes the value of exploratory research into basic technological phenomena that can be used in a variety of products. Examples include research on the blue laser, exploration of biosensors, and much of the fundamental work in computer engineering.13 Federal Policy Toward Research Funding Federal funding for research in computing technologies has been based on the rationale first enunciated by Vannevar Bush in his 1945 report to then-President Truman, Science, The Endless Frontier (Bush, 1945a). Drawing from the nation's experience in World War II, Bush
OCR for page 32
--> argued that government funding of research was necessary to meet the nation's needs in defense, health, and the economy in general. Industry, he argued, had little incentive to support such work, but would pursue more applied research geared toward developing new products, processes, and services. This policy set in place new government activities that over the last 50 years have brought new agencies into existence, such as the National Science Foundation, and made the U.S. research system the envy of the world. Cold War policies of the United States aimed at military and political containment of the Soviet Union and other communist adversaries provided additional impetus for computing research. Defense agencies, such as the Office of Naval Research, Army Research Office, Air Force Office of Scientific Research, and Defense Advanced Research Projects Agency, invested in computing research with long-term effects on military capabilities (and, indirectly, civilian capabilities). They, and other federal agencies, such as the National Security Agency (NSA), Department of Energy (DOE), National Aeronautics and Space Administration (NASA), and National Institutes of Health (NIH), have funded research in computing related to their own missions: maintaining national security, developing new energy sources and nuclear weapons, exploring space, and improving human health. Although these agencies have funded projects linked to their own needs, they have also, to varying degrees, created technical knowledge or specific products that have been adopted by the commercial marketplace (Alic et al., 1992). Many mechanisms have been used to support federal contributions to computing research. Until the mid-1980s, most federal support took the form of research grants or contracts. This included federal contracts for product development or procurement that, in turn, demanded significant research. In each of these arrangements, the government acts as the customer for research services, specifying a period of performance and program objectives. After 1985, a growing number of programs were established that involved partnerships among government, universities, and industry. Such programs tended to pool public and private monies to support research in a variety of organizations in industry, universities, and government. In computing, such programs have included (1) SEMATECH, a consortium of semiconductor manufacturers who, with their own and federal funding, support research and development of semiconductor manufacturing equipment (see Chapter 4);14 (2) Semiconductor Research Corporation, which pools industry and some federal funding to support university research in semiconductor technology; (3) Engineering Research Centers that require collaborative work between universities and industry on engineering problems of interest to industry (with some federal funding); (4) cooperative research and development agreements (CRADAs)
OCR for page 33
--> between government laboratories and industry; and (5) extramural cooperative research programs sponsored by the National Institute of Standards and Technology (NIST), such as the Advanced Technology Program.15 All of these mechanisms are considered in this report. Other Mechanisms for Federal Support of Innovation Federal research support has been an important element of the nation's innovation process, but other government activities have also had a significant impact on innovation in computing. Federal procurement and standardization efforts, for instance, have also been highly influential. In a number of areas, ranging from semiconductors to supercomputers, government's specialized needs for computing technologies created a market for high-performance devices and systems and underwrote the deployment of prototypes and core elements of new technologies in computing. Federal procurement of integrated circuits (IC) for the Apollo spacecraft and the Department of Defense (DOD) Minuteman intercontinental ballistic missile program, for example, was a major impetus for early investments in IC manufacturing capability. The needs of DOE and its predecessors for high-performance computers for nuclear weapon development and testing drove early markets for supercomputers. In software, the federal government helped drive the marketplace toward the American National Standards Institute's version of COBOL by establishing it as a federal data processing standard. It also supported efforts to set a standard for message-passing interfaces in parallel computing and supported the High Performance FORTRAN forum to extend the FORTRAN programming language to parallel computers (OTA, 1995). Antitrust actions have also had a significant impact. For example, the antitrust suit brought against IBM in 1952 and settled in 1956 required the company (among other things) to sell as well as rent its equipment, to help others get into the business of servicing IBM equipment, and to license at reasonable rates all of its current and future patents on information-processing equipment, including electronic computers. The settlement of the IBM suit and a similar settlement reached with AT&T one day earlier (together with a suit then pending against RCA) were described by the chief of the Justice Department's antitrust division as ''part of one program to open up the electronics field." The manner in which these suits were settled facilitated the entry of other companies into the computer industry (Pugh, 1995, pp. 254-255). Similarly, the Modified Final Judgment of Judge Greene created competition in the long-lines industry, which, together with Computer Inquiries I, II, and III of the Federal Communications Commission, ensures the lowest prices for lease and resale of
OCR for page 34
--> long-lines carriage in the world. Such actions were arguably as important as research in advancing the telecommunications industry and the Internet. Issues Related to Federal Support of Research Despite the wide range of influences on innovation in computing, federal research funding deserves particular attention, both because of the leverage it exerts over the entire innovation process and because of the policy issues currently under debate. Throughout the 1990s, changes in the policy environment and in the industry itself have raised new questions about the role of the federal government in funding computing research. The end of the Cold War and increasing calls for fiscal stringency in government spending have renewed debates over federal funding of research in the United States, as well as in other industrially advanced nations. To some extent, these debates are not new: the second half of the 20th century has seen numerous reviews of federal policies, programs, and institutions affecting research and education in science and engineering. The debates of the 1990s differ in that they represent the first time in which fundamental questions are being raised about the infrastructural commitments and organizational principles that have guided federal support for research. Few challenge the appropriateness of government developing or sponsoring new technologies for its specialized needs, especially regarding national defense, but the arguments for government support for commercially relevant technology are less clear and their effects more controversial. Although many believe that fundamental, knowledge-expanding research, whose benefits are openly available through publication, is an appropriate course for government, support is not without question. These questions do not arise just out of budgetary considerations. Even as federal budget deficits have given way to promises of surpluses in the late 1990s, and proposals have been made for increasing federal research spending,16 Congress initiated a study to determine the proper role of government in supporting science and engineering.17 Such studies attempt to determine how federal monies can be most productively spent and, more generally, what role the federal government should play in supporting research and innovative activities. Computing research poses an especially difficult challenge in this regard. First, advocates of computing research must counter the claim that computing technology has matured and that the industry is less dependent on fundamental research than it was in the past. Why should the government continue to support computing research that will yield only incremental improvements in the technology? Answering this question requires an appreciation of the evolution of computing technology over
OCR for page 35
--> the past five decades and an understanding of the role research has played in prompting—and responding to—new advances and developments. It requires an analysis of the ways in which pathbreaking innovations have dramatically altered the landscape of computing over time so that policymakers can appreciate the evolution of the industry as a whole. Second, advocates of computing research must demonstrate why the federal government should continue to support research when a healthy industry exists that could develop its own technology. Why would companies in such a highly competitive and profitable industry not fund computing research and develop new technologies on their own? Clearly, the computer industry does fund research and does develop new technologies on its own. Answering the question more fully requires a better understanding of the interplay among industry, government, and universities in creating and applying new information technologies. Federal policymakers must determine what role government plays in supporting such work and how federal efforts supplement, rather than duplicate or displace, those of industry. Similarly, policymakers must understand how federal needs differ from those of the commercial marketplace and how federal needs can drive industrial innovation. Furthermore, policymakers and federal research managers are under increasing pressure to enhance the effectiveness of government research programs. The desire to streamline federal government operations has led to renewed efforts to improve federal programs and their management, as manifested by passage of the Government Performance and Results Act of 1993. This act requires federal agencies to account for program results through integrating strategic planning, budgeting, and performance measurement.18 For agencies that support scientific and technical research, the act implies that methods be developed for measuring the results of federal research investments. Doing so requires an understanding of the many different ways research influences the innovation process, the time delays involved, and the uncertainties inherent in innovation. Such a task would benefit from an examination of past federal research programs to identify examples of successful federal research programs and to provide guidance on the kinds of metrics, if any, that could be applied to federally funded research. These are the kinds of issues this report hopes to inform. The lessons contained in Chapter 5 attempt to answer questions about the role of federally funded research in the innovation process, the cycle of innovation, and the results of federal investments. They discuss the effects federally funded research had on industry and society as a whole and identify characteristics of effective federal research programs. With this kind of historical background, policymakers can be better informed to face the challenges ahead.
OCR for page 36
--> Organization of This Report The remainder of this report examines the history of computing and communications to derive lessons for public policy. Chapter 2 provides the economic rationale for federal support of fundamental research. It identifies the economic properties of research and discusses market failures in the support of research that justify a government role. Chapter 3 presents an overview of the federal role in creating the research infrastructure that supports the U.S. computing and communications industries. It reviews federal investments in research, education, and research equipment over the past several decades. Chapter 4 reviews the changing organizational context of computing research in the United States, with an emphasis on federal funding agencies. It describes the changing political, technical, and organizational context in which innovation has occurred and contains mini-case studies of particularly important innovations—such as time-shared computing and very large scale integrated circuits—identifying the federal role in each. Chapter 5 contains a summary of the lessons learned from this study. It identifies general lessons about the role of federal funding in the innovation process and about the structure of successful research programs. It is hoped that such lessons will be useful to policymakers, researchers, and research managers. Part II of this report, Chapters 6 through 10, contains the case studies that form the backbone of this report. The cases represent a sampling of important technologies that have had an enduring influence on the computing and communications industry and society: relational databases, the Internet, theoretical computer science, artificial intelligence, and virtual reality. Although by no means comprehensive, they cover a wide range of technologies, degrees of success, and interactions among government, universities, and industry. Notes 1. This report uses the term "computing research" in a broad sense, to include work in semiconductors, software, and data communications, in addition to computer science and engineering. It does not include all research in telecommunications (such as voice communications), which has a very different history characterized by regulated monopolies for telephone services. 2. Many historians offer as a classic case of a dangerously misleading analogy the assumption that conditions in southeast Asia in the 1950s were comparable to those in Europe in 1939. This analogy led some policymakers in the United States to assume that, if the country immediately and directly confronted North Vietnam, it would result in a compromise like that offered Hitler at Munich and another large-scale war would be averted. For a discussion of poor presidential decisions resulting from the misapplication of analogies, see May (1972).
OCR for page 37
--> 3. The so-called productivity paradox was first noted by economist Robert Solow (hence it is often referred to as the Solow Paradox). Explanations have ranged from measurement problems to lag times to the difficulties inherent in integrating computing into the workplace. Nevertheless, recent research suggests a correlation between higher levels of information technology capital and increased productivity in large companies, especially in companies that use information technology to enhance customer service. See Brynjolfsson and Hitt (1996). For a discussion of the difficulties in measuring productivity gains associated with information technology, see CSTB (1995b). 4. Revenues cited for the telecommunications services industry include both voice and data communications over a range of media--wireline and wireless. Data are from Bureau of the Census (1997). 5. Employment of systems analysts and computer scientists and engineers is projected to increase 158 and 142 percent, respectively, in the service industries between 1994 and 2005, versus 26 and 37 percent, respectively, in manufacturing industries. The number of computer programmers in service industries is expected to grow 37 percent, versus a 26 percent decline in manufacturing. See U.S. Department of Commerce (1997). 6. Moore's law is named after Gordon Moore, who first noted the relationship and predicted its continuation in 1964. It is the result of two underlying processes: continuous reductions in the size of individual circuits etched onto computer chips through advances in lithography and other manufacturing processes, and increases in the overall size of the integrated circuit (or chip) resulting from improvements in processing of silicon wafers and reductions in contaminants. See Bashe et al. (1986), pp. 56-58. 7. Between 1960 and 1995, the average unit price of computers sold in the United States declined from $330,000 to $3,700, helping to propel growth in annual sales from 1,790 units to over 21 million units. See ITI (1997). The price-performance ratio of the typical computer during that time period also declined by a factor of 100. The U.S. Department of Commerce's (Bureau of Economic Analysis) hedonic price index for computer equipment for 1970-1994 implies that the price-performance ratio was 1.9 percent of its 1970 level in 1994, an average annual rate of decrease of 15.3 percent. Estimating a slower rate of decline in the 1960s of approximately 7.0 percent per year, the price performance ratio in 1994/1995 would have stood at approximately 1/100 of its 1960 level. See Sichel (1997) Table 4-1. 8. For a more complete overview of the innovation process, see OTA (1995). 9. This figure has remained remarkably constant over the past several decades. A 1967 report from the Department of Commerce that relied on data from the previous 10 years found that product conception and design accounted for 15 to 30 percent of the cost of new product introduction; manufacturing preparation, manufacturing start-up, and marketing start-up made up the balance. See U.S. Department of Commerce (1967), p. 8. 10. This estimate was calculated from data contained in the biennial report, National Science Foundation, Research and Development in Industry. Other data from the National Science Foundation show that 40 percent of all research and development expenditures in the United States supported research in 1995; the
OCR for page 38
--> remaining 60 percent supported development. See National Science Board (1996), pp. 4-5. 11. This figure assumes that about 20 percent of IBM's total R&D expenditures support research. R&D was about 7 percent of IBM's operating expenses (the sum of the cost of goods sold, R&D, and general, administrative, and sales costs) in 1997. See IBM (1997). 12. This figure includes research expenditures for firms in the office and computing-equipment industry only. It does not include expenditures by firms in data communications, prepackaged software, or semiconductors. See NSF (1998a), Table A-24. 13. The National Science Foundation, which is the source of most of the research funding data in this chapter, defines research as "systematic study directed toward fuller knowledge or understanding of the subject studied." It defines development as "systematic use of the knowledge gained from research, directed toward the production of useful materials, devices, systems, or methods, including design and development of prototypes and processes. It excludes quality control, routine product testing, and production." See NSF (1997a). 14. In 1997, after 10 years of roughly even funding from industry and government, SEMATECH became fully self-supported, using only industry funding for its programs. 15. NIST's Advanced Technology Program (ATP), for example, provides costshared funding to consortia of industry and university participants attempting to conduct precompetitive applied research. Funding for the program peaked at $341 million in 1995 and stood at $192.5 million in 1998. Funding history is available online at <http://www.atp.nist.gov/atp/budget.gif>. 16. Two bills were introduced in the Senate in 1997 and 1998, calling for a doubling of the federal funding for basic scientific and precompetitive engineering research. In October 1997, Senators Gramm, Lieberman, and 18 other co-sponsors introduced the National Research Investment Act of 1998. The plan called for the doubling of funds over a 10-year period. The bill was referred to the Senate Committee on Labor and Human Resources. In June 1998, Senator Frist submitted similar legislation entitled the Federal Research Investment Act along with 26 co-sponsors. In addition to doubling federal funding for research to 2.6 percent of the federal budget, the bill also called for new evaluation processes to provide better oversight of funding programs. The bill also called for the President to provide a strategic plan for proposed R&D funds as well as an analysis of current funds as part of the annual budget. The bill was referred to the Committee on Commerce. A companion bill in the House was introduced in August of 1998. 17. Early in 1997, Vernon Ehlers, vice chairman of the House Science Committee, initiated the National Science Policy Study, which was intended to provide a new rationale for federal funding of science. The study examined issues in mathematical and scientific education, funding for R&D, cooperation among government, industry, and the international community. The chair of the Science Committee, James Sensenbrenner, hoped that the study would justify the proposed funding increases for research that were introduced in the Senate in 1997 and 1998. The final report was released on September 24, 1998 (Committee on Science, 1998).
OCR for page 39
--> 18. Each agency was required to submit by September 1997 strategic plans that outlined the agency's mission statement, goals and objectives, and strategies it would use to achieve them. The first annual performance plans were due when the President submitted the 1998 budget to Congress and were to include measures that the agency would use to gauge performance toward meeting those goals, and the resources to be used in doing so.
Representative terms from entire chapter: