analysis to the present. Disaggregated estimates of R&D funding, over time, are then constructed and presented. The proposition that much R&D activity has migrated out of computer hardware production and into semiconductor production is measured against available data. A final section produces a net assessment of recent trends, summarizes recent proposals that address these trends, and offers some concluding observations on the problems that these proposals are likely to encounter.

The United States government has been an important patron of information technology research throughout its history, beginning with the development of the first electronic digital computers during the Second World War, and continuing up through the present.3 Much has changed since the birth of this technology in the 1940s, however, and in the last several years prominent voices have articulated new concerns over changing relationships between government, industry, and information technology research. This paper surveys available empirical data on computer research activity with an eye to the legitimacy of these worries. Before examining the present, we briefly survey the past.

THE BIG PICTURE4

In the first decade after the war, most significant computer research and development projects in the United States depended—directly or indirectly—on funding from the United States government, mainly from the U.S. military. The second decade of the computer, from the mid-1950s through the mid-1960s, saw rapid growth in commercial applications and utilization of computers. By the mid-1960s, business applications accounted for a vastly larger share of computer use. However, the government continued to dominate the market for high-performance computers, and government-funded technology projects pushed much of the continuing advance at the frontiers of information technology over this period.

From the mid-1960s through the mid-1970s, continued growth in commercial applications of vastly cheaper computing power exploded. By the middle of the decade of the seventies, the commercial market had become the dominant force driving the technological development of the U.S. computer industry. The government role, increasingly, was focused on the very high-end, most advanced computers, in funding basic research, and in bankrolling long-term/leading-edge technology projects.

From the mid-1970s to the beginning of the 1990s, the commercial market continued to balloon in size relative to government markets for computers, leav-

3  

For more details, see Flamm, Targeting the Computer, op. cit.; Flamm, Creating the Computer, op. cit.; National Research Council, Funding a Revolution: Government Support for Computer Research, Washington, D.C.: National Academy Press, 1999.

4  

Unless otherwise noted, our summary history is based on the sources cited in footnote 2.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement