The IT industry appears to be growing faster than the research base that has supported it, raising questions about the future in an arena where ideas and talent, or intellectual capital, are the most critical assets. Most obvious is the influx of smaller firms that leverage fundamental research undertaken in universities (and elsewhere). Also obvious is the relative decline of several large U.S. industrial laboratories as sources of IT research. Players present in the early 1990s, such as Digital Equipment Corporation, Control Data Corporation, Cray Research, and even Apple Computer, are no longer contributing to the research base. Enduring players, such as IBM Corporation and AT&T, have reorganized and now focus their research more narrowly than they did in the 1970s and 1980s. Newer companies, such as Microsoft Corporation, and foreign corporations, such as NEC and Mitsubishi, launched U.S. research laboratories in the late 1980s and 1990s; the impact of these new efforts remains to be seen—what is known is that they have attracted leading researchers from academia.5 Industrial research relevant to IT has grown, both in absolute terms and as a share of all such research, but that growth is dwarfed by the IT industry's growth. In this environment, any research investment can have great leverage and influence developments across a broad spectrum of industry and society.
Why does change in the IT industry matter? One might expect, after all, that as an industry matures, R&D spending would decline as a proportion of revenues; such a decline could also come from expanding sales of a stable product—one for which the R&D has been more or less completed. An industry whose history is measured in decades cannot, however, be called mature. Indeed, the evidence of the 1990s points to rejuvenation rather than senescence—the rapid growth of the Internet and its associated business activity, for example, are new phenomena, and that activity will shape yet other phenomena through cumulative experimentation with the network-based interactions of people and systems. Information technology is neither mature nor stable, and the structure and competitive conduct of the industry continue to change.
Measuring the IT research effort is difficult; the most common yardstick is funding. Several factors contribute to concerns about the sufficiency of IT research funding. First, attempts to reduce federal budget deficits and trim defense spending (which historically supported a significant portion of computing and electrical engineering research) constrained federal funding for university research in the early 1990s, affecting career decisions and research output in ways that are beginning to have an impact now. Efforts to enhance accountability in federal government operations and spending also led to increasing support for near-