A number of important lessons about the nature of research in information technology—including the unpredictability of and synergy among research results; the roles of government, industry, and academia; and the economic and social returns from research—can be gleaned from Figure 1 and can also be distilled from past CSTB reports (for a summary, see Box 2).
THE ESSENTIAL ROLE OF THE FEDERAL GOVERNMENT
Innovation in IT is made possible by a complex ecosystem encompassing university and industrial research enterprises, emerging start-up and more mature technology companies, those that finance innovative firms, and the regulatory environment and legal frameworks in which innovation takes place.12 It was within this ecosystem that the enabling technologies for each of the IT industries illustrated in Figure 1 were created. The government role has coevolved with the development of IT industries: its programs and investments have focused on capabilities not ready for commercialization and on the new needs that emerged as commercial capabilities grew, both of which are moving targets.13 A 2009 CSTB report, which examined the health of this ecosystem and noted the challenges posed to the U.S. position in IT leadership, underscored the critical importance of this federal investment (Box 3).14
Most often, the federal investment that contributed to the development of the industries shown at the top of Figure 1 took the form of grants or contracts awarded to university and industry researchers by the Defense Advanced Research Projects Agency (DARPA) and other defense research agencies15 and/or the National Science Foundation (NSF), with the latter having come to play an increasingly important role in supporting academic IT research. A shifting mix of other funding agencies has also been involved, reflecting changes in the missions of these agencies and their needs for IT.16 For example, the Department of Energy (DOE), the National Aeronautics and Space Administration (NASA), and the military services have supported high-performance computing, networking, human-computer interaction, software engineering, embedded and real-time systems, and other kinds of research; the National Institutes of Health invests in research in
Lessons About the Nature of Research in Information Technology—A Summary
• The results of research
—America’s international leadership in IT—leadership that is vital to the nation—springs from a deep tradition of research. …
—The unanticipated results of research are often as important as the anticipated results—for example, electronic mail and instant messaging were by-products of research in the 1960s that was aimed at making it possible to share expensive computing resources among multiple simultaneous interactive users. …
—The interaction of research ideas multiplies their impact—for example, concurrent research programs targeted at integrated circuit design, computer graphics, networking, and workstation-based computing strongly reinforced and amplified one another. …
• Research as a partnership
—The success of the IT research enterprise reflects a complex partnership among government, industry, and universities. …
—The federal government has had and will continue to have an essential role in sponsoring fundamental research in IT—largely university-based—because it does what industry does not and cannot do. … Industrial and governmental investments in research reflect different motivations, resulting in differences in style, focus, and time horizon. …
—Companies have little incentive to invest significantly in activities whose benefits will spread quickly to their rivals. … Fundamental research often falls into this category. By contrast, the vast majority of corporate research and development (R&D) addresses product and process development. …
—Government funding for research has leveraged the effective decision making of visionary program managers and program office directors from the research community, empowering them to take risks in designing programs and selecting grantees. … Government sponsorship of research especially in universities also helps to develop the IT talent used by industry, universities, and other parts of the economy. …
• The economic payoff of research
—Past returns on federal investments in IT research have been extraordinary for both U.S. society and the U.S. economy. … The transformative effects of IT grow as innovations build on one another and as user know-how compounds. Priming that pump for tomorrow is today’s challenge.
—When companies create products using the ideas and workforce that result from federally sponsored research, they repay the nation in jobs, tax revenues, productivity increases, and world leadership. …
SOURCE: Reprinted from NRC/CSTB, 2009, Assessing the Impacts of Changes in the Information Technology R&D Ecosystem, The National Academies Press, Washington, D.C., p. 33, summarizing NRC/CSTB, 2003, Innovation in Information Technology, The National Academies Press, Washington, D.C., pp. 2-4.
biomedical computing; and the Intelligence Advanced Research Projects Activity invests in such areas as data analysis and speech translation.17 Today, a wide array of agencies participate in the federal Networking and Information Technology Research and Development (NITRD) program,18 reflecting their interest in supporting advances in various aspects of computing and communications to fulfill their missions.
Why has federal support been so effective in stimulating innovation in computing? As is discussed below, many factors have been important.
1. Federally funded programs have supported long-term research into fundamental aspects of computing, whose widespread practical benefits typically take years to realize.19
Assessing the U.S. IT R&D Ecosystem
The U.S. information technology (IT) research and development (R&D) ecosystem was the envy of the world in 1995—from the perspective of IT, the United States enjoyed a strong industrial base, an ability to create and leverage ever newer technological advances, and an extraordinary system for creating world-class technology companies. But the period from 1995 to the present has been a turbulent one for the U.S. IT R&D ecosystem. Today, this ecosystem—encompassing university and industrial research enterprises, emerging start-up and more mature technology companies, those that finance innovative firms, and the regulatory environment and legal frameworks—remains unquestionably the strongest such ecosystem in the world.
However, this position of leadership is not a birthright, and it has come under pressure. The IT industry has become more globalized, especially with the dramatic rise of the economies of India and China, fueled in no small part by their development of vibrant IT industries. Moreover, those nations represent fast-growing markets for IT products, and both are likely to build their IT industries into economic powerhouses for the world, reflecting deliberate government policies and the existence of strong, vibrant private-sector firms, both domestic and foreign. Ireland, Israel, Korea, Taiwan, Japan, and some Scandinavian countries have also developed strong niches within the increasingly globalized IT industry.
As a result the United States risks ceding IT leadership to other nations within a generation unless it recommits to providing the resources needed to fuel U.S. IT innovation, to removing important roadblocks that reduce the ecosystem’s effectiveness in generating innovation and the fruits of innovation, and to remaining a lead innovator and user of IT.
SOURCE: Adapted from NRC/CSTB, 2009, Assessing the Impacts of Changes in the Information Technology R&D Ecosystem, The National Academies Press, Washington, D.C.
One of the most important messages of Figure 1 is the long, unpredictable incubation period—requiring steady work and funding—between initial exploration and commercial deployment.20 The time from first concept to successful market is often measured in decades—a contrast to the more incremental innovations that tend to be publicized as evidence of the rapid pace of IT innovation. Starting a new project requires considerable time and often may be risky, but the payoffs can be enormous. It is often not clear which aspect of an early-stage research project will ultimately be the most important. Fundamental research produces a range of ideas, and those bringing technologies to market will draw on innovative ideas as needs emerge. Indeed, the utility of ideas may be evident only well after they have been generated. For example, early work in coding theory ultimately made possible the modern cell phone and streaming video over the Internet, and today’s cloud computing owes much to decades of research in distributed computing.
Because of unanticipated results and synergies, the exact course of fundamental research cannot be planned in advance, and its progress cannot be measured precisely in the short term. Even projects that appear to have failed or whose results do not seem to have immediate utility often make significant contributions to later technology development or achieve other objectives not originally envisioned. The field of number theory provides a striking example. For hundreds of years a branch of pure mathematics without applications, it became a foundation for the public-key cryptography that underlies the security of electronic commerce.21
2. The interplay of government-funded academic research and industry R&D has been an important factor in IT commercialization.22
The examples in Figure 1 show the interplay between government-funded academic research and industry research and development. In some cases, such as reduced-instruction-set computing (RISC) processors that are widely used today in mobile phones and other low-power applications, the initial ideas came from industry, but the research that was essential to advancing these ideas came from government funding to universities. RISC was conceived at IBM (International Business Machines), but it was not commercialized until DARPA funded additional research at the University of California, Berkeley, and at Stanford University as part of its Very Large Scale Integrated Circuit (VLSI) program in the late 1970s and early 1980s.23 RISC has since become commercially significant in a wide range of successful products from supercomputers to mobile phones. Of the 8.3 billion microprocessors produced in 2010, 6.1 billion implemented the Advanced RISC Machine (ARM) architecture.24 The VLSI Design program also supported university research that gave rise to such companies as Cadence Design Systems, Synopsys, and Mentor Graphics, which acquired dozens of smaller companies that started as spin-offs of DARPA-funded25 university research and today are part of a multibillion-dollar electronic design automation industry that is an essential enabler of other IT industries.
Similarly, although IBM pioneered the concept of relational databases (the System R project), it was NSF-sponsored research at the University of California, Berkeley, that brought this technology to a point at which it was commercialized by several start-up companies and then by more established database companies (including IBM).26 Indeed, in none of the examples of products and industries shown in Figure 1 did industry alone provide the research necessary for success.
Moreover, these research-enabled commercial developments have expanded the possibilities for research, given that commercialization has led to substantial decreases in cost. Lower costs have allowed for much wider penetration of technology and have in turn greatly reduced the barrier for who gets to innovate, opening the door to a much wider range of both research and researchers, and to operation at a much larger scale, than was possible even 15 years ago.
3. There is a complex interweaving of fundamental research and focused development.27
The purpose of publicly funded research is to advance knowledge and to solve hard problems. The exploitation of that knowledge and those solutions in products is fundamentally important, but the form it takes is often unpredictable, as is the impact on future research (see the discussion of the technological underpinnings of e-commerce in Box 4). In the case of integrated circuit (VLSI) design tools, research innovation (at places like Stanford University, the University of California, Berkeley, and the University of North Carolina) led to products and then to major industrial markets. In the case of relational databases, research at the University of California, Berkeley, built on earlier work at IBM and led to the first commercialization of the technology. Later, the introduction of products stimulated new fundamental research questions, leading to a new generation of products with capabilities vastly greater than those of their predecessors. Another example is the theoretical research at the Massachusetts Institute of Technology that yielded the algorithms behind the technology for Web-content distribution networks, which provide the foundation for successful companies such as Akamai Technologies.
The Research Underpinnings of Electronic Commerce—An Example
The most visible technology supporting e-commerce is the Word Wide Web, built on the Internet, which during the 1990s grew rapidly from a research network to critical societal infrastructure. Behind the Web interface lie a number of information technologies that have been developed incrementally over years or even decades and that have their roots in computing research. Important examples of such technologies include:
• Distributed-computing technologies that support scaling up to very large numbers of users;
• Approaches to facilitating data interchange, including mediator and wrapper techniques (which allow legacy systems to be integrated into newer systems) and the extensible markup language (XML) standard for describing data;
• Safe mobile code capabilities, which enable code to be downloaded and run on end-user computing platforms;
• Database/transaction capabilities, most notably the development of reliable, large-scale relational databases (and more recent object extensions); capabilities for ensuring integrity and consistency of databases; and the emergence of a standard language, structured query language (SQL), for querying databases;
• Multimedia technologies, including techniques for compressing audio and video, which support streaming or downloaded content;
• Graphical Web browsers, which made Internet services accessible to general users and across a wide range of hardware and software platforms;
• Search engines, including indexing, query interfaces, and spiders that build indexes of Web content;
• Data mining, which allows patterns to be inferred and relevant data to be identified from very large data sets; • Improved understanding of human-computer interface issues, ranging from page layout and navigation design to e-commerce transaction support and online collaboration;
• Public-key and other cryptographic security capabilities that provide confidentiality and the integrity of in-transit and stored data, nonrepudiation of transactions, and the like; and
• Other security capabilities, including authentication of users, network monitoring, and intrusion detection.
SOURCE: Adapted from NRC/CSTB, 2002, Information Technology Research, Innovation, and E-Government, National Academy Press, Washington, D.C., p. 38.
4. Federal support for research has tended to complement, rather than preempt, industry investments in research.
The IT sector invests an enormous amount each year in R&D. It is critical to understand, however, that the vast majority of corporate R&D has always been focused on product and process development.28 This is what shareholders (or other investors) demand. It is harder for corporations to justify funding long-term, fundamental research. Economists have articulated the concept of appropriability to express the extent to which the results of an investment can be captured by the investor, as opposed to being available to all players in the market. The results of long-term, fundamental research are hard to appropriate for several reasons: they tend to be published openly and thus to become generally known; they tend to have broad value; it is difficult to predict in advance which will be important; and they become known well ahead of the moment of realization
as a product, and many parties thus have the opportunity to incorporate the results into their thinking. Such innovations effectively “raise everyone’s boat” in the same way as do government investments in bioscience, health care, and other strategically important scientific disciplines.29 In contrast, incremental research and product development can be performed in a way that is more appropriable. It can be done under wraps, and it can be moved into the marketplace more quickly and predictably.
Although individual industry players may find it hard to justify research that is weakly appropriable, it is the proper role of the federal government to support this sort of endeavor.30 When companies create successful new products using the ideas and workforce that result from federally sponsored research, they repay the nation handsomely in jobs, tax revenues, productivity increases, and world leadership.31 Long-term research often has great benefits for the IT sector as a whole, although no particular company can be sure of reaping most of these benefits. Appropriability also helps to explain why the companies that have tended to provide the greatest support for fundamental research are large companies that enjoy dominant positions in their market.32
Start-ups represent the other end of the spectrum. A hallmark of U.S. entrepreneurship, start-ups and start-up financing have facilitated the development of high-risk products as well as an iconoclastic, risk-taking attitude among more traditional companies and managers in the IT business. But they do not engage in research.33 Thus, start-ups are notable for two reasons: first, although start-ups at least temporarily attract some researchers away from university-based research, they place them in a position to spearhead innovation, often based on their university work, and second, notwithstanding the popular labeling of start-ups as “high-tech,” they apply the fruits of past research rather than generating more. In both respects, government funding plays a critical role in building the foundations for these innovative commercial investments.
UNIVERSITY RESEARCH AND BROADER ECONOMIC IMPACTS
Much of the government-funded research in IT has been carried out at universities.34 Between 1976 and 2009 federal support constituted roughly two-thirds of total university research funding in computer science and electrical engineering.35 Among the important characteristics of universities that contribute to their success as engines of innovation are the following:
• Universities can focus on long-term research, a special role of universities that IT companies cannot be expected to fill to the same extent.36 (Universities’ ability to carry out such research depends, of course, on federal and other sources of funding for research with a long time horizon.)
• Universities provide a neutral ground for collaboration, encouraging movement and interaction among faculty through leave and sabbatical policies that allow professors to visit industry, government, and other university departments or laboratories. Universities also provide sites at which researchers from competing companies can come together to explore technical issues.37
• Universities integrate research and education, a conjunction that creates very powerful synergies, ensuring that students are involved in projects where knowledge is being discovered, not only studied, and providing an educational foundation for the continuous learning that is so important in a fast-moving field like IT.38
• Universities are inherently multidisciplinary, and university researchers are well situated to draw on experts from a variety of fields.39 Despite cultural barriers to cross-disciplinary
collaboration, physical proximity and collegial values go a long way in enabling collaboration. The multidisciplinary nature of universities is of historic and growing importance to computer science, which interfaces with so many other fields.
• Universities are “open” both literally and figuratively, a characteristic that can pay enormous unanticipated dividends. Chance interactions in an open environment can change the world; for example, when Microsoft founders Paul Allen and Bill Gates were students at Seattle’s Lakeside School in the early 1970s, they were exposed to computing and computer science at the University of Washington and to a university spin-off company, Computer Center Corporation.
These characteristics of university research share a common element—people. U.S. research universities are unique in the degree to which they integrate research with education—in both undergraduate and graduate education. Universities educate the skilled IT workers of the future.40 Their graduates are also by far the most effective vehicle for technology transfer, not only from universities to industry but also between university laboratories and departments, through the hiring of postdoctoral researchers and assistant professors.41 Faculty and student researchers often move into product-development roles as consultants, employees, and entrepreneurs.42 Federal support for university research drives this process. In Ph.D.-granting computer science programs, more than half of all graduate students receive financial support from the federal government, mostly in the form of research assistantships.43
Another benefit of federally funded academic research that doesn’t show up in Figure 1 is research’s contribution to the development of open standards and open-source codes that support further innovation. The standards that define the Internet had their origins in academic work, and federal support allowed many university researchers to participate in their development and evolution. The Hyper Text Transfer Protocol (HTTP) Daemon Web server developed with NSF support at the University of Illinois by the National Center for Supercomputing Applications powered much of the early Web, and its code base was used to develop the open-source Apache Web server that is widely used today. Similarly, many of the team members who developed the original Mosaic Web browser went on to commercialize the product in the form of Netscape Navigator. Moreover, the open-source Mozilla browser code became a foundation for the Firefox browser.
In addition to educating students and creating ideas and companies, universities often bring forefront technologies to their regions (e.g., the nationwide expansion of ARPANET in the 1970s and of NSFnet in the 1980s, and the continuation of those efforts through the private Internet activities in the 1990s and early 2000s), and universities serve as powerful magnets for companies seeking to relocate. Indeed, strong research institutions are recognized as being among the most critical success factors in high-tech economic development.44,45