BOX 3
Assessing the U.S. IT R&D Ecosystem

The U.S. information technology (IT) research and development (R&D) ecosystem was the envy of the world in 1995—from the perspective of IT, the United States enjoyed a strong industrial base, an ability to create and leverage ever newer technological advances, and an extraordinary system for creating world-class technology companies. But the period from 1995 to the present has been a turbulent one for the U.S. IT R&D ecosystem. Today, this ecosystem—encompassing university and industrial research enterprises, emerging start-up and more mature technology companies, those that finance innovative firms, and the regulatory environment and legal frameworks—remains unquestionably the strongest such ecosystem in the world.

However, this position of leadership is not a birthright, and it has come under pressure. The IT industry has become more globalized, especially with the dramatic rise of the economies of India and China, fueled in no small part by their development of vibrant IT industries. Moreover, those nations represent fast-growing markets for IT products, and both are likely to build their IT industries into economic powerhouses for the world, reflecting deliberate government policies and the existence of strong, vibrant private-sector firms, both domestic and foreign. Ireland, Israel, Korea, Taiwan, Japan, and some Scandinavian countries have also developed strong niches within the increasingly globalized IT industry.

As a result the United States risks ceding IT leadership to other nations within a generation unless it recommits to providing the resources needed to fuel U.S. IT innovation, to removing important roadblocks that reduce the ecosystem’s effectiveness in generating innovation and the fruits of innovation, and to remaining a lead innovator and user of IT.

____________

SOURCE: Adapted from NRC/CSTB, 2009, Assessing the Impacts of Changes in the Information Technology R&D Ecosystem, The National Academies Press, Washington, D.C.

One of the most important messages of Figure 1 is the long, unpredictable incubation period—requiring steady work and funding—between initial exploration and commercial deployment.20 The time from first concept to successful market is often measured in decades—a contrast to the more incremental innovations that tend to be publicized as evidence of the rapid pace of IT innovation. Starting a new project requires considerable time and often may be risky, but the payoffs can be enormous. It is often not clear which aspect of an early-stage research project will ultimately be the most important. Fundamental research produces a range of ideas, and those bringing technologies to market will draw on innovative ideas as needs emerge. Indeed, the utility of ideas may be evident only well after they have been generated. For example, early work in coding theory ultimately made possible the modern cell phone and streaming video over the Internet, and today’s cloud computing owes much to decades of research in distributed computing.

Because of unanticipated results and synergies, the exact course of fundamental research cannot be planned in advance, and its progress cannot be measured precisely in the short term. Even projects that appear to have failed or whose results do not seem to have immediate utility often make significant contributions to later technology development or achieve other objectives not originally envisioned. The field of number theory provides a striking example. For hundreds of years a branch of pure mathematics without applications, it became a foundation for the public-key cryptography that underlies the security of electronic commerce.21



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement