KEY POINTS IN THIS CHAPTER
- The benefits of scientific research—particularly basic research—include not only innovation but also contributions to a trained workforce and to the infrastructure that enables further research and the use of scientific discoveries.
- It is impossible to predict all of the outcomes or benefits to which basic research might lead. It is equally impossible to predict all the types of research knowledge that will contribute to a future transformative innovation.
- Maintaining a level of preparedness that will allow America to benefit from discoveries made elsewhere is essential. To maintain this level of preparedness, government needs to support world-class basic research in all major areas of science.
- Metrics for evaluating and policies for supporting the translation of university research into industrial innovations need to be varied and flexible to reflect the diversity of academic institutions and firms and their interactions.
- The translation of research discoveries into economically and socially viable innovations frequently is subject to a time lag that in many cases reflects the often prohibitive cost and risk associated with proof-of-concept research. As discussed in
Chapter 6, government support for such research may be essential to overcome this barrier to the development of innovations.
- The international flow of people and ideas plays an increasingly important role in the U.S. research enterprise. This flow is supported in part by worldwide networks of researchers that advance research and enable access to a vast stock of knowledge and technological approaches offering opportunities for commercialization.
The committee’s implicit charge for this study was to identify ways of increasing the output of the U.S. research system. Although the desired outputs are numerous, Congress and others have placed particular emphasis on economic gains, so we give special attention to those contributions here, noting that these gains depend on numerous factors that cannot easily be predicted or controlled, such as the widespread adoption of an innovation. In exploring how the United States might enhance the economic benefits gained from science, we focus on one goal in particular: for the United States to be at the forefront of global competition for new technologies and other innovations. A framework for supporting this goal through a greater understanding of the system of research is described in Chapter 6. We focus on this goal not only because innovative technologies are profitable in and of themselves, but also because focusing on this goal enables one to understand how the research enterprise advances national goals in general. This chapter describes the complex, lengthy, and often unpredictable pathways that lead from research to innovations that yield economic and other benefits for U.S. society, illustrated by a set of detailed examples.
THE LINKS BETWEEN RESEARCH AND INNOVATION
As discussed briefly in Chapter 1 and in greater detail in Chapter 4, measuring the economic and other returns of research is not an easy task. Attempts to trace major innovations back to their original supporting research have rarely if ever revealed a direct flow of “money in, value out.” In the majority of cases, such exercises illuminate a tangled and complex yet rich and fertile path from the original investment to the final impacts on society. They reveal layer upon layer of small impacts scattered across many places, as well as coincidental exchanges of information that gradually steered the path of everyday research toward a transformative breakthrough, one that could not have been predicted (Martin and Tang, 2006).
Yet if one looks more closely at the months, years, and decades pre-
ceding transformative breakthroughs, subtle clues emerge. It becomes clear that chance favors the prepared mind. Along the path to discovery, certain grounds become more fertile, and certain environments more conducive to major innovations. Recognizing particularly fertile avenues for research often requires a close familiarity with “dry wells”—dead ends or failures in the development of scientific knowledge. Scientific knowledge thus advances through failures as well as successes, a point emphasized throughout this report. In fact, every failure in science can be considered a discovery in the sense that the project may not have achieved its original goal, but the failure plays an important role by pointing research in a more productive direction and often by providing a foundation for new discoveries.
We offer a series of illustrations. Box 3-1 provides an example of how innovation flourishes in fertile ground; Boxes 3-2 through 3-4 present examples of transformative innovations that could not have been predicted at the time of the original research; and Box 3-5 describes a failed project that unpredictably gave rise to a revolutionary idea. This concept is also illustrated in a study by the U.S. Center for Technology and National Security Policy titled The S&T Innovation Conundrum (Coffee et al., 2005, p. 1):
For example, the rapid advances in electronics and computer products over the past 50 years have created a general impression of continuous scientific breakthroughs. In reality, the breakthrough S&T innovations for electronics and computers took place in the 1940s and 1950s. The subsequent rapid advances in functional capability were the result of a brilliant and enormously successful program to exploit those early breakthroughs.
The key players in transformative breakthroughs often are well-trained researchers from diverse backgrounds who know the right people—and many of them. The right people are other talented researchers who can draw on their knowledge of diverse fields to bring fresh perspectives to stale problems. Mathematics, statistics, and computer science, for example, help advance discoveries in other scientific fields, while the social sciences provide information, incentives, and institutions that advance the use of research discoveries in all the sciences.
Once the above clues are assembled, they reveal the commonalities of most transformative innovations. Economic and other societal impacts begin with the generation of basic knowledge. Such research may be undertaken for no other reason than to satisfy curiosity. However, a broad and deep knowledge base is necessary for the development of new technologies. People and publications distribute basic scientific knowledge via networks and research institutions. Through its eventual incorpora-
Factors That Influenced the Spread of the Hybrid Corn Innovation
About 95 percent of corn now grown in the United States is hybrid corn, but this was not always the case. In the 1930s, almost no hybrid corn was grown. The father of hybrid corn, G.H. Shull, a geneticist at Cold Spring Harbor, New York, began experiments in 1906 to understand the genetics of corn. At the same time, E.M. East conducted similar experiments at Connecticut State College. Their studies provided an important basis for industry research and for research conducted at state and federally supported experiment stations and in corn research programs. Shull’s research ultimately led to one of the most significant agricultural innovations as hybrid corn went from being unknown in the 1930s to being used by more than two-thirds of all farmers by the 1940s. Allowing farmers to produce more corn with increasingly less land, the investment in this research at agricultural experiment stations yielded a return of about 40 percent (Scott et al., 2001).
Analyzing information about the spread and adoption of hybrid corn among farmers in the United States, economist Zvi Griliches teased out key factors affecting the dispersal of innovation. The challenge with large-scale commercialization of hybrid corn was the need to customize the hybrid to a particular region based on growing conditions. While simply examining the initial and ultimate spread of commercial hybrid corn provided little information, an examination of multiple factors yielded a clearer picture of the factors involved. Locations with the best growing conditions were the first to market hybrid corn. When hybrid corn reached 10 percent of the total corn grown in the United States, superior hybrids and additional farm machinery for harvesting corn allowed other farmers to achieve profit by growing it. Further movement into each state was directly linked to the capabilities of the state’s experiment station. While hybrid corn was adopted more rapidly in the north than in the south, for example, southern states with larger experiment stations, such as Florida, Louisiana, and Texas, adopted it more rapidly than other states in the region. Thus, the second major factor affecting adoption of hybrid corn was the proximity of and access to resources at state agricultural experiment stations, funded by the U.S. government. Griliches’ study demonstrates the importance of regional factors to the adoption and diffusion of novel products and concepts, as well as the importance of federal funding in overcoming regional constraints.
SOURCE: Adapted from Griliches (1957).
tion into products, processes, and business practices—most readily in geographic hubs of innovation, where research institutions are located in close proximity to an external community of funders, human intellectual capital, skilled labor, supplier networks, manufacturers, vendors, and technology-oriented lawyers and consultants (Warren et al., 2006)—this knowledge generates economic and societal benefits.
The MASER, Forerunner of the Laser
For a decade-long stretch of his career, Charles H. Townes, the inventor of laser technology, had to fight to convince others of the possibility, and the value, of the seemingly obscure technique of amplifying waves of radiation into an intense, continuous stream. During his career, he received funding from the National Science Foundation and the U.S. Navy.
Townes, born in 1915 in Greenville, South Carolina, had earned his Ph.D. at the California Institute of Technology and then went to work at Bell Labs. Later, as a professor at Columbia University, he began work on generating a controlled, extended stream of microwaves through contact with an electron in an excited state. The project sounded frivolous even to his colleagues, who told him directly that they thought he was wasting the university’s money.
In 1953, Townes, James Gordon and H.J. Zeiger built the first MASER (microwave amplification by stimulated emission of radiation). About 5 years later, Townes and Arthur Schawlow published a paper saying the MASER’s principles could be extended to amplify radiation at the frequencies of visible light, thus introducing the principle of laser technology.
Even then, Townes encountered doubters who saw no value in the technology. Luckily, however, the scientific community began to grasp the technology’s implications. In 1960, Theodore Maiman built the first laser.
The laser became the basis of countless technologies we use in our daily lives. Without lasers, the Internet and digital media would be unimaginable. Computer hard drives, CDs, digital video and satellite broadcasting would not exist. Nor would laser eye surgery or laser treatment for cancer.
SOURCE: Reprinted with permission from Golden Goose Award (2014).
THE RELATIONSHIP BETWEEN UNIVERSITY RESEARCH AND INDUSTRIAL INNOVATION
Universities in the United States have a long tradition of engagement with industry in research and other collaborative activities. This pattern of engagement has benefited from a two-way flow of ideas and people between academic and industrial research settings, and has included extensive patenting and licensing of university inventions to industry. Contributing to this pattern of collaboration have been both the historical structure of the national U.S. system of higher education and factors external to U.S. universities, such as relatively high levels of domestic interinstitutional mobility of researchers and new-venture financing from various private sources. But the connection between U.S. universities and innovation in industry throughout the 20th and 21st centuries has relied on a number of different channels, including, among others, the training
Green Fluorescent Protein
In 1962, an organic chemist named Osamu Shimomura, working in the Department of Biology at Princeton University, was interested in jellyfish and in learning how and why they glowed bright green under ultraviolet light. The recent Ph.D. graduate collected millions of jellyfish to isolate the source of their bioluminescence, and after many years of careful science, he finally succeeded in identifying the mechanism. He called the responsible protein “green fluorescent protein” or GFP. Beginning in the 1970s, Shimomura received funding from the National Science Foundation to explore the biochemistry of this luminescence further.
In the 1980s, Shimomura’s studies attracted the attention of a young investigator at Woods Hole Oceanographic Institution named Douglas Prasher, who wanted to attach GFP to the bacterial proteins he was studying so they would glow brightly when expressed in a cell. Prasher sought $200,000 from the American Cancer Society to clone and sequence the gene for GFP. He succeeded in publishing the relatively short protein sequence, but he ran out of funding before he could actually use GFP as a tag on the bacterial proteins, and had to set the project aside. Although he failed to achieve his initial goal, Prasher did something even more valuable: he shared the cloned gene with hundreds of other scientists, including Columbia University biochemist Martin Chalfie, and University of California, San Diego biochemist Roger Tsien, who would later share the 2008 Nobel Prize in Chemistry with Shimomura for their work in honing the GFP technology.
Chalfie heard about GFP at a seminar and decided to ask Prasher for the sequence so he could use GFP to tag proteins in some of the worms (C. elegans) he was studying, using funds from the National Institutes of Health. On the opposite side of the nation, unbeknownst to Chalfie, Tsien applied his previous research on the chemistry of florescent dyes to alter the color GFP would produce when exposed to ultraviolet light, thus allowing protein tags of many different colors to be used at once. In 1996, a scientist at the University of Oregon, Jim Remington, collaborated with Tsien to determine the crystal structure of GFP, using funds from NIH.
With this new set of tools, biomedical scientists have opened up vast new capabilities in research. The applications of GFP are ubiquitous in both basic and applied research. Shimomura did not set out to revolutionize biology or medicine; he simply wanted to understand a complex creature. Chalfie wanted to find a way to understand the neurobiochemistry of a simple model organism in more detail, and was inspired by a seminar he attended in his department. Tsien saw the potential to improve the tools available to biologists.
According to a description of the award on NobelPrize.org, the work of these researchers has made it possible today to “follow the fate of various cells with the help of GFP: nerve cell damage during Alzheimer’s disease or how insulin-producing beta cells are created in the pancreas of a growing embryo. In one spectacular experiment, researchers succeeded in tagging different nerve cells in the brain of a mouse with a kaleidoscope of colours.”
SOURCE: Adapted from NobelPrize.org (2008).
Corning® Gorilla Glass®
Gorilla Glass® is in most people’s pockets, but it started with a faulty furnace and spent nearly 40 years as a shelved idea. The idea for Corning’s ultralight, ultra-thin, and virtually indestructible glass—used on the surfaces of most modern mobile phones and laptop computers—emerged one morning in 1953, when chemist S. Donald Stookey accidentally overheated a sheet of lithium silicate photosensitive glass. Because of a faulty temperature controller, the furnace Stookey was using heated the glass to 900ºC rather than 600ºC. Instead of melting, however, the glass transformed to a milky white ceramic plate and bounced, rather than breaking, when it fell to the floor. Completely by accident, Stookey had discovered a new realm of high-temperature chemistry.
This was the start of Corning’s Project Muscle—a research initiative focused on developing strengthened glass products. A key outcome was the realization that the glass could be strengthened through ion exchange by means of hot salt baths, with smaller sodium ions being traded for larger potassium ions. In 1961, Corning unveiled Chemcor glass—a highly durable ceramic that was quickly incorporated into the company’s existing product lines.
But Corning could not find a consistent market for Chemcor; it was a solution in search of a problem. Both Chemcor and Project Muscle were shelved in 1971. Chemcor did not reemerge until 2007, when the widespread use of smartphones suddenly generated the need for strong, thin, lightweight, mass-produced glass. Apple’s Steve Jobs is rumored to have rediscovered Chemcor’s properties and to have requested further improvements. Previously, Chemcor had been produced around 4mm thick, was slightly cloudy, and was manufactured only in small batches. Jobs wanted it to be 1.3mm, clear, and stretchy at relatively low temperatures. And he needed it in 6 weeks for a new idea called the iPhone.
Adam Ellison and Matt Dejneka, two of Corning’s compositional scientists, were given the task of adapting part of the Corning fusion production facility in Harrodsburg, Kentucky, to meet Apple’s first request, as well as reformulating the composition of Chemcor itself. Corning’s commitment to research—for which it is known and to which it has held true throughout its history—as well as its recognition of the sometimes delayed benefits of research, led to a product that can now be found in more than 750 commercial products and 33 brands worldwide.
SOURCE: Adapted from Gardiner (2012).
of students, faculty consulting, publication of research advances, and industry-sponsored research. These channels operate in parallel and are interdependent. Moreover, the relative importance of different channels of interaction and information flow between academic and industrial researchers appears to vary considerably among different research fields.
The so-called “Bayh-Dole era” that began in 1980 (discussed in Chapter 2) extended and expanded this engagement. Important as well was extensive federal support for research, notably in the life sciences, which
Failed Research That Inspired the Discovery of Novel Therapeutics: Antidepressants
In the early 1950s, researchers tested a new drug, iproniazid, for treatment of tuberculosis. It was not an effective treatment, but the researchers reported that the drug made a number of patients “inappropriately happy.” This discovery ushered in a new era of biological research on depression, leading to the development of antidepressant drugs. Iproniazid became the first marketed antidepressant.
SOURCE: Adapted from Burns (1999).
produced important advances that sparked growth in university patenting and licensing, increasingly managed directly by U.S. universities, during the 1970s. There is little evidence that increased faculty engagement in entrepreneurial activities during the post-1980 period, including the formation of new firms and patenting and licensing of inventions, negatively affected the scholarly productivity of leading researchers (Ding and Choi, 2011). Nonetheless, the efforts of U.S. universities to manage their intellectual property more directly for revenue purposes have sparked criticism from U.S. firms, especially those engaged in information technology. In response to this criticism, some U.S. universities have experimented with new approaches to the management of patenting and licensing that take into account the differences among research fields in the importance of patents relative to other vehicles for information exchange and technology transfer. Research universities can contribute to or inhibit faculty start-ups through their reward systems. Some academic departments look askance at patents in tenure consideration, while others regard patents more highly. In recent years, institutions such as the University of Maryland have begun formally counting patents and commercialization in tenure reviews (Blumenstyk, 2012). Similarly, Massachusetts Institute of Technology (Ittelson and Nelsen, 2003) and Carnegie Mellon University (Simmons, 2013) have been recognized for encouraging entrepreneurship and faculty start-ups through supportive policies.
Reflecting the complex roles of university technology transfer programs in regional and U.S. national economies, an array of institutional goals can be pursued through such programs. But these goals are not always mutually consistent or compatible, so that policy priorities must be established for these programs and clearly linked to current policies. Revenue-maximizing licensing strategies may be shortsighted (Ewing Marion Kauffman Foundation, 2012).
Metrics for evaluating the performance of universities in transferring technology and supporting industrial innovation are informative when they are aligned with the specific goals of a given university or research institute and account for the full breadth of channels through which university research influences industrial innovation, including the training and placement of students, faculty research publications, faculty- or student-founded firms, patents, and licenses. Given the lack of data covering these various channels for most U.S. universities, as well as the need for metrics to be tailored to the goals and environments of individual universities, it appears unrealistic and unwise for federal agencies or other government evaluators to impose a single set of metrics for measuring the technology transfer performance of all U.S. universities. Trying to apply an evaluation framework that does not take adequate account of the diverse channels of university influence or the differences among universities would only serve to diminish the institutional heterogeneity that historically has been a strength of the U.S. system of higher education.
This institutional heterogeneity also implies a need for flexibility and variety in the policies used by U.S. universities to support interactions with industry and the commercialization of academic research advances. The Bayh-Dole Act and other relevant federal policies do not specify any single institutional structure for managing patenting, licensing, and related activities in university-industry collaborations. But U.S. universities have been slow to implement and evaluate different approaches to managing these activities during the three decades since the act’s passage. Such experimentation, combined with efforts to assess the effectiveness of alternative approaches, is not likely to advance to the extent that would be desirable without the encouragement of federal agencies, industry, and other stakeholders. Nonetheless, no single approach is likely to prove feasible or effective across the numerous and diverse academic institutions and private firms engaged in federally funded research and industry collaboration. Appendix B elaborates on the relationship between university research and industrial innovation.
THE UNPREDICTABLE TIMELINE FROM RESEARCH TO SOCIETAL IMPACT
In many cases, a significant time lag separates the original research from the commercialization of an innovation incorporating the knowledge generated by that research. Sometimes this time lag represents the long wait between an original research finding and its sudden and unexpected relevance to a breakthrough innovation. The basic science research that enhanced understanding of the mathematics of nonlinear control
theory, for example, eventually made it possible to create electrical power grids that rarely fail.
This time lag occurs, however, even when a research finding has readily apparent applications. This is the case because many research discoveries intended for future development and commercialization, such as the technology used to develop efficient fuels, must first cross the so-called “valley of death”—the often prohibitive cost and risk associated with proof-of-concept research. In some cases, the industry and venture capital support needed to develop a concept or invention vastly exceeds the funding for the original concept or invention.
Only after crossing this valley can the technology be incorporated into a concept model or laboratory prototype that provides a platform for the subsequent applied research and development needed for a product to compete in the marketplace. But technology concept models and laboratory prototypes must be achieved quickly, before others can exploit the discoveries on which they are based for commercial advantage. In that sense, the time lag associated with proof-of-concept research is particularly important in the race to commercialize research discoveries with immediately obvious applications.
The complexity of modern technologies has increased the difficulty of translating basic science advances into economically and socially valuable technologies. Universities, industry, and government are all investing in crossing the “valley of death” within the limitations imposed by the time lag, expense, and risk that characterize the path from basic science to the industrial laboratories where innovations are created. As discussed in Chapter 6, government support for proof-of-concept research may be essential to overcome this barrier to the development of economically and socially viable innovations.
CONNECTING THE DOTS FROM RESEARCH TO INNOVATION
Research universities have the primary goal of generating knowledge and dispersing it through the nation’s most talented people. One of the greatest benefits of research universities is the workforce they train—their talent, abilities, knowledge, skills, and experience and the networks of professional connections they have made. Students trained in research develop critical thinking skills and an ability to help solve some of the most complex problems facing society, ranging from the technical (energy efficiency, climate change, cybersecurity) to the social (the economy, crime, an aging population, immigration).
The funding provided to research universities is therefore crucial to the societal benefits derived from the research enterprise. An example of research funding used to develop new approaches to training is the
National Science Foundation’s (NSF) 2002-2005 Department-Level Reform Grant Program, which funded 20 engineering departments to transform their undergraduate teaching from a stovepiped approach, focused solely on teaching engineering concepts, to an approach providing an education in the context of achieving societal goals. The specifics of the approaches taken by each of the departments differed, but they all included partnerships with nonengineering departments, service learning projects, and hands-on application of the concepts learned. Other interdisciplinary programs followed. The focus of these programs on theory, application, and interdisciplinary experiential education has been deemed effective, although the programs’ long-term effectiveness, including the impact on students’ careers, has not been fully evaluated (Shipp et al., 2009).
The flow of knowledge occurs when talented people forge new connections with other talented people and migrate both geographically and intellectually between positions in academia, private industry, and the government. This flow is channeled through networks and partnerships, aided by publications, citations, and other correspondence, so that bits of knowledge emerge when and where they are needed most. With the increasingly important role of the Internet in scientific research, these networks are expanding and enabling virtual collaborations. As knowledge emerges at different times and in different places, it evolves and expands. People with diverse backgrounds transform it and present it in new ways, with fresh perspectives. Networks of researchers and institutions enable discoveries, ideas, instrumentation, and analytical methods to be shared among the world’s best talent, inspiring the ultimate use of knowledge from research. These networks can also encompass scores of volunteers working with scientists on real-world research projects—a movement known as Citizen Science (Bonney et al., 2014).1 In addition, the ready availability of knowledge enables serendipity and increases the potential for transformative innovations.
Increasingly, these flows of people and ideas occur internationally and play an important role in research and innovation in the United States. Private industry now invests in research laboratories abroad, and the findings from these laboratories feed back into U.S. research and innovation. Encouraging the mobility of researchers across national boundaries as well as among domestic research institutions remains a challenge for most nations; however, a UK Royal Society report indicates that Australia, Canada, the United Kingdom, and the United States attracted the
1Citizen Science has been defined by the Cornell Laboratory of Ornithology at Cornell University in Ithaca, New York, which helped pioneer the concept, as “projects in which volunteers partner with scientists to answer real-world questions.” More information is available at: http://www.birds.cornell.edu/citscitoolkit/about/definition [June 2014].
largest numbers of highly skilled migrants2 from OECD countries in 2001, followed by France and Germany (OECD, 2002b). China perhaps experiences the most extreme challenges with mobility (Ministry of Science and Technology of the People’s Republic of China, 2007). While it produced 1.5 million science and engineering graduates in 2006, 70 percent of the 1.06 million Chinese who studied abroad between 1978 and 2006 did not return to China (GOV.cn, 2010). In 2008, the Chinese government established the Thousand Talents Program, which brought more than 600 overseas Chinese and foreign academics back to their native country (The Royal Society, 2011).
Today, knowledge from basic science moves more rapidly than ever across international borders, and research findings can be shared in a public forum (e.g., the GenBank database of genetic and proteomic findings) to become immediately accessible to all researchers worldwide. A study by Griffith and colleagues (2004) suggests that foreign research and development can spill over domestically and have an impact on productivity. As discussed in greater detail in Chapter 6, the United States needs to educate and attract the scientists and engineers who understand and can advance these findings by conducting world-class basic research in all major areas of science, with “major areas” being defined as broad disciplines of science, their major subdisciplines, and emerging areas such as nanotechnology (National Academy of Sciences, 1993).
This requirement is emphasized in a 1993 report of the National Academy of Sciences (NAS), Science, Technology, and the Federal Government: National Goals for a New Era, which emphasizes the importance of the United States being among the leaders in all major areas of science (National Academy of Sciences, 1993). In particular, the report is noted for its argument that maintaining a world-class standard of excellence in all fields will help ensure that the United States can “apply and extend scientific advances quickly no matter when or where in the world they occur” (Experiments in International Benchmarking of U.S. Research Fields [National Academy of Sciences, 2000, p. 5], in reference to the 1993 NAS report). To this end, the federal investment must be vigorous enough to support research across the entire spectrum of scientific and technological investigation. Because of the interconnection among fields, neglect of one field such that the capabilities and infrastructure in that field are exceeded elsewhere could impede domestic progress in other fields or stifle innovation. The importance of nurturing all fields of scientific research to foster transformative innovations is illustrated by the case study in Box 3-6.
2Highly skilled migrants are defined by OECD as workers who have completed education at the third level in a science and technology (S&T) field of study, or who are employed in an S&T occupation in which that level of education is typically required.
Genomics and the Big Bang Theory
In 2001, three astrophysicists published in Science a confirmation of the Big Bang theory of the creation of the universe (Miller et al., 2001a, 2001b). They studied the imprint of so-called acoustic oscillations on the distribution of matter in the universe and showed it to be in concordance with the distribution of cosmic microwave background radiation from the early universe. This discovery not only provided support for the Big Bang theory but also yielded an understanding of the physics of the early universe that enabled predictions of the distribution of matter from the microwave background radiation forward and backward in time.
The discovery was made using a statistical method—the false-discovery rate—to detect the oscillations. The impetus for this method was the development of technologies that allowed for the rapid collection and analysis of data on a large number of distinct factors.
Collaborating with the astrophysicists, a statistician developed the method further for their research. Using this method, the authors were able to make their discovery and publish it in Science while others were still wrestling with the plethora of data. Based on the method’s applications to cosmology, statisticians conducted research to improve it, and it is now used in many other applications. This method has been applied in genomics, for example, so that for a small sample of individuals, thousands of genes can be tested simultaneously to determine how they differ in affecting a biological condition.
The possibility of innovation is enhanced when tools and resources considered “missing” in some fields of science can be handed over by colleagues in other fields at the precise moment that they matter most (see Box 2-5 in Chapter 2). When investments in research create a fertile environment for innovation, the United States has a greater capacity to build scientific infrastructure, generate knowledge and human capital, and reap economic and other societal benefits. When the environment is fertile for innovation, the nation is better prepared for an uncertain future.
A wise investment in America’s future, therefore, is an investment in knowledge: in the researchers who generate it, in the research colleges and universities that disseminate it, and in the networks of scientists and engineers who transform and ultimately use it. The value of knowledge becomes most evident through its eventual applications, which often cannot be predicted. Nonetheless, investing in the generation, dissemination, and use of knowledge will better ensure that research leads to applications that benefit society, some in transformative ways. Many research findings will eventually have unexpected applications that differ from a project’s original goal. The task for government management of research is not to predict, much less control, the future but to allow discoveries to emerge from these investments.