Ask someone to draw a picture of an innovator and you’re likely to get some version of Thomas Edison or Benjamin Franklin—a prolific lone genius fueled by astounding creativity and an almost magical ability to intuit what the world needs. In truth, however, innovation is not so much a person as a process—and a rather meandering, messy, and long process at that.
Take a close look at any single information technology (IT) advance and you’re bound to find behind it a sprawling network of inventors, researchers, engineers, investors, and precursor technologies. Though it certainly has its heroes, innovation in information technology is the story of collaboration, borrowing, and exchange among many, many contributors over the course of years and decades.
Neither the private sector, nor university researchers, nor the federal government has a monopoly on IT innovation. It is the interplay among these contributors, with their disparate motivations, strengths, and limitations, that creates the innovation ecosystem in which theories and ideas can lead to the experimentation that spawns technologies and, ultimately, applications.
In this chapter, three leading innovators dissect the research-to-application pipeline from different perspectives: Deborah Estrin on the value of application-engaged research; Robert Colwell on the motivations of different stakeholders within the IT innovation ecosystem; and Farnam Jahanian, on the sometimes unpredictable journey from insight to innovation—and the imperative for the United States to remain at the forefront of IT.
In the medical community, the process of translating findings from fundamental research into practical applications is known as translational research. It’s a process that can take 10-20 years as basic chemistry or biology research is applied to develop new drugs, medical devices, or other innovations that improve medical care. The analogous process in computer science is sometimes called “application-engaged research,” and in this field the invention-to-innovation cycle can happen far more quickly. This rapid cycle is a significant driver behind the enormous growth in the technology sector.
A presentation by Deborah Estrin, professor of computer science at Cornell Tech and professor of public health at Weill Cornell Medical College, focused on this critical relationship between the invention of a new tool or technique and the innovation that happens through its use. It was a theme echoed throughout much of the workshop, from Rodney Brooks’s exploration of the back-and-forth process of building robots to Jaime Carbonell’s description of data-driven machine learning techniques.
Estrin has spent her career on application-engaged research, often at the intersection of technology and health. A pioneer in the field of networked sensing—the use of mobile and wireless technology to collect real-time data about the physical world—she currently directs the Small Data Lab at Cornell Tech. There, her team develops technologies that harness what she terms “small data,” the small bits of information generated from the personal technology we use every day, for applications that support healthy living and other goals.
Estrin has long been focused on the application-engaged research, which she calls “grounding your work.” She pointed to advice she received from Jim Waldo, now chief technology officer at Harvard University, that helped crystalize the focus on the solution-oriented research that has characterized her career: Waldo called for researchers to avoid wasting time thinking of creative problems, and instead spend time thinking of creative solutions to problems that someone in the world has articulated.
Another remark of Estrin: She has always remembered Judea Pearl, professor of computer science at the University of California, Los Angeles, a 2011 Turing Award winner and a renowned researcher in the field. Pearl once commented to a Ph.D. student, “That your approach is generalizable does not release you from the responsibility of showing us one thing it actually does.” To Estrin, this sums up the idea that innovation is most successful if it is grounded in actual use.
She cited government programs as having fueled much of her work. Even serving on government-led committees has had a huge impact on the application-engaged work
she does today: Her tenure at the Information Science and Technology study group of the Defense Advanced Research Projects Agency (DARPA) inspired her to pursue networked sensing. She said DARPA’s other programs, such as SensIT (Sensor Information Technology) and NEST (Network Embedded Systems Technology), allowed her to keep that research moving forward.
The relationship between invention and innovation, or research and application, is a two-way street, not a unidirectional flow, said Estrin. She continued, “When you’re doing this kind of multidisciplinary application-driven work, as Margaret [Martonosi] said a decade ago, you have to take turns.” To Estrin, this means that researchers and technologists should not insist on innovating on the “how” and the “what” at the same time, but rather oscillate between the two in order to solve both theoretical and practical problems effectively. Furthermore, any line of inquiry may be rapidly and surprisingly enriched by development from another line, propelling one’s own work forward.
As an example, Estrin and her team leveraged funding from the National Science Foundation (NSF) to include different domain experts in their work in order to take turns to propel co-innovation in networked sensing. While she was conducting her research into networked sensing, smartphone use rose substantially and at the same time there was a big leap forward in methods of statistical analysis. Pairing those developments with her own inventions inspired her research focus today: improving health management using mobile devices, sensors, and the digital transactions of individuals. Estrin observed that the NSF’s Science and Technology Centers program has given her research group the “funding and time to really bring the domain experts into the same room and process for a long enough period of time that we could take on authentically application-driven problems that transformed both the applications and the technology.”
According to her, once a product can be used, “reality gets to push back,” and it is this push and pull between research and the real world that drives innovation. In her own research experience, she said, “Scientists and engineers who were trying to measure something specifically gave us concreteness; it pushed back on us to give up on some of the things that we thought were most elegant and focus on a different set of problems that turned into effective technology for them and also led us to new technical challenges.”
Estrin credits government funding with allowing her to take on a wider range of problems than would be possible in an industry setting, with its necessary focus on near-term business models. Building, testing, and using working systems requires patience and committed funding. The interplay between invention and innovation also tends to amplify good
ideas and make it clear what the nonuseful ideas are. Shared and open-source tools, she said, represent one good idea with roots in government-funded research that was amplified through this push and pull. The Internet protocol suite, Web browsers, and TinyOS are other examples of powerful open-source tools that industry would not or could not have invented without government-supported work.
As was pointed out by several participants, a key difference between university research and industry research is the former is not constrained by specific business models and the financial market’s pressure for annual revenue targets. “It’s so important that this work also happen in the university, because the university can take on problems that extend beyond the incentive structure of any individual product, company, or industry,” said Estrin. Health care, she said, is a particularly clear example of a field in which university research, especially in health care IT, has propelled research that the private sector was not willing to do. With government funding, academic researchers have taken on, and can continue to take on, a broader range of problems in application-engaged work that moves technology forward for health care and many other fields.
As someone with a long history working at the forefront of the technology industry, Robert Colwell offered a unique perspective on the processes and motivators behind technology research and development in government, academia, and industry. He spent most of his career engineering microprocessors at Intel, where he was chief architect on the Pentium Pro, Pentium II, Pentium III, and Pentium 4 microprocessors. After retiring from Intel, he served as director of DARPA’s Microsystems Technology Office.
Although Colwell has not himself been the recipient of federal grants for academic research since his graduate studies, he is a strong believer in the inherent value of such investments. In addition, Colwell emphasized in his presentation the importance of military technology as a driver of technological advances in the commercial sector. As technology developed for military applications is adopted for public and commercial use, government investments in computer science and engineering pay double dividends.
Colwell presented a brief history of technological innovation, beginning with an example of one of the earliest known computers: In 1943, John Mauchly and J. Presper Eckert built a machine for the U.S. military that quickly computed the math and physics relevant to the
angles at which ballistic shells drop and explode. By the 1960s, computers were enormous but fast, becoming a part of mainstream research. At the same time, the military continued investing in improving computing capabilities to tackle more complex military problems. In the 1970s, the first microprocessors were invented, allowing computers to get substantially smaller. The 1980s brought faster microprocessors and the invention of the personal computer. In the 1990s, computers continued to improve, and the Internet and cell phones emerged. The 2000s brought smartphones and tablets, the rise of search engines, an explosion in social media, and hundreds of other innovations now prevalent in the daily life of billions of people.
Colwell stressed that one main lesson to be drawn from the history of technology in the modern era is that it is impossible to predict how government-funded technology will be adapted and used. A story from Intel illustrates the unexpected turns innovation can take: When Colwell began working on a computer chip in 1990, the Internet wasn’t yet a pervasive aspect of personal computing, so the engineers did not factor it into the chip’s capabilities. By 1995 when the chip was finally ready, the Internet had evolved, and it was partly a matter of luck that the chip did not need severe redesign to accommodate that new market. Even at the frontlines of the technology industry, the Internet came as a surprise. “Fundamentally, even for people in the industry designing the actual hardware, we didn’t know what was coming next,” he said.
Researchers have never been able to accurately predict what faster, stronger, better computers will enable. But, while one cannot predict the future, one can extrapolate from the past. And what the past tells us, according to Colwell, is that the computing technology that has transformed our world would not have been possible without government-funded academic research. Using a smartphone as an example, he listed numerous component technologies that stemmed from government-funded research, including the Internet browser, the camera, GPS, the embedded antenna, and the battery, among others (Figure 1.1). “We’ve never been any good at predicting what better computers will enable . . . . We just have a faith that better technology is better technology, and smart people will figure out something really cool to do with it,” said Colwell.
While industry, academia, and government all conduct research, they have very different motivations. Based on his experience in a long industry career, Colwell attested to the reality that for-profit companies take a narrow, short-term view of technology. Whereas academic researchers might be able to step back and examine the larger picture, a company focused on earnings doesn’t always have that luxury. He said that industry is also
looking to sell things on a large scale. The implication of this is that if a new technology is not immediately going to sell 10 million units, it may not be worth risking money to develop it.
Of course, no one knows what product or innovation will or will not sell millions of units at some point in the future. Colwell shared that when he was at Intel, they were blind to the potential of mobile computing. Other examples of inventions we might not have without government research funding include the computer mouse, computer graphics programs, and the Internet. Government funding also supports Ph.D. students to carry out all of this research and design work. Colwell’s own doctoral work in the early 1980s was sponsored by the U.S. Army.
As a demonstration of one of industry’s inherent limitations, Colwell recalled that in the first years of the Internet, AOL, CompuServe, and other early e-mail and Internet providers had carved out separate online spaces that worked fine on their own, but intercommunication was complex and cumbersome. Today, in part because of government investment, we all benefit from the convenience and speed of one giant, interconnected Internet.
Turning his focus to the future, Colwell identified some current problems that he believes only academic researchers will have the motivation and perspective to solve. These include working at both the exascale and the nanoscale, improved semiconductors, and energy-efficient computing. Semiconductors present an especially worrisome
challenge: Because transistors can only get so small, there will be an ultimate limit to how many circuits or transistors engineers can place on a computer chip. He warned that the time is coming when this inevitable limit will lead technological innovations to stagnate, with potentially major consequences for the U.S. economy and military. Although technology companies will certainly benefit from the research geared toward solving these problems, he urged that we cannot leave it to them, with their short-term, profit-driven view, to solve them.
Although Colwell stressed that academic research is a main driver of unexpected innovation, he also said the government stands to gain from directly funding industry research. Although IT companies invest large sums in research and development, they need assurances that they will see a return on that investment. Granting government research funds directly to companies makes it more feasible for them to invest in high-risk, high-payoff innovations. Industry is competitive, not cooperative, and government funding can encourage otherwise risky development that can lead to economic growth.
Another reason for government to fund industry research comes down to simple self-interest, explained Colwell. The success of the nation as a whole requires access to the best electronics. Many branches of government, but the military especially, rely heavily on commercially produced electronics. Counterfeit chips and cybersecurity concerns are very real threats. The car industry also relies heavily on industrially produced electronics. He noted that it is of great benefit to the nation as a whole if those electronics are the best and the most secure that they can be. Other important roles for the government in industry research and development, he added, include developing fair standards, creating cooperative task forces, and brokering disputes.
It is virtually impossible to find any sector of our economy today that does not rely heavily on computing innovations that have come as a result of government-funded research in both academia and industry. Today’s health care, science, manufacturing, communications, and entertainment, to name just a few examples, are heavily computer-dependent. Government funding of research and development across the board increases the chances that our scientists can develop and exploit every technological opportunity and remain the world’s IT leader. Colwell concluded: “We don’t know what’s next, but we need to win.”
With a distinguished career spanning academia, industry, and government, Farnam Jahanian, Carnegie Mellon University, has experienced the computer science discovery and innovation ecosystem from several distinct vantage points. He served as a computer science professor at the University of Michigan, as a researcher at IBM’s T.J. Watson Research Center, and as assistant director of NSF for Computer and Information Science and Engineering (CISE). As leader of the CISE directorate, he was responsible for a research budget of roughly $900 million.
To frame his presentation about the role of government-funded research in technology innovation, Jahanian shared two favorite quotes about innovation:
Innovation distinguishes between a leader and a follower.
—Steve Jobs, Apple CEO and renowned innovator
The guy who invented the first wheel was an idiot.
The guy who invented the other three, he was a genius.
—Sid Caesar, comedian
Echoing a theme that pervaded the workshop, Jahanian emphasized the tremendous importance of research-driven IT to America’s economy, security, and scientific leadership over the past 30 years. In his view, it is primarily IT advances that have made the U.S. economy competitive and sustainable in a global market. In addition, IT has undoubtedly accelerated the pace of scientific discovery in disciplines such as biology, chemistry, physics, and the social sciences, all of which have undergone remarkable transformations driven by computational and data-intensive approaches.
Today, Jahanian said, IT advances and interdisciplinary approaches are crucial to addressing society’s most pressing challenges, including health care, cybersecurity, transportation, and environmental sustainability. Because IT is now embedded in these fields, new advances or solutions must incorporate multidisciplinary approaches that involve computer scientists and technologists, as well as domain experts. “Our community is in the middle of all of these conversations, and many of these advances will depend on involvement of members of our community and computational and data-intensive approaches,” said Jahanian.
Clearly, the United States has been a global leader in spurring the IT advances we enjoy today. A 2013 report by international business consulting firm McKinsey & Company lists 12 top “disruptive technologies,” or innovations that will transform the global economy and daily lives (Box 1.1).1 According to Jahanian, all of these technologies are rooted in basic research advances that scientists working in America have been responsible for inventing and advancing through innovations such as advanced robotics, the Internet of Things, and the mobile Internet. Furthermore, nearly all of those basic research advances have stemmed from government support. “U.S. taxpayers have long been the most important investors in knowledge creation in this country,” Jahanian said.
But despite these past successes, he stressed that America’s work is far from over. America today faces relentless international competition to create or capitalize on the next disruptive technologies and to recruit the best talent from around the world. Although U.S. scientists have been the recipients of the largest R&D budget for many years, other countries are beginning to understand how government-funded research leads to economic prosperity and are rapidly increasing their research spending. At China’s current rate of funding growth, for example, the Chinese R&D budget is expected to surpass that of the United States by 2022.2 Now that other countries are realizing just how critical this pipeline is, Jahanian stressed the increasing need to align U.S. R&D funding to match its scientific and economic aspirations and national security requirements.
1 J. Manyika, M. Chui, and J. Bughin, 2013, Disruptive Technologies: Advances That Will Transform Life, Business, and the Global Economy, McKinsey Global Institute, http://www.mckinsey.com/business-functions/businesstechnology/our-insights/disruptive-technologies.
2 M. Grueber and T. Studt, 2013, 2014 Global R&D Funding Forecast, Battelle and R&D Magazine, December, https://www.battelle.org/docs/tpp/2014_global_rd_funding_forecast.pdf.
Two more statistics drive home this point. Right now, industry spends more money on research and development than does the U.S. government (Figure 1.2).3 This is concerning, in Jahanian’s view, because industry tends to focus on short-term, applied research rather than on the long-term fundamental work that drives true innovation. Even more concerning, there has been flat or no growth in the federal research and development budget as a share of the U.S. GDP (Figure 1.3).4 As nearly every workshop presenter emphasized, the innovations that lead to new technologies, and thus to economic growth, come from unexpected places but have the common denominator of federal funding of basic research that is tied to meaningful problems. A growing funding gap thus threatens to undermine U.S. momentum in technological innovation.
The thriving U.S. research community drives the long-term discovery and innovation that is the foundation of our economic prosperity and domestic security. Yet the paradox when funding research projects is that their outcomes are unpredictable. There is no single path or action that leads directly from invention to innovation, from product to prosperity. “The paradox of discovery and innovation is that no one actually knows how an idea or an innovation will impact the world,” said Jahanian. Sometimes an idea requires a long incubation period, during which it interacts with and reacts to other ideas and technologies before it blossoms into a groundbreaking new application. In fact, as many workshop presenters emphasized, it is most often the case that unanticipated research results are just as important or impactful as the anticipated ones. As a result, Jahanian said, “Quantifying return on investment in the context of basic research often is a very very ambiguous proposition.”
The United States can take great pride in its long history of research and development. In 1945, Vannevar Bush’s report on federal funding for scientific research laid a challenge the government quickly realized was worthwhile.5 In retrospect, there is a clear and direct path from Cold War–era federal defense contracts to today’s Silicon Valley success stories. But when early federal defense contracts were awarded to develop ARPANET, a crucial precursor to today’s Internet, no one could have predicted that Silicon Valley and its businesses and innovations would be a downstream result.
3 National Science Board, 2014, Science and Engineering Indicators: 2014 Digest, Washington, D.C.
4 M. Hourihan, 2015, Federal R&D in the FY2015 Budget: An Introduction, American Association for the Advancement of Science, Washington, D.C.
5 V. Bush, 1945, Science, The Endless Frontier: A Report to the President, U.S. Government Printing Office, Washington, D.C.
The federal government is a major partner in America’s discovery and innovation ecosystem, both through direct funding of research and projects and through overarching investments in nurturing the ecosystem itself. In The Entrepreneurial State, Mariana Mazzucato debunks the myth of a slow-moving government lagging behind frenzied innovators and reveals the opposite to be true.6 Jahanian explained that recent national initiatives, such as the Brain Initiative, the National Robotics Initiative, and the Materials Genome Initiative, demonstrate how targeted federal investments can help solve large-scale, pressing national challenges that would be impossible for one company, university, or research organization to solve alone.
Of course, companies, universities, and research organizations are also crucial partners in the innovation ecosystem, he continued. It is where they intersect that research leads to the consumer products that drive our economy and encourage the government to reinvest in the research cycle. Historically, university research labs are where knowledge creation and information dissemination begin. Jahanian noted that in these labs, students, seed technologies, and scientific curiosity become the paths to start-ups, patents, and hardware and software prototypes that ultimately become the everyday technologies that are an integral part of our world.
Contrary to what some would assume, there is in fact a very healthy relationship between university research and industry products, and today start-ups and university labs are more connected than ever, he said. According to an annual study by the Association of University Technology Managers, there were 4,200 actively operating university start-ups in 2013, double the number in 2000.7 This ecosystem can in part be traced to the Bayh-Dole Act, enacted in 1980, which permitted licensing agreements between university laboratories and companies, thus giving universities the ability to patent their inventions and retain the rights, creating additional incentives for them to partner with the private sector to further their innovations.
Jahanian emphasized that universities commercialize their technologies for many reasons, although the financial incentive is often exaggerated. There is far more gain to be had in developing a product that becomes a public benefit, contributes to the larger goals of a university’s mission, and enhances its reputation than in making a product that is merely profitable. Government funding frees universities from the trap of a narrow-minded focus on return on investment, allowing them to pursue the risky and highly uncertain projects that are not feasible within the confines and financial influences of industry.
6 M. Mazzucato, 2013, The Entrepreneurial State: Debunking Public vs. Private Sector Myths, Anthem Press, United Kingdom.
7 Association of University Technology Managers, 2014, “AUTM Licensing Activity Survey: FY2013,” http://www.autm.net/resources-surveys/research-reports-databases/licensing-surveys/.
This interplay among academics, government agencies, and industry has led to a gradual shift of mindset. Transferring knowledge, whether in a general form or in the form of an actual new product, is no longer strictly about protecting intellectual property. Today’s researchers see the long history of this back-and-forth and recognize that this relationship is about the pipeline from knowledge dissemination to economic development, to societal benefits.
In summary, Jahanian reiterated his firm belief that to date, federal investments in basic research have returned exceptional dividends to our nation, while also providing a foundation for economic prosperity and national security. There is no reason to slow down or stop these amazing yet unpredictable results. The jobs of the future are in this discovery and innovation ecosystem, in the fields of engineering, computing, or information technology; he urged that the federal research funds of the future must be there, too.