According to the National Science Foundation’s (NSF’s) engineering research center (ERC) website, “The goal of the ERC Program is to integrate engineering research and education with technological innovation to transform national prosperity, health, and security.”1 The committee resonates with this existing goal and suggests that the top-level goal of future convergent engineering research centers (CERCs) should be to solve critical societal problems with engineering research and to advance fundamental knowledge on how to deliver those solutions to society. The goal of maximizing societal benefit will in most cases lead to the creation of great economic value as well. An example is the Internet, which began as a research tool but has transformed both the social and economic landscape.
Various metrics have been used to judge the performance of center-based research programs, such as number of students graduated, number of scientific articles published, number of industry participants, number of patents, number of startups spawned, or products commercialized. According to NSF, since the ERC Program’s inception in 1985, it has created 193 spin-off companies, disclosed more than 2,200 inventions, and has been awarded 739 patents, which resulted in 1,339 licenses.2
The problem with many of these metrics is that they measure outputs that say little about the degree to which the research has achieved center goals. Outputs are indicators of potential impact and are comparatively easy to measure (papers, patents, etc.) Outcomes are related to impacts (e.g., transformational changes). It is outcomes or impacts that NSF should be most concerned about. And some of these output measures—such as the number of participating companies, number of patents, or number of students involved—can be “gamed.” Above all, the metrics of center performance should not foster a “box-checking” mentality. Appropriate metrics are discussed below and in Chapter 6.
Under the top-level goal of delivering societal benefit, the committee believes the following four outcomes will be critical to the success of the CERCs:
1 National Science Foundation (NSF), “Engineering Research Centers (ERC),” https://www.nsf.gov/funding/pgm_summ.jsp?pims_id=5502, accessed October 10, 2016.
2 NSF, 2015, Creating New Knowledge, Innovators, and Technologies for Over 30 Years, https://www.nsf.gov/eng/multimedia/NSF_ERC_30th_Anniversary.pdf.
- Students with the skills to be innovators and leaders,
- New ideas and paradigm-shifting research,
- Development and use of work products of durable intellectual value, and
- Creation of economic value.
Students with the Skills to be Innovators and Leaders
Top-notch undergraduate and graduate engineering students are not only key to the successful operation of CERCs, as was pointed out in Chapter 3, they are also an important indicator of center success. Center students should have the opportunity to (1) engage in significant, multidisciplinary technical training, (2) experience deep collaboration, (3) take on leadership roles, and (4) gain meaningful exposure to industry practices and culture. See the Chapter 3 section “Engineering Education.”
FINDING 4-1: Superbly educated and innovation-savvy engineering student leaders capable of addressing current and future societal challenges are likely to be the most important long-term outcome of CERCs.
RECOMMENDATION 4-1: A key objective of the convergent engineering research centers should be capacity-building through student training and development, at both the graduate and undergraduate level, and direct engagement with industry.
Metrics that might be used to measure student outcomes can include tracking of student placement in industry, university, government, or nonprofit positions and contributions made in those positions, compared with a non-CERC control student population. As discussed further in Chapter 6, emerging collaboration platforms allow real-time tracking and longitudinal follow-up of center research activities and of students who have been engaged at the centers—all with fewer burdens on the centers. With only a limited number of cohorts of students graduated with degrees during the typical 10-year NSF ERC funding lifetime, this metric may need to have a retrospective component; see further discussion in Chapter 6.
New Ideas and Paradigm-Shifting Research
Successful centers produce new research ideas and innovative technical approaches to the problems being addressed. In the case of a CERC devoted to a grand-challenge-like problem, a successful outcome may be just understanding how best to address the grand challenge. The center may also produce paradigm-shifting research. An example from the past was the pioneering work of W.E. Deming on statistical process control—a process innovation.3 It created an entirely new, more productive way to create value for society though lower cost and profoundly higher product quality. A modern example might be the marriage of artificial intelligence, machine learning, and big data technologies to usher in an era of lower-cost, easy-access cognitive health care.4
Metrics for this category of outcomes could be evidence of widespread adoption of new research techniques or processes developed.
Work Products of Durable Intellectual Value
The Internet makes it possible for scientists and engineers to not only to record their own work, but also to create durable work products and platforms that can be instantly shared with others, thereby directly accelerating the progress of science and helping to develop innovative engineering and manufacturing systems. These durable work products can include prototypes, innovative manufacturing processes, software applications, tools, and
3 D.W. Edwards, 1993, The New Economics for Industry, Government, and Education, MIT Press, Boston, Mass., p. 132.
4 See, for example, S. Smith, 2015, 5 ways the IBM Watson is changing health care, from diagnosing disease to treating it, Medical Daily, December 17, http://www.medicaldaily.com/5-ways-ibm-watson-changing-health-care-diagnosing-disease-treating-it-364394.
libraries; data and data repositories; cloud computing platforms and services; physical engineering systems with remote access; international standards; and other enabling technologies with broad application. Their development is enabled by the availability of distributed software development platforms such as GitHub, cloud computing platforms such as Microsoft Azure and Amazon AWS, the maturation of IoT (Internet of Things); hardware, software, and data aggregation tools; and the broad use of standardized REST Internet interface tools for connecting physical systems to applications and services. Examples of such systems include software such as the BLAST tool for comparing primary biological sequence information; the World Wide Telescope, which provides astronomers with new ways to visualize observational data from the world’s great telescopes; and data repositories such as the Network for Earthquake Engineering Simulation Hub (NEEShub), which provide a web-based gateway for earthquake engineering results and information.
Metrics for measuring impact in this category might include widespread use of center-developed software, tools, or standards, or the frequency with which data repositories are accessed.
FINDING 4-2: The development and dissemination of engineering and intellectual work products with durable value accelerates innovation and scientific progress.
RECOMMENDATION 4-2: Future convergent engineering research centers should be encouraged to produce broadly accessible engineering prototypes, tools, data repositories, platforms, and enabling technologies that foster broad scientific, engineering, and manufacturing innovation. Such work products might form useful interim deliverables from large-scale projects.
Development of research and education products can be shared widely through collaboration platforms. NSF should explore ways of disseminating CERC innovation, student training, and entrepreneurship activities that scale beyond the centers and participating institutions. Universities frequently develop tools of this type, but these may only be usable by their developers, because they do not put in the large amount of effort required to make them user-friendly. To make these tools usable by a wider audience, NSF may need to make funds available for professionals to refine them. As appropriate, NSF should consider leveraging existing federal initiatives, such as the Small Business Technology Transfer program, to help disseminate these tools.
From its inception, the ERC program has had the goal of enhancing U.S. industrial competitiveness by transferring intellectual value and technology developed in the centers into the commercial sphere. NSF has also sought to create economic value indirectly through the training of a diverse population of students with the skills to innovate.
The challenges associated with the commercialization goal for ERCs are often underappreciated. For example, it is generally not realistic to expect intellectual property developed in a center working in biology and materials sciences to become a commercial product in the center’s typical 10-year life span. Most medical or materials-based start-up companies take about 10 years before they produce products at sufficient scale to generate profits. However, information technologies can be developed much faster and, as recommended here, centers using value-creation best practices can engage industry much earlier in valuable commercial initiatives.
Today, companies often engage with centers to gain access to new knowledge and future employees and do not expect to develop specific processes or products as a result of these relationships.5 This is reasonable for ERCs, given that they initially sit at the early stages of the continuum of technology readiness levels (see Figure 1.1).
5 I. Feller, C.P. Ailes, and J.D. Roessner, 2002, Impacts of research universities on technological innovation in industry: Evidence from engineering research centers, Research Policy 31(3):457-474.
Researchers have documented considerable direct and indirect economic impact of selected ERCs.6,7,8 While commercial successes do occur (Box 4.1), they are relatively rare. Orin Herskowitz, director of Columbia University’s Columbia Technology Ventures, told the committee that about 85 percent of the technology transfer activities of U.S. universities lose money, with the bulk of licensing revenues accruing to a small subset of institutions from a small number of blockbuster products.9
Part of the challenge is that some center-related economic impacts are very difficult to quantify, while others may not be evident until many years after a center has ceased operation. External center funding, increased employment, and improvements in the technical workforce comprise the larger categories of quantifiable economic value.
Over time, one way to measure the success of delivering social benefit is that the results (e.g., intellectual work products) of the center are picked up by industry and then industry makes economic advances that can be traced back to the centers. Thus, one can use economic value delivered as one metric—but not the only one—to determine if CERCs (or the NSF centers generally) have delivered societal benefit.
The committee is agnostic about whether the larger goal of delivering maximum societal benefit is served by centers seeking to translate proprietary technologies to the private sector by either licensing or forming startups, or by giving away their intellectual content through open sourcing. As one example, the Linux operating system is open source, but has created huge economic value.
Metrics for measuring economic value are relatively well developed and quantitative and are not discussed further here.
FINDING 4-3: Metrics currently used to evaluate centers tend to focus on numbers of students graduated, papers published, patents awarded, and so on. These output numbers do not necessarily measure the true impact of the center and the metrics can be gamed and may encourage a box-checking mentality.
6 SRI International, 2004, The Economic Impact on Georgia of Georgia Tech’s Packaging Research Center, Arlington, Va.
7 SRI International, 2008, National and Regional Economic Impacts of Engineering Research Centers: A Pilot Study. Final Report, Arlington, Va., November.
8 D. Roessner, L. Manrique, and J. Park, 2010, The economic impact of engineering research centers: Preliminary results of a pilot study, Journal of Technology Transfer 35(5), 475-493.
9 National Academies of Sciences, Engineering, and Medicine, 2016, A Vision for the Future of Center-Based Multidisciplinary Engineering Research: Proceedings of a Symposium, The National Academies Press, Washington, D.C.
RECOMMENDATION 4-3: The National Science Foundation should develop metrics that track the impacts of center activities, not just the outputs. Examples might include the placement of graduated students in positions of influence, or evidence that intellectual value developed in the center is widely used.
While the considerable achievements of the ERC program have been recently documented,10 the committee believes that a qualitative improvement in the delivery of societal value is possible in future centers through dissemination and systematic use of team-research and value-creation best practices among all center personnel (Boxes 2.4 and 2.5, respectively). In the committee’s experience, this is not occurring now. Programs such as NSF’s Innovation Corps (I-Corps)11 are a good start, but more will be needed to fully immerse students and faculty in an entrepreneurial environment (see Chapter 6).
10 NSF, 2015, Creating New Knowledge, Innovators, and Technologies for Over 30 Years, https://www.nsf.gov/eng/multimedia/NSF_ERC_30th_Anniversary.pdf.