National Academies Press: OpenBook
« Previous: Summary
Suggested Citation:"1 The Role of Advanced Computing in Science and Engineering." National Research Council. 2014. Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/18972.
×

1


The Role of Advanced Computing in Science and Engineering

Many past reports have underscored the integral role of advanced computing in science and engineering, including but not limited to the role of computing in addressing scientific and engineering “grand challenges” vital to our nation’s welfare, security, and competitiveness. Over time, and especially in recent years, advanced computing has become relevant to an expanding set of scientific problems and disciplines.

The term advanced computing is used in this report to refer to the full complement of capabilities that support compute- and data-intensive research across the entire science and engineering spectrum, which are too expensive to be purchased by an individual research group or department and perhaps too expensive even for an individual research institution. The term also encompasses higher-end computing for which there are economies of scale in establishing shared facilities rather than having each institution acquire, maintain, and support its own systems. For compute-intensive research, it includes not only today’s “supercomputers,” which are able to perform more than 1015 floating point operations per second (known as “petascale”) but also “high-performance computing” platforms that share the same technologies as supercomputers but may have lower levels of performance.

The terms capability and capacity are sometimes used to refer to the low and high end of the spectrum of computing performance of a single application. Capability computing is computing that stretches the limits of available resources. For example, for compute-intensive applications, it is the capability to run the largest possible tightly coupled computa-

Suggested Citation:"1 The Role of Advanced Computing in Science and Engineering." National Research Council. 2014. Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/18972.
×

tions (i.e., which can only be run practically on a single computer system). The concept of capability computing also applies to data-intensive applications, although how this might best be defined is an open question. Faster machines have been deployed by the Department of Energy and elsewhere; these very fastest machines might be thought of as the “extreme end.”

Capacity computing, by contrast, provides large amounts of computing but lower peak performance. What was capability computing 2 decades ago is capacity computing today, both in terms of individual computing needs and the number of jobs being run. The distinction is arguably somewhat artificial: high-performance computers cover a wide range of performance characteristics, and large machines can be used to provide either capacity or capability. The need for researchers to reach solutions in a reasonable amount of time means that any large problem or large ensemble of problems, even those that do not require a capability machine to reach a solution, can be in some sense a capability problem. High-throughput computing refers to systems that provide large amounts of processing capacity over long periods of time—that is, the number of operations available per month or year rather than per second—but not as high peak performance.

Since the beginning of the National Science Foundation’s (NSF’s) supercomputing centers program in the 1980s, NSF’s Division of Advanced Cyberinfrastructure (ACI) and its predecessor organizations have supported computational research across NSF with both supercomputers and other high-performance computers and provided services to a user base that spans work sponsored by all federal research agencies.

Modeling and simulation has for some time been seen as a true peer, standing beside theory and experiment, in the scientific process. It is used at a wide range of scales, as measured by the number of parallel cores needed. Some problems in astrophysics, cosmology, or biomolecular modeling use massively parallel simulations and run on machines with tens to hundreds of thousands (or more) cores. Other problems, in fields such as materials, climate simulation, and earthquake modeling, use large volumes of computation on “midscale” machines with a thousand or more cores, as do a wide array of applications of uncertainty quantification and other techniques for robust design and decision making. In addition, massive volumes of high-throughput simulations are used in combinatorial chemistry, drug design, design of functional materials, and systems design.

Data-intensive computing is beginning to emerge as a separate discipline and is being viewed by some as a “fourth paradigm” for scientific discovery, complementing discoveries made by theory, experiment, and simulation. In some disciplines, such as astronomy and biology, the per-

Suggested Citation:"1 The Role of Advanced Computing in Science and Engineering." National Research Council. 2014. Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/18972.
×

centage of research papers that are primarily based on data from data repositories of previously collected data, versus new experimental data, is increasing and reaching a point that this mode of discovery is now a significant driver of research. Networked sensors are increasingly embedded in urban and civil infrastructure, and sensors are widely used to capture research data in a growing number of fields. New algorithms for analyzing data sets that are large, complex, noisy, or unstructured allow automatic discovery of patterns within data that were previously unknown. Web search engines, online shopping recommendations, and face recognition software are some well-known applications of such algorithms, but these techniques are also increasingly valuable in science and engineering. Internet companies, such as Google, Yahoo!, Facebook, and Amazon, have introduced new software and hardware platforms and new programming models for data-intensive computing, and these platforms and models are increasingly being used for scientific research. Data-intensive research may require high-performance input/output (I/O) systems, access to very large storage systems using systems with different architectures than traditional high-performance computing systems, and new approaches to data visualization.

COMPLEMENTARY ROLES OF SIMULATION AND DATA-INTENSIVE COMPUTING

NSF’s historical emphasis on advanced computing for modeling and simulation is sometimes viewed as being in competition with the more recent interest in data-intensive computing. The relative need for one or the other is important when future advanced computing investments are considered, because the types of computer systems, storage systems, networks, software, usage models, staffing and support, industry partners, and organizational structures may be different (and possibly quite different) across these two broad categories of use. The needs of users in the two categories and the appropriate technical and organizational responses to those needs both require future study.

It is misleading, however, to think of these two categories as competing in science and engineering, because modeling and data analysis are often used in concert. In cosmology, computational models are used to fill in missing or incomplete data; in image-based scientific instruments such as synchrotrons, simulation may be used to “invert” the observation into a particular crystal structure; in climate analysis, reconstruction of historical data is critical to the validation of models; and in the Materials Genome Initiative, the results of millions of simulations are being stored and shared for community analysis. Large experiments in fields like high-energy physics use simulation to design devices and to set up individual

Suggested Citation:"1 The Role of Advanced Computing in Science and Engineering." National Research Council. 2014. Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/18972.
×

experiments to maximize the likelihood of success. Simulations produce large data sets that are part of the data challenge, and sophisticated simulations incorporate observational data to quantify uncertainty or fill in for incomplete theory. Advanced computational models and algorithms are being fused with observational data and with more sophisticated and expensive techniques to accommodate and quantify uncertainty. Fundamentally, scientific discovery goes beyond identifying patterns in data to discovering models that explain and predict those patterns. As these examples suggest, the fusion of computational modeling and data analytics pervades all of science and engineering. This is true of both large scientific collaborations and the work of individual investigators.

Modeling and large-scale data analysis is also driving the development of a new class of stochastic models in many areas of research, such as Earth system modeling. The present class of dynamically based models are being stretched to their limits because there is often little knowledge of the model parameters, let alone the dynamical form of critical processes such as cloud formation and rainfall. Rather than continuing to improve model resolution and add more features, some climate researchers are advocating for a new approach based on stochastic models that will link models and large-scale data analytics.

Thus, for many scientific disciplines, the issue is not whether to use data or simulation, but how the two will be used together. The need for advanced computing is important throughout disciplines, as the models, data, and types of scientific inquiry grow in sophistication. The increasing number of uses that combine computational models and observational data suggests that facilities supporting both are needed, and there may be value in co-locating data and compute capabilities. Although the technical challenges are many, the social, organizational, and funding challenges may be equally crucial.

1. For its final report, the committee will explore and seeks comments on how to create advanced computing infrastructure that enables integrated discovery involving experiments, observations, analysis, theory, and simulation.

Suggested Citation:"1 The Role of Advanced Computing in Science and Engineering." National Research Council. 2014. Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/18972.
×
Page 7
Suggested Citation:"1 The Role of Advanced Computing in Science and Engineering." National Research Council. 2014. Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/18972.
×
Page 8
Suggested Citation:"1 The Role of Advanced Computing in Science and Engineering." National Research Council. 2014. Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/18972.
×
Page 9
Suggested Citation:"1 The Role of Advanced Computing in Science and Engineering." National Research Council. 2014. Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020: Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/18972.
×
Page 10
Next: 2 Challenges »
Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020: Interim Report Get This Book
×
 Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020: Interim Report
Buy Paperback | $29.00 Buy Ebook | $23.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Advanced computing capabilities are used to tackle a rapidly growing range of challenging science and engineering problems, many of which are compute- and data-intensive as well. Demand for advanced computing has been growing for all types and capabilities of systems, from large numbers of single commodity nodes to jobs requiring thousands of cores; for systems with fast interconnects; for systems with excellent data handling and management; and for an increasingly diverse set of applications that includes data analytics as well as modeling and simulation. Since the advent of its supercomputing centers, the National Science Foundation (NSF) has provided its researchers with state-of-the-art computing systems. The growth of new models of computing, including cloud computing and publically available by privately held data repositories, opens up new possibilities for NSF. In order to better understand the expanding and diverse requirements of the science and engineering community and the importance of a new broader range of advanced computing infrastructure, the NSF requested that the National Research Council carry out a study examining anticipated priorities and associated tradeoffs for advanced computing. This interim report identifies key issues and discusses potential options.

Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020 examines priorities and associated tradeoffs for advanced computing in support of NSF-sponsored science and engineering research. This report is an initial compilation of issues to be considered as future NSF strategy, budgets, and programs for advanced computing are developed. Included in the report are questions on which the authoring committee invites comment. We invite your feedback on this report, and more generally, your comments on the future of advanced computing at NSF.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!