component. As we incorporate more models into one code (multiscale and/or multiphysics models) and perhaps integrate data management and other functions, it is more likely that overall performance will be held back by one weak link. Therefore, there will be an increased need to work toward balanced systems with components that are relatively similar in their parallelizability and scalability.

Another problem is that the computations must deal with an enormous number of particles (1010-1011 at present) and large grids (as big as 20483, or 1010 cells, currently, with larger calculations planned for the future). Complex methods can require 103-104 floating point operations per cell per time step and generate hundreds of gigabytes to multiple terabytes of data in a single snapshot. Systems that deal with data on this scale will usually need to access thousands, perhaps even hundreds of thousands, of disks in parallel to keep the system’s performance form being limited by input/output (I/O) rates. Thus scalable parallel I/O software would be extremely critical in these situations in the long term.

In all of the HECC-dependent areas of astrophysics reviewed in Chapter 2, there is a strong science case to use the increase in computational capability expected over the next 10 years to increase model fidelity. This would be done mainly by including the effects of a large number of physical processes and their interactions in a single simulation. For such a program to be successful, it will be necessary to make fundamental improvements in the models, algorithms, and software.


Traditionally, large-scale simulation in astrophysics has taken the form of first-principles modeling, in which the equations to be solved were relatively well characterized. Examples include three-dimensional calculations of inviscid compressible flow, flows of collisionless matter, effects of self-gravity, and passive radiative losses in optically thin media. The next generation of models will go in the direction of much more complicated coupling between different physical processes (multiphysics modeling) and the interaction of a much greater range of temporal and spatial scales (multiscale modeling). Some of these complex models include the following:

  • In cosmology models, those that incorporate the feedback effects of star formation and of supermassive black holes on galaxy formation.

  • In models of the early stages of star formation, those that incorporate a self-gravitating multicomponent medium that includes both collisional and collisionless components undergoing ionization, chemical reactions, heat and mass transfer within and among the components, and radiative transfer.

  • In models of supernovae, those that include the ignition and propagation of a nuclear reaction front in a turbulent, gravitationally stratified medium, on the scale of an entire star.

In these and other cases, it is not feasible to use a first-principles model that resolves all of the length scales and timescales and all of the coupled physical processes, even under the most optimistic view of the growth of computational capabilities over the next decade. Instead, it will be necessary to resolve a narrower range of length scales, timescales, and physical processes, with the remaining effects represented approximately through models that can be represented on the resolved scales. The development of such models is a complex process involving a combination of mathematical analysis and physical reasoning; they must satisfy the mathematical requirement of well-posedness in order to be computable; and they must be validated. Large-scale, detailed simulations will play a substantial role throughout this process by simulating unresolved scales that will provide input to the development of models and be the basis for validation.

The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement