Skip to main content

Currently Skimming:

10 Computational Infrastructure - Challenges and Opportunities
Pages 175-196

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 175...
... Yellowstone system, provide an important capability that enables progress toward these goals, but they are only a first step in deploying rapidly evolving computational capabilities. Scientific advances and applications will motivate the national climate modeling enterprise to exploit these new computational capabilities; the complexity and diversity of climate model codes coupled with the expected nature of hardware advances will make this an increasingly challenging task over the next two decades.
From page 176...
... Overall, the climate modeling enterprise relies on sustained improvements in supercomputing capabilities and must strategically position itself to fully exploit them. Finding 10.1: As climate models advance (higher resolutions, increased complexity, etc.)
From page 177...
... While actual sustained performance of climate codes relative to the theoretical peak hardware performance fell by an order of magnitude, time to solution continued to be reduced through exploitation of the aggregate performance of massively parallel machines. Adapting to parallel computing architectures did require pervasive refactoring of highly mature codes.
From page 178...
... When the underlying hardware changed, the infrastructure changed with it, and the scientific codes remained largely intact. Finding 10.2: The climate modeling community adapted well to the previous hardware transition by moving toward shared software infrastructure.
From page 179...
... Graphics processing units (GPUs) have been used to achieve very high concurrency for specialized graphics operations (such as rendering three-dimensional objects on a two-dimensional screen)
From page 180...
... Finding 10.3: Climate models cannot take full advantage of the current parallel hardware, and the gap between performance and maximum possible performance is likely to increase as the hardware advances. Programming Models for the Next 10-20 Years At this time the many emerging architectures do not adhere to a common programming model.
From page 181...
... The most promising avenue at the moment appears to be OpenACC,4 but it is still early in its development. The climate/weather modeling community has never retreated from experimenting with leading-edge systems and programming approaches to achieve required levels of performance.
From page 182...
... The key question in the climate context is to see whether trajectories subject to small changes at the hardware bit level stay within the same basin of attraction, or do these small errors actually push the system into a "different climate state." Currently there is no other way to prove that an architectural or software infrastructure change has not pushed the system into a different climate other than computing the climatology of long (usually 100-yr, to take into account slow climate processes) control runs.
From page 183...
... It was recognized over a decade ago that such software infrastructure could usefully be developed and shared across the climate modeling community. The most ambitious such project was the Earth System Modeling Framework (ESMF; Hill et al., 2004)
From page 184...
... With ESMF and other infrastructure activities, the climate modeling community is seeing the natural evolution of infrastructure adoption. Individuals, communities, and institutions are seeing advantage.
From page 185...
... Finding 10.5: Shared software infrastructures present an appealing option for how to face the large uncertainty about the evolution of hardware and programming models over the next two decades. A NATIONAL SOFTWARE INFRASTRUCTURE FOR CLIMATE MODELING Very complex models have emergent behavior whose understanding requires being able to reproduce phenomena in simpler models.
From page 186...
... A related methodological advance is the multimodel ensemble and the model intercomparison project, which has become ubiquitous as a method for advancing climate science, including short-term climate forecasting. The community as a whole, under the aegis of the World Meteorological Organization's World Climate Research Programme -- through two working groups, the Working Group on Climate Modeling and the Working Group on Numerical Experimentation -- comes to consensus on a suite of experiments, which they agree would help advance scientific understanding (more information in Chapter 8)
From page 187...
... This weakness in methodology requires the climate modeling community to address the issue of scientific reproducibility. That one should independently be able to replicate a scientific result is a cornerstone of the scientific method, yet climate modelers do not now have a reliable method for reproducing a result from one model using another.
From page 188...
... Data-Sharing Issues The rapidly expanding archives of standardized model outputs from the leading international climate models have heavily contributed to the IPCC assessments. They have also made climate model simulations accessible to a wider community of users as well as researchers.
From page 189...
... This need is similar to that described in Chapter 5 related to the handling of observational data. This combination of rapidly increasing climate simulation data objects with a more distributed set of supercomputers and data archives requires the climate modeling community to begin to make use of a separate backbone data-intensive cyberinfrastructure, based on dedicated optical lightpaths on fiber optics separate from the Internet, connecting these facilities with each other and with data-intensive end users.
From page 190...
... Therefore, to advance climate modeling, U.S. climate science will need to make effective use of the best possible computing platform and models.
From page 191...
... climate community to make the additional investment to redesign climate models to effectively use new high-end supercomputers, because of the possibility of configuring multiple scientifically credible versions of individual model components to run on such systems without enormous additional effort.
From page 192...
... Individual modeling centers may not easily be convinced to migrate from their current infrastructure. However, given a decade of experience, combined with a bottom-up, community design process, the committee believes that the climate modeling is ready to develop a capable and ambitious common software infrastructure whose overall benefits far outweigh the costs.
From page 193...
... global and regional climate models, as well as its use in model comparisons and national climate model data archival.
From page 194...
... Recommendation 10.1: To promote collaboration and adapt to a rapidly evolving computational environment, the U.S. climate modeling community should work together to establish a common software infrastructure designed to facili 194
From page 195...
... Recommendation 10.3: The United States should support transformational research to bring analysis to data rather than the other way around in order to make the projected data volumes useful. Recommendation 10.4: The data-sharing infrastructure for supporting international and national model intercomparisons and other simulations of broad interest -- including archiving and distributing model outputs to the research and user communities -- is essential for the U.S.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.