National Academies Press: OpenBook

Improving the Effectiveness of U.S. Climate Modeling (2001)

Chapter: Responding to Climate Modeling Requirements

« Previous: Increased Societal Demands on U.S. Modeling
Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×

5

Responding to Climate Modeling Requirements

The following is needed in order to respond to the climate modeling requirements described in the previous sections:

  1. additional computational resources, and human resources;

  2. new mechanisms of standardization and interchange of models, model components, and model output;

  3. increased availability of standardized diagnostic tools for standardized model evaluation;

  4. new modes of organization and management.

5.1 COMPUTATIONAL RESOURCES REQUIRED

Model Resolution

A weather or climate model begins with fundamental equations governing the motions of the atmosphere, oceans, and sea ice, which are derived from physical laws, particularly the conservation of mass, momentum, and energy (e.g., Washington and Parkinson, 1986). These equations also include any other algorithms describing the movement of tracers and reactive chemicals through the modeled system. A number of recent books document the extensive treatments of various aspects of model building and the numerical methods needed to implement the models (Trenberth, 1992; Randall, 2000; Kantha and Clayson, 2000).

For the purposes of this analysis an atmospheric resolution of 30 km is used. This scale is viewed as a minimum resolution for acceptably modeling the orographic influences (Plate 2) and in particular to assure that a drop of water falling as precipitation falls into the correct catchment

Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×

basin. It is also the minimum scale that can resolve weather motions characteristic of the climate being simulated. While it can be questioned whether such high resolution is needed for the climate simulation, it should be noted that the downscaling needed for applications is a one-way process. It uses the large-scale simulation as boundary conditions for smaller-scale simulations, but does not feed back on the larger scales. Thus, the highest resolution possible for the climatic simulation will assure the best possible downscaling. We will not argue that 30 km is the ultimate scale needed in the atmosphere but simply use it as a characteristic desirable scale needed to model the atmosphere for climate simulations. Such definiteness is also needed to estimate the computational resources required in this chapter. This resolution may be insufficient for many hydrological problems where resolution is important to adequately represent the groundwater paths and the many interacting spatial scales that affect water on its path from clouds to land to its ultimate return to the ocean. A land soil moisture and vegetation model must be embedded in the atmospheric model.

The relationship between horizontal and vertical scales is defined by the Rossby radius, such that a 30 km scale in the horizontal gives 300 m scales in the vertical. This implies a need for 50 layers up to the tropical tropopause at a height of 15 km. These layers may be augmented by increased resolution in the surface boundary layer. Including the stratosphere in the model requires additional vertical levels. Any sub-grid scale processes, including clouds, turbulent mixing, and boundary layer processes, not resolved by 30 km resolution, must be parameterized in terms of quantities on the resolved scale.

The ocean model is similarly coded from the equations of motion and the conservation of water and salt. To resolve western boundary currents a resolution of at least 10 km in the horizontal and 100 m in the vertical is required. The resolution in the vertical may be enhanced by extra layers to resolve the surface mixed layer. A sea ice model must be embedded in the ocean model in order to get the surface albedo of Earth correct and to ensure the correct salt balance of the ocean.

Because the ocean heat capacity is large and changes slow, it generally takes about 10,000 model years to spin up a coupled climate model to equilibrium, although acceleration techniques are available to speed the process. The model must be run at least 1,000 years, starting from near this equilibrium state, to diagnose its climatology and variability. Only after these diagnoses are performed is the climate model ready for use.

Ensembles

Because climate models may be sensitive to small changes in initial conditions, ensembles of many runs are made, each with slightly different

Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×

initial conditions. The spread of the model results across the ensemble gives some idea of the sensitivity of the run to initial conditions and therefore the uncertainty of the final results.

To get an idea of the total output spread ensembles usually consist of 10-30 members. Practical exigencies of each problem determine the time allowed to complete the ensemble of runs. As a rough guide an ensemble of runs for weather prediction must be completed each day, for seasonal-to-interannual prediction each month, and for long-term climate change each year or two.

Computer Resources for the Various Types of Modeling
Simulation

Approximately 1017 floating-point operations are required to run a 30-km, 50-level atmospheric model for one model year. It takes roughly the same number of floating point operations to run a 10-km, 50-level ocean model for one year. Including all sub models and couplings, we can assume it takes of order 5*1017 floating point operations to run a coupled-climate model one model year.

A thousand-year model run would therefore take 5*1020 floating-point operations. As a rule of thumb, to develop a climate model, at least 10,000 simulated model years are required. Therefore 5*1021 floating point operations are needed to develop a model.

We see that on the order of 10-100 Tflops of sustained computer speed is needed to develop climate models at the specified resolution in reasonable amounts of time ( Table 5-1). Adding components to form more comprehensive climate models takes still longer. Coupling additional chemical components of carbon models can increase these estimates by an order of magnitude. To analyze the variability and mean state of these runs a comparable amount of time is needed so that a number of thousand-year

Box 5-1

Capacity vs. Capability

Computing capacity — The ability to run many computing jobs, none of which requires all of even a large part of available computing resources. High computing capacity enables the throughput of many jobs that for climate modeling are often ensembles of runs with slight variations in initial or boundary conditions. Typically, these jobs can be run simultaneously thus providing a form of parallelism.

Computing capability — Simulations for which a single coupled or component model uses all or a large fraction of the entire system. Such calculations are much more demanding because they require the model code to efficiently use the number of processors available.

Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×

TABLE 5-1 Time to Perform a 10,000-Model Year Run at Various Sustained Operating Speeds

Sustained Speed (Gflops)

Wall-clock Time (years)

5

30,000

100

1,500

1000

150

10,000

15

100,000

1.5

model runs can be done by a sustained 100-Tflops computer on the order of a year. Model development and climate simulation require the greatest capability.

Weather Prediction

Ensembles of 10-day forecasts with an atmosphere-only model are required. For the purpose of this exercise other crucial parts of a forecast system, such as quality control, data assimilation, and initialization, are ignored. At the resolution specified above, a ten-day forecast requires 2*1015 flop and an ensemble of 30 each day requires 6*1016 flops each day. Each member of the ensemble may be executed on a separate computer with the members potentially processed simultaneously. This requires the application of 20 Gflops acting over a day for each forecast, so a collection of 30 machines each operating at a sustained speed of 20 Gflops is required. This is an example of capacity computing since 20-Gflop machines are currently available. If the ensembles must, for some reason, be done sequentially each day, 600 sustained Gflops are required.

With the implementation of more sophisticated analysis schemes, such as four-dimensional variability (4-D VAR), the computational costs associated with preparation of the model initial states have become very significant. As an indication of this, of the ECMWF operational work, only 10% is attributed to the high-resolution 10-day forecast. The 4D-Var analysis consumes about 45% and the ensemble prediction system another 45%.

Short-Term Climate Prediction

For the purposes of this exercise, a six-month coupled forecast performed once a day so that we have an ensemble of 30 once a month is assumed. Again, the other parts of the forecast process are not included. Using the numbers for the resolution given above, 2.5*1017 flops need to be done each day which requires a sustained 2.5 Tflops.

Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Greenhouse Simulation

The resources needed to perform global warming projections are similar to those for simulations in developing and testing the models, except that an additional number of projections must be run. For each specified future concentration of radiatively active gases and constituents an ensemble of 100-year runs, each differing slightly in initial conditions, is needed.

Because the IPCC special report on emissions lists a number of future scenarios, an ensemble should be done for each emission scenario to simulate the full range of possible outcomes. Assuming a 10-member ensemble, 1,000 years of model years for each scenario or something on the order of 10,000 additional model years is implied. Thus, global warming projections put similar demands on high-end computing as simulation, and we can similarly conclude that a 10-100-Tflops computer would satisfy the needs for global warming projections. Additional computational demands arise when downscaling global warming projections to specific regions.

Compromises

In general it has not been the resolution needs of climate models that have determined computer purchases but rather computer availability that has determined the highest resolution at which climate models could be run. It was pointed out in Section 3.6 that at current resolution (about 300 km in the atmosphere) a thousand-year model runtime would take a current dedicated supercomputer running at 10–20 Gflops on the order of 3-6 wall-clock months to accomplish. Some Japanese computers can be bought now that run at a sustained 1 Tflops, a factor of 50 times faster, thus allowing a current increase in resolution of about a factor of 3 (100 km). The tradeoff between computer power and desired resolution is a compromise that will exist for a very long time.

5.2 WILL MASSIVELY PARALLEL ARCHITECTURES SATISFY OUR NEEDS?

Speed and Usability

The earlier modeling report (NRC, 1998a) detailed the NCAR procurement procedure that led to a Commerce Department anti-dumping tariff on Japanese vector computers. As noted in the discussion in Section 2.4, multi-processor machines are subject to limitations in speedup (Amdahl's law) such that the speedup factor over a single processor performance is less than the number of processors. The use of fast custom-designed processors helps to overcome the limitations of Amdahl 's law.

Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×

For climate modeling, which has inefficiencies in code and a certain irreducible amount of sequential operations, parallel vector machines, currently built only by Japanese manufacturers, enjoy a throughput advantage of about a factor of 40 (per processor) over currently available massively parallel computers on real climate codes.

If Japanese supercomputers continue to be excluded from the U.S. market, we are faced with relatively limited options, all that remains are clusters of commodity-based SMP or PC nodes. The survey results (Appendix E) of the performance of current parallel climate models show that these machines do not compete effectively with Japanese VPPs in performance. Air-cooled Japanese vector machines are about 40 times faster (per processor) on climate and weather codes than current U.S. microprocessor-based machines. The latest Japanese vector machines are achieving impressive performance: 3–10 Gflops sustained per processor on real climate applications. The best microprocessors currently are achieving about 0.1 Gflops per processor.

These Japanese machines also have an additional advantage, that of usability. These machines have been developed in a sequence of incremental changes of the vector processor line that ended in the United States with the Cray T90. The software has developed slowly and carefully, and this generation of Japanese supercomputers, in particular the Fujitsu VPP line and the NEC SX-5 ( http://www.hpc.comp.nec.co.jp/sx-e/sx-world/), have a full range of scheduling and balancing software and a robust compiler for FORTRAN. The machines are therefore usable, and the generation of coupled model codes written for vector machines work on them with minor modifications. In contrast, MPP and SMP machines are less mature and generally lacking the full range of software that allows immediate and facile use.

Vector or MPP or Both?

Massive parallelism and vector technology are not mutually exclusive. In fact, both Cray scalable vector machines and Japanese vector machines can be built with hundreds or thousands of processors. The Japanese have done a good job in building scalable interconnects for massively parallel vector machines in their NEC SX-5 and Fujitsu VPP5000: Both machines can be scaled up to 512 processors. The Earth Simulator project in Japan is building a network of parallel vector supercomputers with as many as 4096 processors designed to deliver sustained speeds of 5 Tflops.

Because this study also shows that much faster computers with larger memories are required to meet the needs of the U.S. climate modeling community, the choice is stark. If U.S. climate scientists are not able to purchase Japanese vector machines, they will continue to be unable to compete with their European and Asian counterparts.

Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×

Well-designed parallel applications can be written so that they scale and perform well on both vector and microprocessor-based machines. It is quite straightforward to do this for problems that use domain decomposition for parallelism; nearly all climate applications use domain decomposition. For cache-based microprocessors domains should be sub-divided into small two-dimensional patches so that better cache performance is achieved, while vector machines perform better with long vectors, so that long, thin patches are more appropriate. Note that either partitioning can be achieved with the same code. Only a few parameters need be changed to achieve either decomposition in the same code.

Nevertheless, massively parallel machines cannot provide infinitely scalable performance. Amdahl's Law is a stern taskmaster, such that massively parallel processors require that the code be almost perfectly parallel to obtain good performance, which is difficult to achieve. Synchronization, load imbalance, and serial execution inherent in some codes makes achieving 99% parallelism very difficult, yet a code that is 99% parallel can achieve at most 100-fold speedup, even on a thousand-processor machine (Plate 1).

Most environments for climate could use both types of machines. Vector machines for mainframe performance, scalability, and reliability, and particularly for capability computing as defined in Box 5-1. SMP and MPP machines are well suited to capacity computing applications requiring large throughput of multiple jobs and for pre- and post-processing of data sets, and for data assimilation, where the ratio of vector to microprocessor speeds is low (between 4 and 6).

The Bottom Line

Clearly the easiest path to good performance is for U.S. scientists to have access to machines with the most powerful processors, the smallest and fastest network, and the fastest memory access, making cache unnecessary. That the Japanese computers are currently superior in both speed and usability for weather and climate model codes is without question. It is no coincidence that in countries other than the United States, the great majority of weather services and climate research institutions have purchased, or are about to purchase, Japanese parallel vector computers. We note finally that a previous NRC report (NRC, 1998a) has stated: “The United States must apply greater resources, particularly (but not exclusively) in the area of advanced computer machines. National boundaries should not influence where machines are purchased.”

Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×

5.3 THE NEED FOR CENTRALIZED FACILITIES AND OPERATIONS

Viewed purely as research the 20+ intermediate-size and large U.S. climate-modeling activities are healthy and diverse, allowing many people to engage in productive and innovative climate research. Viewed from the point of being able to bring concentrated resources to bear on specific problems, the 20+ differing climate activities could be considered duplicative, inefficient, and sub-critical. This shortcoming is illustrated by one of the responses to the survey, which stated that “the first problem in the United States is fragmentation, and very inefficient use of resources/ / single agencies, such as NOAA and NASA are unable to organize an integrated global modeling effort within their agency.” Owing to the increasing range of climate products required by diverse users, there is increasing need for a single agency charged with assembling and disseminating the various climate products to these diverse user groups.

One solution to this problem would be to have a few centralized climate modeling activities under the auspices of a single agency, which are of critical size and have adequate resources, each devoted to a specific task. These centralized modeling activities would maintain close linkages to the various research and user groups and would undertake model building, quality control and validation of models and products, product design, regular and systematic product development, and integration of observational data. Although these operational activities would be centralized, they would take advantage of research activities external to operations, including model development, analysis, diagnostics, and interpretation.

Perhaps the most telling argument for centralization of operational climate modeling is by analogy. Every major country in the world that invests in weather services has chosen to have the forecasts centralized in a modeling and prediction center, usually co-located with other weather activities.

5.4 FOSTERING COOPERATION WITH A COMMON MODELING INFRASTRUCTURE

The Efficiency of Cooperation

Small modelers in the United States have modern workstations available to them, sometimes with more than one processor. These workstations can be used for coarse-resolution ocean models run for tens of model years; high-resolution atmospheric models for a few days or weeks; development of radiation codes, boundary layer models in the atmosphere and ocean, and new numerical schemes; and the diagnosis and analyses of observed data and the output of large models.

Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×

By definition high-end modelers have supercomputers available to them. In the United States, this group is unique because it can currently run high-resolution ocean models for hundreds of model years, atmospheric general circulation models for hundreds of model years, very-high-resolution global atmospheric models for a few model months, coupled-atmosphere ocean models for hundreds of model years, ensembles of seasonal-to-interannual forecasts (in two-tiered configurations) for six months to a year in advance, and ensembles of weather forecasts. Until recently these two groups of modelers have tended not to interact.

The desirability for interaction of these classes of modelers is overwhelming and brings benefits to both classes. Small-scale modelers who develop parameterizations prove the worth of these parameterizations in a climate model and to demonstrate this worth they need access to high-end computers and models. High-end modelers gain the expertise of a large number of smaller modelers. The output of large models is usually underanalyzed and making it available to a wider community not only gives smaller modelers and diagnosticians access to a system approximating the climate so that they may understand its mechanisms but also gives invaluable feedback to the high-end modelers about the fidelity and faults of their climate models.

An effective U.S. modeling effort therefore requires better cooperation. This in turn requires an extensive and effective shared infrastructure that facilitates the exchange of technology and provides means and metrics to rigorously benchmark, validate, and evaluate models and forecast systems. It should, for example, allow university researchers access to an integrated forecast system, through which they can investigate the impacts of a new process parameterization. It should facilitate and support the interaction between focused centers and the broader research community for development and process experimentation, as well as evaluation and diagnostics of simulations and predictions.

The technical difficulties of exchange can be satisfied by the common modeling infrastructure (CMI), but the full potential of interaction can be realized only if access to computers and the free availability of models and model output is assured on a mutually beneficial basis.

Common Modeling Infrastructure

The process of developing, evaluating, and exercising complex models and model components is resource intensive. It places serious demands on personnel, computing, data storage, and data access that cannot be met by any single group or institution. Incorporating new understanding or new technology into comprehensive models and forecast systems additionally requires effective collaboration and communication between

Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×

the research community and model developers. Currently, the U.S. modeling effort is limited in both of these respects. The distribution of effort is less of a problem than the fragmentation of efforts, most operating with their own standards for codes, data, and model and forecast evaluation. The technical effort required to port codes and data from one center to another or to incorporate new components into an existing system often prevents effective interaction, and even communication between groups. As a result, the enterprise as a whole is inefficient, and progress is slowed.

The concept of CMI was initiated at the NSF/NCEP Workshop on Global Weather and Climate Models (NSF/NCEP, 1998), where the participants agreed that global atmospheric model development and application for climate and weather in the United States should be based on a common model infrastructure. The CMI was proposed to address the growing perception that the diversity of models currently in use at U.S. modeling centers is acting as a barrier to collaboration among groups. To this end a CMI steering committee was established to develop a flexible modeling infrastructure with standards and guiding principles to facilitate the exchange of technology between operational and research, weather and climate modeling groups in the United States. The proposed goals were to accelerate progress in global numerical weather prediction (NWP) and climate prediction, to provide a focal point and shared infrastructure for model development, and to provide a means for assessing physical parameterization schemes. A CMI does not imply a common model; it is simply the set of standards and protocols that ensures common features of different models and compatibility of files not only among different models but also with observational data.

Once established the CMI group recommended the creation of core models devoted to a particular modeling focus (e.g., numerical weather prediction, seasonal-to-interannual prediction, decadal variability). In the development of these core models it was recommended that each should concentrate on a problem that would benefit from broad involvement of the modeling community and that should be associated with a center whose mission is directly related to that problem (such as NCEP for NWP). Furthermore, the core models should be based on a flexible common model infrastructure and permit a range of options for different physical problems with standard configurations defined for operational applications. These configurations would represent the primary development path for the core models and would provide controls upon which improvements could be tested. Candidates for core models would be based on well-defined code standards designed to advance the goals of the common model infrastructure. New codes should be straightforward to enable the integration of new diagnostics to model output and to implement new parameterizations. Core model codes should also be portable to several machines.

Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Software Framework

The key to making models and model components interoperable is the existence of a common framework that is flexible, modular, and efficient. The framework needs to separate the scientific and computational components of the code in order to facilitate the separate development efforts in each area and to ensure the greatest portability across multiple computing architectures. Framework prototypes have been developed at a few centers, and there is consensus on the outlines of a broader-based framework even at present. Development of a community framework, however, represents a significant software development exercise over a period of several years, with continuous support for maintenance and evolution thereafter.

Data Standards

Models and analysis tools interface with data constantly. Researchers collaborate extensively through data exchange. The research enterprise will be aided greatly by standards that minimize the effort in accessing and exchanging information. Perhaps most important is the development of community metadata standards and conventions. With standardized metadata, applications programs and models can be tailored to exchange, access, and manipulate data efficiently and effectively, even in a remote site, with minimal effort. This ultimately will lead to greater interoperability between software applications and easier interaction between individuals and between groups.

Community Modeling Repository

A comprehensive and well-supported community modeling resource will allow the full potential of established coding frameworks and data standards to be realized. The repository will house a full array of model components and physical parameterization codes, including full dynamical cores and integrated physics packages of one or more “standard” models (supported by large development and support efforts in the national program), all fully documented, with validation output, and fully consistent with the community modeling framework and standards. It will allow a graduate student at any university access to complete coupled or uncoupled models and to components that could be used to construct a new model for research. The advantage of the repository for modeling centers will be the ease with which model components and parameterizations can be exchanged with other groups and the ease with which their model can be ported, tested, and run at numerous other sites. To be effective the repository must be well staffed in software engineering, in-

Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×

formation management, and quality control. The staff will actively solicit continuing expansions and updates of the repository, ensure compliance with standards, and support the use of repository resources throughout the community.

Several other common needs can also be effectively met through a repository including statistical analysis, diagnostic and visualization tools, standard algorithms (such as model clocks and calendars), physical constants, and certain standard data (e.g., vegetation, surface elevation). The repository will simplify the locating of possible tools or resources for model-based development and research. It will include adequate documentation to inform users how to access and use the resources and will be fully compatible with the community modeling framework, making access, porting, and use of resources straightforward.

Common Tests and Evaluations

Development inevitably involves testing and evaluation. Though much of this work is specific to particular research projects, there are important benchmark simulations and results that are of widespread scientific (and societal) interest. Examples include climate change simulations with prescribed trace gases (e.g., IPCC-related simulations) and seasonal forecast suites. A valuable component of an enhanced common modeling infrastructure is the establishment and maintenance of community benchmark calculations and evaluation metrics. These would fit easily into a general repository structure, providing continually clear and succinct results with which state-of-the-art assessments can be made and evaluation of new tools measured.

5.5 HUMAN RESOURCES

In the surveys referred to in Chapter 3, the respondents commented that, after the availability of computer time, their most pressing need was for computer technologists to convert code from existing machines to the new parallel machines and in general to optimize code for use on this new class of machines. At a time when Internet companies offer large salaries or stock options or both, it becomes very hard for research grants to compete for computer technologists. Even though some technologists want to stay in a research environment despite the relatively small monetary rewards, the numbers are small, and research groups compete for the few that are available. This problem is one of the unintended consequences of the enforced shift from vector to massively parallel machines available to the U.S. high-end modeling community.

Indeed, most U.S. Earth science centers are experiencing increased turnover in computational positions, with a net migration away from the

Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×

field. Significant numbers of earth scientists are leaving the field after school, rather than moving into scientific positions. To compete with the non-scientific information technology job market, scientific organizations need to offer not simply competitive salaries but also development of job skills that are attractive to mainstream professionals and career paths comparable to others in the scientific field. There is also a disturbing tendency for decreases at the front end of the employment pipeline, with 1998 showing a 20% drop in the number of Ph.D.s awarded in meteorology and oceanography over 1997 (UCAR Quarterly, Spring 2000), which was not mirrored in other scientific fields. According to UCAR Quarterly this drop in graduates is mirrored by a drop in enrollments: “During the past year, members of the UCAR Board of Trustees have expressed concern about a perceived sudden drop in the number of qualified students applying to their graduate departments.”

The staffing issues discussed above are likely to continue their negative impact on climate modeling science into the near future. To prevent the continued drain of competent scientists and technicians to overseas institutions and to the information technology (IT) sector requires such actions as increasing salaries at modeling institutions and improving career development for technical staff.

One technical solution to this problem can be modeled after the Application Service Provider (ASP) phenomenon. An ASP is a commercial company that creates a shared service center accessible over the Internet. Companies pay monthly fees to the ASP for the shared services of a limited number of IT professionals. Using this concept in a centralized computer facility would allow climate scientists to have access to the small number of IT professionals available to the entire community. The overall human resources issue is so fluid and so deeply rooted in the economic and social conditions in the United States that, aside from noting the problem and its likely effect on any attack on the climate modeling problem, it is difficult to present any global solutions.

5.6 NEED FOR CLIMATE SERVICES AND MANAGEMENT ISSUES

The current approach of expecting existing organizations within the USGCRP to deliver climate information products as an activity ancillary to their primary missions has not been successful (NRC, 1998a). Simply providing these organizations with small amounts of additional funding to give them incrementally greater capability is therefore not an effective remedy to the current situation. To provide the required capabilities for climate modeling activities and to insure the production of climate modeling products, there needs to be some organizational entity with its primary mission being the delivery of these products. For the sake of this discussion only, this entity will be designated a Climate Service—no

Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×

agency or other organizational connotation is implied. The discussion then will delineate the properties of this Climate Service needed to deliver climate information products.

Institutional and Incentive Issues

The Climate Service must have a clearly defined mission, focused on the delivery of the product and the assurance of product quality. These products should be the result of the scientific process, though the delivery of the product will require bringing closure to incomplete scientific arguments to allow production of software suites. Different versions of the model generating climate products need to be tested and validated prior to their use for product generation.

The defined mission of the Climate Service will be to provide an overarching structure to facilitate prioritization of institutional needs and decision making. Though the mission is essential, it is also important that there is an executive decision-making function vested in a small group of science and software managers, whose performance is measured by the successful delivery of the products and subsequent customer response. At the lead of this group will be an individual with the ultimate authority and responsibility for product delivery.

The current fragmented situation does not support an effective incentive structure at any level. At the lowest level scientists are generally rewarded for individual accomplishments of discovery-driven research. At the next level, even in the most project-focused organizations, funds flow into organizations from a variety of program managers. The program managers naturally command the allegiance of these subsets of the organization and are generally not rewarded for the delivery of successful products by the organizations they fund. This programmatic fracturing extends to computational resources, and in most U.S. laboratories there is a disconnect between computational resources and the delivery of simulation and assimilation products. The disconnect arises because the computing organization is often funded to pursue computational research in information technology programs or the computing facility is run as an institutional facility and the product generation exists in an uncomfortable balance with large numbers of small discovery-driven research projects. Finally, the organizations that are expected to deliver the needed Earth science products are often embedded in large Agency laboratories whose basic metrics of success do not include delivery of successful Earth science simulation and assimilation products. All told, the current structure of Earth science activities in the United States is fracturing rather than unifying.

For an executive function to be effective, an organization has to have an incentive structure that connects all facets of the organization to the responsibility for successful product generation.

Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Business Practices

A functioning Climate Service that contains the attributes described above would stand in stark contrast to the pervasive scientific culture of the United States. Such an organization would vest the decision-making function in an executive process that acts in the best interest of the delivery of the institutional products. Such a Climate Service will require supporting business practices that are significantly different from those currently used in the scientific community. These business practices must be unifying. They must provide a mechanism for stable and effective external review and integration with the discovery-driven research community.

As with the scientific and computational aspects of this enterprise, the business practices need to be considered in a systematic and integrated way. They need to support the goals and function of the charged institute. While the complete specification of these business practices are beyond the scope of this document, the following can be derived from experience in the current organizations.

Funding should be:

  1. focused on delivery of products;

  2. stable over 10-year time periods;

  3. balanced on all elements of the organization;

  4. under the direction of the executive decision-making function responsible for scientific quality and operational success;

  5. isolated from the program volatility of funding agencies.

Review:

  1. conventional peer review will not work;

  2. need to develop review techniques to support organization, including review of science and operations;

  3. different levels of review are needed for scientific and operational purposes.

Business practices:

  1. success of the Climate Service must be a critical metric for success of the hosting agencies;

  2. contractual vehicles must support the organizational goals;

  3. salary structures must allow effective recruiting and retention of personnel.

5.7 REWARDING THE TRANSITION WITHIN THE RESEARCH COMMUNITY

There must be an incentive for the research community to develop societally useful products for transition to the Climate Service. The situa-

Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×

tion can be explained most easily by considering the transition from research to operations. Suppose the research community develops something valuable, such as the development of seasonal-to-interannual prediction and the ENSO Observing System (see NRC, 1996). Research from the TOGA program demonstrated the predictability of aspects of ENSO. From this an observing system was established, and insight into the kind of questions that must be answered in order to use these types of climate forecasts were asked (NRC, 1999c).

A natural transition would then be to recognize the value of ENSO predictions, and on the basis of demonstrated value, move the prediction aspect and the routine observations needed to initialize the prediction into the operational domain using new resources in anticipation of demonstrated benefits. Instead, most of the prediction and all the observing system has remained in the research domain. Resources that should be used to explore and develop new knowledge is therefore going into activities that are not research but undoubtedly contribute to research. As a result, the “reward” for the research community to develop seasonal-to-interannual prediction has been decreased financial resources. As noted in the Pathways report (NRC, 1999a), “A research program can maintain a permanent observing system only when the system is relatively cheap and does not inhibit other research objectives. When there is an operational need for a system, funding must not come from research sources, else the building of a permanent observing system could gradually impoverish the research enterprise.”

Developing societally valuable research that leads to climate information products should lead to a clear transition path whereby the products find a home in the Climate Service. The reward to the research community should be the freeing up of resources so that research can address new problems, perhaps leading to new societal benefits.

5.8 PROVIDING THE BEST POSSIBLE SERVICE TO AN INFORMED PUBLIC

A Climate Service focused on the production and delivery of climate information must make these products as useful as possible to its customers. Weather forecasting has dealt with similar problems for a long time and therefore provides a framework for modeling for societal benefit; some of the discussion is based on lessons from that arena. To provide the best weather and climate services, effective interaction with informed customers is essential. To do this requires meeting several challenges.

The first challenge is to make sure that the most current and reliable information reaches the public. The products must be authoritative and one way of assuring this is to have an unbiased organ of the governmental bureaucracy either produce or bless the product. Professional

Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×

organizations, such as the American Meteorological Society (AMS) and the American Geophysical Union (AGU), strive to put out the best information possible and they are answerable to professionals in the field. The public makes the final choice of what to believe, however; so the educational system must emphasize the basics of science and critical thinking to the generations of future voters.

The second challenge is to engage the scientific community in such efforts. Scientific institutions often state education as a goal, but lack of commitment at the supervisor level or by peers and narrowly defined reward structures can discourage scientists from engaging in public or educational outreach activities. Such commitment involves allowing time and resources for training and the outreach activities themselves.

The third challenge is to decrease public confusion about climate issues. Public exposure to climate change is often in the form of sound bites explaining the latest weather disaster in terms of El Niño or global warming. Yet climate issues are difficult to understand without going back to the basics. Trying to explain the question of natural versus anthropogenic climate change to the public, for example, involves many issues, including:

  1. how climate change is measured over different time scales (issues of pollen proxies, sea life, crop records, and more recently, instrument corrections);

  2. what determines climate and climate change (changes in greenhouse gases, aerosols, land-surface properties, solar output, ocean);

  3. the physical processes (especially radiation);

  4. what a numerical model is;

  5. how a climate model is tested (against past and present climate, testing of parameterization schemes against special data sets, studying the way the model responds to data input if run as a weather model); and

  6. what the differences in climate models really mean.

Clearly, teaching such material involves not a single lecture but a carefully crafted set of activities and discussions that the audience (typically teachers) can use to pass on the information. Because climate and weather sciences evolve, a means of getting new information (e.g., Web sites) is included, along with contact information for future questions.

The fourth challenge is maintaining strong links between the forecasting and user communities and their customers. All sides must agree on what is needed, what is reliable, what is most usable, and what is realistic. This is best met when the first three challenges are met. It is essential, however, that the providers learn from the customers, or that they learn together. The very nature of the modeling products produced by the Climate Service must be negotiated between the service and its customers. This involves not only formal interaction but also research on societal aspects of

Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×

use of weather and climate information (Pielke and Kimpel, 1997). We expect that the creation and distribution of useful climate products for the public and private use will be the best way of maintaining these links.

5.9 SUMMARY

Increased computational and human resources are required to effectively respond to the various demands outlined in Section 4. A new way of focusing resources to meet the specific challenges posed by these various demands implies a less fragmented and therefore more centralized mode of addressing these problems. The nature of the institutional and management requirements were discussed in terms of a Climate Service, which here is the designation for the organizational entity that would create the climate information products and manage the climate modeling activities that would deliver these products. The full range of functional components of such a Climate Service extend beyond climate modeling and were not discussed. This will be presented in Section 7, where an overall vision of its functions and its interaction with the research community is presented.

Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Page 51
Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Page 52
Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Page 53
Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Page 54
Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Page 55
Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Page 56
Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Page 57
Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Page 58
Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Page 59
Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Page 60
Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Page 61
Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Page 62
Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Page 63
Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Page 64
Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Page 65
Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Page 66
Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Page 67
Suggested Citation:"Responding to Climate Modeling Requirements." National Research Council. 2001. Improving the Effectiveness of U.S. Climate Modeling. Washington, DC: The National Academies Press. doi: 10.17226/10087.
×
Page 68
Next: Improving the Effectiveness of U.S. Climate Modeling »
Improving the Effectiveness of U.S. Climate Modeling Get This Book
×
Buy Paperback | $44.00 Buy Ebook | $35.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Information derived from climate modeling has become increasingly important in recent years. More and more we understand that climate variability and change impacts society and that dealing with climate-related disasters, conflicts, and opportunities requires the best possible information about the past, present, and future of the climate system. To this end, Improving the Effectiveness of U.S. Climate Modeling describes ways to improve the efficacy of the U.S. climate modeling enterprise, given the current needs and resources. It discusses enhanced and stable resources for modeling activities, focused and centralized operational activities, how to give researchers access to the best computing facilities, the creation of a common modeling and data infrastructure, and research studies on the socioeconomic aspects of climate and climate modeling.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!