Atmospheric Dynamics and Weather Forecasting Research Entering the
Progress in understanding and predicting weather is one of the great success stories of twentieth century science. Advances in basic understanding of weather dynamics and physics, the establishment of a global observing system, and the advent of numerical weather prediction put weather forecasting on a solid scientific foundation, and the deployment of weather radar and satellites together with emergency preparedness programs led to dramatic declines in deaths from severe weather phenomena such as hurricanes and tornadoes.
Basic research in atmospheric science has been one of the most cost-effective investments that society has made in science. Progress in the basic understanding of phenomena such as severe thunderstorms has led directly to improved warnings and the reduction of loss of life, while technical advances in numerical weather prediction, application of statistics to model output, and advanced satellite and radar technology have contributed to much improved forecasts of all kinds.
1. Report of the Ad Hoc Group on Weather Dynamics and Storm Systems: K. Emanuel (Chair), Massachusetts Institute of Technology; K.C. Crawford, Oklahoma Climatological Survey; R. Rotunno, National Center for Atmospheric Research; L. Shapiro, NOAA/AOML/Hurricane Research Division; J. Smith, Princeton University; R. Smith, Yale University; L. Uccellini, NOAA/National Meteorological Center; M. Wolfson, MIT/Lincoln Laboratories. The group gratefully acknowledges contributions from A. Betts, L. Bosart, C. Bretherton, J. Derber, K. Droegemeier, B. Farrell, R. Fleming, J.M. Fritsch, R. Houze, M. LeMone, D. Lilly, M. Shapiro, A. Thorpe, S. Tracton, and E. Zipser.
Society chooses to invest in basic research not only because of perceived tangible benefits but also because of the intrinsic value of pushing back the frontiers of knowledge. Few would deny the largely intangible but very real value of intellectual achievements such as the formulation of quantum mechanics, the discovery of DNA, or the characterization of the physics of deterministic but nonperiodic systems. In the United States, the intellectual appeal of progress in the atmospheric sciences rivals that of such fields as cosmology and molecular biology.
Atmospheric science is poised to make another series of major advances, many of which will lead directly to improved weather warnings and predictions. Great strides in the basic understanding of the dynamics of weather systems and the development of new techniques such as ensemble forecasting combine with the deployment of new measurement systems and advanced means of communicating information to offer the promise of much improved forecasts to the American public.
To realize these potential improvements, new means of measuring the atmosphere, oceans, and land surface must be developed and implemented, and existing measurement systems such as rawinsondes, mobile radars, and research aircraft must be maintained and upgraded. We cannot stress enough the continued need for in situ and ground-based remote sensing capabilities and are alarmed at the deterioration of fundamental observing systems such as the global rawinsonde network. In surveying the state of basic research in weather dynamics, time after time we came to the conclusion that further progress was limited by the lack of appropriate measurement capabilities. For this reason, many of our recommendations focus on the need for better measurement systems. However, it must be recognized that we have the ability to predict, with some accuracy, how improvements in observing systems or techniques might actually improve forecasts. This capability is largely unexploited. One of our most important conclusions is that far more must be done to exploit known techniques, such as observing system simulation experiments, to make a priori estimates of optimal combinations of observing systems and forecasting techniques for application to specific forecast-related problems. Further, we feel that atmospheric scientists must work much more closely with other disciplines, particularly economists, to determine the potential costs and benefits of new observing systems and forecasting methods.
The major body of this Disciplinary Assessment was completed just as the U.S. Weather Research Program (USWRP) was being defined. Much of what is contained here is strongly consonant with the objectives of the USWRP as outlined in Emanuel et al. (1995).
Emerging Research Opportunities
We have identified a number of emerging basic research, technique, and technological developments that, on the basis of their intrinsic intellectual value
and/or potential economic or societal payoff, should be given high priority in the coming decades. Here, these key developments are summarized, and specific recommendations based on them are offered. The developmental foundations behind the identification of these opportunities are delineated later.
1. The fundamental physics of land-air interaction: Basic understanding of the nature of the interaction between atmospheric and land surface processes is at the threshold of major advances and has the potential, when coupled with greatly improved routine measurements of land surface properties, to lead to substantial improvements in understanding and forecasting convection, boundary layer cloud cover, and regional climate anomalies. The link between soil moisture and precipitation may be the key to improved quantitative precipitation forecasts.
2. Seasonal climate variations and their dependence on the stochastic, internal variability of the atmosphere as well as variations linked to longer time-scale phenomena in the oceans, atmosphere, and land surface: Research on blocking and on land-atmosphere interactions has the potential to yield significant improvements in seasonal forecasts. The seasonal prediction problem is highly dependent on proper representation of sources and sinks of heat, moisture, and momentum, whereas short-range prediction depends more on advection of these quantities.
3. The continued development of ensemble forecasting and data assimilation techniques: These offer great promise for improved numerical weather forecasts and the quantification of forecast uncertainty.
4. Adaptive observation strategies: Budding research suggests that ensemble forecasting techniques, including the use of model adjoints and breeding methods, may provide real-time estimates of optimal observation location and timing, given the availability of programmable observation platforms. Observing system simulation experiments could be used to help determine optimal combinations of observing systems. This may lead to large gains in the skill of numerical weather forecasts for a relatively small investment in additional observations.
5. Improved understanding of the hydrological cycle and much better measurements of atmospheric water: Ongoing advances in understanding the control of atmospheric water (in all phases) will lead to much improved understanding of and ability to predict a variety of dynamical systems. Critical physical processes include the control of water vapor by convection and cloud microphysics, and the coupling of the atmospheric boundary layer with the underlying surface. Improved understanding of these processes, together with the advent of much improved techniques for measuring soil properties, atmospheric water vapor, and condensed water, is essential for solving the difficult problem of quantitative precipitation forecasting and will be necessary for adequate modeling of climate as well.
6. Coupling the atmospheric boundary layer with deep convection and merging the understanding of cell-scale dynamics and prediction with the understand-
ing of convective ensemble dynamics: There have been enormous advances in understanding the cell-scale dynamics of moist convection, and in understanding and representing the interaction between ensembles of convective cells and larger-scale circulations; the time appears ripe for a productive synthesis of these developments.
7. The dynamics of deep convective downdrafts: These play a major role in the dynamics of at least some mesoscale convective systems and in the overall heat balance of the tropical boundary layer but have received comparatively little attention in formulating representations of cumulus convection.
8. The fundamental role of the tropopause in atmospheric dynamics and the possible benefits of better observations at and near the tropopause: Recent advances in the dynamics of synoptic-scale systems and better analyses of potential vorticity (a measure of atmospheric rotational motion) have pointed to the tropopause as a locus of important dynamical processes and the exchange of chemical constituents. This suggests that future models and observing systems may profit from much improved resolution near the tropopause. One especially promising candidate observing system for improving resolution of the tropopause is the global positioning system (GPS), which can be used to deduce profiles of temperature in the upper atmosphere.
9. Tropical cyclone genesis and intensity change, including the role of the upper-ocean response and interactions with dynamical systems in the upper troposphere and lower stratosphere: Tropical cyclones have been implicated in costly weather-related catastrophes in the United States, but there is little skill in forecasting open-ocean intensity change or genesis. Moreover, the modernization of the National Weather Service has done little to improve our ability to observe and forecast these storms.
10. The dynamics of landfallen tropical cyclones, particularly as they relate to flash floods: Some of the worst disasters in U.S. history were caused by tropical cyclone-associated flooding, but relatively little research effort has been expended in understanding the dynamics of landfallen tropical cyclones.
11. The dynamics and cloud physics of mesoscale convective systems and of other convective systems that produce heavy rainfall: Mesoscale convective systems are responsible for much of the summertime rainfall over the central United States, and research aimed at understanding the underlying dynamics and cloud physics appears to be at the threshold of major advances.
12. Orographic and other influences on sources and sinks of atmospheric potential vorticity: The understanding and numerical modeling of synoptic-scale dynamical processes that center on advection is comparatively well developed, but we are only beginning to understand the nature of diabatic and frictional processes. Forecasts beyond a few days rely heavily on an accurate account of such processes.
13. The interaction of quasi-balanced and unbalanced circulation systems: This interaction is responsible for, among other things, the generation of internal
waves in synoptic-scale cyclones and the creation of unbalanced flows such as gap winds or Kelvin waves by orography. These mesoscale events are major impediments to improvements in forecasts and warnings.
14. The development and evolution of mesoscale frontal cyclones: These are often missed by models, and their dynamics are not well understood.
15. The development of mesoscale models for forecasting ''fire weather'' conditions and interactive models for prediction of actual fire development and movement: Very recent research and modeling results suggest that such developments may aid considerably in the prediction and control of forest and wildfires.
16. Research on advanced statistical techniques and on optimal blends of numerical and statistical approaches: The best forecasts available today are based on combinations of deterministic model output and statistical guidance that depends mostly on model output. Further improvements should result from the production of higher-order moments related to the probability of events and from application of model output statistics to nonlocal quantities such as drainage basin-integrated precipitation.
We make the following recommendations, based in part on recognition of the value of the research opportunities summarized above and in part on further deliberations:
1. Fundamental improvements in forecasting in the two- to seven-day range have enormous potential economic benefits but require far better collection and utilization of data over the oceans and other data-sparse areas. We strongly encourage the support of research seeking to determine optimal combinations of satellite and ground-based remote sensing, and aircraft, balloon, and surface observations, as well as the support of key technological developments such as satellite-borne active sensing techniques, near-field remote sensing of atmospheric water vapor, and observations from commercial and pilotless aircraft. Such research should include comprehensive, well-posed observing system simulation experiments (OSSEs) and data denial experiments. Cost-benefit analysis should play a key role in the definition of "optimal" as it is used above, and the cost to the nation as a whole, rather than the cost to individual agencies, should be the criterion.
2. Recent research strongly suggests that adjoint techniques or breeding methods can be used to target specific regions of the atmosphere for observational scrutiny during the subsequent data assimilation cycle, resulting in greatly reduced forecast error. We advocate enhanced research on adaptive observations and their potential for substantial reduction in forecast error.
3. The deterioration of the global rawinsonde network must be reversed or a better substitute developed if progress is to be made on a variety of operational
and basic research problems. The reduction of in situ measurements in general, in favor of remote sensing measurements, is at best premature, and we reemphasize the desirability of performing research that seeks to determine an optimal mix of observation techniques and placement.
4. Much-improved understanding of land-atmosphere interaction and far better measurements of land surface properties, especially soil moisture, would constitute a major intellectual advance and may hold the key to dramatic improvements in a number of forecasting problems, including the location and timing of the onset of deep convection over land, quantitative precipitation forecasting in general, and seasonal climate prediction. We see a major opportunity that may be exploited by encouraging interactions between hydrologists and atmospheric scientists and by developing new means of routine and comprehensive measurement of soil properties.
5. Improvement in understanding the dynamics of atmospheric circulations affected by phase change of water, as well as in numerical weather prediction, especially quantitative precipitation forecasting, is severely impeded by poorly resolved and inaccurate measurements of atmospheric water vapor. High priority must be given to new water vapor measurement systems and to research that seeks to delineate the water vapor observations necessary to address specific research and forecast problems.
6. At present, the extent to which seasonal climate variations represent stochastic, internal variability of the atmosphere versus variations linked to longer time-scale phenomena in the oceans, atmosphere, and land surface is poorly understood. Research on blocking and on land-atmosphere interactions presents an exciting opportunity for fundamental advances in understanding and has the potential to lead to significant improvements in seasonal forecasts. The seasonal prediction problem is highly dependent on the correct specification of sources and sinks of heat, moisture, and momentum, whereas short-range prediction depends more on advection. We encourage enhanced research efforts on blocking, land-atmosphere interactions, and frictional and diabatic effects on atmospheric dynamics.
7. The worst natural catastrophes in U.S. history were caused by tropical cyclones. Although research on the dynamics of tropical cyclone genesis, intensity and structure change, and motion is ongoing, it has received little emphasis in recent national programs or in the modernization of the National Weather Service. Little is known about the dynamics of landfallen tropical cyclones, and this has limited our ability to forecast related flooding. Detection of hurricanes has been greatly facilitated by satellite-based observations, but much of the current state of understanding as well as the quantitative prediction of storm motion and structure and intensity change has relied on in situ measurements. We strongly recommend the support of research on the physics of tropical cyclone motion and intensity change, and of research seeking to delineate optimal combinations of measurement systems in aid of hurricane forecasting.
8. Tropical cyclones and some classes of extratropical marine cyclones are sensitive to local sea surface temperature and are known to influence ocean temperature through wind-induced stirring and upwelling. Modeling studies show that this feedback has an important effect on hurricane intensity, but observations of this interaction are lacking. We strongly encourage enhanced observations of the upper ocean during the passage of tropical and some extratropical cyclones.
9. The resources to maintain a balanced, national, basic research observing infrastructure must be restored, enhanced, and maintained. Satellites do not provide the spatial resolution or three-dimensional coverage required to diagnose many basic physical processes such as those involving clouds and precipitation. Next Generation Weather Radars (NEXRADs) have operational constraints that compromise their use in basic research, even if combined with other technologies. Mobile and transportable radars, research aircraft, and surface observations used to make high-precision, high-resolution observations with sufficient time continuity are required as research tools in the study of many atmospheric processes.
10. Many of the exciting and potentially beneficial developments identified earlier are of a nature that cuts across traditional disciplinary boundaries, involving much more intimate ties among atmospheric science, oceanography, atmospheric chemistry, hydrology, computational science, economics, communications, and operational forecasting. Yet the breakdown of these barriers is not well reflected in the organizational structures of the principal government funding and oversight agencies, and this is impeding progress on a number of fronts. We recommend that consideration be given to streamlining federal funding and oversight channels with a view to facilitating interdisciplinary research.
This section summarizes what we regard as the important elements of current basic research efforts, as well as key developments in measurement and forecast techniques and technology.
Basic Research Foci
Extratropical Cyclones and Associated Mesoscale Processes
Most significant weather events in the midlatitudes occur in association with extratropical cyclones. Mesoscale features characterized by strong upward motion and moderate to heavy precipitation are often embedded in the synoptic-scale region of ascending air in cyclones. These smaller-scale features include fronts, rain bands, and squall lines. Being unable to forecast the formation and ground-relative motion of these mesoscale features is a significant impediment to
having accurate precipitation forecasts. Broadly speaking, a front forms as a natural consequence of the deformation field of a large-scale cyclone acting on existing temperature gradients. We need to know more about the diabatic processes (convection, radiation, boundary layer) that modulate this frontogenesis. Mature fronts are known to have characteristic precipitation features such as rain bands; the origin and nature of these have to be understood better. Finally, circulations associated with fronts can instigate significant weather many tens of kilometers away from the front itself (e.g., prefrontal squall lines); the precise nature of these influences is still poorly understood.
Mesoscale processes often exert a significant influence on the larger-scale cyclone behavior. It is still a matter of uncertainty whether the radiosonde network (with station spacing of approximately 400 km) contains enough of the mesoscale information needed to forecast accurately many cases of the large-scale cyclogenesis itself. Sensitivity studies using adjoint models indicate that in many cases, the forecast location and intensity of the cyclones depend on upwind flow features of mesoscale size; in particular, finer-scale information concerning perturbations of the tropopause is needed. Cyclone-scale waves growing on preexisting fronts are still poorly observed and understood. Forecast models still have difficulty with lee cyclogenesis; we need to know more about how, precisely, mesoscale terrain features contribute to the cyclogenesis and how to incorporate this knowledge into a forecast model. Similar comments apply to cyclogenesis in the presence of mesoscale physiographic features such as ice-edge boundaries or coastlines. A major uncertainty is the cumulative effect of moist convection on large-scale cyclone behavior.
In general, diabatic processes and their influence on synoptic-scale dynamics must be better understood. We note that much of the total latent heating and frictional dissipation that occurs in the atmosphere is associated with mesoscale and convective-scale processes. Thus, better understanding of mesoscale and larger-scale processes is inextricably bound.
Observations of the middle-latitude atmosphere show that the gradients of potential vorticity that are so fundamental to large-scale dynamics are usually concentrated at the surface and the tropopause. The mixing and other irreversible processes that lead to nearly uniform potential vorticity distributions in the interior of the troposphere and to the concentration of potential vorticity gradients near the tropopause have to be better understood. We must further explore the consequences of the observed potential vorticity distributions for synoptic and planetary wave propagation and instability.
Better forecasts out to three days are critical for a number of important forecasting problems, such as snowfall, precipitation type, and high winds; these will depend largely on better upstream observations and improvements in understanding and capturing mesoscale phenomena. However, current numerical weather prediction techniques may not be uniformly applicable at the mesoscale. A large issue that must be faced is the initialization problem. Most current
techniques are designed to filter out phenomena such as gravity waves and upright and slantwise convectionthe very phenomena we wish to forecast on the mesoscale. We believe that better weather forecasts on time scales less than one day will require much improved understanding of mesoscale phenomena such as gravity waves, slantwise convection, and frontal cyclones, together with advanced numerical weather prediction techniques such as dynamic and diabatic initialization that preserve real internal waves and condensational heating.
Forecasting at all time scales on the U.S. West Coast and beyond a day or two on the East Coast is seriously impaired by lack of usable data over the Pacific, but there is a paucity of research on the effects of these data voids. We strongly recommend that OSSEs and data denial experiments be undertaken to estimate the effect of oceanic data voids on medium-range numerical weather prediction and, similarly, to estimate the influence of potential new data sources on numerical forecasts. Another intriguing technique that should be explored is the use of ensemble forecasting methods and adjoint techniques to make a priori estimates of the distribution, magnitude, and sensitivity of forecast skill measures to upcoming analysis error, so that programmable observation platforms, such as unmanned aerial vehicles or programmed deployment of dropsondes from commercial aircraft, can be directed to focus on sensitive regions. Adaptive observational strategies may serve to help optimize observations in aid of numerical weather prediction.
Landfalling hurricanes can have catastrophic societal impacts in terms of loss of life and property near the U.S. coastline. Hurricanes have accounted for more than $40 billion damage and costs to the U.S. economy since 1980 and more than 200 deaths. In 1992, Hurricane Andrew alone caused about $25 billion in damage and costs, with 58 lives lost, and was the single costliest natural disaster in the history of the United States.
Detection of an incipient tropical cyclone is generally made by geostationary weather satellite. Although satellites are also used to monitor the evolution of a cyclone, errors from these remote sensors can involve as much as tens of miles in position and tens of knots in wind speed. Measurements from reconnaissance aircraft, coastal radars, ships, buoys, and land stations provide additional sources of data.
The track of a tropical cyclone is determined primarily by the environmental flow in which it is embedded. The internal structure of the cyclone and its interaction with the environment are also important for track and intensity prediction. For accurate forecasts, detailed measurements are required on scales ranging from those of the large-scale environment to the cyclone's small inner-core structure. As noted in a recent American Meteorological Society policy statement (AMS, 1993), however, "The present reconnaissance aircraft fleet and
weather satellite information cannot provide the full three-dimensional data required for hurricane track forecasting. Omega dropwinsondes deployed from the aircraft can provide wind, temperature, and moisture information from flight level to the surface, and have been shown to have a positive impact on track forecast models. The aircraft are relatively slow, however, and the information derived from the sondes does not cover the important region above flight level. The remote-sensing satellite data are limited in accuracy and coverage, particularly at the critical middle-tropospheric levels."
More accurate tropical cyclone forecasts and warnings require that improved understanding of basic physical processes and improved depictions of the hurricane and its environment be incorporated into forecast models. Skillful forecasts of hurricane track and intensity require simultaneous, accurate prediction of multiple scales of motion ranging from several thousand kilometers (which determine motion) to several kilometers (which represent intensity).
Research with barotropic models, representing the depth-averaged flow that steers storms, has improved understanding of the mechanisms that influence motion, including effects due to interactions with the environment. Skillful operational track forecasts have been achieved using a barotropic model. The effects of vertical shear on motion have been investigated with baroclinic models, representing the full three-dimensional structure of the hurricane and its environment. The application of initialization schemes that include a synthetic representation of a hurricane has demonstrated the potential for substantial improvements in track prediction. It has also been demonstrated that track forecast improvements of 20 percent or more result from the addition of supplemental environmental observations, including Omega dropwinsondes. Field experiments in the western Pacific have studied the environmental factors, including interactions with mesoscale convective complexes, that influence tropical cyclone motion. We are now in a position to use advanced numerical models to make a priori estimates of the potential benefits of new observing systems for hurricane track forecasts, and the application of objective adaptive observation strategies may be particularly beneficial in the case of hurricane forecasting.
Here, perhaps more than anywhere, the use of hurricane forecast models in suitably designed observing system simulation experiments could delineate a superior mix of observations necessary for accurate forecasts of tropical cyclone movement. There can be little doubt, however, that improved measurements of the synoptic environment of hurricanes offers perhaps the best opportunity for improved forecasts that would lead to reduced loss of life and property damage. Platforms that should be considered in estimating an optimal mix of data sources include satellite-borne sea surface scatterometers, Special Sensor Micorowave/Imager (SSM/I), passive water vapor measurements, GPS-based temperature and water vapor profiles, and active radar and Doppler lidar systems as well as in situ and dropwinsonde measurements from manned and unmanned aircraft.
At present, forecasters show little if any skill in hurricane intensity predic-
tion. Current research on intensity prediction indicates that physical processes in the hurricane boundary layer and in the upper-tropospheric outflow layer have a strong controlling influence on intensity changes. High-altitude regions are, unfortunately, the area in which observations and understanding have been lacking. High-altitude (15-20 km) research aircraft are essential for making measurements that will allow understanding and forecasting of hurricane intensity and structure change by environmental interactions. Modeling studies demonstrate that hurricane intensity is very sensitive to the ratio of the coefficients governing the exchanges of heat and the momentum at the sea surface, but almost nothing is known about the nature of these exchanges at high wind speeds. Understanding the nature of the exchange of heat and momentum between the air and sea at high wind speed is important for understanding and predicting hurricane intensity.
The mesoscale and convective characteristics of a hurricane, including eyewall and spiral rainbands, are being studied. The importance of concentric eyewalls and their associated secondary wind maxima in influencing the short-term evolution of some intense hurricanes has been established, but the basic physical mechanism of this phenomenon has not been conclusively identified. The role of air-sea interactions, including the controlling influence of sea surface temperature, on cyclone intensification is being elucidated. Intensity prediction using a statistical regression model has highlighted the importance of sea surface temperature as a cap on hurricane intensity. Cooling of the ocean surface owing to the passing of a hurricane has been shown to moderate the intensification of the hurricane, but great uncertainty remains about the physics of entrainment of cold water into the ocean mixed layer. Research on this aspect of tropical cyclone physics would profit from better measurements of the ocean response during and after passage of tropical cyclones.
Although the axisymmetric dynamics of hurricane evolution are reasonably well understood, the asymmetric interactions with the environment that influence a storm's intensity are just beginning to be established. Current research emphasizes the importance of upper-tropospheric interactions in modulating storm development. Intensity forecasts with dynamical prediction models show considerable promise but are still at an early stage of development. Data deficiencies, in both the hurricane's inner core and its environment, particularly at upper levels, limit the skill of these models. Innovative numerical techniques are being applied to the development of more accurate prediction in the context of a multinested model. Doppler radar measurements from aircraft are being used to deduce inner-core structure, and satellite imagery is being used to infer storm-associated rain rates.
Earlier stages of tropical cyclone development are also being investigated, from both a dynamical and a statistical perspective. The importance of upper-level influences on tropical storm genesis is being studied from the potential vorticity perspective. The factors that determine the evolution of an incipient
easterly wave disturbance to a tropical storm have been the subject of a field experiment in the eastern Pacific. Forecasts of the number of Atlantic hurricanes that will form in a given hurricane season are being made on an operational basis. Especially strong relationships have been found between seasonal hurricane frequency and the E1 Niño/Southern Oscillation as well as western African Sahelian rainfall. Studies of environmental factors that determine the character of the hurricane season have established the prominent role of vertical shear in modulating the numbers of storms that develop.
Finally, landfallen tropical cyclones often cause major inland flooding and associated damage and loss of life. A recent example of this was the extensive flooding in central Georgia resulting from the stalled remnants of tropical storm Alberto in the summer of 1994. Too little is known about the dynamics of landfallen tropical storms, so prediction remains problematic. We strongly encourage enhanced research on the dynamics of landfallen tropical cyclones. Perhaps research aircraft could be directed toward an investigation of landfallen tropical cyclones.
There has been considerable progress in the basic understanding of moist convection in the atmosphere over the past two decades. This research is beginning to result in improved forecasts ranging from "nowcasts" of severe weather to short-term climate variability.
The late 1970s saw the first numerical simulations of severe, "supercell" convection that bears a strong resemblance to observed severe storms. Advances in supercomputing permitted spatial resolutions of 1 km in the horizontal and 500 m in the vertical, not fine enough to resolve the outer inertial range in ordinary cumulus cells, but evidently large enough to resolve the exceptionally strong, cloud-scale drafts in supercells. Simulations of violent convection have advanced recently to the point that circulations resembling tornadoes are resolved.
By the early 1980s, it became apparent that supercell convection might be predictable on a time scale of many hours. Simulations of a well-observed supercell cluster in Oklahoma captured the trajectory and new formation of many elements of this cluster. Although explicit prediction of actual storm location may not always be possible using initial conditions that precede storm development, real pre-storm thermodynamic and wind soundings may be used to predict the form of convection and its general movement. Such information is already being used by the Center for the Analysis and Prediction of Storms (CAPS) and Cooperative Program for Operational Meteorology, Education, and Training (COMET) to help operational weather forecasters predict severe weather.
Numerical simulations and theoretical developments have also helped us understand the dynamics of squall lines and identify the environmental conditions conducive to this form of convection. Here it has been recognized that the
dynamical interaction between the surface cold pool formed by cold, evaporation-driven downdrafts and the large-scale shear flow, as well as the convection itself, are key parts of the dynamics of precipitating convection.
There have been several comparatively successful simulations of mesoscale convective systems (MCSs), using parameterized convection. Some of the synoptic-scale circumstances conducive to MCSs have been identified, and theories for their formation and maintenance have been proposed, but only recently have these begun to be rigorously tested against observations or numerical simulations using explicit convection. We feel that a basic understanding of MCSs has not yet been achieved.
Although the dynamics of tornadoes are just being revealed through numerical experiments and observations (Doppler radar and high-quality videos) by storm chasers, there has been some progress in warning thanks to a combination of trained storm spotters and Doppler radar.
Although real progress had been achieved in detection and improved warnings of severe thunderstorms, microbursts, and tornadoes, additional problems remain. Data provided by the demonstration wind profiler network, local mesonetworks, and the Geostationary Operational Environmental Satellite (GOES) 8 and 9; a plethora of numerical guidance produced by more frequent update cycles of mesoscale numerical weather prediction (NWP) models; and a "firehose" of observed and derived information from a network of Weather Service Radar 1988 Doppler Weather Radar Systems (WSR-88Ds) are combining to swamp the process of ingesting storm-scale data, extracting relevant information, and producing more skillful warning decisions. Moreover, there is insufficient knowledge about what is actually happening (or is likely to happen) at the Earth's surface where people live. For example, mesocyclones indicated by radar do not necessarily indicate tornadoes on the ground. Even when the mesocyclones are real, less than half are associated with tornadic thunderstorms.
Of continued practical concern and great research interest is the phenomenon of intense, short-lived downdrafts in convective storms; these are usually referred to as downbursts or microbursts and pose a great threat to aircraft as they approach or leave an airport. Most recently, in the summer of 1994, a U.S. Air jet crashed on approach to Charlotte, North Carolina, probably because of such an event. Owing to their very short duration and to the fact that strong divergence may be limited to a few hundred meters above the surface, microbursts are more difficult to observe with Doppler radar than are tornadoes. Although the advent of NEXRAD and in situ wind shear detection systems at airports will no doubt improve warnings of microbursts, the phenomenon remains an outstanding research challenge and important focus of the warning system.
Better understanding of cloud microphysical processes offers improved understanding and perhaps prediction of convection. In all cases this will require better in situ measurements of cloud microphysical properties, particularly in ice clouds, and the application of remote sensing techniques such as polarimetric
radar. Cloud microphysical processes cannot be fully understood without high-resolution wind data having adequate time continuity. Only by mapping the three components of air velocity over the full extent of the storm (including the weakest echo regions and clear air) can the microphysical processes be placed in dynamical context, entrainment processes be better documented, and the interactions between downdrafts and microphysics be determined.
Another scientific issue, which arises in many contexts besides this one, is the dependence of storm initiation and evolution on distributions of atmospheric water vapor. We stress that water vapor, unlike temperature, pressure, and wind, is not constrained by dynamics to vary slowly on the scale of the deformation radius; what evidence exists from aircraft and satellite observations shows significant small-scale structure, even in clear air. Many areas of meteorology will benefit from improved strategies for measuring atmospheric water vapor.
In the present context, the initiation and evolution of convection depend strongly on distributions of water vapor in the subcloud layer, and the dynamics of convective storms is sensitive to evaporation associated with the turbulent entrainment and evaporation of falling precipitation, both of which are sensitive to environmental humidity. Storms often undergo strong transition when they experience changing environmental moisture. Yet existing means of characterizing the distribution of atmospheric water vapor are greatly inadequate if not totally absent. First-order improvements in both the quality and the quantity of atmospheric water vapor measurements will be necessary. Some water vapor information can be obtained from satellites, but integral measures, such as precipitable water, are of limited utility.
Another area of opportunity involves boundary layer and land surface properties, particularly soil moisture. There is increasing evidence that the evolution of the planetary boundary layer over land is strongly influenced by the distribution of soil moisture, through its effect on the temperature and moisture of overlying air. but routine measurements of soil properties are seriously inadequate. We believe that an enhanced research effort on land-atmosphere interaction involving increased collaboration between atmospheric scientists and hydrologists, together with first-order improvements in our ability to routinely characterize soil properties, may lead to dramatic improvements in the prediction of convective storm initiation.
On small time and space scales, especially over continents, convection can be regarded as a local release of accumulated conditional instability. On larger time and space scales, particularly over oceans, convection can be usefully viewed as a rather special form of turbulence, which can be considered to be in a form of statistical equilibrium with its environment. This equilibrium seems to be characterized by an approximate balance between the rate of creation of potential energy by large-scale processes such as upward motion, radiative cooling, and surface fluxes, and the dissipation of kinetic energy within convective cells. Modeling of convection has historically focused on local release of existing
instability, but recent work has taken advantage of increased computational power to explicitly simulate ensembles of convective clouds in statistical equilibrium with applied forcing. Although it is too soon to be certain where the ensemble approach will lead, it is vital for obtaining a quantitative understanding of the interaction of convection with large-scale flows and for improving our ability to represent convection in large-scale models. It has become clear that climate models are sensitive to the way convection is parameterized and, in particular, the way its effect on atmospheric water vapor is represented. More effort has to be directed at the problem of obtaining a high-quality data set for evaluating representations of cumulus convection. Much may be gained from a synthesis of the understanding of cloud-scale dynamics with ensemble behavior of atmospheric convection.
Seasonal forecasting occurs at the boundary between mostly deterministic forecasts at a time scale of 1 to 10 days and climate prediction at time scales of more than several years. Although notable success has been achieved in seasonal predictions of weather associated with specific phenomena such as E1 Niño, it is not yet clear to what extent it might be possible to make seasonal predictions that show some skill or what mix of deterministic and statistical methods should be brought to bear on the problem.
Several outstanding issues must be addressed in connection with seasonal prediction:
1. How far forward can ensemble techniques be pushed? Traditional application of predictability theory has been based on exponentially growing instabilities in the atmosphere, but recent ideas on algebraic growth of atmospheric disturbances give some hope that quasi-deterministic techniques might be pushed further. Scientists at the European Centre for Medium Range Weather Forecasts recently showed that the interval of validity of deterministic forecasts can be extended in practice to equal that of the best forecast in a series. There is also some hope that deterministic prediction of longwaves might be successful even after the decay of shortwave predictability.
2. Better understanding of low-frequency modes of the coupled atmosphere-ocean and atmosphere-land systems, along with better measurements of the land surface component of the system, might lead to much improved seasonal predictions. Examples showing definite seasonal forecast skill include weather anomalies associated with E1 Niño and seasonal forecasts of Atlantic hurricane activity based on long-period fluctuations such as E1 Niño and the quasi-biennial oscillation as well as land surface conditions in sub-Saharan Africa. Sea ice and snow cover on land may also prove to be significant components of the coupled system on seasonal time scales.
3. The influence of high-frequency but extreme events on low-frequency coupled atmosphere-ocean and atmosphere-land surface phenomena must be addressed. For example, a hurricane moving into a region experiencing drought may end the drought by changing the soil moisture distribution.
4. The degree of seasonal predictability is likely to depend on the initial conditions. Some seasonal forecasts may be susceptible to small perturbations in initial conditions, whereas others are not. The degree of fragility must be quantified so that confidence bounds can be placed on seasonal forecasts.
5. The sensitivity of the nonlinear global ocean-land-atmosphere system to small perturbations in boundary conditions is likely to be linear in perturbations, and this sensitivity can be probed by observing the system's response to naturally occurring fluctuations. One way of proceeding is to use the fluctuation-dissipation relation to find the equivalent transfer function. A seasonal prediction model should be consistent with the observed fluctuation-dissipation relation.
6. External influences on short-term climate change have to be better understood. These include small fluctuations in solar output and volcanic eruptions.
7. Observing systems that support seasonal forecasting must be global. The extent to which soil properties can be observed adequately by satellite is uncertain at this time.
Contrasts in surface features often lead to geographic contrasts in weather patterns. A prominent and extreme example is associated with the land-sea boundary of Florida. Unlike many regions exhibiting land surface heterogeneities, the physical mechanisms responsible for contrasting weather patterns are relatively well understood in Florida. New observing capabilities open the possibility of improving significantly the understanding of physical processes that control regional climate. Advances in understanding these processes will be of utility both for weather forecasting and for assessing the potential effects of changing climate.
Soil moisture has played a prominent role in research concerning land surface effects on weather because it is the most dynamic component of the land surface and because of its central role in flash-flood hydrology. The link between soil moisture and flash-flood hydrology arises principally in determining the partitioning of rainfall between infiltration of the soil and surface runoff. This partitioning is the major land surface control of flash floods. Links between soil moisture and precipitation processes are an important area of future research and one for which significant advances in weather forecasting are possible. Preliminary studies using the forecast model of the European Centre for Medium Range Weather Forecasts indicate the importance of soil moisture representations for heavy-rainfall forecasting.
Properties of the land surface, especially soil moisture content, exercise an
important control on the thermodynamics and water vapor content of the overlying atmosphere. This may have great importance for a variety of issues, ranging from quantitative precipitation forecasting to seasonal climate anomalies. Progress in understanding these issues will require far better measurements of soil properties.
Topographic features warrant special consideration in the context of heavy-rainfall and flash-flood forecasting. Virtually all of the record and near-record rainfall and flash-flood events in the United States have been linked to distinctive topographic features. Common themes that emerge from diagnostic studies of heavy-rain events concern the role of topography in maintaining quasi-stationary storm systems and in sustaining anomalously large moisture flux to storm systems.
Orographic Effects on Weather
The Earth's terrain is known to cause or modify many types of atmospheric phenomena, including the following:
• topographically enhanced rain and rain shadows,
• torrential rain and flash floods,
• forest fire storms,
• shear lines controlling tornado formation,
• sheltering of lee-side locations from strong winds,
• severe downslope and channel winds,
• gravity waves that remotely interact with larger-scale flows,
• cold air damming,
• modification of fronts and cyclones,
• diurnal control of thunderstorms,
• valley pollution and long-range pollution transport, and
• clear-air turbulence.
There are three fundamental difficulties facing researchers and practitioners dealing with meteorology in mountainous areas:
1. The Continuous Scales of the Earth's Topography: Atmospheric scientists have traditionally divided the Earth's terrain into two categories: large-scale mountains and small-scale roughness. The airflow disturbance generated by large-scale mountains has been analyzed explicitly, whereas the small-scale roughness has been parameterized. This division is physically inappropriate. The Earth's orography actually has a continuum of scales with no natural dividing scale. Even as the resolution of numerical models has improved from 400 km to 100 km to 25 km or less, an artificial division between resolved and unresolved orographically generated phenomena has remained. Furthermore, there is a par-
tially resolved range of scales near the grid size of the numerical model. These partially resolved scales cannot be treated accurately by parameterization or by direct computation. Among other things, the internal waves excited by flow over topography often break in the troposphere and lower stratosphere, providing a net drag on the large-scale flow. Weather prediction models prove sensitive to the way in which this is formulated, and it is clear that progress in basic research on flow over topography with a continuum of scales is necessary before internal wave breaking can be adequately represented in models.
Over the next two decades, the issue of terrain scale will provide challenges for the theoretician and the numerical modeler. The improved models will begin to capture the horizontal topographic scales of 100 km down to 1 km that contain the gravity wave spectrum of the Earth's atmosphere. The interaction of gravity waves with the larger scales of flowscales that are already resolvedwill bring new physical and numerical problems into our research and applications.
2. Predictability and Triggering: The question of whether atmospheric phenomena are more or less predictable in mountainous terrain is now thought to have a double answer. On the one hand, terrain can anchor flow systems in both space and time. On the other hand, mountain airflow patterns exhibit their own instability and triggering characteristics. Slight changes in ambient wind speed, wind direction, or wind shear can lead to sudden reorganization of the airflow and precipitation patterns. For this reason, ensemble forecasting and probabilistic methods will be useful in problems related to orographic influence.
3. Model Development and Verification: The numerical simulation of mountain-induced mesoscale phenomena has advanced enormously over the past two decades. The current interest in this subject and the predicted advances in computer technology suggest that the field will continue to move ahead. There remain, however, fundamental questions about numerical techniques and surface boundary conditions. The choice of vertical coordinate in numerical models will continue to be discussed, especially in relation to the diffusion of moisture and the applicability of small-scale parameterization schemes. The degree to which surface roughness and evapotranspiration should be included in mountain-flow models will require further examination.
Although there is growing confidence that high-resolution numerical models can accurately describe mesoscale orographic phenomena, there is less confidence in our ability to verify model output against real data. The problem has to do with the wide spectrum of topographic scales and the full four dimensionality (space and time) of orographic airflow fields. The application of existing measurement technology and new observational tools will be required for evaluation of models in mountainous regions.
Fire Weather Prediction
Forest fires create large losses of timber, property, and sometimes life
throughout the United States, particularly in the West. Weather information and forecasts on a variety of time scales are a vital component of fire forecasting and management, both before and after fires have begun. Current forecast models provide reasonable guidance for predicting areas at risk for fires. If the forest is dry, the synoptic meteorological conditions favoring fire formation are low relative humidity, high temperature and winds, and thunderstorms creating cloud-to-ground lightning with little precipitation. Overall improvements in short-range forecast accuracy will also better identify areas of fire risk.
However, the greatest opportunity for scientific progress in the next ten years may lie in developing and refining computer models of fire spread in mountainous terrain, which can be used to develop an optimal control strategy for a given fire. The meteorological component of such a model is a non-hydrostatic airflow model with a relatively fine horizontal resolution of less than 1 km, which is used to simulate the immediate vicinity of the fire. This model could be nested in a mesoscale model that provides appropriate larger-scale boundary conditions for the fire-affected area itself. Such models are already well developed. The scientific challenge is to integrate this with a fire-spread model that uses meteorological conditions to predict local fire spread and fuel burn rate, the heat from which modifies the airflow around the fire. There have been promising pilot demonstrations of this idea in which many features of an observed fire were qualitatively simulated, but current fire-spread models are very primitive and must be improved to take full advantage of this modeling approach. With current computer technology, it is becoming possible to run such a model in near real time on a portable computer. This could aid in deciding where to deploy fire fighters or drop fire retardant and could minimize the risk to personnel. Even in control burns that are set deliberately to ameliorate later fire hazards, lives are sometimes lost when a burn goes awry. Prior modeling can be a valuable adjunct to the heuristic rules and human judgment generally used in these cases, owing to the complexity of the airflow that often develops.
Larger-scale mesoscale models can also be used to predict the distribution of smoke plumes from fires. Occasionally, such smoke may be thick enough to significantly affect the temperatures over a large area, mainly by reducing incoming solar radiation. Conceivably, this effect could be included in a forecast model.
Ensemble prediction involves generating multiple forecasts with a forecast model from a set of perturbed initial conditions. Perturbations can be produced by a variety of methods, including time lagging and ''breeding,'' a method in which the most rapidly growing modes are naturally selected in the forecast-data
assimilation cycle, and the use of model adjoints to generate particularly rapidly growing perturbations. Two principal benefits result from ensemble forecasting: the spread between members of the ensemble gives a quantitative estimate of uncertainty in the numerical forecast, and as it turns out, the average of all members of the ensemble is statistically a better forecast than any single member. Although many additional conceptual and practical questions must be considered, ensemble prediction is applicable also to shorter-range forecasting with regional models.
Data Assimilation and Adaptive Observations
Data assimilation combines the information in observations with an atmospheric prediction model to provide the best possible estimate of atmospheric state. In the past few years, there have been substantial advances in the theory and practice of data assimilation. These advances can be attributed to improvements in four basic components: the forecast model, the data base, quality control techniques, and analysis or assimilation techniques. The improvement in the forecast models and data base is outside the area of this Disciplinary Assessment, but it should be noted that any improvement in these components immediately results in improvement of the data assimilation system.
Since instruments do not work perfectly and data are collected through a number of different paths (some still using manual means of transmission), data can contain errors. Bad data can cause problems with data assimilation systems. For this reason, it is necessary to perform some type of quality control to eliminate or correct large errors in the data. In most assimilation systems, the observational differences from the model prediction are compared to nearby observational differences interpolated to the observation location. Quality control decisions are based on these differences. At the National Centers for Environmental Prediction (NCEP), for example, a complex quality control system has been developed that, in addition to accepting or rejecting data, corrects some of the observations for common types of errors.
Most operational data assimilation schemes use an intermittent data assimilation technique in which the model is integrated forward for some period of time and then, based on available data, is adjusted using a three-dimensional objective analysis technique. The technique of three-dimensional objective analysis is most commonly some form of optimal (or statistical) interpolation. With the development of variational techniques to solve the analysis problem, many of the approximations contained in optimal interpolation can be eliminated. The advantages of these schemes include the elimination of data selection, the inclusion of more physically realistic constraints, and the easy inclusion of additional data types. As a result of these changes, the independent initialization step can be eliminated and observations such as radiances, refractivities, and scatterometer measurements can be directly incorporated in the analysis system. An even more
promising approach is four-dimensional variational assimilation, in which optimization is performed in the temporal as well as spatial domains. With the increased understanding of the theoretical aspects of data assimilation over the past few years, many aspects of the future of data assimilation have become clear.
Quality control will become even more important with the introduction of many new observation platforms and the observation and assimilation of new quantities. Complex quality control, which uses several independent quality checks in order to make a more robust decision, will continue to be improved, and an attempt will be made to salvage information from miscommunicated or improperly coded data.
The observing network will provide many new platforms to assimilate data from such devices as Doppler radar and new satellite sensors. To obtain the maximum amount of information from these data, it is desirable to use them in their most original raw form. Thus, the retrieval step that is currently performed with many data sets (e.g., temperatures and moisture retrieved from satellite measured radiances) can be eliminated, and the observed radiances used directly. To do this, it is necessary to have a high-quality forward model that transforms the model fields into the same form as the observations. This step of incorporating observed quantities directly in the analysis is vital for fully utilizing the data. However, significant effort is required to use properly each new type of data.
Assimilation systems of the future will also be required to include many new quantities (e.g., clouds, soil moisture, skin temperature, precipitation, ozone, other trace gases). To properly assimilate such quantities, it is necessary to incorporate them in the prediction model, develop the proper statistics, and include observations influenced by these quantities. All three of these steps will require substantial effort. In future systems, it is likely that diabatic processes will play a larger role. As the coupling between dynamics and physics becomes more important, the inclusion of more exact constraints will become necessary.
The final configuration of data assimilation systems of the future is not completely decided. It may be based on the extension of a three-dimensional variational system to a four-dimensional system or some approximation of a Kalman filter.
One exciting potential by-product of ensemble forecasting and data assimilation schemes is the concept of adaptive observations. Here ensemble techniques are used to identify, in a 12- or 24-hour forecast, regions of the atmosphere that are particularly sensitive to observational error, and/or adjoint techniques are used to estimate the sensitivity of a given forecast error measure to perturbations in these regions. Then programmable platforms (such as high-flying, dropsonde-equipped aircraft) are deployed to the regions. Experiments with low-order models show large potential increases in forecast skill from application of this technique. It may represent an optimal way of deploying limited observational resources and will provide a means of optimizing forecasts with respect to a chosen error measure, which may be local in some cases. Thus, we may be able
to choose, in a given meteorological circumstance, to make those observations that minimize errors in, for example, the 72-hour forecast of a violent storm over a populous region.
Applications of Advanced Computer Architectures
Research and operational numerical models are just beginning to be run on massively parallel processors (MPPs; see below). Several problems have to be solved before the application of MPPs becomes routine, however. A major problem is that of software for translating standard code into code that makes efficient use of MPP capability. Experience to date shows that codes written expressly for MPPs are very difficult to understand and to update, making them impractical for many applications.
It will be absolutely necessary to have a stable vendor environment for MPPs before full-scale development can proceed. The recent demise of several MPP vendors underscores the risks involved for operational NWP centers in these early stages of MPP development.
Parameterization of Physical Processes
Accurate prediction of the moisture field, including horizontal and vertical cloud distributions, is one of the most important items for both numerical weather prediction and climate forecasting. Recent studies have shown the importance of predicting the horizontal distribution of shallow clouds for the coupled ocean-atmosphere system. The distribution of upper-tropospheric moisture is an important component in the radiative heat budget, but one that is neither well observed nor well predicted by current models. It is clear that the prediction of moisture requires an accurate formulation of its sources and sinks, as well as extreme care in its advection by the model wind fields. A full treatment of model-parameterized processes is beyond the scope of this Disciplinary Assessment, but the current status of some of the most important components can be outlined briefly: cumulus parameterization, explicit prediction of atmospheric suspended liquid or ice concentration, and surface physics.
Current cumulus parameterization schemes can be classified into three basic types:
1. Adjustment schemes (e.g., Manabe-type moist convective adjustment and the Betts and Miller scheme)
2. Mass flux schemes (e.g., that of Arakawa and Schubert)
3. Schemes based on statistical equilibrium of water (Kuo schemes)
Recently, many operational centers have decided to use one of the mass flux schemes. Overall experience in testing cumulus parameterization schemes in
forecast models has indicated the importance of including (1) saturated downdraft effects, (2) the interaction of penetrative convection with boundary- and subcloud-layer mixing processes, and (3) cloud-radiative interactions. There is some indication from coupled atmosphere-ocean experiments that, perhaps owing to inadequacies in cumulus/boundary layer interactions, model-generated convective precipitation is less sensitive to changes in sea surface temperature than in nature. Evaluation of cumulus parameterization schemes is, however, difficult since cumulus convection interacts with many other physical processes that themselves may not necessarily be represented adequately in the model. Although tuning may ameliorate the most obvious problems, questions still remain about the theoretical foundation of cumulus parameterization.
The prediction of cloud liquid and ice water content is now being attempted in mesoscale and global forecast models. The interaction between cloud water and radiation is also being explicitly calculated. Preliminary results indicate improvements in many aspects of the forecasts, including the amount and location of precipitation, and cloud amounts. These improvements are being documented quantitatively using comparisons with satellite data.
More attention should be paid to accounting for cloud microphysical processes both in explicit clouds and in parameterized convective clouds. The water vapor content of the atmosphere is very sensitive to assumed cloud physical processes, and better prediction of water vapor content will be necessary for improvement in quantitative precipitation forecasts, longer-range numerical weather prediction, and climate simulation.
Considerable effort has been made to improve the parameterization of surface physics, particularly ground hydrology. These improvements include two-layer soil thermodynamics and hydrology with explicit evaporation, transpiration, and canopy intercept for latent flux estimates; improved surface exchange coefficients; and parameterizations for drainage, runoff, and snow cover. Results indicate improved forecasts of screen temperature and precipitation over land and, in general, improvements in the diurnal cycle over land.
A comprehensive program covering both numerical upgrades and a review and development of physical parameterizations will be necessary to take advantage of recent research developments and greater computer resources. Upgrades to physical parameterizations should be based on more refined theory, much better evaluation techniques, better model resolution, and affordability. In the near term, one may expect advances to occur in the above-mentioned areas of surface physics, cloud and cloud-radiation parameterizations, and the increasingly realistic simulation of phenomena associated with severe weather such as squall lines, outflow boundaries, and mesocyclones.
The nature of the typical weather forecast problem motivates the continued
search for accurate and cost-effective ways of making discrete approximations to the continuous equations. Much effort is currently being spent on developing two-time-level, semi-Lagrangian techniques, although the outcome of these efforts may be the development of a two-time-level Eulerian method that avoids its originally encountered unattractive features (e.g., instability, damping) while keeping its natural advantages (e.g., simplicity, conservation easily enforced). Another technique is to increase resolution over only those parts of the computational domain which it is required. The technique of automatically adjusting selective grid enhancement presents a promising avenue of exploration since many significant weather phenomena are related to sharp horizontal variations of the meteorological fields (e.g., fronts and airmass contrast across coasts) that form, move, and dissipate. A related recent development, which will continue, is the implementation of a single general model for almost all meteorologically relevant scales (i.e., from the planetary to the cloud scale). These unified models, run with selective grid enhancement, may allow simultaneous computation of nonhydrostatic clouds and/or breaking gravity waves on nested domains with enhanced resolution and of large-scale flow on a planetary-scale domain. Other important numerical issues include the following:
One of the very recent advances in numerical techniques is the utilization of isentropic coordinates as the vertical coordinate in both global and regional models. Because isentropic coordinates are quasi-Lagrangian, they do not suffer from the problems usually associated with vertical differencing. On the other hand, they cannot be used readily in nonhydrostatic models, in which phenomena such as breaking internal waves may cause isentropic surfaces to overturn. This weighs against the use of isentropic coordinates in a unified model. A hybrid-coordinate approach has been proposed to overcome the lack of resolution near the ground and the problem of isentropic surfaces intersecting the ground. More investigation is needed into the feasibility of using such hybrid coordinates and determining whether there is an advantage to their use in numerical weather prediction and climate modeling. Some operational mesoscale forecast models use the eta coordinate, which is a variation of the sigma coordinate that remains relatively horizontal and uses step mountain topography. This permits a more accurate description of orography with rapidly changing slopes than some spectrally based topographic representations. With recent increases in computer power, mesoscale models can have a horizontal resolution of the order of 20 km with 50 layers in the vertical, which represents a major improvement over present models.
Although many forecast models still employ classical numerical schemes, the semi-Lagrangian approach of treating the dynamics has received much attention in recent years. This approach, when coupled with semi-implicit time integration, allows much longer time steps com-
pared to those allowed by the traditional Courant-Friedrichs-Lewy stability criterion, minimizes phase errors and computational dispersion, and easily allows shape preservation. These considerations have led to the use of semi-Lagrangian formulations in many research and operational models. One major drawback of semi-Lagrangian methods is their lack of an a priori conservation guarantee, which is viewed by some as essential for climate modeling. Although further work is needed on this subject, a novel semi-Lagrangian approach has recently been proposed, in which exact conservation of mass and any other scalar variables can be achieved. The semi-Lagrangian technique is also being applied to nonhydrostatic systems in both regional and global models. One unresolved problem with semi-Lagrangian schemes that will require future attention is the treatment of flow around steep orography when large time steps are being used.
Ground- and Aircraft-Based Measurement Systems
In situ measurements from surface stations, ships, aircraft, and balloons remain the backbone of the global observing system in aid of weather forecasting. Improvements of this in situ measurement capability and of ground- and aircraft-based remote sensors offer the promise of much improved knowledge of the overall state of the atmosphere, which should lead to improved understanding of weather dynamics and physics and to improved forecasts.
In situ measurements have also proved valuable to the very short-range prediction problem. The State of Oklahoma recently funded a high-density mesonetwork of surface observing stations. Data from these stations have led demonstrably to improvements in short-term local and regional forecasts and to associated economic benefits, as well as improved public awareness and science education. Efforts are now under way to enhance the mesonet by adding profiler measurements and measurements of soil moisture, and to network the data through primary and secondary schools. The panel believes that such mesonets are a cost-effective route to much improved short-range regional and local forecasts, as well as better science education. Such mesonets may be funded ideally through public-private partnerships.
Some effort has been made to improve the technology of the rawinsonde. The next generation of balloon sounding will be more automated, perhaps requiring human attendance only weekly or monthly, and state variable sensors should be much improved. Tracking by the GPS is just being developed for rawinsonde and dropsonde wind finding. Despite its obvious sampling limitations, balloon sounding remains a cost-effective means of sampling the atmosphere, and the deterioration of the global rawinsonde network is alarming.
Commercial air carriers offer another means of sampling the atmosphere, both at cruising level and during ascent and descent. These measurements offer
many advantages over balloon soundings, including much better spatial and temporal coverage and potentially low operating costs. One disadvantage compared to balloons is that sounding data are restricted to cruising altitudes and below. We strongly encourage observing system simulation experiments and data denial experiments aimed at determining the importance, or lack thereof, of routine measurements above standard aviation cruising altitudes.
Another means of obtaining atmospheric measurements is by remotely piloted aircraft. Unmanned aircraft have been used by the military for about 50 years; they played a vital role in reconnaissance and as decoys in the Yom Kippur war and in Desert Storm. Technological advances in low-Reynolds-number aerodynamics, propeller design, carbon fiber epoxy construction, and power plants now make it possible to build unmanned aircraft that can cruise at 18 km altitude for two or three days, carrying many hundreds of kilograms in payload, including GPS-based dropwindsondes. These aircraft may soon enable plentiful direct measurements up to the lower stratosphere, at relatively low cost. They are particularly well suited to obtaining soundings over the ocean and over sparsely populated land. They would be instrumental in observation or numerical forecast systems that made use of adaptive observations. A major hurdle to be cleared is the problem of coordinating unmanned aircraft operations with the air traffic control system.
A number of technological developments promise much-improved measurements of atmospheric water vapor. Differential absorption lidar (DIAL) operates by transmitting laser pulses in and slightly off a water vapor absorption band, and comparing the intensities of the received return. A major limitation of the technique is eye safety; this requires transmitting low-power and/or broad-beam pulses, necessitating integration of the return over relatively long periods to achieve a reasonably high signal-to-noise ratio. Current estimates place the minimum error of water vapor estimates by this technique at about 1 g/kg, with vertical resolutions of the order of 100 m and a maximum altitude of about 3 km. The technique cannot be used to retrieve water vapor profiles above cloud base. Even so, DIAL offers much improved sampling of lower-tropospheric water vapor.
Some information about atmospheric water vapor content can be obtained from the GPS. A single GPS receiver is capable of measuring the time delay between reception and transmission of the signal from one or more satellite-borne transmitters. This delay is due to electromagnetic effects in the ionosphere, the total atmospheric mass, and water vapor. The ionospheric delay can be corrected by using two different frequencies and comparing the two time delays, whereas the atmospheric mass component can be accounted for if surface pressure is known to an accuracy of greater than about 0.3 millibar (mbar). The remaining delay is proportional to the vertically integrated water content. At any one time, about six GPS satellites are visible from a single location, so that the different elevation angles of the satellites can be used to make some inferences about the
vertical distribution of water vapor. The maximum vertical resolution from this technique is limited to about 1 km. Finally, a satellite-borne GPS receiver can be used to estimate vertical profiles of water vapor by observing the occultation of satellite-borne GPS transmitters, provided an independent evaluation of the vertical temperature profile is available. The water vapor content determined by this technique is effectively averaged over about 200 km horizontal distance, and vertical resolution is limited to about 1 km.
Vertical profiles of virtual temperature can be estimated using the radio-acoustic sounding system (RASS). In this technique, a vertically propagating sound pulse is tracked by radar; since the speed of sound is a function of virtual temperature, the latter can be deduced from the measured velocity of the sound pulse. Vertical resolution and maximum altitude are limited principally by the characteristics of the transmitter, and the data tend to be noisy.
Satellite-Based Measurement Systems
Satellite data fill the space-time gaps within in situ systems more uniformly than data from any other observing system, although information is mostly limited to radiance integrals and cloud-top properties.
Satellite remote sensing will improve in several technical areas during the next decade. Passive microwave observations are promising for detecting precipitation, but the remoteness of geostationary positions presents a major obstacle that must be overcome. Active cooling and better infrared detectors can improve precision from one part in 100 to a few parts in 10,000. Pointing accuracy to the nearest pixel will be possible. Arrays of detectors could deliver "snapshots" of regions within a few seconds, and low-light sensors could deliver high-resolution cloud cover observations on a moonlit night. On-demand "skycam" operations could be directed by local forecast officers with immediate digital data delivery through commercial paths.
During the next decade, satellites will carry infrared spectrometers that can resolve the infrared spectrum and double the vertical resolution to the theoretical limit for passive sensing of temperature and moisture.
Knowledge of the global wind field is widely recognized as fundamental to advancing the understanding and prediction of weather and climate. Several active sensing techniques can be used to detect atmospheric winds. One such technique is Doppler lidar, which operates much like Doppler radar in that signals returned to the receiver from distant targets are analyzed spectrally to recover the Doppler shifts imposed by the motion of the target. The short wavelengths (e.g., 9 µm) involved in lidar, however, mean that the targets can be much smaller than for radar and that comparable Doppler shifts and signal bandwidths are much greater. For wind sensors, the targets are cloud particles or naturally occurring aerosols suspended in the atmosphere, which move at approximately the speed of the wind.
Studies have concluded that tropospheric winds can be measured from space with current lidar technology. Successful experimental demonstrations of a 5-joule class, carbon dioxide (CO2) laser were conducted in the laboratory as part of design studies for the Laser Atmospheric Wind Sounder (LAWS) instrument.
Sea surface scatterometers can be used to reconstruct surface wind fields over the oceans. Scatterometers are absolutely calibrated radars that measure reflected signal strength from distributed targets. For given operational parameters (wavelength, incidence angle, and polarization), backscatter from the sea surface is primarily a function of the capillary wave spectrum, which is a direct measure of surface wind stress. This can be related to the surface wind vector. Thus, backscatter measured from several perspectives (as provided by an instrument in polar orbit and employing multiple fixed antennae) can be used to infer several possible averaged wind vectors over a portion of the ocean surface. Complementary data and continuity constraints may be used to select the most geophysically likely solution.
Space qualification of the technique dates from the 1978 SEASAT (U.S. sea satellite) mission, although no successor was deployed until the launch of ERS-1 (the European Remote Sensing Satellite) in 1991. The ERS-1 scatterometer samples a swath from 200 to 700 km on one side of the satellite ground track with 50 km resolution for three antennae. A large number of studies comparing SEASAT and ERS-1 scatterometer wind data to those from National Oceanic and Atmospheric Administration (NOAA) buoys and objectively analyzed winds now exists. In the range of 2-3 m/s, scatterometer winds agree to within 2 m/s and 20° of other estimates, barring contamination from sea ice or precipitation. Moreover, assimilation of ERS-1 scatterometer wind data can improve operational weather forecasts, particularly of tropical cyclone formation and location. This improvement is strongest in the data-sparse Southern Hemisphere.
The algorithms for deriving winds and for flagging rain- or ice-contaminated data are empirical. Many groups continue to work to optimize these algorithms in order to provide the best possible wind data. In addition, new algorithms should be developed to extract secondary data products, for example, tracking sea ice to complement images expected from synthetic aperture radar.
In addition to the important scientific advances that would be achieved, there is substantial evidence that significant economic benefit to the nation would occur with the use of better wind data in operational weather forecasting. Two notable examples would be a reduction in fuel consumption by airlines achieved through more accurate wind forecasts in the upper troposphere, and improved hurricane track forecasts that would reduce the area of uncertainty. Satellite-based GPS systems can also be used to measure atmospheric water vapor, as described in the preceding section.
It is vitally important to undertake an analysis of the potential costs and benefits of various systems that have been proposed for enhancing atmospheric wind information. Here, more than anywhere else, we must be able to compare
the costs and benefits of existing and proposed systems that span many federal agencies without regard for the needs and goals of individual agencies. Once again, we stress the desirability of using models to make a priori estimates of the impact of new observing systems on forecasts, as one step toward devising an optimal combination of observing systems, where ''optimal'' includes the associated costs.
To circumvent the inherent limitations (the speed of light and the minimum size of a unit) of single central processor computers, the concept of multiple processors performing tasks in parallel has been introduced, and several such machines are now on the market. Thus far, their performance has not lived up to expectations, at least as far as meteorological applications have been concerned. The problems are twofold: (1) Learning how to program these machines is an investment of time and effort that most scientists tend to avoid. (2) Current experience with atmospheric general circulation models shows performance that is basically comparable to a Cray YMP system (1.2 gigaflops), although better performance (10 gigaflops) can be achieved on more specialized problems. If numerical weather prediction is to be done with kilometer-scale resolution over synoptic-scale (1,000 km) domains, massively parallel machines are presently the only ones that could in theory deliver the 1,000-gigaflop speed required. Solutions to fundamental problems with this technology remain to be worked out by the computer industry, and better software has to be developed by users to reach the needed speed.
In recent years, the appearance of high-performance workstations (approaching the speed of a single-processor Cray YMP) have made possible truly local numerical weather prediction. High-resolution [grid spacing O (5 km)] forecast models, run over domains large enough (500 km) to avoid contamination by artificial signals from the domain edge, can provide significant enhancement of short-term forecasts (3-12 hours). Possible applications that are being investigated by researchers include enhancement of emergency response systems, more detailed local forecasts for military operations, thunderstorm forecasting, and daily weather forecasts where terrain, coastlines, and/or other physiographic features exert a significant influence on the weather. Given the trend in costs and the ease of access to forecast models, local weather forecast offices could each have such a system by the turn of the century.
Basic and applied research in atmospheric science has yielded dramatic im-
provements in weather forecasts and warnings over the past several decades and is now poised to make even more spectacular advances. The main stumbling block to realizing significant progress in basic research and operational meteorology is the need for better measurements of the atmosphere, oceans, and land surface, and the need to better understand and delineate optimal combinations of measurement systems for specific forecast problems. Our nation has invested heavily in environmental satellites, and this investment has been paid back many times in improved understanding of the atmosphere and better warnings of hazards ranging from hurricanes to severe thunderstorms and tornadoes. However, observing systems of great importance, some of low cost, have been allowed to deteriorate. Examples include research Doppler radar facilities, global rawinsonde coverage, small research aircraft for boundary layer studies, and research surface mesonets. Meanwhile, the measurement of atmospheric water vapor continues to be vastly inadequate for a number of purposes, ranging from quantitative precipitation forecasting to climate prediction. In some cases, we have just begun to realize the potential benefits of certain types of measurements, such as soil moisture and the detailed structure of the tropopause. We must stand back and take a hard look at the costs and benefits of all existing and proposed measurement systems, from the perspective of basic scientific progress and societal need, with a blind eye toward the objectives and budgets of individual federal agencies.
If we elect to take a rational and well-thought-out approach toward observations in support of basic research and operational objectives, there is every reason to believe that the potential exists for great advances in understanding and prediction. A proper accounting of land surface physics and irreversible processes in the atmosphere may lead to large increases in the skill of seasonal forecasts. Better measurements of atmospheric water vapor and of cloud microphysical processes, particularly those involving ice, may allow us to solve a number of outstanding problems such as predicting the development and movement of mesoscale convective systems and the response of atmospheric water vapor and cloud cover to climate change. Advanced applications of ensemble and adjoint techniques to numerical weather prediction may reveal, in near real time, those parts of the atmosphere that are particularly susceptible to initial error, allowing us to target such regions for observational scrutiny and thereby greatly reduce numerical forecast errors. Better in situ observations in the atmospheric and oceanic environment of hurricanes may lead to dramatically improved forecasts of the motion and intensity of these great hazards. These are but examples of what we can expect to achieve in the coming decades if we take the right approach now.