Cover Image

PAPERBACK
$44.25



View/Hide Left Panel

Computational Simulation and Submesoscale Variability

James C. McWilliams*


In the way of an oracle, I offer the following remarks about the future of physical oceanography:


Because of broadband intrinsic variability in currents and material distributions and because of the electromagnetic opacity of seawater, the ocean is severely undersampled by measurements and likely to remain so. (Surface remote sensing makes a wonderful exception.) The most important instrumental advances will be ones that improve on this situation.

Computational simulation of realistic oceanic situations is steadily growing in capacity and usage. Given the first remark, this is a very good thing. It supports a dual strategy of using measurements to inspire and test models, and using models to design experiments and extend the scope of measurements. So far, planning for field experiments that embody this duality is still rarely done well. In 1997 NSF convened a similar futurism workshop (APROPOS), and I contributed a white paper describing practices and trends in numerical modeling that still seems apt (McWilliams 1998). I would now add two further remarks. First, accumulating experience supports the hypothesis that such simulations—even if sometimes remarkably like nature in their emergent patterns, phenomena, and multivariate relationships—have an inherent, irreducible imprecision compared to measured quantities in turbulent (chaotic) regimes (McWilliams 2007). This extends to irreproducibility among different

*

University of California, Los Angeles



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 89
Computational Simulation and Submesoscale Variability James C. McWilliams* In the way of an oracle, I offer the following remarks about the future of physical oceanography: Because of broadband intrinsic variability in currents and material distributions and because of the electromagnetic opacity of seawater, the ocean is severely undersampled by measurements and likely to remain so. (Surface remote sensing makes a wonderful exception.) The most impor- tant instrumental advances will be ones that improve on this situation. Computational simulation of realistic oceanic situations is steadily growing in capacity and usage. Given the first remark, this is a very good thing. It supports a dual strategy of using measurements to inspire and test models, and using models to design experiments and extend the scope of measurements. So far, planning for field experiments that embody this duality is still rarely done well. In 1997 NSF convened a similar futurism workshop (APROPOS), and I contributed a white paper describing practices and trends in numerical modeling that still seems apt (McWilliams 1998). I would now add two further remarks. First, accumu- lating experience supports the hypothesis that such simulations—even if sometimes remarkably like nature in their emergent patterns, phenom- ena, and multivariate relationships—have an inherent, irreducible impre- cision compared to measured quantities in turbulent (chaotic) regimes (McWilliams 2007). This extends to irreproducibility among different * University of California, Los Angeles 9

OCR for page 89
90 OCEANOGRAPHY IN 2025 model codes putatively solving the same problem (n.b., the persistent spread among global-warming simulations). The imprecision is due to the model composition with its non-unique choices for numerical algorithms, parameterizations, and couplings among different processes. Testing and digesting this hypothesis and acting on its implications are strongly rec- ommended. Second, there is a serious, unsolved infrastructure problem in oceanic modeling, iz., how to increase and depersonalize model docu- mentation, calibration, and availability in support of widespread usage without impeding the necessary, continuing evolution of what is still a young technology. How many published model results are reproducible by a reader, hence verifiable? How can we facilitate the interfaces between model makers and users? How can anyone other than the IPCC go about deploying an ensemble of different models to understand the spread of their answers for a range of problems? (Even the IPCC’s is an inadvertent ensemble.) For reasons that have a lot to do with measurement undersampling and model immaturity, oceanography is only now moving into a bloom of discovery about distinctive types of variability within the submeso- scale regime (10s–1000s m; hours–days). This is an awkward scale regime for the usual measurements: small compared to most remote-sensing footprints; large compared to a ship’s range; and subtle to distinguish from inertia-gravity waves in point time series. There is an emerging, provisional paradigm for non-wave submesoscale variability. Its primary energy source is mesoscale eddies and currents, which confounds the theoretical (and computationally confirmed) prediction of up-scale energy transfer in geostrophic, hydrostatic flows. It is manifest in frontogenesis, frontal instability, coherent vortices (including the notorious “spirals on the sea” often seen in reflectance images but never measured in situ), “mixed-layer” instability, unstable topographic wakes, “arrested” topo- graphic waves, ageostrophic instability of geostrophic currents, sponta- neous wave emission by currents, temperature and material filaments, horizontal wavenumber spectra with shallow slopes, probability density functions with wide and skewed tails (e.g., near-surface cyclonic vortic- ity and downwelling velocity), and acoustic scattering patterns of lenses and layers (e.g., in geoseismic surveys). It affects a forward cascade of both kinetic and available potential energy and thus provides a route to dissipation for the general circulation (via mesoscale instability) that is probably globally significant. This cascade supports a microscale inte- rior diapycnal material mixing that sometimes may be competitive with breaking internal waves. It also induces density restratification (an appar- ent vertical unmixing!) that is especially effective around the surface mixed layer. It provides important material transport between the surface mixed and euphotic layers and the underlying pycnocline and nutricline,

OCR for page 89
91 JAMES C. MCWILLIAMS and it sometimes provides important horizontal transport. As yet, only a few flows have been simulated, only a few theories devised, and only a few regions measured for their submesoscale variability. This family of phenomena deserves a lot of attention in the coming decades. The disciplinary borders of physical oceanography are increasingly indefensible with respect to both scientific content and the education and recruitment of new researchers. This view should be embraced in our institutional homes. REFERENCES McWilliams, J.C. 1998. Trends in Numerical Modeling for Physical Oceanography. Excerpted from the Future of Physical Oceanography: Report of the APROPOS Workshop. Available: http://surfouest.free.fr/WOO2003/mcwilliams.html. McWilliams, J.C. 2007. Irreducible Imprecision in Atmospheric and Oceanic Simulations. PNAS. 104: 8709-8713.