Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Computational Simulation and Submesoscale Variability James C. McWilliams* In the way of an oracle, I offer the following remarks about the future of physical oceanography: Because of broadband intrinsic variability in currents and material distributions and because of the electromagnetic opacity of seawater, the ocean is severely undersampled by measurements and likely to remain so. (Surface remote sensing makes a wonderful exception.) The most impor- tant instrumental advances will be ones that improve on this situation. Computational simulation of realistic oceanic situations is steadily growing in capacity and usage. Given the first remark, this is a very good thing. It supports a dual strategy of using measurements to inspire and test models, and using models to design experiments and extend the scope of measurements. So far, planning for field experiments that embody this duality is still rarely done well. In 1997 NSF convened a similar futurism workshop (APROPOS), and I contributed a white paper describing practices and trends in numerical modeling that still seems apt (McWilliams 1998). I would now add two further remarks. First, accumu- lating experience supports the hypothesis that such simulationsâeven if sometimes remarkably like nature in their emergent patterns, phenom- ena, and multivariate relationshipsâhave an inherent, irreducible impre- cision compared to measured quantities in turbulent (chaotic) regimes (McWilliams 2007). This extends to irreproducibility among different *âUniversity of California, Los Angeles 89
90 OCEANOGRAPHY IN 2025 model codes putatively solving the same problem (n.b., the persistent spread among global-warming simulations). The imprecision is due to the model composition with its non-unique choices for numerical algorithms, parameterizations, and couplings among different processes. Testing and digesting this hypothesis and acting on its implications are strongly rec- ommended. Second, there is a serious, unsolved infrastructure problem in oceanic modeling, viz., how to increase and depersonalize model docu- mentation, calibration, and availability in support of widespread usage without impeding the necessary, continuing evolution of what is still a young technology. How many published model results are reproducible by a reader, hence verifiable? How can we facilitate the interfaces between model makers and users? How can anyone other than the IPCC go about deploying an ensemble of different models to understand the spread of their answers for a range of problems? (Even the IPCCâs is an inadvertent ensemble.) For reasons that have a lot to do with measurement undersampling and model immaturity, oceanography is only now moving into a bloom of discovery about distinctive types of variability within the submeso- scale regime (10sâ1000s m; hoursâdays). This is an awkward scale regime for the usual measurements: small compared to most remote-sensing footprints; large compared to a shipâs range; and subtle to distinguish from inertia-gravity waves in point time series. There is an emerging, provisional paradigm for non-wave submesoscale variability. Its primary energy source is mesoscale eddies and currents, which confounds the theoretical (and computationally confirmed) prediction of up-scale energy transfer in geostrophic, hydrostatic flows. It is manifest in frontogenesis, frontal instability, coherent vortices (including the notorious âspirals on the seaâ often seen in reflectance images but never measured in situ), âmixed-layerâ instability, unstable topographic wakes, âarrestedâ topo- graphic waves, ageostrophic instability of geostrophic currents, sponta- neous wave emission by currents, temperature and material filaments, horizontal wavenumber spectra with shallow slopes, probability density functions with wide and skewed tails (e.g., near-surface cyclonic vortic- ity and downwelling velocity), and acoustic scattering patterns of lenses and layers (e.g., in geoseismic surveys). It affects a forward cascade of both kinetic and available potential energy and thus provides a route to dissipation for the general circulation (via mesoscale instability) that is probably globally significant. This cascade supports a microscale inte- rior diapycnal material mixing that sometimes may be competitive with breaking internal waves. It also induces density restratification (an appar- ent vertical unmixing!) that is especially effective around the surface mixed layer. It provides important material transport between the surface mixed and euphotic layers and the underlying pycnocline and nutricline,
James C. McWilliams 91 and it sometimes provides important horizontal transport. As yet, only a few flows have been simulated, only a few theories devised, and only a few regions measured for their submesoscale variability. This family of phenomena deserves a lot of attention in the coming decades. The disciplinary borders of physical oceanography are increasingly indefensible with respect to both scientific content and the education and recruitment of new researchers. This view should be embraced in our institutional homes. References McWilliams, J.C. 1998. Trends in Numerical Modeling for Physical Oceanography. Excerpted from the Future of Physical Oceanography: Report of the APROPOS Workshop. Available: http://surfouest.free.fr/WOO2003/mcwilliams.html. McWilliams, J.C. 2007. Irreducible Imprecision in Atmospheric and Oceanic Simulations. PNAS. 104: 8709-8713.