Ms are theoretically expected and experimentally observed to saturate (around 6.3 and 8.2, respectively) with increasing seismic moment and are essentially useless for estimating the size of the mega-earthquakes capable of producing tsunamis that cause damage in the far field.

  1. If all earthquakes obeyed scaling laws, the measurement of one or another magnitude should in principle be equivalent, and an analyst should be able to predict the low-frequency value of the seismic moment by measuring the source in a different frequency band. However, earthquakes with similar moments produce widely scattered estimates of magnitude, and “tsunami earthquakes” feature anomalous source characteristics. The observational challenge is to somehow identify those in real time.

  2. While the goal of tsunami warning is to quantify (e.g., hypocenter, magnitude, focal mechanism, and fault extent) the earthquake as quickly as possible upon detection, it is also imperative to record the source in its entirety in order to assess its full tsunamigenic potential. Bearing in mind, for example, that the source of the 2004 Sumatra earthquake lasted eight minutes, we realize that assessing its size within five minutes is at best a challenge and at worst an impossible task. Unfortunately, there is really no consensus among seismologists as to the deterministic nature of earthquake rupture, namely whether the early stages of nucleation of a large earthquake carry a fingerprint of the eventual true size of the event. Indeed, several examples of delayed sources (e.g., 2001 Peru and 2006 Kuril Islands, both having generated destructive tsunamis) reveal a sudden increase in seismic moment release as late as one or two minutes into their source process; they constitute another class of events violating scaling laws. In lay terms, at the initiation of a seismic rupture, does Mother Nature really know how large the final product will be? Yet it is that final product that will control the tsunami and that the watchstanders at the Tsunami Warning Centers (TWCs) are charged with estimating, as swiftly and as reliably as possible.

  3. Seismic data and sophisticated processing are insufficient to determine the destructiveness of tsunamis. Guisiakov uses the Soloviev-Imamura tsunami intensity scale based on run-up data to show there is only a tendency of increased tsunamis with an increase in earthquake magnitude. The lack of direct correlation can be attributed in part to secondary mechanisms (submarine slumps and slides) in the generation of tsunamis. This is shown in the findings by Plafker where submarine landslides account for many large and destructive tsunamis.

THE MWPALGORITHM

The application of geometrical optics to seismology reveals that the earth’s ground motion resulting from the passage of P-waves in the far-field is related to the time derivative of the history of the deformation or physical slip at the source. In other words, if a permanent deformation (in the form of a step in displacement) is incurred at the epicenter, the far-field signal will register an impulse (or spike) of short duration, followed by a return to quiescence. Conversely, the deformation at the source should be obtainable by mathematically integrating the ground displacement over time in the far field, and by performing a number of theoretically justifiable corrections, which account, for example, for the path from epicenter to receiver. As most seis-



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement