Skip to main content

Currently Skimming:

5 Interpretation
Pages 97-106

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 97...
... Perhaps there might be some geophysical data that were collected for a particular project at a nearby location. One or more boreholes also may be available, often including natural gamma radiation and electrical resistivity logs.
From page 98...
... One of the most common methods of data display in two dimensions is through the use of contouring. Although human interpretive contouring is often difficult to beat in the geologic sense, machine contouring algorithms are now routinely used to prepare displays of geological and geophysical data, especially structural contour maps and potential field data maps.
From page 99...
... Data integration should consider all of the data, not just geophysical data, acquired during a site characterization. Multiple sources of data provide the ability to check the quality of individual data sets against each other.
From page 100...
... Knowledge of these four parameters enhances the possibility of predicting fluid flow paths, particularly in fractured media. For example, seismic velocity usually decreases in fracture zones, and radar wave velocity may increase in these same fracture zones, particularly when a large increase in air-filled
From page 101...
... Clay tends to attenuate radar energy, whereas seismic energy often is not attenuated rapidly by propagation in clays. At other sites, seismic waves might be attenuated rapidly in dry, quartzitic sand, whereas radar waves propagate well in the same medium.
From page 102...
... help practitioners and clients understand the capabilities and limitations of the measurements. In addition, rigorous numerical models can be used to improve the quality and reliability of nonintrusive site characterization surveys.
From page 103...
... For example, with electrical and electromagnetic data, a common form of data processing is still to normalize to apparent resistivity or apparent conductivity, which simply matches the data to a homogeneous earth model. Plotting the data in "pseudosection" form using simple guidelines provides depth interpretation.
From page 104...
... Advances in modeling, inversion, and visualization now make it possible to present data in a geologically and visually meaningful way. Shaded relief maps, for instance, have revolutionized the presentation of potential field data (e.g., Plate 3~.
From page 105...
... The creative component consists of conceiving all of the possible geologic models likely to explain the data; the quantitative component involves generating synthetic data for every possible model to demonstrate whether a particular model is consistent with the field data. Generating multiple synthetic data sets from a single geologic model, although not done routinely, is technically feasible.
From page 106...
... Universities and government laboratories should be encouraged and supported to identify deficiencies and develop rigorous computer models that provide realistic descriptions of subsurface properties and processes. · Given the need for data fusion and integrated interpretation, universities and government laboratories also should be encouraged to develop and validate integrated modeling software explicitly designed for site charactenzation.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.