Click for next page ( 88


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 87
5 Data Any measurement program produces data. However, several important aspects of collection are necessary to make such data useful. These include calibration and standards, quality control, data processing, analysis, and archiving. In some cases there are quite formal programs for these procedures for example, routine weather observations and ocean hydrographic data. These data, in most cases, are handled in a prescribed manner from the time they are collected until they are archived. The sensors must meet some established calibration and standard; quality control procedures are specified; the data are transmitted to the operational centers in a prescribed format, and finally they are archived in a specific way. Those who need the information can usually obtain it In a standard format. This is not the case for coastal engineering data. The data requirements for coastal engineering research and de- sign efforts are diverse. A partial list of quantities of interest Is shown in Table 5-1. A variety of sensors and instrumentation ~ required to make these measurements. Some are relatively standardized, such as a tide gauge, for which there are formal calibration procedures and standards, and the data are handed to the archiving center in a pre- scribed way. However, for other measurements, there are a variety of competing sensors and virtually no calibration or standards; qual- ity control is highly variable. In many cases, the data are collected 87

OCR for page 87
88 TABLE 5-1 Quantities of Interest in Coastal Engineering Measurement Units Deep-water wave height and direction Shallow-water wave height and direction Current speed and direction Bottom friction Wave forces Beach profiles Suspended sediments Sediment transport Bed-load transport Sea level Wind speed and direction Water temperature and salinity Breaker zone and type m, degrees m, degrees cm/see, degrees dynes/cm2 dynes m gm/cm gm/sec gm/sec m 3 m/see, degrees T C, S ppt no units and analyzed by an investigator or agency and the only place the data may be seen, before they are archived in a file cabinet, is in a publication. This lack of standards for archived data has an impact on the usefulness and validity of coastal engineering data, which are typically expensive to obtain. QUALITY CONTROL Quality control is fundamental to good science and engineering. Poor data can lead to wrong conclusions and bad designs. If the data are to be used in a mode! simulation, poor quality data may lead to bad mode! output. However, quality control is particularly difficult with some of the measurements needed by coastal engineers. First, there may be no accepted calibration procedures or standards. In many cases, the instrument is a one-of-a-kind creation; in other cases, there are no calibration facilities where the conditions of the coastal zone can be simulated. For example, it is well established that impeller current meters provide different readings in oscillating flows than in steady flows. In a wave field such as the surf zone,

OCR for page 87
89 impeller current meters would not be a good choice for measuring currents. However, even if a suitable impeller meter were found, it would be difficult to establish calibration standards and facilities. Even the electromagnetic current meter, which is commonly used for nearshore measurements in wave fields for many applications, is only calibrated routinely for zero velocity by placing it in a container of still water. All observations contain errors. These can usually be attributed to one or more of several factors, e.g., sensor response or calibration, telemetry, recording, processing, or archiving. For a discussion of quality control, the data can be divided into two categories, (1) operational data and (2) research data. Operational data are collected on a routine and continuing ba- sis by engineers to meet their requirements. Examples are winds, waves, and water level at a particular location to support the de- sign requirements for a structure. These data are normally collected with standard off-the-shelf instrumentation that has been designed specifically for that application, such as cup anemometers, pressure wave gauges, and float tide gauges. The operating principles, cali- bration, standards, and data from these instruments are quite weD understood. Government agencies use these instruments routinely to meet their responsibility for environmental monitoring. Research data, on the other hand, are collected on a very different basis. In this case, the instruments are frequently one-of-a~kind, or, at most, a small production run designed for a special purpose. A hypothetical example would be an instrument designed to measure the boundary friction at a particular site for a short period of time. Normally, only the investigator who owns, and may have built, the instrument is experienced in its use. No widely accepted standards or calibration facilities exist for such instruments, and the data are frequently unique and require special processing and archiving. These data are generally considered to belong to the investigator who collected the data until such time as the results of the investigation are published. Quite different quality control is exercised on these two classes of data. For operational data, there are standardized procedures from the time the data are collected until they are archived. For research data there may be no standard quality control procedure and the data may never be formally archived. The committee has concluded that there is an urgent need to

OCR for page 87
go introduce quality control in coastal engineering measurements, but how this can be accomplished is not clear. Capabilities for Quality Control Quality control requires good instrumentation, careful calibra- tion of all components of the system from sensor to recorder, careful data handling, and finally, archiving. Quality control is a procedure that is exercised at each step along the way to include experimen- tal design, sensor selection, sensor calibration, telemetry, recording, processing, and archiving. The evolution of digital electronics, sensor technology, teleme- try, and recording systems has enhanced our capability to exercise quality control, in some cases in a highly automated fashion. At the same time, these and other recent technical advances generate data at a snuck greater volume, making the quality control problem comparably larger. Unfortunately, in coastal engineering the mea- surement requirements are so diverse and difficult that formalized quality-control procedures are not prevalent. The ultimate use of much data is as an input to a model. As computer technology has progressed the models have become more and more demanding of data, not only in quality but also in quantity. In many cases model requirements for quality-controlled data cannot be satisfied. In general, the techniques are available to provide the necessary quality control, but this capability is not being used with respect to coastal engineering data. Part of the reason for this is that the data are often collected by individuals who have only a limited oh jective in mind for the data. Additionally, the measurement envi- ronment is detrimental to instruments, and maintenance is difficult and costly, particularly for long-term measurements. Coastal engi- neering measurements are often site-specific, and their application to other problems is lirn~ted. Finally, the lack of a comprehensive archiving procedure for coastal engineering places no pressure on the investigators to exercise strong quality control measures. Quality Control Needs There is a need to establish formal quality control procedures for many coastal engineering measurements. This will require agree- ment on the measurements, calibration and standards, recording, processing, and data archiving. This has not been done in the past

OCR for page 87
91 for a variety of reasons. The Corps of Engineers has recently begun to develop procedures for their coastal wave measurement program. However, there are many other types of coastal engineering data for which no quality control procedures have been established. The following needs are identified: . Comprehensive data bases are required to solve many coastal engineering problems. They may be comprehensive in that a va- riety of measurements is collected simultaneously, or in that the measurements are collected over a wide geographical region con- t~nuously. There is a need to develop a comprehensive historical quality-controlled data base to support engineering design and mod- eling. There Is a need to unprove real-t~me acquisition and quality control of coastal engineering data. . Quality, quantity, and timeliness of the data entering the data base should be improved. . Support Is needed for development of quality-control proce- dures and the necessary took and procedures to deal with coastal engineering data as part of coordinated long-term research planning. . Calibration standards and methods should be established for as many coastal engineering measurements as possible; wave data should be given first priority. . Provisions should be made for real-time access to coastal engineering data by a variety of users. . Procedures should be developed for identifying coastal en- gineering data appropriate for archiving and for entering new data into the system conveniently. Appropriate quality control must be applied to both of these functions. . Efforts should be made to acquire good data bases from other countries. STANDARDS AND CALIBRATION Standards and calibration are fundamental to scientific data collection. Unfortunately, in the high-energy, shallow-water coastal zone, there are often no suitable standards or calibration facilities for a particular sensor. Tide gauges, pressure sensors, current meters, and temperature and salinity instruments are examples where stan- dards and calibration facilities exist. However, for many other quan- tities such as sediment concentration, turbulence, radiation stress, and bed load, none exist. In these cases, the only "standard" may

OCR for page 87
92 be that the measurements are taken by other systems, or that they satisfy theory, or are repeatable. Unfortunately, in many cases, little or no thought is given to standards and calibration of coastal engineering measurements. To be sure, it is a difficult problem, but unless it is addressed, the risk of costly mistakes will remain high. DATA ASSI1~I[ATION AND SYNTHESIS As emphasized at the beginning of this section, the product of a measurement system should be (1) a set of data, along with an assessment of its accuracy; (2) controls for the quality of data; and (3) the data-base management for their and use in engineering am plications. The initial motivation for collecting some of the data may be for special engineering or research goals; other data may be part of an ongoing program for acquisition of certain standard mea- surements to build knowledge of the coastal oceanic and atmospheric climate. The types of data and their accuracy can vary greatly. Much data wiD be from direct in situ measurements in the water column; some may be indirect measurements made by remote techniques. This report has identified a number of such indirect measurements made by remote techniques, including those using acoustics (e.g., Doppler and tomographic methods) and electromagnetic sensing via satellites or aircraft (e.g., laser, infrared imagery, drogue tracking, or radio altimetry). The kinds of information inferred from these remote-sensing systems include elevation of the sea surface or seabed relative to the sensor, sea surface temperature, and currents. The in situ direct measurements from fixed sensors can give added information on currents, pressure, and other properties within the water column, at sampling rates that are Innited only by the inherent time constant of the sensor and of the recording and/or telemetry system. When recorded at suitable sampling rates at a single location, pressure or sea surface elevation data can quantify surface waves (but not their direction), tsunamis, tides, and storm surges. Similarly, the sensing of fluid velocity at given rates, and with given spatial resolution, determines its usefulness to studies of circulation, wave kinematics, or turbulence. It should be noted that there are a number of data types for quantifying waves and currents over a wide spectrum of frequencies and for inferring information about sediment transport, erosion, and

OCR for page 87
93 deposition in the coastal zone. Each type of measurement has inher- ent error bounds that depend on sensor type, the energy level of a given event, and the frequency spectrum of the event. A great range exists in the spatial and temporal resolution of measurements and, in many cases, there is a trade-off between accuracy and resolution (either spatial or temporal). Faced with this variety of measurement data, which portrays dif- ferent things, at different tunes, in different places, and with different accuracy and resolution, how can one synthesize at least portions of this data base so as to give a description of coastal zone dynam- ics more meaningful than that derived from any individual set of measurements? One answer ~ to use models that relate the dif- ferent variables In a physically acceptable manner. For data that are not synoptic, and especially for data taken to quantify small- scale and/or high-frequency processes like turbulence, the synthesis is largely a matter of the contribution by these data to the proper representation of processes that cannot be resolved adequately in predictive models, such as bed stress and sediment dynarn~cs. The committee sees this approach as an iterative tuning process between mode! adequacy and data adequacy. The foregoing interactive synthesis for mode} development was discussed throughout Chapters 3 and 4. The kind of data synthesis that was not highlighted in those chapters relates to the problem of combining different types of data in a more direct diagnostic man- ner, so as to give a description of a synoptic field. Examples include synthesis of wind field or water circulation in the nearshore zone, thereby providing better spatial resolution and suppressing individ- ual measurement errors. An example of this synthetic approach was pointed out to the committee by Richard Seymour (persona] commu- nication). His example pertains to the problem of estimating wind fields in the nearshore zone where data are not generally available from the National Weather Service (NWS) of NOAA. Such informa- tion can be import ant in nearshore generation of surface waves and surges. Basically, the methodology is the following: Suppose some auxiliary measurements of wind velocity are made in the nearshore zone to supplement those made at land stations and those from offshore meteorological buoys and ships at sea. It is well known that wind velocity and sea-level atmospheric pressure gradients are related in a deterministic manner and hence, the assimilation of the additional nearshore wind velocity measurements with the barometric pressure and wind information available from NWS can

OCR for page 87
94 yield a blended description of the wind field in the nearshore zone, which is an improvement on that deduced from the NWS data alone. Another example is related to nearshore circulation (Iow-frequen- cy horizontal flow of water, but outside of the zone where rip currents are active). If one adopts the hypothesis that the flow is quasi-steady, then the circulation field can be represented in terms of a stream function (the analog of pressure in the case of the wind field). It is then possible to combine all the data on low-frequency horizontal current measurements (for a reasonably synoptic period) to estunate the stream-function and therefore the circulation. The data may include direct in situ measurements from fixer! current meters at a few locations, Lagrangian drifted data, and sea-level and satellite- derived ~rnagery of surface temperatures all of which are related to the stream-function. The methodology, when carried out objectively, requires representation of the unknown stream-function. Specific examples of the technique as applied to oceanographic mesoscale eddies are given in Vastano and Reid (1985) and McWilliams et al. (1986~. Other examples could be given that involve more complex tech- niques of information theory and objective analysis, which are beyond the scope of this report. It suffices to conclude this section by citing the principle that the whole can often be better than the sum of the parts, provided that the parts are compatible with an appropriate di- agnostic or prognostic model. Thus we return to a theme inherent in the text of this report: Both models and data are essential elements of a coastal engineering measurement system.