Cover Image


View/Hide Left Panel

Page 2

Earth science thus is driven strongly by developments in observing technology. For example, the availability of satellite remote sensing has transformed our view of upper ocean biology. The spring bloom in the North Atlantic, the sudden ''flowering" of phytoplankton (the microscopic plants of the ocean) that occurs over a period of a few weeks, was thought to be primarily a local phenomenon. However, satellite imagery of ocean color (which is used to infer phytoplankton abundance) has shown that this event covers the entire North Atlantic over a period of a few weeks. Here, a new observing technique provided an improved understanding of what was thought to be a well-known process.

There are instances where new observing systems have transformed our understanding of the Earth. For over 40 years, the State of California has supported regular sampling of its coastal waters to understand the relationship between ocean circulation and fisheries production. A sampling grid was designed based on our understanding of ocean processes at the time. When satellite images of sea surface temperature (SST) and phytoplankton abundance became available in the early 1980s, they revealed a complex system of "filaments" that were oriented perpendicular to the coast and sometimes extended several hundred miles offshore. Further studies showed that these filaments are the dominant feature of circulation and productivity of the California Current, yet they were never detected in the 40-year record. The original sampling grid had been too widely spaced. This example shideas can sometimes lead us to design observing systems that miss critical processes.

The interaction between ideas and observations occasionally results in more subtle failures, which may be further obscured by computing systems. A notable example occurred during the 1982–1983 El Niño/Southern Oscillation (ENSO) event. ENSO events are characterized by a weakening of the trade winds in the tropical Pacific, which results in a warming of the eastern Pacific Ocean. This shift in ocean circulation has dramatic impacts on atmospheric circulation, such as severe droughts in Australia and the Pacific Northwest and floods in western South America and southern California. The 1982–1983 ENSO was the most dramatic event of this century, with ocean temperatures 5°–6°F warmer than normal off southern California. This physical event strongly influenced ocean biology as well. Lower than normal salmon runs in the Pacific Northwest are associated with this major shift in ocean circulation.

The National Oceanic and Atmospheric Administration (NOAA) produces regular maps of SST based on satellite, buoy, and ship observations. These SST maps can be used to detect ENSO warming events. Because of the enormous volume of satellite data, procedures to produce SST maps were automated. When SST values produced by the satellites were higher than a fixed amount above the long-term average SST for a region, the computer processing system would ignore them and would use the long-term average value instead (i.e., the processing system assumed that the satellite measurements were in error). As there was no human intervention in this automated system, the SST fields continued to show "normal" SST values in the eastern tropical Pacific in 1982. However, when a NOAA ship went to the area in late 1982 on a routine cruise, the ocean was found to be significantly warmer than had ever been observed. An alarm was raised, and the satellite data were reprocessed with a revised error detection algorithm. The enormous rise in SST over much of the eastern Pacific was revealed. The largest ENSO event of the century had been hidden for several months while it was confidently predicted that there would be no ENSO in 1982.

This episode reveals that the relationship between data and ideas has become more complex with the arrival of computers. The increasing volume and complexity of the data available for Earth science research have forced us to rely more heavily on automated procedures. Although this capability allows us to cope with the volume, it also relies on precise specification of various filters and models that we use to sort data in the computer. These filters may reflect our preconceived notions about what the data should actually look like. Although computers and networks apparently place more data into our hands more rapidly, the paradox is that there is increasing distance between the scientist and the actual physical process. This "hands-off" approach can lead to significant failures in the overall observing system.

As noted by Theodor Roszak 1, raw data are of little value without an underlying framework. That is, ideas come before data. There must be a context for observations before they can make sense. A simple stream of temperature readings will not advance science unless their context is defined. Part of this framework includes the ability to repeat the measurements or experiment. Such repeatability strengthens the claim of the scientist that the process under study is a general phenomenon with broad applicability. This framework also includes a

The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement