Before returning to these ideas in more detail, the reader might find a brief history of ocean observations to be useful.
Quantitatively useful ocean observations date back only to about the turn of the century, which brought several important advances to the field of oceanography, and might be said to mark the start of a "modern" era. Around this time, empirical formulas were developed relating salinity, chlorinity, and density. These allowed precise salinity measurements to be made, since samples could be titrated to determine clorinity. Coincidentally, although the mathematics governing fluid dynamics had been studied for centuries, general physical theories of ocean circulation also developed with great rapidity beginning around the turn of the century. Many of these advances can be attributed to Scandinavian researchers (for a more detailed history, see the Introduction in Sverdrup et al., 1942).
The oceanographic expedition of the German research vessel Meteor, in 1925-1927, was led by Georg Wüst. It is notable for (at least) two contributions: First, Wüst's careful attention to accuracy and detail rendered the Meteor data useful as a baseline for comparison with later measurements of temperature, salinity, and dissolved oxygen. Second, Wüst conceived of and popularized the "core" method for determining the circulation of water masses. This method is based on the assumption that water parcels acquire their physical characteristics when they are in contact with the atmosphere at the sea surface, and that they retain these characteristics as they sink and flow into the ocean. Thus, Wüst concluded, the large-scale circulation in the ocean is reflected in the patterns of the temperature, salinity, and oxygen distributions. This concept is of fundamental importance to observations of deep-water production and circulation, particularly in recent decades when we have been able to measure many chemical constituents of anthropogenic origin (Schlosser and Smethie, 1995). By the middle of this century, temperature was being determined accurately to within about ±0.02°C and salinity to within about 0.02 permil. The next baseline for observational oceanography was the International Geophysical Year, carried out in the mid- to late 1950s. This coordinated series of expeditions sought to map the physical properties of the entire Atlantic Ocean on a somewhat regular grid, and the resulting data provide the second "snapshot," three decades after Wüst's work, of the North Atlantic temperature and salinity structure. (Fuglister's 1960 atlas presents these data.)
Clearly, because of the limited accuracy of most measurements before 1900, there exist relatively few examples of long time series of measurements useful to the study of climate change. Roughly century-long global or regional records derived from operational measurements of such quantities as sea-surface temperature, sea-ice cover and extent, and sea-level measurements have been accessible to observers much longer than quantitatively useful deep ocean measurements. Decades-long time series of deep-ocean properties do exist, but are generally either mid-ocean and very isolated, or extensive spatially but limited to coastal waters. Fisheries provide strong economic motivation for such programs as the 40-year time series from the CalCOFI hydrographic cruises off the coast of California, or some of the repeated hydrographic data sets maintained for years off the coast of Japan. For several decades, a number of mid-ocean stations were occupied regularly by the ocean weather ships, for the purpose of providing marine weather forecasts.
Although long time series collected explicitly for climate studies or other research purposes are virtually nonexistent, an outstanding exception is the time series of temperature and salinity from the Panulirus station, located in deep water just off the coast of Bermuda, which already has contributed to studies looking at long-term variability in properties of the North Atlantic. Finally, there exists a vast archive of expendable bathythermograph (XBT) data collected from merchant ships, which is global in extent but has remarkably dense coverage in the North Pacific, where the NORPAX program is in its third decade. Although the XBT measures temperature as a function of depth to only 400 or 750 m (sometimes 1500 m), prediction of decade-to-century-scale ocean variability will require emphasis on such upper-ocean monitoring.
Parker et al. (1995) describe how useful baseline data sets can be constructed from historical records to yield a more comprehensive record of, in this case, monthly sea-ice and sea surface temperature fields dating back to January 1871. Since the resulting time series may exceed a century in length, it can be used for forcing and testing numerical models designed to examine variability at decade-to-century time scales. Mysak et al. (1990), analyzing sea-ice concentration and ice-limit data collected over almost 90 years, have found decadal-scale fluctuations in sea-ice extents, and have related them to other processes in the Arctic in a "negative feedback loop." Mysak (1995) reviews the evidence for such self-sustained climatic oscillations, presents more recent evidence strengthening these conclusions, and suggests links between the Arctic cycle and interdecadal variability at lower latitudes. That some of these observed interdecadal fluctuations are regular implies that they may in fact be predictable. Douglas (1995) reaches somewhat more negative conclusions in analyzing two sets of tide-gauge records (80 years and 141 years long): He finds no statistical evidence for acceleration of global sea-level rise, which is predicted to accompany global warming. A major difficulty is that the interdecadal signal overwhelms any longer-term trend. However, an understanding of the physics involved in the sea-level rise would allow this interdecadal signal to be removed from tide-gauge records, and would