Ocean Infrastructure for 2030: Categories and Trends
Changes in infrastructure over the past two or more decades provide an important perspective when planning for the next two decades. The committee identified trends in the development and use of supporting infrastructure for ocean research, focusing mainly on the past 20 years (19902010), as a means to extrapolate toward 2030. When taken in association with the major research questions found in Chapter 2, these trends guided the committee’s discussion of infrastructure categories that should be included for planning the next 20 years and are achievable with attention and support. Many of the questions deal with changes in spatial and temporal range and resolution, needs for more precise, accurate sensors, or development of advanced sensors for important physical and biogeochemical properties. Where possible, these are discussed in terms of changes over the past 20 years and likely trajectories for the next two decades. Infrastructure assets and trends are divided into the following categories: mobile and fixed platforms, in situ sensors and sampling apparatus, remote sensing, modeling, computational and network services, and supporting infrastructure.
The chapter focuses on common or shared infrastructure rather than supporting infrastructure generally found in the inventory of an individual scientist, as this is often prototype or highly specialized. Many current ocean infrastructure assets began in this manner and were nurtured to maturity over a period of years by astute sponsors. This leads to another emerging challenge related to agency support for the development of new instruments. Many of the sensors and platforms currently in widespread oceanographic use arose from investments by the Office of Naval Research (ONR) under the aegis of national security. The ONR technology investment is no longer strongly aligned with many of the ocean research questions expected to be of interest in 2030, leading to its diminished role in sustained funding for “high-risk, high-reward” ocean infrastructure. To foster innovation and technological advancements in the ocean sciences, federal agencies will need to encourage a risk-taking environment. However, this is difficult under the current peer-review system.
A brief review of usage and trends associated with each specific type of infrastructure is provided, with supporting information drawn from examinations of referenced reports, presentations by invited speakers, community input, and committee members’ expert judgment.
Technology and infrastructure trends for the future are then discussed, including ways in which ocean infrastructure will need to evolve to meet future research goals, and the types of capability that will need to be developed.
The UNOLS and Federal Fleets
Oceanography has historically required access to the sea, and it is anticipated that ships will continue to be an essential component of ocean research infrastructure (USCOP, 2004; NRC, 2009b). The past few decades have seen a trend toward lower total ship days per year for the University-National Oceanographic Laboratory System (UNOLS) academic research fleet (a 13 percent decline from 2000-2008; NRC, 2009b). At the same time, operational days for the largest research vessels (Global Class) have generally increased over the past 20 years; they are the most highly subscribed vessels in the fleet. This trend may be related to increasing interdisciplinary and multidisciplinary science, as well as the Global Class’s ability to support multiple science operations with a larger science party, greater laboratory areas, and more deck space (NRC, 2009b).
The UNOLS Fleet Improvement Plan (2009) projects reductions of nearly 40 percent in available ships by 2025, due to ship retirements and fewer new vessels entering the fleet, yet a lower demand for access to the ocean is not anticipated. The cost of ship operations increased 75 per-
cent from 2000 to 2008, largely influenced by rising crew and fuel costs (Fleet, 2009b; Figure 3.1). Over the past 10 years there have been several instances of academic research vessels being laid up to offset rising costs, resulting in fewer ship days being funded. There has been continued use of ships of opportunity (e.g., foreign icebreakers, small ships with global capability to deploy autonomous platforms) and specialized ships (e.g., submersible support ships; fisheries vessels), some of which are part of the UNOLS or federal ship fleets. This move toward specialized ships reflects an effort to optimize the limited resources available for seagoing operations. It also supports the idea that the recent decline in funded ship days for the academic research fleet does not reflect a corresponding lack of science demand, but is rather affected by agency budgets and investigator’s proposal success rates (NRC, 2009b).
Mission-oriented marine research and survey ships are currently operated by the National Oceanic and Atmospheric Administration (NOAA) and the Environmental Protection Agency, among others, to support their congressional mandates for efforts such as fisheries surveys, ecosystem assessments, water quality assessments, hydrographic surveys, and seafloor mapping (Interagency Working Group on Facilities, 2007). NOAA has recently acquired four advanced, acoustically quiet fishery survey vessels and has several more being built or planned. In support of priority objectives laid out in the National Ocean Policy (CEQ, 2010; E.O. 13547), these ships will remain essential components of ocean research infrastructure.
The nature of shipboard work may change as a consequence of increasing numbers and capabilities of over-the-side systems (NRC, 2009b), which will increase operational efficiency. Increasingly multidisciplinary and interdisciplinary research requires vessels with support for a wide diversity of platforms and instruments, and increasing ship costs motivate greater use of autonomous assets. To meet these needs, the past two decades have seen significant increases in dynamic positioning and station holding capabilities, multibeam and sidescan sonar systems, and more complex sensors and instrumentation. This has also led to an increasing dependence on shipboard science technical support. One metric for planning future fleet capacity and capability
could be the number of scientists using the academic fleet in larger interdisciplinary groups vs. those in smaller, focused campaigns, taking into account potential locations for future research. Another metric could be the number and capabilities of extended duration instruments, including autonomous vehicles, which could lessen the number of scientists at sea. Future trends include a fleet composed of both adaptable, general purpose platforms and specialized ships to meet a broad range of research activities; sustaining the number of larger, general purpose platforms; and growing the capabilities and numbers of smaller ships. The committee endorses the following recommendation from the 2009 NRC report Science at Sea: Meeting Future Oceanographic Goals with a Robust Academic Research Fleet: “The future academic research fleet requires investment in larger, more capable, general purpose Global and Regional class ships to support multidisciplinary, multi-investigator research and advances in ocean technology.”
Icebreakers and Other Polar Assets
With the loss of polar assets over the past two decades, there is diminished capability for the United States to address polar science questions. The United States currently conducts high-latitude oceanographic research using a combination of U.S. Coast Guard icebreakers, charters, and international partners (NRC, 2009b), as well as limited use of U.S. Navy submarines. Icebreakers are uniquely capable of carrying out ship-based science in ice-covered oceans; as such, they require specialized construction, operations, and maintenance. While the reduction of ice cover in the Arctic during summer and fall has been dramatic (e.g., Stroeve et al., 2008), ensuring access to both the Arctic and Antarctic in the foreseeable future will still require the ability to operate in fully or partially ice-covered areas. Nuclear submarines provide a unique under-ice capability; from 1993 to 2005, the U.S. Navy made these available to civilian ocean science researchers through the Scientific Ice Expeditions program (SCICEX Science Advisory Committee, 2010). Nuclear submarines complement icebreakers and have potential for increased ocean research use, but are not a replacement for future needs. They provide an efficient mapping platform (e.g., for multibeam operations) but do not support the types of over-the-side operations that are and will be carried out from a ship. They are also very expensive for routine science missions, and unlikely to become less so.
While scientific research at high latitudes is characterized by a high level of international collaboration, the loss of U.S. icebreaker capability may become an issue of national security and competitiveness in future years. The committee endorses the following recommendation from the 2007 NRC report Polar Icebreakers in a Changing World: An Assessment of U.S. Needs: “The United States should continue to project an active and influential presence in the Arctic to support its interests.”
Scientific Ocean Drilling Platforms
From 1985 to 2003, the oceanographic community had access to the JOIDES Resolution riserless drillship as part of the Ocean Drilling Program (ODP) and, later, the Integrated Ocean Drilling Program (IODP). After a refit from 2006 to 2009, the JOIDES Resolution returned to service and is expected to remain available for science operations through the end of IODP in 2013. In 2000, the Japanese riser drillship Chikyu was built and has since been used for science operations in support of IODP. The number of operational days for the JOIDES Resolution has decreased 30 percent between 2003 and 2009 (Brad Clement, personal communication, 2010). International agreements, such as those used by IODP to ensure access to very expensive infrastructure assets like drillships, are perhaps one method to increase the use and efficiency of ocean research infrastructure worldwide. Leasing arrangements with the industrial sector may also be an option to pursue (Fleet Review Committee, 1999).
A national long-range plan for the overall capacity and mix of capabilities of the U.S. academic research vessels is clearly warranted (e.g., Fleet Review Committee, 1999; Federal Oceanographic Facilities Committee, 2001; USCOP, 2004; NRC, 2009b; UNOLS, 2009). Such a plan could lay out the resources needed for technology upgrades and new construction, and phase out of older platforms; explore usage trends and alternative options for use, such as leasing; direct interagency agreements and international opportunities; and provide a roadmap for tracking progress. The committee endorses the following recommendation from the 2009 NRC report Science at Sea: Meeting Future Oceanographic Goals with a Robust Academic Research Fleet: “Federal agencies supporting oceanographic research should implement one comprehensive, long-term research fleet renewal plan to retain access to the sea and maintain the nation’s leadership in addressing scientific and societal needs.”
Human Occupied and Remotely Operated Vehicles
Since the early 1990s, the dominant working platforms for the deep ocean science community have been human occupied vehicles (HOVs) and remotely operated vehicles (ROVs). In a much more limited capacity, U.S. Navy nuclear submarines have also been used (see previous section). Prominent among the current platforms are the HOV Alvin and the ROVs Jason and Jason II, in part because of their participation in the National Science Foundation-funded Na-
tional Deep Submergence Facility1 (NDSF). Although Alvin use has decreased by approximately 20 percent over the past two decades (1,339 dives from 1990 to 1999; 1,070 dives from 2000 to 2009), there has been a dramatic increase in both the number of ROVs available and their use for science. For example, Jason and/or Jason II dives increased from 162 during 1990-1999 to 527 dives during 2000-2009 (Annette DeSilva, personal communication, 2010). Other non-NDSF funded ROVs, operated by several U.S. institutions, have also seen increases in usage over this timeframe. For example, Monterey Bay Aquarium Research Institute (MBARI) ROVs logged approximately 3,500 dives during the same time period (Steve Etchemendy, personal communication, 2010).
The increase in ROV use reflects a variety of factors including advancements in robotic technologies, such as better manipulator dexterity; increased payload ability, equivalent to HOVs; and longer sustained dive times. There is also more use of telepresence, which allows shore-based audiences to virtually participate in ROV operations. Current industry use of ROVs offers some possibilities for next generation science, including higher power systems and multiple vehicles operating in the same area. Based on the committee’s assessment of science questions in 2030, the demand for highly capable ROVs is very likely to increase, while the demand for HOVs is likely to remain stable. Although HOV use has declined modestly in the past two decades, ongoing and planned Alvin upgrades2 will increase its depth rating from 4,500 to 6,500 m, enabling it to operate in 98 percent of the ocean.3
One future direction may be in the use of hybrid vehicles that combine components of traditional ROVs and autonomous underwater vehicles (AUVs) for greater capability and operations at full ocean depth, such as the hybrid ROV Nereus. Another may be in increased use of nonnuclear submarines, such as smaller air-independent propulsion platforms, which are common in navies other than the United States.
Submersible vehicles have also seen increasing sophistication in sensors and sensor payloads as well as quality of and ease of obtaining navigation. To eliminate the time required to deploy and calibrate long-baseline transponder arrays, there have been trends toward using a combination of GPS navigation and ultra-short baseline acoustic tracking on the ship to determine the position of underwater vehicles and DVL (Doppler Velocity Log)-aided inertial navigation systems (e.g., Whitcomb et al., 1999; Kinsey et al., 2006, and references therein) on the underwater platforms to achieve high-accuracy positioning (within meters). This is a critical need for addressing many of the science questions anticipated in 2030.
Towed platforms became critical components of ocean exploration during the past two decades, capturing acoustic and optical imagery as well as oceanographic data and samples for many environments, ranging from just below the sea surface to the deep seafloor (e.g., Wiebe et al., 2002; Davis et al., 2005). Unlike analogous sensors mounted on ship hulls, sensors mounted on towed platforms can be deployed more flexibly from a range of vessels, including ships of opportunity. Moreover, their depth can be controlled from the surface, providing better control. Often the cable connecting a towed system to the surface vessel serves as its own platform for small sensors like thermistors and plume recorders (Baker and Milburn, 1997), which serve to provide nearly synoptic views of the water column during the towed system’s primary mission. In the past decade, seafloor survey operations have begun to shift from use of towed vehicles to use of AUVs, particularly in deep water. While towed vehicles can be supplied power from the ship, and therefore operate higher-power sensors, AUVs can operate at higher speeds than is typical of deep tow, offer a very stable platform for sonar sensors, and are capable of closely following seafloor terrain. However, towed systems are likely to continue to be a method for collecting samples, including seawater from depth for shipboard analysis, in the near future. As AUV capabilities increase, there is likely to be some impact on the use of towed systems. This is especially true in areas where it is difficult to deploy towed systems, such as ice-covered seas. AUVs are currently the preferred sonar mapping platform in commercial industries such as oil and gas. As AUVs mature and their cost of operation drops, towed platform applications will likely continue to migrate to AUVs.
Autonomous and Lagrangian Systems
Autonomous and Lagrangian platforms operate without tethers to ships or to the seafloor (Rudnick and Perry, 2003). Included in this class of devices are drifters that move with the surface current, floats with adjustable buoyancy that profile the water column from surface to depth, underwater gliders that fly horizontally with up-down profiling, and self-propelled AUVs. This category of platforms has seen a remarkable increase in capabilities, numbers, and use over the past two decades (Dickey et al., 2008).
The increasing effectiveness of autonomous and Lagrangian platforms has been influenced by “consumer” technologies driven by commercial markets outside ocean science. Circa 1990, there were only a few 8-bit microprocessor systems with sufficiently low power consumption for autonomous deployments, and they had volatile solid-state memory and limited computational power and data storage. In 2010, processors with orders-of-magnitude-higher computational power can navigate systems, command sensors
and actuators, adapt missions, and retain gigabytes of data in robust solid-state memory. There have been parallel improvements in power availability, including the transition from alkaline to lithium batteries. Consumer-driven advances in microelectronics are likely to continue to benefit the ocean research community through increased platform capabilities. This will be enabled by modular platforms that can easily accommodate rapidly evolving sensors.
In coming years, autonomous and Lagrangian platforms are likely to be deployed in larger numbers to provide improved spatial coverage and resolution during process studies, routine monitoring, and event response. This will lead to a need to form scalable arrays of devices, optimized for the specific task and available at locations of interest. In sufficient numbers and with a sustained presence, such arrays can provide data that are currently needed for routine model assimilation and skilled forecast models.
The first observations of ocean flow were probably by surface drifters, including work by Benjamin Franklin (1785) and Irving Langmuir (1938). With the advent of satellite communication in the 1970s and 1980s, the use of drifters increased rapidly. Global deployment takes place through the Global Drifter Program,4 an array that grew from fewer than 100 satellite-tracked drifters in 1988 to at least 1,250 in 2010. Drifters can carry a wide variety of sensors, measuring such variables as temperature, salinity, wind, light, passive radiation, and atmospheric pressure; these types of observations have led to global maps of surface circulation (Niiler et al., 2003). The use of drifters is seeing growing application in the coastal ocean, especially in dispersion studies (e.g., pollutant tracking, larval transport). Due to their wide commercial availability, relatively low cost, and ease of use, drifters will continue to be used. A broader suite of sensors, especially for ocean-atmosphere flux studies and monitoring, are needed for future science research. Newer developments in drifter-like assets include surface floats that can develop propulsion from wave action near the surface, which allows them to travel separately from the local surface drift.
Floats The first neutrally buoyant floats were designed to observe subsurface currents (Swallow, 1955). During the 1970s and 1980s, float tracking began to make use of the ocean sound channel, and eventually autonomous profiling floats were developed to periodically surface for navigation updates and data telemetry by satellite (Davis et al., 1992). In addition to velocity measurement, floats have measured a wide and growing variety of oceanic variables (e.g., temperature, salinity, chlorophyll fluorescence, dissolved oxygen, nitrate); this is almost certain to increase by 2030. Because floats are stable, they are also able to observe challenging quantities like turbulent microstructure and vertical velocity (D’Asaro, 2008). Today, the international Argo program sustains at least 3,000 floats in the global ocean, each providing a 1,000- or 2,000 m profile of temperature and salinity once every 10 days (Roemmich et al., 2004). The present 3,000-float array was populated in less than 10 years. Future trends include an increase in numbers of floats; variety of observations; enhanced two-way satellite communication for active piloting and adaptable missions; full profiling of the entire water depth; and under-ice capabilities to extend float coverage to high latitudes. The need for longer endurance across a wide range of sensor types and environments will undoubtedly bring challenges in power requirements; these might be met by innovative methods of energy storage or harvesting. The Argo-type float array has been very successful and shows great promise for a robust, low-cost global capability that can provide subsurface observations able to inform both at sea campaigns and skillful ocean models.
Gliders Underwater gliders are the fulfillment of Stommel’s (1989) vision of buoyancy-driven devices that profile vertically while flying horizontally on wings. In the past decade, gliders have transitioned from prototypes (Eriksen et al., 2001; Sherman et al., 2001; Webb et al., 2001) to widely used tools for a variety of research purposes (e.g., Davis et al., 2003; Rudnick et al., 2004; Glenn et al., 2008; Hodges and Fratantoni, 2009), with several hundred now in operation. For example, the Navy has commissioned 150 gliders for use in both oceanographic research and national security (Rusling, 2009). Gliders can carry many types of sensors (e.g., temperature, salinity, velocity, nutrients, optics, fluorometry, acoustics), a suite which is likely to grow in the next two decades. Because gliders are typically recovered and reused (unlike many floats and drifters), there will be pools of gliders that can be made available for event response; the scientific community mobilized several gliders in response to the Deepwater Horizon oil spill. With more robust capabilities, including the ability to work under ice and in other extreme environments, and longer endurance, gliders are very likely to become ubiquitous elements of regional ocean observing systems by 2030. A likely trend is toward easier deployment, perhaps from ships of opportunity, offshore platforms, or aircraft. In the next 20 years, gliders may become inexpensive enough to lessen the need for recovery.
Autonomous Underwater Vehicles AUVs are self-propelled, uncrewed underwater vehicles. Basic characteristics include a power source, payload capabilities, and onboard controls capable of executing missions without regular human supervision. AUVs have been configured to carry a wide variety of in situ sensors, including water samplers. In comparison to gliders or floats, AUVs are more flexible platforms because they can travel at a chosen depth
as well as steer, climb, and dive in response to commands, preprogrammed instructions, or adaptable observation strategies. Although most current AUVs are optimized around higher power payloads (e.g., multibeam or side-scan sonar) and therefore have generally shorter endurance than gliders (days versus months), in principle they will be capable of greatly increased range and endurance by 2030. A prototype long-range AUV was recently demonstrated (Bellingham et al., 2010). As with gliders, most AUVs can operate in a range of environments (e.g., the continental shelf [Brown et al., 2004; Johnson and Needoba, 2008]; coral reefs [Shcherbina et al., 2008]; under ice [Nicholls et al., 2008]) and can be deployed from multiple platforms. The oil and gas industry routinely uses AUVs for deepwater mapping, the U.S. Navy has spent at least two decades making large investments in AUV technology for a range of military applications, and NOAA uses multi-instrumented AUVs that can be deployed from its fisheries survey vessels to augment a variety of marine ecosystem investigations.
In 1990, there were no AUVs in routine operation for science, and today there are a range of commercially available vehicles. While still in their infancy as platforms, a substantial improvement of AUV capabilities, reliability, and usability can be expected over the coming decades.
Energy storage is a fundamental limitation for all autonomous systems at sea. Although battery technology has advanced in past decades, progress has been incremental rather than revolutionary. Development of new battery systems has been primarily driven by the portable electronic industry to power devices such as cell phones and laptops. However, the advent of electric cars promises to generate further technical advances relevant to marine instrumentation. Not only may this industry create new high-energy-density systems, but it is also likely to encourage an increased focus on safety, a particular concern in marine applications. There are also some classes of electrochemical energy storage systems peculiar to the marine environment, including seawater batteries that depend on the surrounding environment for an oxidizer. Advanced lithium-based seawater batteries with very high specific energy have been developed in prototype and may be in common use by 2030.
Environmental energy (sun, wind, wave, thermal, chemical) offers a promising route to power the growing inventory of autonomous platforms used for oceanographic research. Solar power on ocean moorings was rare in the 1990s and is routine today, as are wind power generators. Solar-powered AUVs that recharge their batteries at the ocean surface have been tested (Crimmins et al., 2006). One type of profiling drifter uses thermal temperature differences to generate electrical power.5 There has also been development of autonomous surface vessels that scavenge energy for propulsion. One device uses wave energy for propulsion and has demonstrated ranges of thousands of kilometers even in low sea states (Willcox et al., 2009). Autonomous sailing vessels have also been developed (Neal, 2006) and have potential to serve as research platforms.
In addition to the broad categories of systems described in earlier sections, a number of platforms have been developed either as prototype systems or as specialized solutions to specific sensing problems. For example, seafloor experiments and observations can be carried out by benthic landers or crawlers (e.g., Sayles, 1993; Smith et al., 1997). These range from comparatively simple sensor platforms to systems capable of carrying out perturbation experiments on the seafloor (Sherman and Smith, 2009). With the installation of scientific cabled observatories, some of these systems are being designed to be operated attached to a cabled system, while others are intended to operate autonomously. The power and bandwidth available through cabled systems can be used to extend AUV operations, potentially making them independent of a ship for extended periods. AUV docking has been demonstrated by many groups (Cowen et al., 1997; Singh et al., 2001; Stokey et al., 2001; Evans et al., 2003; Fukasawa, 2003; Allen et al., 2006) with more recent work exploiting the capabilities of cabled observatories (McEwen et al., 2008). Another developmental concept with ocean research applications are unmanned aerial vehicles (UAVs) equipped with GPS, energy-harvesting solar cells, and diverse sensor packages. These UAVs could monitor the ocean surface in the same manner as a drifting buoy and reposition themselves via flight (Meadows et al., 2009).
FIXED PLATFORMS AND SYSTEMS
Since the development of moored surface buoys in the 1960s, mooring developments have enabled a wide range of studies addressing fundamental climate, weather, physical, and biogeochemical questions. Arrays of moorings provide the backbone to many ocean networks today, from oceanatmosphere interactions to global tsunami warning, with increased utility through real-time two-way communications and profiling capability. Although their uses may evolve, moorings will remain a key element of ocean observing infrastructure by providing high-frequency fixed location data to supplement spatial data collected by mobile sampling networks and satellite remote sensing. Importantly, they also mark the surface location of subsurface infrastructure and sensor networks; therefore, even without sensors, moorings provide an invaluable service. Within the United States, only a limited number of federal and academic institutions maintain the expertise to build reliable deep-ocean moorings and to overcome the difficult operating conditions encountered in the ocean. Coastal moorings, which often
have lesser observational requirements but more challenging surface environments and hazards, have attracted a larger number of commercial, federal, and academic institutions capable of development and deployment. Mooring systems will continue to be critical for both fundamental research and routine monitoring needs through 2030.
The need for sustained, long-term scientific observations and data collection in the coastal and deep ocean (NRC, 2003a) has resulted in deployment of seafloor cables, which provide high power and bandwidth and continuous realtime two-way communications. In the 1990s, early systems included deployment of dedicated seafloor cables or took advantage of existing telecommunication cables no longer used by industry. For example, the Japanese DONET cable was driven by a national need to better understand undersea earthquakes. During the past 10 years, the distribution and capabilities of science cables have expanded globally with many countries now deploying seafloor networks. The United States currently has several existing or planned cables (e.g., Long-term Ecosystem Observatory,6 Martha’s Vineyard Coastal Observatory,7 Kilo Nalu Nearshore Reef Observatory,8 Monterey Accelerated Research System,9 OOI Regional Scale Node10). Use of seafloor cables will increase in the coming decades because of their ability to host a wide variety of platforms and sensors and their high power and bandwidth capability. The large-scale construction and installation of cabled observatories has begun only recently, along with early stage instrument development. Scientific use is still in the future, so the impact of cabled observatories cannot yet be predicted. A future trend could include some means to remotely recover physical samples in lieu of research cruises, perhaps via released data capsules collected by unmanned vehicles.
Since 1991, over a dozen borehole observatories (Circulation Obviation Retrofit Kits [CORKs]) have been installed in ODP and IODP borehole sites to characterize subseafloor hydrological regimes. These platforms were first envisioned in the late 1980s as a method to investigate hydrologic perturbations in the subseafloor associated with faulting and diking, tidal forcing, and other physical events (Davis et al., 1992; Becker and Davis, 2005). Since that time, CORKs have been augmented with fluid and microbial sampling capabilities, thermistor arrays, pressure sensors, and in situ seismometers and strain gauges. In the past few years, active tracer experiments between boreholes have measured formation permeability and flow rates in subseafloor aquifer systems. Because study of the subseafloor currently suffers from very sparse in situ observations, the numbers of borehole observatories and the types of sensors available for deployment are likely to grow in the coming decades. In association with cabled observatories, some CORKs can and will be able to utilize high power and bandwidth for real-time monitoring of basement conditions. With increased power capabilities, borehole sensors could expand to include mass spectrometers and in situ microbial analyzers for coregistered measurements of chemical properties and subseafloor microbial communities.
In the past two decades, use of floats, gliders, ROVS, AUVS, and scientific seafloor cables has increased; use of ships, drifters, moorings, and towed arrays have remained stable; and use of HOVs has declined. Based on these trends, utilization and capabilities for floats, gliders, ROVs, AUVs, ships, and moorings will continue to increase for the next 20 years, and HOV use is likely to remain stable. Ships will continue to be an essential component of ocean research infrastructure; however, the increasing use of autonomous and unmanned assets may change how ships are used. Cabled observatories are only now being installed on a large scale, and while their use will undoubtedly increase due to increased availability, the nature of their scientific impact cannot be predicted.
DATA TELEMETRY AND COMMUNICATIONS
Communications to and from platforms at sea has changed dramatically in the past two decades. In 1990, scientific communications from ship to shore occurred primarily through voice calls patched through a satellite, to a shore operator, and then linked to a collect phone call. By 2000, scientists at sea had access to email that was sent between ship and shore a few times per day, allowing for limited communications and data exchange. Today, real-time connection to the Internet is routine, including the ability for real-time video transmission. These fleet improvements have led to a greatly increased capacity to conduct complex, interdisciplinary projects, to encompass the broader community of scientific knowledge, and to engage the public.
An array of low-power, low-Earth-orbit satellite communication systems has enabled rapidly evolving capabilities for communications to autonomous platforms. In 1990, the Argos satellite system11 was the primary link for scientific data from remote platforms. Communications were only one way, from platform to shore, and data transmission was limited to about 16,000 bits/day. Today, the Iridium satellite
system12 provides global coverage with two-way communications at a rate of 2,400 bits/second, a 10,000-fold speed improvement. Seafloor cabled networks offer much higher, bidirectional bandwidths but are likely to be limited to a few, fixed locations for the foreseeable future. Both types of systems allow scientists on shore to operate sensor systems in an adaptive mode, based on the data taken or on other sources of information, such as remote sensing imagery.
Some of these communication technologies have been essential to the development of ocean science capabilities and have no equivalent replacement. For example, virtually all low-Earth satellite communication systems have gone bankrupt at some point. Without support from sources such as the Department of Defense, key communication systems such as Iridium might not currently be available. Unfortunately, the means of communication for autonomous systems generally remains fixed for the duration of a long deployment or the platform’s lifetime, often years. The risk of a single-point failure due to a sole means of communication is clear and argues for some redundancy in data pathways, as well as a set of standards common to any provider. An innovative redundancy solution is “store and forward” capability, which could be located on commercial ships and aircraft, offshore platforms, or even miniature satellites. These systems could provide backup capabilities, as well as services in areas that currently have poor coverage, such as polar regions. Another solution by 2030 could be networked devices that pass information along to other members until the data arrives at a node with connectivity to shore. Advances in the application of key enabling infrastructure like GPS will continue to be driven by commercial activity, but could lead to breakthroughs in geolocation. Two-way communications, especially for platforms, has been truly transformative in the past two decades and will remain essential to ocean research infrastructure assets in the future. However, key infrastructure components are reliant on technologies outside of the ocean science community, particularly satellite communication and GPS.
IN SITU SENSORS
Mobile and fixed platforms provide access to the ocean, but the sensors that operate aboard them are the essential elements that enable observations over broad spatial and temporal scales. Many new platforms have enabled the transition from infrequent ship-based measurements to a sustained ocean presence, but there is a continuing need for innovative, robust, low-cost sensors to explore the ocean. The types of data collected 20 years ago to simply constrain initial conditions for ocean models are now routinely used in real-time, data-assimilating forecast models. Modeling needs for a variety of societal objectives will continue to grow in the coming decades, and the in situ data collected from the ocean will need to reflect a broad range of processes and constrain parameters for best model fidelity. Trends for the future include more multidisciplinary sensor packages with long endurance, stability, and range in multiple operating environments. Along with improved performance and reliability, it will be essential to get precise sample and data locations in the undersea environment, especially with the almost ubiquitous use of geographic information systems and the increasing move toward coastal and marine spatial planning, as outlined in the National Ocean Policy (CEQ, 2010). The problem of biofouling in the upper ocean, however, remains a challenge for the sustained performance of oceanographic sensors.
The primary in situ sensors for physical oceanography are integrated conductivity, temperature, and depth (CTD) units and sensors for current velocities. The CTD was introduced in the 1970s and by the 1990s was commonly used in shipboard operations. In 2010, CTDs were common on almost all situ platforms (e.g., moorings, floats, AUVs). The U.S. National Oceanographic Data Center receives about 5,000 ship-generated vertical CTD data profiles each year, while profiling floats currently deliver about 10,000 profiles per month, albeit only to depths of 2,000 m (Freeland et al., 2009). In 1990, the state of the art for observing ocean currents was moored, mechanical current meters, and acoustic Doppler current profilers (ADCPs) had just been introduced as a commercial product. Today, nearly all current measurements are from ADCPs, which can sample over broad depth ranges at variable resolutions, can provide vertical velocity, are immune to most fouling, and have high reliability. Acoustic Doppler velocimeters, which sample three-dimensional velocity in one location at high frequencies, are now enabling measurements of turbulent energy and can provide an estimate of turbulent fluxes when coupled with other rapid sampling sensors (e.g., for O2; Lorrai et al., 2010).
Although basic sensor technologies for physical oceanography are well established, the challenge will be to extend observations across all spatial and temporal scales, including to the microscales at which turbulent dissipation takes place. This is likely to lead to high volumes of data at smaller scales and higher frequencies. Another area of importance will be sensors that measure fluxes (heat, mass, and momentum) at the ocean surface, coupled with gas exchange rates for chemically active and inert components. Together, these data will be critical to understanding ocean-atmosphere interactions, particularly during high wind and storm events. At larger scales, acoustic methods that enable remote sensing of the ocean interior and tomography are expected to continue. Their application may be more likely through adaptive arrays from a mix of mobile platforms. Optical and radar remote sensing techniques for ocean surface processes are currently
largely satellite based, but developments in focal plane arrays and miniature radars offer opportunities for small, relatively inexpensive sensors that could be deployed on mobile platforms (e.g., small aircraft [Dugan and Piotrowski, 2003], tethered balloons, commercial aircraft [following the current practice of automated atmospheric sensors for weather forecasting; Moninger et al., 2003]) or at fixed locations (e.g., coastal video monitoring13). In addition, tide gauge networks and sensors capturing river outflow and precipitation will continue to be needed for understanding physical processes in coastal and near-shore regions.
The past two decades have seen a dramatic increase in chemical sensors for oceanographic research, including sensors capable of operating in some of the most extreme environments on Earth. In 1990, there were almost no chemical sensors in routine use for autonomous, in situ applications. Instead, virtually all chemical measurements required scientists aboard a research vessel collecting samples for later laboratory analysis. Today, new sensors are rapidly developing as a result of technical advances in a number of fields outside oceanography. As size, power requirements, and costs drop, advanced chemical sensors are likely to expand greatly. Oxygen sensors have been deployed on hundreds of profiling floats (Gruber et al., 2009); sensors that measure carbon dioxide partial pressure operate on moorings around the world (Borges et al., 2009); and nitrate sensors have been deployed for multiple years (Johnson, 2010; Johnson et al., 2010). These sensors sample on the same scale as CTDs, providing unprecedented spatial and temporal resolution for chemical parameters. Figure 3.2 shows 8 years of dissolved oxygen measurements made from a profiling float near the Hawaii Ocean Time-series study site. These data were used to resolve a long-standing debate on whether the open ocean consumes or produces oxygen, demonstrating that the large oxygen maxima appearing within the euphotic zone each summer were a result of an oxygen-producing ecosystem (Riser and Johnson, 2008). Such long-term chemical measurements have only been accomplished in the past decade.
Prototype research sensors for trace elements, inorganic carbon species, and a variety of nutrient elements are currently being developed, while other chemical sensors are currently being used in extreme environments (e.g., hydrothermal vents, anoxic sediments). Most recently, in situ mass spectrometers mapped the subsurface oil plume resulting from the Deepwater Horizon oil spill (Camilli et al., 2010). Often, these prototypes can suffer from problems due to excessive mechanical complexity, biofouling, or insufficient temporal stability. However, the success of oxygen, carbon dioxide, and nitrate sensors demonstrate that chemical sensors are at a level similar to physical oceanographic sensors in the early 1990s; undoubtedly, there will be a significant increase in their use aboard autonomous platforms by 2030. Sensors that enable observations of the CO2system (including pH) and speciation of key micronutrients, such as iron, will be central to a number of studies, especially as micronutrient analytical systems are miniaturized or made more portable.
Since the early 1990s, a rapid increase in in situ optical and acoustic sensors have allowed for estimation of bulk properties of phytoplankton and detritus, while in situ multifrequency acoustic and optical imaging systems now allow for the determination of phytoplankton and zooplankton stocks. The development of kinetic fluorometers in the mid-1990s provided a means to estimate rate processes. The oceanographic community is currently leveraging technology development from other fields, particularly the medical sciences, to take advantage of the growth in nanotechnology, high-throughput sequencing devices, high-resolution imaging, increased computing power, and networked arrays to substantially increase in situ sampling capabilities. Examples of such systems include in situ sensors that analyze genetic information in order to characterize water column organisms and use co-registered fluorescence measurements to quantify population abundance and physiology. In situ flow cytometers with imaging capabilities are being utilized for sorting, characterizing, and quantifying millions of organisms per day (Olson and Sosik, 2007; Sosik and Olson, 2007). Increasingly, acoustic monitoring systems that were traditionally used for geophysical and national security issues are now being used for biological sensing (e.g., tracking whales [Spaulding et al., 2009], estimating fish populations [Makris et al., 2006, 2009]). The latter example, which employs ocean acoustic waveguide remote sensing, enables areal surveys of pelagic fish populations several orders of magnitude greater than current survey methods.
Future trends in biological sensing will involve improved rate and flux measurements, which are crucial inputs for carbon mass balance, as well as onboard gene sequencing. Key to meeting needs in 2030 and beyond, particularly in coastal and near-shore environments, will be relatively small and inexpensive versions of biological sensors that can replicate today’s complicated laboratory techniques for collecting genomic, protienomic, and metabolamic data.
Geophysical measurements are essential to understanding the mechanics of the oceanic crust. The past decade witnessed the first long-term, in situ deployments of seismic sensors in the crust, including broadband seismometers, short-period seismometers, and networked seismic arrays on cabled observatories. Currently, these types of sensors can detect diking and eruptive events along mid-ocean ridges, visualize hydrothermal upflow zones, and are even used for earthquake early warning systems. Future trends include further developments in underwater geodetics, where bottom pressure recorders and acoustic extensometers measure small-scale vertical and horizontal movements of the seafloor (e.g., inflation or deflation of submarine volcanoes, faulting or magma intrusion [Fox et al., 2001; Chadwick and Stapp, 2002], changes due to tsunami wave trains). Because the ability to assimilate real-time data from cabled seafloor seismic and pressure sensors will increase, it is very likely that use of these arrays will grow and become routine components of earthquake early warning and tsunami warning systems.
Downhole logging tools remain important technologies to measure crustal permeability, geochemistry, and fracture geometry and will follow trends set within scientific ocean drilling programs. The use of chirp sub-bottom profilers for deducing acoustic and physical properties of ocean sediment and subseafloor is likely to increase, as is the development and use of omnidirectional sonar systems able to sense in all directions with one acoustic ping. Multibeam sonars will continue to grow in capability, as will the performance of synthetic aperture sonars, providing increased ability to resolve seafloor features.
Many sensor capabilities have increased—longevity, stability, communications, and access to harsh environments. These improvements are mostly dependent on innovation from outside the ocean science field. The ocean science community will continue to benefit from other fields’ innovations in sensors and technology.
Despite encouraging improvements in sensor technology, a majority of studies in chemical and biological oceanography and marine geology will continue to require the collection of water, rock, and sediment samples, filtered particulates from seawater, and organisms for study. Aboard ship, sampling systems presently available (rosettes with continuous CTD, O2, fluorescence, and transmittance) are a substantial improvement over wire-clamped Nansen bottles with reversing thermometers, but there are significant needs for more capable oceanographic sampling systems. In addition, ship-based sampling will continue to be important for ground-truthing satellites, validating sensors before and after deployment, process studies, and long-term archiving.
Currently available shipboard hardware is grossly contaminating for many chemical elements, including radio-isotope systems that are not normally contamination prone and trace metals that can create artifacts in biological experiments. One of the highest priorities for chemical sampling is truly uncontaminated stationary and underway surface sampling systems for a broad range of research studies. Systems designed for uncontaminated sampling of trace gases and metals (such as CTD systems designed for CLIVAR and GEOTRACES) need to be transitioned to wider availability. Currently, there are only a few automated water samplers for use on moorings. Although they are not yet
routine or compact enough for use on autonomous vehicles, the next 20 years could see great advances in automated water sampling. Development of improved fluidic systems for chemical analyzers (e.g., pumps, valves, connectors) or alternative particulate sampling systems would be particularly valuable.
Many tools for biological sampling of the water column and seafloor systems (e.g., nets, Niskin bottles, sediment traps) have not evolved significantly in the past two decades, and despite technical advances it is very likely these samplers will continue to be used in the near future. For microbial communities, several sampling strategies have been emerging over the past decade, including profiling or towed systems equipped with pumps that pipe organisms through bio-optical instruments (Herman et al., 2004), video imaging (Davis et al., 2005), and flow cytometers (Sieracki et al., 1998; Olson and Sosik, 2007); in situ, extended-duration, time-series samplers that filter fluids for DNA and subsequent onshore analysis (Scholin et al., 2009); and efforts to develop sample collection and preservation approaches for autonomous vehicles. For zooplankton and higher trophic levels, sampling is still dependent on net tows (see review by Wiebe and Benfield, 2003) and often on acoustically quiet research vessels. Although multifrequency, multibeam, broadband, and ocean acoustic waveguide remote sensing acoustic sensors are rapidly evolving, these approaches still require physical samples for calibration (Lavery et al., 2007; Trenkel et al., 2008; Makris et al., 2009; Stanton et al., 2010). In addition, collecting delicate and soft-bodied organisms is not possible with nets, although this is routinely done with ROVs, an approach that may evolve to capture an even broader range of organisms. In the case of larger organisms (e.g., seals, sea lions) marine ecologists have successfully used smaller, less costly instrument packages to turn the animals themselves into sampling platforms for oceanographic properties (e.g., Biuw et al., 2007; Costa et al., 2008), a trend that is likely to continue to increase by 2030.
Over the past 20 years, scientific ocean drilling through ODP and IODP has played a vital role in sampling oceanic sediments and crust, and measuring physical properties within the crust and overlying sediments. As oceanic sediments are one of the best sources for high-resolution, long-duration, spatially distributed paleoclimate records, these data will continue to be needed to understand past and future climate change (IODP, 2011). In addition to ODP and IODP, shallower sampling of the ocean crust and sediment is currently done through coring systems available on a variety of research ships (e.g., the Woods Hole Oceanographic Institution long corer mounted on the R/V Knorr [Curry et al., 2008]). ROV drilling systems have also been used for extracting small hard rock cores (e.g., Stakes et al., 1997). The need for both shallow and deep coring and drilling will continue in the next 20 years in order to investigate paleoclimate, structure of the oceanic crust, and the subseafloor biosphere. In general, there has been a decrease in dredging operations to collect rock samples, but a concomitant increasing use of wax corers (which collect glassy rock fragments) on towed or autonomous systems and high-precision sampling through ROVs and HOVs. Geological sampling on the seafloor has also been facilitated by significant increases in bathymetric resolution that allow for more accurate sampling methodologies. Sediment traps, which collect samples for studies of concentration, particle size distribution, vertical flux, and horizontal transport will also continue to be needed.
Remote sensing includes sensors and platforms that provide ocean data from above the ocean surface, including satellites, piloted and autonomous aircraft, and land-based, ice-based, and offshore installations to sense the ocean. Use and availability of remotely sensed data has increased significantly in the past 20 years, and these types of data are now utilized for a range of fundamental and applied problems (NRC, 2008a). Current remote sensing capabilities provide critical environmental parameters (e.g., sea surface temperature [SST], ocean color, altimetry, wind speed and direction, ocean surface currents, ocean waves, sea ice and ice shelves, glaciers, atmospheric properties) that can also be used for applied data products of societal relevance (e.g., vessel traffic, ice flows, spill trajectories). For 2030, these capabilities will need to be sustained and greatly expanded, and they will continue to require groundtruthing from manned and autonomous platforms.
Physical parameters available from space-based sensors provide information on ocean temperature, wind speed and direction, sea surface height and topography, and sea ice distribution and thickness. Biogeochemical parameters are derived from ocean color radiometers (e.g., pigment concentration, phytoplankton functional groups, size distribution, particle concentration, colored dissolved organic material). These observations require active scatterometry, microwave array spectrometers, microwave imagers, multibeam altimetric lidars, and altimeters, among others (NRC, 2007b). Future trends involve LIDAR to provide depth-resolved particle concentration, mixed-layer depth estimates, and ice sheet measurements; polarimeters to provide particle composition; and hyperspectral resolution from the ultraviolet to the near infrared, which allows for better separation of phytoplankton functional types and separation of dissolved
absorption from that of particles. In addition to sustaining critical global measurements, there is considerable potential for innovation with a planned salinity sensor and proposed measurement of ocean carbon and surface fluxes using multiple sensors synergistically. There are several specific needs for improved scientific understanding: improved coastal remote sensing algorithms for ocean color, interferometer scatterometers that provide higher resolution wind fields closer to the coast, sensors that combine infrared and microwave channels to provide all-weather SST fields with higher spatial and temperature resolution, and more precise surface salinity sensing.
Most present environmental satellites are polar orbiting, covering the whole globe over a period of days. Adding geostationary satellites, of which few are currently available, will provide the possibility to sense the same area of the ocean several times a day, thus providing better temporal ability to resolve tidal effects as well as real-time data during episodic events like hurricanes or oil spills. High-latitude fluxes need continuous monitoring by polar orbiting and geostationary satellites for adequate sampling. Special satellite systems with multifrequency visible and infrared channels at several look angles are also needed. A future trend in short timescale temporal sampling, although rarely achieved today, may be satellite tasking for a “spotlight” sequence of images (e.g., Schofield et al., 2010b). Spatial resolution has increased steadily for many satellites (e.g., from 4 km down to 250 m for ocean color) and is expected to continue in the future. Atmospheric correction, a present-day challenge, is likely to be better addressed in the next two decades. Similarly, signal to noise characteristics have been improving steadily and could be further mitigated by temporal image processing.
An analysis of the trends in space-based Earth science over the past decade (NRC, 2007b) indicates that global observations from space are at considerable risk, with both operating missions and the number of operating sensors in decline. In other cases, the replacement sensors on operational platforms are less capable than the original research platforms. Remote sensing capabilities and data continuity are declining; vector wind, all-weather SST, altimetry, and ocean color measurements are at risk. Plans for new satellite capabilities and for continuity of certain sensor capabilities have not been realized in recent years, with the likelihood of gaps in coverage for key data in the future. This is particularly serious for ocean color data, as all existing U.S. ocean color satellites have exceeded their projected life span and could fail at any time, leaving a high probability of research-quality data gaps (Siegel and Yoder, 2007; Turpie, 2010).
Availability of UAVs has grown in the past decade, ranging in size and capability. Airborne piloted and autonomous platforms (e.g., planes, balloons, UAVs) have been used for several years to map shallow topography, identify fish abundance, image the coastal ecosystem, and track pollutants. Sensors are similar to those on satellites but, given their lower operating altitude, have significantly higher spatial resolution and may be capable of flying below and around cloud cover. Today, these assets are available at government labs and private companies with little use by academia, but it is expected that UAVs will follow the growth trajectory of AUVs and become far more utilized for oceanographic research by 2030. Smaller UAVs are already being launched and recovered by oceanographic ships. Their sensor payloads can be refreshed and adapted more readily than spaceborne sensors and can fill in satellite coverage gaps, and can also be used as communications relays. Aircraft of all types, but particularly UAVs, allow unprecedented response to episodic events, whether natural or manmade, and are already an important part of the portfolio of platforms needed to understand oceanographic processes. Additionally, certain radar remote sensing payloads (e.g., synthetic aperture radar) are currently being miniaturized for use aboard UAVs. Success in adapting these types of sensors to UAVs will almost certainly also lead to other airborne platform uses by 2030. However, there are significant regulatory restrictions surrounding their use.
The number of high-frequency (HF) radar sites used to measure surface currents has grown rapidly in recent years. In the past 10 years, they have been deployed over most of the U.S. coast. HF radar arrays are also extending offshore via buoys and fixed offshore platforms. There is strong momentum to build a national backbone, as surface current data are highly valuable for both fundamental research (e.g., coastal circulation models) and applied needs (e.g., search and rescue, safe offshore platform operations). More routine use of HF radars on ships and multifrequency HF radars to estimate near-surface vertical current shear is likely to enable new types of shallow water observations by 2030. Furthermore, increased industrial ocean activities could provide new platforms for placing sensors and for greater, more persistent coverage of the ocean surface.
Likewise, the network of cameras for observations of near-shore wave dynamics and beach topography has also grown in coverage and utility. Ground-based radars have also been used to detect ice extent.14 Owing to relatively low costs of implementation and operation, as well as their sustained coverage, the next two decades are likely to see significant growth in both numbers and capabilities of visible and infrared imaging systems from fixed sites as well as ships, satellites, and both piloted and autonomous aircraft. This will enable air-sea interaction
studies at smaller scales and more locations than previously accomplished.
MODELING AND COMPUTATIONAL INFRASTRUCTURE
The past two decades have seen great growth in numerical models of ocean circulation as part of the larger set of Earth system models. Examples include ocean general circulation models, nested regional models, coupled physical-biological models, and coupled ocean-atmosphere climate models. These models have been used in sea level rise prediction, carbon and heat storage calculations, and defense and homeland security applications. There has been rapid growth in the development and use of models that assimilate ocean observations to construct dynamically consistent predictions and hindcasts of ocean state. Modern ocean models take into consideration many types of processes, including ocean sea ice dynamics, mixed-layer dynamics and open ocean turbulence, marine biogeochemistry, and ecosystem processes. The field of ocean modeling has advanced rapidly in the past two decades, but more work is needed to increase fidelity for improved forecasting. Development of these models has been aided by the exponential growth of computer processing speed and memory capacity, reduced electrical power requirements, and steadily decreasing costs.
In most instances, the ability to model physical processes far exceeds the ability of the models to resolve important chemical and biological processes. Multidisciplinary models will be needed to address many of the major science research questions for 2030 and are almost certain to enable answers to societally relevant questions of Earth system dynamics. Models have become increasingly more interdisciplinary (e.g., combining ecosystem, cryosphere, and surface wave processes), although much remains to be done to quantify different processes. Models are also being run at higher resolution to simulate dynamical features of importance (e.g., mesoscale eddies, flow constrictions, coastal upwelling) and temporal and spatial scales important to biological processes. Skillful parameterizations will continue to be needed for unresolved dynamics of a range of processes. One such example is upper-ocean mixing, which is driven by surface fluxes and so is coupled to the atmosphere and such phenomena as aerosols. Parameterizations will also be increasingly needed to incorporate rate laws for biogeochemical and other processes. Given the demand across many disciplines, computational capacity will continue to be stressed in 2030. For the oceanographic community, this suggests a future need for broadly accessible centers with exascale or petascale capability, where teams of experts can be colocated with cutting-edge computational and modeling resources, healthy competition of ideas and methods can be fostered, and data products with basic and applied uses can be produced.
These modeling centers will need to assimilate disparate, growing data streams to sustain skillful simulations and forecasts. In the next 20 years, a subset of these modeling capabilities will include integrating the deep ocean with shelf seas for ecosystem-based management; using coupled ice, ocean, and atmospheric models to predict ice movement and thickness; using coupled ocean, surface wave, and atmospheric models for simulations of severe storms and coastal inundation; modeling tsunami arrival times and inundation zones; estimating marine resources for projected growth of industrial activities in the ocean; and modeling potential outcomes of geoengineering experiments.
The total volume of data produced by numerical models cannot be completely stored. Practical considerations influence what final model products can be saved, and what intermediate steps are discarded. While approaches are currently being developed to manage model complexity and data produced, the need to make decisions on what to archive will persist. This is driving the push for dedicated petaflop and higher computing power and data storage systems for ocean modeling, which is only likely to be met in a limited number of real or virtual locations and might leverage on evolving computing capacity being developed by commercial entities. The issue of creating broadly accessible modeling centers that dedicate significant resources to oceanographic needs requires further study in the near future, so that they can be in place by 2030.
This important crosscutting infrastructure category is subject to rapid changes, driven almost entirely outside the field of ocean sciences. Trends in this area include growing collaborations between computer and ocean scientists, leading to the emergence of a new class of scientific activity structured around networked access to observational information (Hey et al., 2009). Driven in large part by commercial activity, network and computational infrastructure that currently supports ocean scientists is undergoing significant evolution. Further change seems likely as the computational and network paradigm dominating industry shifts to cloud computing. Cloud computing refers to a new paradigm in which pervasive connectivity allows access to location-independent computational and storage resources and long-distance collaboration via the Internet and cellular networks. The current investment in cloud computing resources, led by commercial entities like Google, Microsoft, and Amazon, is creating a large infrastructure that may in turn transform the sciences, including the data-rich ocean sciences of 2030.
The evolution of data management in the ocean sciences needs to include a framework for a common lexicon across disciplines and applications, creation of distributed virtual centers for data deposit, broad accessibility for users from scientists to policy makers, and user-friendly archiving and synthesizing tools. Virtual data centers could be formed for a variety of disciplinary data: river outflow and tide gauges, terrestrial dust transport, seafloor mapping and seismicity,
ocean hydrography, biogeochemistry, and ecosystem structure and status, genomics, and many others. A major need for success in the realm of data management is to establish seamless integration of federal, state, and locally held databases, so that relevant data can be easily retrieved by a range of users. Programs such as the Global Earth Observation System of Systems15 will assist in bringing relevant observations together from different networks in order to provide maximized societal use. A continuing concern will be who owns, funds, and maintains these databases; however, excellent precedents are being set by programs such as the Marine Geoscience Data System,16 the Biological and Chemical Oceanography Data Management Office,17 the Palmer Station Long-Term Ecological Research Data Management Office,18 and Rolling Deck to Repository.19
A long-standing strength of the U.S. ocean sciences is the diversity of funding sources and the variety of sectors represented in the ocean research community, which ensures flexibility in how scientific research is performed and evaluated. Most basic research in ocean science is done in academia and government. The academic research community relies on funding from the National Science Foundation (NSF), NOAA, and ONR. With the exception of NSF, these agencies have applied ocean research missions, with significant intramural research and operational activities. Other mission agencies, such as NASA, the Department of Energy, and the Defense Advanced Research Projects Agency also provide focused ocean science and ocean engineering related support.
Increasingly, there are opportunities to focus and leverage resources among the federal agencies, which could maximize returns on ocean research investments both internally and externally, minimize costs for individual agencies, and draw in new federal and private-sector partners. Programs like the National Oceanographic Partnership Program have been critical to these efforts by providing a mechanism for multiple agencies to collaborate on a specific focus, with leveraged partnership between academic, federal, nonprofit, and commercial partners. Organizational frameworks that promote collaboration between agencies can help to ensure effective leveraging of resources in the coming decades.
State and local government support is also central to the ocean science and engineering communities. Major contributors include state universities and community colleges that employ a large segment of academic oceanographers, often with strong connections to societal and economic issues of regional or local importance. However, the trend for state investment is mixed at best, and recent budget deficits have forced some consolidation within the university system. Local county and township governments tend to have smaller amounts of research funds, despite an increasing appreciation of the major economic impact provided by the ocean to local communities. Nonetheless, there are many opportunities to share and to leverage local and state infrastructure (e.g., research vessels, shore-based laboratories, regional ocean observing systems) for common goals at the national and even international level. The National Ocean Council, via its Governance Coordinating Committee, appears to have a mechanism to foster this collaboration on a local to regional scale (CEQ, 2010); this could be exploited more fully.
Finally, the past decade has seen an increase in basic and applied research investments funded by nonprofit foundations, as well as increased partnerships between different ocean science sectors (e.g., academic, industry). In recent years, foundations have had high impact by providing resources and momentum to key research areas within their scope of interest. There are also growing partnerships with diverse industry interests (e.g., oil and gas, aquaculture, ocean energy). In view of the increasing demand by society for services and products related to the ocean, encouraging cooperation and joint infrastructure investment between the industrial sector, academia, and government is likely to foster greater success for all.
Currently, there are a limited number of community-wide facilities and organizations in the ocean sciences; their development is usually driven by cost and expertise issues. However, the logistical challenges inherent in conducting ocean research have led to increasing use of such facilities. These efforts are usually a means to address the technical needs and costs required for (1) platforms, sensors, and analytical equipment; (2) compiling, managing, and maintaining large complex data sets; and (3) computing and modeling. Facilities that are supported and accessed by a broad base of ocean science users can focus on specialized areas of ocean infrastructure, while providing cost effectiveness and standardized, reliable services.
One of the most successful examples is the growth of data and modeling centers (e.g., NOAA’s National Oceanographic Data Center and National Geophysical Data Center, National Center for Atmospheric Research). Numerous data centers have been created over the past 20 years and, given the diversity of new observation systems, the range of data available to the broader community (including education and the interested public) through distributed data centers are very likely to grow. Barriers to be overcome include data
accessibility and impediments to collaboration, which are critical to continued success. For community-wide facilities that provide laboratory analyses, independent verification and calibration is needed to provide sustained confidence in the data being produced.
Successful community-wide organizations need broad support at several levels of government. UNOLS has been an exemplar of this type, having strong engagement between academic, state, and federal partners. UNOLS provides academic and government oceanographers with access to the research fleet through coordination of ship schedules and operations, as well as managing standards and safety and ensuring standard instrumentation aboard each vessel. It also schedules deep submergence assets (HOVs, ROVs, AUVs) and use of research aircraft. By 2030, it is expected that consortia similar to UNOLS could facilitate broad community access to other infrastructure assets, including other mobile or fixed platforms (e.g., AUVs, gliders, drifters, moorings, seafloor cables and nodes, UAVs) or expensive analytical equipment. The creation of new community-wide facilities for ocean research infrastructure will be dictated in large part by technology innovations that either simplify operations and maintenance requirements or lower purchase and operation costs, as well as broad involvement and acceptance. However, they could also be driven by federal agencies as a means to maximize infrastructure effectiveness while minimizing costs.
Technology Development, Validation, and Transfer Groups
To address the various societal needs of 2030, new innovations need to be created, matured, and transitioned into operations. A number of federal agencies and private foundations support design and construction of new in situ and remote sensors and platforms. Some novel work in sensor development has been supported through the federal government’s Small Business Innovation Research Program.20 In addition, several laboratories, research groups, and private companies are actively developing the next generation of ocean infrastructure (e.g., MBARI, SRI International Marine Technology Program). However, to ensure that basic science understanding, forecasting, and management decisions are based on accurate, precise, comparable data, there is a fundamental need to verify and validate the performance of new and existing instrumentation. Enabling organizations that facilitate the development and adoption of effective and reliable sensors and platforms for ocean science will continue to be needed in the future. These types of organizations (e.g., the Alliance for Coastal Technologies21) can provide technology users with an understanding of sensor performance and data quality and provide technology developers and manufactures with opportunities for beta testing, system validation, and insights into various user needs, applications, and requirements through independent laboratory and field testing of prototype and off-the-shelf instrumentation. Such efforts help to accelerate critical instrument development and operationalization, while minimizing the risks of error and failure often associated with young technologies.
Shipboard Technical Support Groups
The responsibilities of shipboard technical support groups span a number of key areas, including safety, over-the-side handling of equipment, communications and shipboard computer networks, operation of hull mounted and underway sensors, quality control of collected data, and troubleshooting and repair of failed equipment. These responsibilities have evolved significantly from 1990 to 2010, in response to the increasing availability and complexity of sampling gear and, as well as the increasing breadth of federal regulations. Today, a marine technician’s duties may involve aspects of a bosun, chemical safety officer, satellite communications specialist, network administrator, and electronics technician. These groups are an integral component of the U.S. oceanographic fleet. As shipboard assets grow more complex, there will be an increased need for highly technically skilled workers aboard academic research vessels.
EDUCATION AND WORKFORCE TRAINING
As mentioned earlier in the chapter, ocean research infrastructure trends point toward greater complexity of ocean infrastructure and enormous volumes of data flow. Interdisciplinary work both influences and is driven by infrastructure and data; by 2030, it is likely that interdisciplinary education will be even more developed than today. However, the trends also suggest greater need for a technically skilled workforce, both for academic research and support, and for implementing monitoring and observations. Undergraduate programs in environmental and Earth systems science need to evolve to fill this need, especially if their graduates are encouraged to move into technical fields. Oceanography also needs to attract more computer science and engineering graduates to sustain innovation. While one role of academic institutions is to train future oceanographers, other organizations could be established to focus on the specific technical skills needed for future ocean research workforces, including early career experiences like internships. Presently, there is some effort toward education and training at the community college level for field support staff (e.g., the Marine Advanced Technology Education Center22). Similar enabling organizations could be created to address other critical education and training
needs, including analytical methods, data management and archiving, and equipment maintenance and repair. None of these are currently covered in traditional university degree programs; certificate programs could bridge this gap and provide useful standards for both the technical and research workforce in academic and private sectors.