BOX 2-1 Important Points Made by the Speakers
• Develop a set of questions that, if answered, could drive the design of a Gulf monitoring effort.
• A virtual infrastructure for data management, information, and published documents could help guide response and restoration activities throughout the Gulf of Mexico.
• Job training and the long-term involvement of the residents of affected areas can influence the success of monitoring and restoration activities.
Several presentations during the workshop explored broader issues relevant to the overarching challenge of environmental monitoring for a large complex ecosystem like the Gulf of Mexico. One presentation highlighted a plan that addresses the broad range of needs and expectations by stakeholders for environmental monitoring programs in the Gulf Region. Two presentations identified and discussed critical infrastructure components of monitoring networks, including work force development, and data integration and management. Another presentation recapped the lessons and insights from a program that directed 30 years of monitoring in the Northern Gulf. Despite its multiple decades of contributing to scientific understanding, the program’s future is in doubt due to a lack of funding, which highlights the most persistent challenge for all long-term observational programs—sustained support. One of the keynote presentations detailed the approach and strategy used to develop and sustain a 10-year continental-scale observing and monitoring network known as NEON, providing details of what might be needed in designing a regional monitoring network for the coastal and offshore regions of the Gulf.
The millions of people living in Gulf Coast communities will continue to face substantial threats in the future, said Russ Lea, chief executive officer for the National Ecological Observatory Network (NEON). The rise in sea surface temperatures will create the potential for more intense hurricanes; sea level rise will create an elevated risk of erosion, storm surge, and saltwater intrusion; land areas are subsiding; and coastal inland and water temperatures are expected to rise.
The science studying these changes involves identifying multiple environmental stressors and complex interactions, said Lea. NEON was designed to enable understanding of these stressors and interactions. NEON was based on the concept that ecologists needed a distributed but integrated complex of facilities that would enable them to answer “grand challenge” questions in their field (NRC, 2001). NEON provides the infrastructure to support research, education, and environmental management in the areas of biodiversity, biogeochemical cycles, climate change, ecohydrology, infectious diseases, invasive species, and land use on a continental scale (NRC, 2003). It is divided into 20 ecoregions across the continental United States, including Alaska, Hawaii, and Puerto Rico. Across these regions facility sites are categorized into three types: sites located in unmanaged wildland conditions, relocatable sites that are representative of human land management effects on ecosystems, and fresh water aquatic sites to measure changes in aquatic systems over time. However, NEON does not have facilities on the coast because a proposed parallel program that exclusively targeted coastal sites was planned but did not get funded.
NEON is using an integrated sampling strategy across regions and biological domains. For example, its biological sampling includes plant biodiversity, plant biomass and chemical composition, leaf area, plant phenology, birds, ground beetles, mosquitoes, small mammals, infectious diseases, biogeochemistry, and soil microbes. Aquatic sampling, atmospheric measurements, soil measurements, and airborne observations are equally wide ranging. NEON is “trying to take societal benefit areas and work backwards through policy, through environmental indicators, through tools for collaboration, to how do you produce the data in the first place,” said Lea.
As a specific example, Lea discussed policy for coastal adaptation (NRC, 2014b). Dunes and berms, tidal flats, wetlands, and forests near the coasts all provide essential ecosystem services, and all of these ecosystems are affected by public policies at the sectoral, regional, or national levels that address societal needs. These policies, which may be in the areas of risk management, vulnerability assessment, resource management, climate adaptation, education, or other areas, form a direct link between ecosystem services and policies designed to handle risk and produce societal benefits.
Several key tools can inform these policies, Lea observed. For example, a computing infrastructure can ingest data, run models, visualize, and share information. Models need credible, high-quality, and long-term data, and these data can come from many sources, including the biosphere, the atmosphere, the lithosphere, the hydrosphere, and social spheres. Measurement traceability, observatory interoperability, and socio-environmental integration to enable linkage to societal benefits are all important features of these data, Lea said. The data also need to be freely accessible in a common database (PCAST, 2011; NSTC, 2014). The advantage of long-term programs like NEON, said Lea, is that models and data can be developed over time, with both hindcasting and forecasting, to develop greater confidence in those models.
Lea also discussed the roles of marine laboratories and biological field stations, citing a recent National Research Council report on the subject (NRC, 2014c). This infrastructure is well distributed across the United States and has generated valuable long-term historical data records, but these facilities are not necessarily integrated with other research resources, and they often have generated heterogeneous forms of data over time. Nevertheless, they are “a rich source of information,” especially if the information were readily available.
Given the needs that exist, what are the pathways to develop the appropriate indicators, models, cyberinfrastructure, tools, and data, Lea asked. He urged that a set of questions be developed to drive requirements. These requirements then would lead to a design, with subsystem designs leading to implementation. This decomposition process would be followed by integration and testing from the subsystem level to system verification and commissioning—“you learn this in engineering 101,” said Lea. NEON was developed the same way, starting with grand challenge questions and developing the systems and subsystems needed to answer those questions.
In response to a question about how NEON can adapt its research program to new problems, Lea noted that technologies will be refreshed every five to eight years to take advantage of new capabilities. Also, new questions will drive the science programs within federal agencies and scientific publications, which in turn will drive changes in NEON.
In addition, Lea noted that NEON will be working with large data companies to store and make its data and metadata accessible, and these companies will be interested in using those data to leverage an expansion of their user communities. For example, a climate corporation might use proprietary algorithms with NEON data to drive a better understanding of climate as it affects farming and the use of pesticides and fertilizers.
The Gulf of Mexico Coastal Ocean Observing System (GCOOS) is part of a larger Global Ocean Observing System that is more than a decade old, noted Landry Bernard, GCOOS’s associate executive director. Collectively, these systems are collaborations of public, private and academic sector partners that use observations, modelling and analysis of marine and ocean variables to support operational ocean services. As the Gulf Coast’s ocean observing system, GCOOS has five themes:
• Public health and safety
• Healthy ecosystems and water quality
• Mitigation of effects of coastal hazards
• Safe and efficient marine operations
• Long-term ocean variability and changes
The system acquires data from a wide variety of sources and makes those data available through a portal and in the form of various data products to meet both public and private needs.
Five years ago, NOAA decided that it wanted GCOOS and other regions under the United States’ Integrated Ocean Observing System to develop a monitoring plan. Based partly on the results of numerous stakeholder workshops over a ten year period, GCOOS identified common stakeholder priorities and developed a build-out plan with the following tools and applications:
• Surface currents and waves network
• Fixed mooring network
• Autonomous meteorological measurement network
• Glider and autonomous underwater vehicle network
• Satellite observations and products
• Aircraft observations
• Bathymetry and topography mapping network
• Water level network
• Enhanced Physical Oceanographic Real-Time System
• Outreach and education
• Harmful Algal Bloom Integrated Observing System
• Ecosystem monitoring
• Water quality and beach quality monitoring
• Hypoxia monitoring
• Monitoring of river discharge
• Physical modeling
• Ecosystem modeling
• Data management and communications system
• Research input into new technology development
These new tools and applications build on the existing systems and capabilities in the Gulf, including those funded by the federal government and the states, Bernard said. Also, as with the existing GCOOS program, data management remains a prominent issue with these 19 elements. For example, parameters, quality control, and the generation and use of metadata all have been factored into the development of the build-out plan.
The build-out plan contributes to several common themes in the 2012 Resources and Ecosystems Sustainability, Tourism Opportunities, and Revived Economy of the Gulf Coast Act (known as the RESTORE Act), Bernard noted, including:
• Restoration and protection of fish, wildlife, and natural resources
• Restoration and protection of marine and coastal resources, including barrier islands, beaches, and wetlands
• Restoration and protection of ecosystems
• Observing and monitoring
• Restoration and protection of the economy, sustainable development, and sustainable technology
Bernard focused specifically on the monitoring components of the build-out plan. Restoration projects have a wide variety of environmental monitoring needs. As examples, he listed satellite imagery, radar for coastal current, sidescan sonic imagery, multibeam bathymetry, aircraft and drone camera imagery, autonomous underwater vehicle camera and video imagery, and sediment profiling. Funding these technologies requires that the total expense be divided among the Gulf States and other partners, he said. It also will require deciding among technologies—for example, buoys versus gliders.
As particular opportunities for long-term monitoring, Bernard listed several technologies:
• Use of the GCOOS-Regional Association’s Gulf Glider Task Team and implementation of a Gulf Glider Network
• Implementation of a high-frequency radar network for Gulf coastal currents
• Augmentation of existing moorings and addition of new moorings with real-time sensors
• Implementation of a passive acoustics monitoring network
• Implementation of a marine biodiversity observing network
• Expanded data management and communications
He also listed a variety of recommendations for modeling, which he noted goes hand-in-hand with monitoring:
• Adapt the Gulf of Mexico Research Initiative’s Deep-C Consortium’s graphical map interface to allow interactive comparisons of multiple Gulf ocean circulation models simultaneously
• Support activities to further ecosystem modeling
• Prepare data sets needed by modelers
• Support the dissemination of model outputs
• Support the production of integrated satellite and other data products
Achieving these advances will require workforce development. Information analysts and modelers are needed to synthesize disparate data sets and develop useful information products for decision makers, Bernard said. In addition, “we need data providers who can use ocean data to create and provide data products. We also need marine technology innovators who can develop devices that operate in real time, resist bi-ofouling, and are more sensitive, among other desired attributes.”
“Environment monitoring points to jobs, and for us that’s key,” said Patrick Barnes, president of the environmental engineering and scientific consulting firm BFA Environmental. The environmental monitoring needed for coastal monitoring can generate jobs for the communities at risk from coast change, and BFA tries to incorporate environmental job training into everything that it does. “We don’t go into an area and do monitoring just for the sake of monitoring; we want to do it in a way that includes the communities. We don’t think that an area is really restored unless the community is fully engaged. And true engagement is not just about edu-
cation; it’s also about training and long-term involvement of the residents of an area that’s impacted.”
Barnes described a recent trip to Costa Rica where everyone he met, no matter their occupation, was engaged with the environment. The environmental challenges in the Gulf provide an opportunity to build a similar kind of engagement, he said. “We can integrate into our system, even at the elementary level but certainly at the middle and high school levels, a curriculum that builds that sort of ownership for the students so that they are aware of the value of the coast on every level.” Then, when a storm occurs, “they have an interest in being there to help fill the needs on those projects.”
In 2006, Barnes’ firm started a job training program called Limitless Vistas that has trained more than 400 at-risk youth in Louisiana as environmental technicians. Program graduates, many who were mostly unemployed or underemployed at the beginning of their training, developed the skills they needed to do monitoring work. Given the success of the program, Barnes called for a larger-scale effort and a more direct linkage between industry and job training.
Job training and environmental monitoring both provide ways of engaging with communities across political jurisdictions and scaling up small projects to larger ones, Barnes concluded. They also provide ways of learning what communities expect and can contribute when a disaster strikes and restoration activities are needed.
The day after the Deepwater Horizon drilling platform exploded, Russ Beard, director of the National Coastal Data Development Center, received a phone call asking him to assemble a joint analysis group to work on the assimilation, integration, and visualization of all the data being gathered by NOAA observing systems and other sources on the resulting oil spill. The group’s tasks included overseeing the calibration, validation, quality control, and generation of metadata for the incoming data streams. The data then would be used by agency managers to protect living marine resources and human health. In addition, the data would be archived in data centers for use by the research community and other interested groups.
This experience revealed several flaws in the existing system, Beard said. First, there was a failure to establish a consistent and coherent policy throughout the data management process. Second, there was no clear chain of command to drive the collection requirements. Third, the science advisory panels were not always able to provide guidance on vessel sampling locations or protocols. Fourth, data-sharing protocols and public access agreements either were not in place or were poorly understood by industry, academia, and government. Finally, the absence of an existing virtual infrastructure added latency to meeting data requirements for response and restoration.
The challenge facing data managers today is how to deal with petabytes of information derived from the Gulf of Mexico ecosystem, Beard said. Data sources range across oil and gas production, habitats, human health, community resilience, living marine resources, stock assessments, socioeconomic impacts, recreational fishing, and environmental indicators. To deal with this huge amount of information, Beard recommended that a virtual infrastructure for data management, information, and published documents be established for all Gulf of Mexico response and restoration activities. Key components of this infrastructure would include funding entities; data providers; data discovery, access, and visualization services; supportive metadata; online metadata catalogs; end users; and institutional repositories for data and documents.
Beard particularly emphasized the generation and use of metadata to do filtered harvests and federated searches. Data integrators could access data from providers, who in turn could draw data from existing national institutional repositories. The results of this data integration could be provided to users through a portal or dashboard.
Such a system would have both short-term and long-term benefits. Data providers either already possess or can easily establish Web Accessible Folders or Catalog Services for the Web that make their data accessible. Most providers already develop and provide acceptable levels of metadata. Establishing a public Gulf of Mexico Geoportal Server, which Beard labeled as one option, would be inexpensive; data could continue to be accessed from authoritative sources via existing infrastructure and databases; and data would be free to users. Finally, the originators of the data would continue to receive citations even as users are able to discover data they do not know about or would have difficulty finding with current tools.
Nancy Rabalais, executive director and professor of the Louisiana Universities Marine Consortium, described her three decades of experience studying the dynamics of the large hypoxic region in the Gulf of Mexico. An area of low oxygen has been forming every summer since Rabalais began monitoring the area in 1985. Gradually the monitoring effort has expanded, except for a few years in which data were not collected because of a lack of funding. Her monitoring and re-
search effort also has taken advantage of ancillary data collected by the U.S. Army Corps of Engineers and the U.S. Geological Survey.
Rabalais commented on the importance of “knowing what your question is.” With the Gulf hypoxic zone, the initial question was how big it is and where it is. But over time the questions being asked have multiplied, and the collection of data has allowed Rabalais and her colleagues to test hypotheses with their observations. Models have been developed and are now used to explore the properties of the hypoxic zone and better understand the dynamics of the system, with directed experimental work done to refine causative models. Models even incorporate coupled systems that examine the economic costs and benefits of different agricultural practices. “We have come a long ways in 30 years,” Rabalais said.
She also pointed out that technologies have changed substantially over 30 years and will continue to change. Some data are collected from ships, some from underwater cables, and some from instruments mounted on offshore platforms. Changing instrumentation poses challenges to data continuity, “but you have to have some kind of focus to start with or you are never going to get anywhere,” she said.
In 2009 a group of stakeholders put together a hypoxia monitoring plan that spells out the monitoring to be done and the research to be pursued. The plan reflects both scientific needs and funding realities. It also was coordinated with other plans to foster cooperation among groups and a combined effort. However, continued funding difficulties could threaten a valuable source of information. “What’s at risk is a 30-year database,” she said.
At the end of each day during the workshop, participants broke into four breakout groups to discuss environmental monitoring opportunities with a focus on the two major areas of interest—ecosystem restoration and the deep Gulf. Some of the main points raised by the breakout groups were more broad and relevant to the larger challenge of environmental monitoring in the Gulf; they are summarized below. These observations should not be seen as the conclusions of the workshop participants as a whole. Nor are they activities that the workshop participants thought should be pursued by the Gulf Research Program. Rather, they are opportunities related to environmental monitoring and related activities, with which the Program could be involved as it develops in future years.
Socioeconomic Considerations: A Synopsis of the Discussions
• One of the more significant challenges faced by those interested in and responsible for managing and protecting coastal and marine resources is to better refine our understanding of the linkages between ecosystem dynamics and the social benefits associated with them. While both the social science and marine science communities have developed indicator systems to better assess change, the degree to which such efforts have been linked has been surprisingly limited (Bowen and Riley, 2003), hence the interest in advancing the integration of socioeconomic considerations into environmental monitoring efforts and evaluation.Monitor socioeconomic conditions, including community metrics.
• Include socioeconomic benefits in models, with links to physical and biological parameters.
• Create a Gulf-wide “network of networks” through coordination of existing programs and data.
• Build a regional long-term reference network to allow tracking of environmental changes.
• Foster a distributed data system, including use of crowd-sourcing, big data, and citizen science.
• Set up an open standard for a common information-sharing framework.
• Make use of Natural Resource Damage Assessment data and other legacy studies.