tions, often on the basis of integrated data. A major development over the last several decades has been that of image classification methods that focus on characterizing and interpreting imagery data, for example, from satellites and aerial platforms. More recently, a new generation of statistical analysis methods tailored to spatial data has emerged that can provide new insights about spatio-temporal phenomena and processes and new methods for transformation, visualization, and prediction.

Modeling Services — Modeling services usually apply algorithms based on a combination of theory and statistics to generate estimates of past or current environmental conditions and changes and to project future events, trends, or risks. Models that have significant spatial dimensions or elements are increasingly prevalent in virtually all environmental and social science disciplines and many engineering and public health fields. The USGS has pioneered the development of diverse spatially enabled models, such as those related to earthquake prediction, hydrological resource management, and land-cover change.

Visualization — Visualization services not only support hypothesis generation, analysis, and modeling but provide a mechanism for scientists to use in communicating their findings to other scientists, applied users, and the public. An example of an online visualization service that is available at the USGS is the Disease Maps Web site (http://diseasemaps.usgs.gov). It is a simple tool that allows users to see the spatial distribution of wildlife and zoonotic diseases (such as West Nile virus) in different years at national or state levels. Another USGS Web site, WaterWatch (http://waterwatch.usgs.gov/), facilitates spatial data visualization and has daily local-level stream flow data, which can also be accessed through Google Maps. It displays real-time stream flow and is able to compare with historical stream flow by station.

The Spatial Data Infrastructure as a Workflow Platform

Over the last decade, various scientific programs have begun to incorporate workflow methods into their best practices. Most of the programs highlighted in Chapter 3 use workflows to collect raw observation data and make them available to thousands of researchers worldwide through specialized analytical and visualization tools (such as National Center for Atmospheric Research and National Science Foundation collaboratories). Workflow approaches are being developed by the OGC through its Open Web Services testbeds and through the Architecture Implementation Pilot of the Global Earth Observing System of Systems.

The USGS will need to consider implementing high-throughput workflow processes as a means of advancing the overall scalability of the SDI in support of the USGS Science Strategy and the SDI’s long-term role in geospatial knowledge capture, preservation, and reuse. Workflow techniques will need to become an essential technology layer in an SDI—one that enables research on a large scale by automating complex data preparation and analysis pipelines and by facilitating cross-disciplinary analysis, visualization, and predictive modeling. Providing computing, analytics, visualization, and other application processes as an SDI layer moves them closer to the data and makes it possible to leverage



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement