National Academies Press: OpenBook

IT Roadmap to a Geospatial Future (2003)

Chapter: 1. Introduction and Context

« Previous: Executive Summary
Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×

1
Introduction and Context

VOYAGES OF THE 21ST CENTURY

In the 15th century, advances in geospatial knowledge and the technology to take advantage of them changed the world. Prince Henry the Navigator (see Box 1.1) foresaw that the discovery of a maritime trade route from Europe to India would make Portugal a world power and enable it to acquire wealth far out of proportion to its modest size and population. At the time, even the existence of a route around Africa was in doubt, and serious technical challenges stood in the way of achieving Henry’s vision. He established the School of Navigation, where the best scholars from diverse disciplines—including astronomy, cartography, and maritime technology—could collaborate. Their inventions in navigation and sailing turned geography inside out. The seas, instead of the land, became the medium for the establishment of world trade and the connection between Eastern and Western civilizations.

Today we are on the cusp of a similar transformation through the convergence of four independent technological advances (see Figure 1.1):

  • A sharp increase in the quality and quantity of geospatial data (see Box 1.2 ). Geospatial data have become of critical importance in areas ranging from crisis management and public health to national security and international commerce.

  • Location-aware computing. The availability of small mobile devices using wireless communication has made it possible to acquire location

Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×

BOX 1.1 Henry the Navigator

Early in the 15th century, interest in exploration had awakened in many European nations, owing to the discovery and translation into Latin of Ptolemy’s second-century Geography, wide publicity of Marco Polo’s earlier journeys, and increasing trade with Asia via Arab middlemen. Portugal’s Henry the Navigator (Prince Henry, 1394-1460) foresaw that the discovery of a maritime route to India could dramatically lower the cost of trade and thus gain Portugal a dramatic trading advantage vis-à-vis its European rivals, including Spain and Venice.

At the time, European knowledge about Africa was essentially limited to the Mediterranean coast and the lower Nile, and European sailors rarely entered the Atlantic Ocean. When they did, the ships that navigated along the shores of the African coast risked running aground, while those who attempted to steer into open waters could stray too far and be lost, since open-water navigation in that era was mostly guesswork. To measure latitude they used the star Polaris, which is not visible in the Southern Hemisphere. The furthest south anyone had sailed was Cape Bojador (at the southern end of the Atlantic coast of modern-day Morocco). No one knew whether Africa continued all the way to the mythical “Southern Continent” of Ptolemy or if one could sail around it. Ship technology was primitive and ill suited to the demands of long voyages, which often involved long passages against prevailing winds.

To overcome these technical challenges, Henry founded a multidisciplinary community of scholars—the School of Navigation at Sagres, at the southern tip of Portugal. Here, Abraham Zacuto published the first accurate solar ephemeris and improved the astrolabe for measuring the positions of heavenly bodies. These two advances enabled accurate determination of latitude far out to sea. Cartography improved, and a new type of ship was designed, the caravel, that could sail close to the wind. A series of ocean voyages that probed ever southward culminated in Vasco da Gama’s sailing around Africa to India in 1498. The sea route to India had been discovered, 38 years after Henry’s death and almost 70 years after Portugal’s maritime quest had begun. These maritime advances enabled Portugal to establish dominance over the sea lanes to the east that would go unchallenged for nearly a century.

In the mid-18th century, John Harrison’s invention of the chronometer completed the technological picture of that era by enabling accurate determination of time, and thus longitude, at sea.

SOURCE: “The European Voyages of Exploration,” University of Calgary Applied History Research Group, <http://www.ucalgary.ca/applied_history/tutor/eurvoya/>.

Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×

FIGURE 1.1 The convergence of four independent technological advances has the potential to transform the world.

specific information anywhere, anytime. The Global Positioning System (GPS) allows users to calculate physical locations quickly and reliably.

  • Databases and data mining. The escalating availability of digital information has prompted the development of hardware and software systems capable of managing extremely large data sets. Improvements in geospatial database techniques and data mining can improve our ability to analyze the vast amount of data being collected and stored.

  • Human-computer interaction with geospatial information. Visualization and related virtual/augmented1 technologies, multimodal interfaces,

1  

Augmented reality systems enhance a user’s view of a real scene with computer-generated information.

Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×

BOX 1.2 What Are Geospatial Data?

Executive Order 12906 (which called for the development of a National Spatial Data Infrastructure) defines geospatial data as “information that identifies the geographic location and characteristics of natural or constructed features and boundaries on the earth.”1 Examples of geospatial data include a forest, a wildfire, a satellite image, addresses of homes and businesses, and the GPS coordinates of a car. Although time is considered to be a dimension of geospatial data (or “geodata”), the term “spatiotemporal data” often is used to emphasize data that vary over time or have a time-critical attribute. The extent of a wildfire as it burns is an example of spatiotemporal data.

Geodata are different from traditional types of data in key ways. Among the most important differences is that geodata are high dimensional (highly multivariate) and autocorrelated (i.e., nearby places are similar). Auto-correlation is a feature to be exploited (e.g., it allows predictions to be made about places for which there are no data) but it also prevents application of standard statistical methods.2 Some geospatial data contain distance and topological information associated with Euclidean space, whereas others represent non-Euclidean properties, such as travel times along particular routes or the spread of epidemics.

Digital representations of geospatial data are moving beyond the traditional, well-structured vector data (geometric shapes that describe cartographic objects) and raster data (digitized photographs and scanned images) formats. A more common conceptualization of geographic reality is based on the field and object representation models. The field model views geospatial data as a set of distributions over space (such as vegetation cover), whereas the object model represents the earth as a surface of discrete, identifiable entities (e.g., roads and buildings; Fonseca, Egenhofer, and Agouris, 2002). Some geospatial entities are discrete objects, whereas many others are continuous, irregularly shaped, and inexact (or fuzzy). For example, a storm is a continuous four-dimensional (4D) phenomenon but must be represented in digital form as a series of approximate discrete objects (e.g., extent, wind velocity, direction), resulting in uncertainty, errors, and reduced accuracy. An integrated conceptualization combining the field and object perspectives is increasingly important and necessary to represent, for example, a storm as an object at one scale and to model its structure as a field at a different scale.

The characteristics of geospatial data pose unique challenges to geospatial applications. The requirements of a geospatial data set—such as the coordinate system, precision, and accuracy—are often specific to

Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×

one application and may be difficult to use or integrate with other geospatial applications. Geospatial applications may produce erroneous results if the metadata (measurement methods, transformation operations performed, etc.) associated with a geospatial data set are inaccurate or missing. The large data volumes—measured either in the number of entities or in the total computational storage (bytes)—typically associated with geospatial applications stress the ability of geospatial algorithms and computing systems to store and analyze geodata in a timely and efficient manner. This is compounded further by the wide range of spatial and temporal scales of geospatial data. For example, an application that is well suited for storing and analyzing data at a very small scale (e.g., a neighborhood or town) may be very inefficient and ill suited for analysis or queries at the country or continent scale. Although other application domains grapple with many of these same issues, few, if any, must deal with all of these issues simultaneously.

1  

Available online at <http://www.fgdc.gov/publications/documents/geninfo/execord.html>.

2  

From a white paper, “Data Mining Techniques for Geospatial Applications,” prepared for the committee’s workshop by Dimitrios Gunopulos.

and collaborative decision-support environments are examples of interaction technologies that are maturing rapidly. Together, they can enhance human abilities to understand and interact with geospatial data.

The convergence of advances in these areas potentially can transform the world. As in Henry the Navigator’s era, however, seizing this opportunity requires that we have the vision to recognize both the potential and the interdisciplinary research synergy that will be needed to realize it.

SCENARIOS

The committee envisions a world in which all geospatially relevant information is available in a timely fashion to those authorized to have access, in ways that are natural to use and, when desirable, coordinated with real-world context. The following scenarios illustrate the exciting opportunities enabled by research to enhance the accessibility and usability of geospatial information.

Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×

Just-in-Time Mapping

The first example shows how future technology could be harnessed to manage situations, such as the aftermath of a tragedy, in which human lives are at risk. Consider a hypothetical scenario some years hence:

A devastating earthquake, “the Big One,” has hit downtown San Francisco. A huge complex of skyscrapers built on reclaimed land has caved in. It is feared that thousands of people are trapped in the rubble. Emergency personnel have little time in which to rescue them. Although cranes and heavy earthmoving equipment have been put in place with amazing speed, it is not clear how the excavation should proceed. With unstable interior spaces and broken gas and electric lines, it is not clear how to excavate in a way that is fast yet will not further injure survivors. Time is ticking away and with it, hopes for survival.

With few options left, the disaster-relief director decides to use an experimental, robot-based, just-in-time three-dimensional (3D) mapping capability that was developed after the September 11, 2001, World Trade Center calamity. Thousands of small mobile robots (“mapants”) burrow into the rubble. Each robot is equipped with location-sensing ability as well as with visual, toxic gas, and other sensors.2 The key to the speed of the just-in-time mapping application is the enormous parallelism made possible by the huge number of mapants. To conserve energy and to enable communication through the rubble (which has large concentrations of steel), the robots use ad hoc wireless communication to share data with one another and with high-powered computers located outside the rubble. The computers perform planning tasks and assist the mapants with compute-intensive tasks such as image recognition and visualization of the map as it is constructed.

Although early trials of this approach have been promising, it still is considered highly experimental. This is its first use in a real-world event. After an initial planning phase, the mapants are let loose. Each has its own mission but also is cognizant that this is a team effort. The thousands of mapants organize themselves according to the planned strategy, burrowing and climbing as needed. They possess sufficient autonomy to handle unexpected situations. Once a mapant has reached a designated region, it explores that region and reports on what it senses. The input from these mapants is combined to produce a 3D map showing (with centimeter accuracy) the location of potential survivors, fires, dangerous gases, and other critical information. Human experts monitor the progress of the mapants, review early maps, direct the robots to

2  

For further discussion of research challenges for networked systems of embedded computers, see Embedded, Everywhere (CSTB, 2001).

Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×

areas of interest, evaluate dangers, and select strategies for mapping refinements. As the mapping of top layers of the rubble is completed, the mapants move deeper and excavation of the just-mapped region begins. Within 48 hours, many survivors are rescued who might have perished were it not for the assistance of the mapants.

Clearly this is science fiction today. Yet we might question why it appears to be so futuristic, since many of the component technologies— such as miniature robots, ad hoc wireless communication, and strategy planning—are active areas of research today. The answer is that the situation represented in this scenario would considerably stretch each of these technologies and, more importantly, would require that they be integrated in ways that have never before been attempted. Following are some examples of the scientific and engineering challenges that will have to be overcome:

  • Engineering of small, autonomous mobile robots capable of burrowing, climbing, and so forth;

  • Planning and coordination of scalable, ad hoc robot diffusion;

  • Centimeter-accuracy location sensing without supporting infrastructure;

  • Robot-to-robot wireless communication through steel-filled rubble;

  • Communication infrastructure that is robust and pervasive, even during emergencies;

  • Real-time analysis and integration of heterogeneous data (detection of fire, dangerous chemical leaks, or life signs) from distributed sources;

  • Real-time map construction and refinement; and

  • Map-based user interfaces that allow coordinated teams of human experts to quickly and easily direct the mapants and analyze the results.

Controlling Wildfires

A second scenario illustrates how geospatial data from a wide array of sources could be integrated with powerful computational models, enabling us to predict more accurately the onset and behavior of wildfires.3 The size and severity of wildfires depend on how quickly and effectively

3  

A 2000 National Science Foundation Workshop on Dynamic Data Drive Application Systems explored research opportunities created by dynamic/adaptive applications that can dynamically accept and respond to field-collected data. See <http://www.cise.nsf.gov/eia/dddas/> for more information.

Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×

firefighting resources are deployed and how successfully the areas of high risk can be evacuated. In our hypothetical future, a wildfire hazard system is in constant operation:

The wildfire hazard system automatically monitors the national landscape to ensure early detection of fire outbreaks. Although dry fuel load (biomass with low water content) is the most direct indicator of potential fire severity, it is too difficult to measure over large areas, because remote optical instruments respond to the radiation reflected from the leaves rather than the dry fuel. Because ground-based sensors are impractical over vast areas, the new system monitors data (e.g., lightning strikes, Doppler weather radar, soil surface properties, and wind data) harvested from satellites. A wide array of satellites—some of them engaged in classified or proprietary reconnaissance—has been deployed in recent years, making it possible to acquire data updates at coarse spatial resolution almost continuously, with higher-resolution (~1 km) data available at intervals of several hours. The wildfire hazard system warns of the possibility of fires by combining these measurements with spatially distributed models of plant growth and drying (as functions of energy and water inputs, which vary at the synoptic scale as well as locally with elevation and slope orientation) and with spatiotemporal data about historical wildfire occurrences (Callaway and Davis, 1993). Once a fire starts, satellites sensing radiation in the infrared portion of the spectrum can detect small, hot areas as long as their view is not obscured by clouds (Giglio and Kendall, 2001). Not all of these hot targets are fires, however, so to avoid false alarms, the hazard system must integrate, mine, analyze, and cross-compare data to reliably identify wildfire outbreaks.

When an apparent wildfire is detected, a standby alert is issued to emergency response authorities. The measurements from the remote sensing instruments are passed to a system component that calculates the geographic boundaries of the fire itself and of the area affected by smoke. The system automatically identifies potentially relevant data sets, and it harvests data on vegetation/biology, wildfire-spread factors (vegetation flammability, location of natural and man-made fire barriers, etc.), and meteorological conditions. Weather prediction and chemical plume diffusion models are activated to forecast how the fire and smoke/debris will spread. A wildfire is especially complicated because its behavior depends on the three-dimensional flow of air over terrain, which in turn depends on both synoptic weather conditions and the convection that the fire itself causes. The hazard system combines models of the airflow with the Doppler wind profilers to estimate the state of the overlying atmosphere. As the wildfire spreads, the hazard system rapidly updates the models to predict the future behavior of the fire.

An emergency response component is activated to cross-analyze the simulation results with data on the locations of population centers, remote dwellings or businesses, and evacuation routes. Results are presented to a distributed control team that reviews the data, evaluates the

Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×

risks, and collaboratively selects a plan of action. Public agencies are alerted to begin the evacuation process, with detailed routing information provided automatically to all cell phones, pagers, PDAs, and other location-aware devices4 in the affected area. Meanwhile, a fire control component is activated. This cross-analyzes the original simulation results—the wildfire-spread prediction model continues to run, using constantly updated sensing data—with data on access paths for firefighting equipment and personnel. The component proposes strategies for combating the fire and predicts the relative effectiveness of each strategy in containing damage to natural resources and property. As firefighting crews are dispatched, they are provided with strategic scenarios and routing information. Real-time updates flowing through the system make it possible to adjust strategies and routing as conditions change.

In this scenario, a number of new challenges arise because predictive models have been coupled with the time-critical analysis of extremely large amounts of data:

  • Development of systems that can harvest classified and proprietary data, with appropriate barriers to unlawful access;

  • Methods for integrating computational, observed, and historical data in real time;

  • Methods for dynamically coupling independent numerical models and infusing external data into them to develop, evaluate, and continuously refine strategies for emergency response;

  • Algorithms capable of tracking moving and evolving objects and predicting their future state;

  • Methods for automatically identifying and communicating with persons in the affected area via wired and wireless communication mechanisms (household and cellular telephone numbers, pagers, PDAs, satellite TV and radio, cable TV, and the Internet) based on geographic location; and

  • User interfaces empowering a range of users (from emergency responders to local government officials) with little or no training to collaboratively evaluate proposed plans and coordinate actions.

Digital Earth

The final scenario, taken from Gore (1998), illustrates how new technologies and methods could enrich our understanding of the world and

4  

This notification approach is being contemplated in other arenas in spite of potential drawbacks, including false positives and the inability to reach everyone.

Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×

the historical events that have shaped it. Imagine that a grade-school student is visiting an exhibit in a local museum. The Digital Earth exhibit is a multiresolution, three-dimensional representation of the world that allows her to interactively explore the vast amounts of physical, cultural, and historical information that have been gathered about the planet.5 The exhibit also provides tutorials that explain difficult concepts and guide their exploration (e.g., What is ocean productivity? How is it measured?).

“After donning a head-mounted display, she sees Earth as it appears from space. Using a data glove, she zooms in, using higher and higher levels of resolution, to see continents, then regions, countries, cities, and finally individual houses, trees, and other natural and man-made objects. Having found an area of the planet she is interested in exploring, she takes the equivalent of a ‘magic carpet ride’ through a 3D visualization of the terrain. Of course, terrain is only one of the many kinds of data with which she can interact. Using the system’s voice recognition capabilities, she is able to request information on land cover, distribution of plant and animal species, real-time weather, roads, political boundaries, and population.

“This information can be seamlessly fused with the digital map or terrain data. She can get more information on many of the objects she sees by using her data glove to click on a hyperlink. To prepare for her family’s vacation to Yellowstone National Park, for example, she plans the perfect hike to the geysers, bison, and bighorn sheep that she has just read about. In fact, she can follow the trail visually from start to finish before she ever leaves the museum in her hometown.

“She is not limited to moving through space; she also can travel through time. After taking a virtual fieldtrip to Paris to visit the Louvre, she moves backward in time to learn about French history, perusing digitized maps overlaid on the surface of the Digital Earth, newsreel footage, oral history, newspapers, and other primary sources. She sends some of this information to her personal e-mail address to study later. The time line, which stretches off in the distance, can be set for days, years, centuries, or even geological epochs, for those occasions when she wants to learn more about dinosaurs.”

5  

“Digital Earth” is a broad international initiative using Earth as a metaphor for an information system and network that supports an easy-to-use human user interface for accessing multiple digital, dynamic 3D representations of the Earth and its connected information resources contained in the world’s libraries, and scientific and government institutions. Initiated in the United States by NASA, many countries, international agencies, and organizations have been working since 1998 to develop standards, specifications, and information content to implement Digital Earth interoperability. For more information, see <http://www.digitalearth.net.cn> and <http://www.digitalearth.ca>.

Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×

As envisioned in 1998, Digital Earth was intended to support individuals or, at most, co-located groups. Although many of the goals for Digital Earth have not yet been realized (and remain research challenges), one can imagine a next-generation Digital Earth that can connect distributed individuals through teleimmersive environments. In the scenario sketched above, the young girl on her virtual field trip could interact directly with a child in another country or with distributed groups of students engaging in collaborative learning activities that take advantage of their collective abilities, resources, and access to real-world locations. Realizing this vision will require not just advances in technology, but overcoming significant challenges related to human capabilities:

  • Data integration techniques capable of merging data of vastly different types, spatial resolutions, and temporal scales in response to human queries;

  • Supporting technologies for extremely large and diverse geospatial databases, including complex data models, scalable searching and navigation techniques, and scalable analysis on the fly;

  • Distributed virtual-reality environments that are uncomplicated and responsive enough to suit the general public; and

  • Intuitive, multimodal interfaces capable of supporting unconstrained navigation through virtual space and time as well as guided exploration of the concepts.

WHY NOW?

The volume, quality, and resolution of geospatial data are increasing exponentially. Driving this sharp increase are American and international remote sensing programs, both public and private. For example, Terra, the flagship spacecraft of the National Aeronautics and Space Administration’s (NASA’s) Earth Observing System produces 194 gigabytes (GB) per day and Landsat 7 produces an additional 150 GB per day.6 These data are accessible to a wide range of users, because science specialists interpret the raw data in the form of easily understandable variables (e.g., surface temperature, radiation balance, vegetation index, ocean productivity). When these higher-level products are included, the data volume from these two satellites alone amounts to a terabyte (1012 bytes)

6  

Data from Earth Observing System Data & Information Systems homepage at <http://spsosun.gsfc.nasa.gov/eosinfo/EOSDIS_Site/index.html>.

Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×

of geospatial data per day.7 In just one year, the size of NASA’s earth science data holdings has doubled. The growth rate is certain to increase; for example, a single copy of a color satellite image of the entire planet, at 1-meter resolution, exceeds 1 petabyte, or 1015 bytes (Crockett, 1998). There are, of course, many other sources of geospatial data, including global positioning satellites, aerial photographs, distributed sensor networks,8 embedded devices, and other location-aware technologies. This increase in generation of geospatial information creates the potential for an order of magnitude advance in knowledge about our natural and human world and in our ability to manage resources and react to the world’s dynamic nature. At this time, however, our ability to generate geospatial information is outpacing our ability to analyze the information. The research contribution lies in finding ways to facilitate that analysis through better spatiotemporal database organization strategies, improved geospatial data reduction and data simplification methods, and new geovisualization techniques.

Advances in location-aware computing are having a great effect not just on how geospatial data are acquired but also on how and with what quality they can be delivered. Sensors that can record the location and some information about the surrounding environment (e.g., temperature and humidity) are being deployed to monitor the state of roads, buildings, agricultural crops, and many other objects. “Smart Dust” sensors (devices that combine MEMS sensors with wireless communication, processing, and batteries into a package approximately a cubic millimeter in size) can be deployed along remote mountain roads to determine the velocity and direction of passing vehicles (Pister, 2002). The data transmitted wirelessly in real time from these sensors increase not only the volume but also the complexity of available data. Data that describe continuously moving and evolving objects, such as buoys floating on the ocean currents that record ocean temperatures, pose significant obstacles to current database and data mining techniques.

7  

The BaBar experiment, a collaboration of 600 physicists from nine nations that observes subatomic particle collisions, is another example of the increasing generation of scientific data. The amount of data generated per day by BaBar increased from about 500 GB in April 2002 to over 663 GB in October 2002. For more information, see <http://www.slac.stanford.edu/BFROOT/www/Public/Computing/Databases/>.

8  

CSTB’s Embedded, Everywhere report examines the implications of heterogeneous, sensorrich computational and communications devices embedded throughout the environment. It describes the research necessary to achieve robust, scalable, networked, embedded computing systems, which operate under a unique set of constraints and present fundamental new research challenges (CSTB, 2001).

Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×

Other unique properties of geospatial data present challenges that go well beyond current capabilities for organizing and analyzing information. The diverse sources of geospatial data, which typically use dissimilar standards or formats (e.g., relative vs. absolute position), make data integration problematic. Integration and analysis are particularly challenging when data are represented at different scales, because objects at one scale (such as residential buildings) may appear only implicitly at another (e.g., implied by particular types of land-use zone), or they may not be represented at all (MacEachren and Kraak, 2001). The semantics of geospatial data often are difficult to define (e.g., Where does a forest end? When is one object north of another? At what bearing is it no longer north?) and may be different from one application domain to the next.9 Much progress has been made in the past two decades on geospatial databases and data mining activities but shortfalls remain, particularly in the combination of geospatial and temporal data.

Finally, because geospatial data already are ubiquitous in our everyday lives, users vary widely in background and expertise. The challenge is to provide information and services in a manner that satisfies the requirements of diverse audiences, from scientific experts to schoolchildren. For centuries, visual displays in the form of maps (and images) provided a primary interface to information about the world. Recent advances in visualization, image enhancement, and virtual-reality technologies can be exploited for working with geospatial information. Evolving technologies soon will create the potential to go beyond visual displays as the primary interface between humans and geospatial information. Multimodal interfaces (combining sound, touch, and force-feedback, with vision) and collaborative decision-making environments could allow users to interact with geospatial information in totally new ways, constructing new knowledge about the world and applying that knowledge to critical problems facing science and society.

The convergence of advances in location-aware computing, databases and data mining technologies, and human interaction technologies, combined with a sharp increase in the quality and quantity of geospatial information, promises to transform our world. This report identifies the critical missing pieces that are needed to achieve the 21st-century vision of Prince Henry the Navigator’s voyage.

9  

There are no standard definitions that allow quantifying features such as a forest, soil, or land cover. Geologists and other experts often create and use their own definitions when creating databases and maps. See Chapter 3 of this report for a discussion of the research challenges in geospatial ontologies.

Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×

ORGANIZATION OF THIS REPORT

This report is organized around different categories of research. One challenge in identifying promising research directions in computer science enabled by geospatial information is the breadth of application domains and technologies involved. Some of the issues and topics are specific to geospatial data, but others have broader applicability—which implies greater leverage for the recommended research investments. The committee chose to focus on three key themes: location-aware computing, geospatial databases and data mining,10 and human interaction with geospatial data. It drew on presentations and discussions at the workshop, augmenting that material with its own insights. (See Appendix B for the workshop agenda and Appendix C for a list of white papers, which are available online at <http://www.cstb.org/web/project_geospatial_papers>.) Chapter 2 explores the state of the art, research directions, and possible future application scenarios in location-aware computing. Chapter 3 outlines research challenges in database technologies and data mining techniques that are needed to manage and analyze immense quantities of geospatial information. Chapter 4 highlights new approaches for enabling users to experience “realistic” representations of high-dimension environments. The committee’s recommendations are presented in the Executive Summary.

WHAT THIS REPORT DOES NOT DO

The development of a comprehensive, prioritized research agenda is, of course, beyond the scope of a single workshop. Rather, working within its constrained resources, the committee tried to highlight the important themes that are emerging in computer science as a result of complex geospatial information. Not all challenges raised at the workshop are presented in this report. The committee focused on issues where the most fruitful approaches for future research efforts could be identified. Its intention was that the report motivate a more comprehensive assessment of the wide array of challenges occurring at the intersection of computer science and geospatial information.

This report also does not attempt to outline the implications of policy issues associated with geospatial information. As geospatial data are more

10  

Although the workshop held four separate breakout groups (location-aware computing and sensing; spatial databases; content and knowledge distillation; and visualization, human-computer interaction, and collaborative work), the committee believed that the close dependency between the accessing and processing of data and data analysis argued for combining the database and knowledge distillation themes.

Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×

widely used, important ethical, legal, and sociological issues are likely to arise. They include such things as the liability of data and software tool providers, intellectual property rights, and the rules that should govern information access and use—including privacy and confidentiality protection. Issues associated with the availability of government-collected geospatial information include constraints owing to national security concerns, policies that limit the release of data obtained for government use that could compete with data from commercial providers, and the cost of preparing data sets for public release. Moreover, access practices vary at the local, regional, national, and international levels of government. Whereas the federal government’s general policy is to make data available free of charge or at the actual cost of distribution, many state and local government organizations seek partial or full cost recovery, raising questions about what incentives might encourage state and local governments to make their data more widely available.11 It is not clear whether policy and technical mechanisms can be coordinated so as to encourage the realization of potential benefits from geospatial information while avoiding undesirable social costs. The committee believes that an in-depth analysis is needed of the policy and social implications raised by the collection and increased availability of geospatial information.

REFERENCES

Callaway, R.M., and F.W. Davis. 1993. “Vegetation Dynamics, Fire and the Physical Environment in Central California.” Ecology, 74:1567-1578.

Computer Science and Telecommunications Board (CSTB), National Research Council. 2001. Embedded, Everywhere. Washington, D.C.: National Academy Press.

Crockett, Thomas W. 1998. “Digital Earth: A New Framework for Geo-referenced Data.” Institute for Computer Applications in Science and Engineering Research Quarterly, 7(4), December. Available online at <http://www.icase.edu/RQ/archive/v7n4/DigitalEarth.html>.


Fonseca, Frederico T., Max J. Egenhofer, and Peggy Agouris. 2002. “Using Ontologies for Integrated Geographic Information Systems.” Transactions in GIS, 6(3).


Giglio, L., and J.D. Kendall. 2001. “Application of the Dozier Retrieval to Wildfire Characterization—A Sensitivity Analysis.” Remote Sensing of the Environment, 77(1):34-49.

Gore, Albert, Jr. 1998. “The Digital Earth: Understanding Our Planet in the 21st Century,” given at the California Science Center, Los Angeles, January 31. Available online at <http://www.digitalearth.gov/VP19980131.html>.


MacEachren, Alan M., and Menno-Jan Kraak. 2001. “Research Challenges in Geovisualization.” Journal of the American Congress on Surveying and Mapping, 28(1):3-12.


Pister, Kris. 2002. “Smart Dust: Autonomous Sensing and Communication in a Cubic Millimeter.” Viewed on April 2, 2002. Available online at <http://robotics.eecs.berkeley.edu/~pister/SmartDust/>.

11  

Of course, as controversy over treatment of driver’s licenses and other state records shows, care is needed in setting the terms and conditions for making available data that bears, for example, on privacy.

Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×
Page 10
Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×
Page 11
Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×
Page 12
Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×
Page 13
Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×
Page 14
Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×
Page 15
Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×
Page 16
Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×
Page 17
Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×
Page 18
Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×
Page 19
Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×
Page 20
Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×
Page 21
Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×
Page 22
Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×
Page 23
Suggested Citation:"1. Introduction and Context." National Research Council. 2003. IT Roadmap to a Geospatial Future. Washington, DC: The National Academies Press. doi: 10.17226/10661.
×
Page 24
Next: 2. Location-Aware Computing »
IT Roadmap to a Geospatial Future Get This Book
×
Buy Paperback | $46.00 Buy Ebook | $36.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

A grand challenge for science is to understand the human implications of global environmental change and to help society cope with those changes. Virtually all the scientific questions associated with this challenge depend on geospatial information (geoinformation) and on the ability of scientists, working individually and in groups, to interact with that information in flexible and increasingly complex ways. Another grand challenge is how to respond to calamities-terrorist activities, other human-induced crises, and natural disasters. Much of the information that underpins emergency preparedness, response, recovery, and mitigation is geospatial in nature. In terrorist situations, for example, origins and destinations of phone calls and e-mail messages, travel patterns of individuals, dispersal patterns of airborne chemicals, assessment of places at risk, and the allocation of resources all involve geospatial information. Much of the work addressing environment- and emergency-related concerns will depend on how productively humans are able to integrate, distill, and correlate a wide range of seemingly unrelated information. In addition to critical advances in location-aware computing, databases, and data mining methods, advances in the human-computer interface will couple new computational capabilities with human cognitive capabilities.

This report outlines an interdisciplinary research roadmap at the intersection of computer science and geospatial information science. The report was developed by a committee convened by the Computer Science and Telecommunications Board of the National Research Council.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!