Skip to main content

Currently Skimming:

1. Introduction and Context
Pages 10-24

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 10...
... foresaw that the discovery of a maritime trade route from Europe to India would make Portugal a world power and enable it to acquire wealth far out of proportion to its modest size and population. At the time, even the existence of a route around Africa was in doubt, and serious technical challenges stood in the way of achieving Henry's vision.
From page 12...
... Improvements in geospatial database techniques and data mining can improve our ability to analyze the vast amount of data being collected and stored. · Human-computer interaction with geospatial information.
From page 14...
... SCENARIOS The committee envisions a world in which all geospatially relevant information is available in a timely fashion to those authorized to have access, in ways that are natural to use and, when desirable, coordinated with real-world context. The following scenarios illustrate the exciting opportunities enabled by research to enhance the accessibility and usability of geospatial information.
From page 15...
... , the robots use ad hoc wireless communication to share data with one another and with high-powered computers located outside the rubble. The computers perform planning tasks and assist the mapants with compute-intensive tasks such as image recognition and visualization of the map as it is constructed.
From page 16...
... from distributed sources; · Real-time map construction and refinement; and · Map-based user interfaces that allow coordinated teams of human experts to quickly and easily direct the mapants and analyze the results. Controlling Wildfires A second scenario illustrates how geospatial data from a wide array of sources could be integrated with powerful computational models, enabling us to predict more accurately the onset and behavior of wildfires.3 The size and severity of wildfires depend on how quickly and effectively 3A 2000 National Science Foundation Workshop on Dynamic Data Drive Application Systems explored research opportunities created by dynamic/adaptive applications that can dynamically accept and respond to field-collected data.
From page 17...
... Although dry fuel load (biomass with low water content) is the most direct indicator of potential fire severity, it is too difficult to measure over large areas, because remote optical instruments respond to the radiation reflected from the leaves rather than the dry fuel.
From page 18...
... Real-time updates flowing through the system make it possible to adjust strategies and routing as conditions change. In this scenario, a number of new challenges arise because predictive models have been coupled with the time-critical analysis of extremely large amounts of data: · Development of systems that can harvest classified and proprietary data, with appropriate barriers to unlawful access; · Methods for integrating computational, observed, and historical data in real time; · Methods for dynamically coupling independent numerical models and infusing external data into them to develop, evaluate, and continuously refine strategies for emergency response; · Algorithms capable of tracking moving and evolving objects and predicting their future state; · Methods for automatically identifying and communicating with persons in the affected area via wired and wireless communication mechanisms (household and cellular telephone numbers, pagers, PDAs, satellite TV and radio, cable TV, and the Internet)
From page 19...
... She sends some of this information to her personal e-mail address to study later. The time line, which stretches off in the distance, can be set for days, years, centuries, or even geological epochs, for those occasions when she wants to learn more about dinosaurs." 5"Digital Earth" is a broad international initiative using Earth as a metaphor for an information system and network that supports an easy-to-use human user interface for accessing multiple digital, dynamic 3D representations of the Earth and its connected information resources contained in the world's libraries, and scientific and government institutions.
From page 20...
... Realizing this vision will require not just advances in technology, but overcoming significant challenges related to human capabilities: · Data integration techniques capable of merging data of vastly different types, spatial resolutions, and temporal scales in response to human queries; · Supporting technologies for extremely large and diverse geospatial databases, including complex data models, scalable searching and navigation techniques, and scalable analysis on the fly; · Distributed virtual-reality environments that are uncomplicated and responsive enough to suit the general public; and · Intuitive, multimodal interfaces capable of supporting unconstrained navigation through virtual space and time as well as guided exploration of the concepts.
From page 21...
... Data that describe continuously moving and evolving objects, such as buoys floating on the ocean currents that record ocean temperatures, pose significant obstacles to current database and data mining techniques. 7The BaBar experiment, a collaboration of 600 physicists from nine nations that observes subatomic particle collisions, is another example of the increasing generation of scientific data.
From page 22...
... The convergence of advances in location-aware computing, databases and data mining technologies, and human interaction technologies, combined with a sharp increase in the quality and quantity of geospatial information, promises to transform our world. This report identifies the critical missing pieces that are needed to achieve the 21st-century vision of Prince Henry the Navigator's voyage.
From page 23...
... Chapter 2 explores the state of the art, research directions, and possible future application scenarios in location-aware computing. Chapter 3 outlines research challenges in database technologies and data mining techniques that are needed to manage and analyze immense quantities of geospatial information.
From page 24...
... Whereas the federal government's general policy is to make data available free of charge or at the actual cost of distribution, many state and local government organizations seek partial or full cost recovery, raising questions about what incentives might encourage state and local governments to make their data more widely available. It is not clear whether policy and technical mechanisms can be coordinated so as to encourage the realization of potential benefits from geospatial information while avoiding undesirable social costs.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.