National Academies Press: OpenBook

Priorities for GEOINT Research at the National Geospatial-Intelligence Agency (2006)

Chapter: 4 Hard Problems and Promising Approaches

« Previous: 3 NGA Challenges
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

4
Hard Problems and Promising Approaches

The National Geospatial-Intelligence Agency (NGA) has historically carried significant responsibilities in mapping, charting, geodesy, and imagery analysis to gather geospatial intelligence, such as the locations of obstacles, navigable areas, friends, foes, and noncombatants. However, the creation of geospatial intelligence not only requires optimal performance in these four areas, but also demands an effective integration of the four functions that comprise persistent TPED (i.e., tasking, processing, exploitation, and dissemination of data) over vast geographic areas and at the time intervals of interest. Moreover, as NGA transitions from a “data-driven” model to a “data- and process- driven” model in order to provide timely, accurate, and precise geospatial intelligence, the need to integrate other sources of intelligence with geospatial intelligence becomes even more critical.

This chapter lists a set of as-yet unsolved or “hard” problems faced by NGA in the post-9/11 world. They are organized into six classes that align with the NGA top 10 challenges: achieving TPED; compressing the time line of geospatial intelligence generation; exploitation of all forms of intelligence (which includes challenges 2-6); sharing geospatial information with coalition and allied forces; supporting homeland security; and promoting horizontal integration. Note that the third category has been broadened from “all forms of imagery” to “all forms of intelligence” to reflect the evolution of geospatial intelligence (GEOINT) beyond an imagery focus. Also identified are promising methods and tools for addressing the hard problems. Virtually none of these tools are part of NGA’s current

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

systems architecture or set of operating procedures, and so should be termed “disruptive.” Disruptive methods necessitate retraining and redesign at the least. However, it is likely that many of the tools will be introduced incrementally; therefore the transformation itself may feel evolutionary to those involved. Many of the problems involve extensions to spatial database management systems (S-DBMS), which have long been seen as different from the standard DBMS used in information technology and commerce. Such systems are essential to manage vast data holdings, yet only recently have they been adapted for geospatial data and the special needs of GEOINT.

Based on the committee’s knowledge of the hard problems in geographic information science (GIScience) and information from NGA (as described in earlier chapters) on the current and future challenges in developing GEOINT, the subset of hard geospatial research problems most relevant to NGA was selected as the list of “hard problems” identified here. Aspects that can be addressed in the short term versus the long term are discussed after each hard problem. Then, based on knowledge of current research and literature, and after considerable debate and discussion, the committee selected methods and techniques that seem most promising for addressing the hard problems. These are not ranked in any way, but were seen by the committee as potential starting points for future research. As a final step, a prioritization of the hard problems is proposed in Chapter 6.

ACHIEVE PERSISTENT TPED

Hard Problems

In the post-9/11 world, persistent tracking, processing, exploitation, and dissemination of geospatial intelligence over geographic space and time is crucial. However, current sensor networks (i.e., remote sensing using satellites and aircraft) and database management systems are inadequate to achieve persistent TPED for many reasons. First, current sensor networks were designed for tracking fixed targets (e.g., buildings, military equipment). They are sparse in space and time, and it takes a long time (e.g., hours) to move sensors to focus on the desired geographic area of interest for the relevant time interval. Lastly, even if an appropriate network were employed, current databases do not scale up to the significantly higher data rates and volumes of data generated by deployed sensor arrays. Basic and applied research on next-generation sensors, sensor networks, and spatiotemporal databases is crucial to achieving persistent TPED. Of particular importance has been the rapid development and deployment of unpiloted aircraft with multiple sensor systems that

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

can remain aloft for long periods of time, such as Predator and Global Hawk. In the future, swarms of such aircraft linked to more static sensor webs will provide enormous amounts of space-time data. Ground sensors include cameras, microelectrical mechanical systems or motes, data retrieved via the Internet such as weather information, and other devices (Warneke and Pister, 2002). Such systems of linked sensors will create a sensor web or network with enhanced capabilities, just as connecting computers together into networks has transformed computation. Yet sensor network theory is in its infancy, and even some of the first-generation technology lacks operational robustness. Much existing research and development to date has been on applications outside the geospatial context (Bulusu and Jha, 2005). Lastly, the ever-growing suite of positioning technologies continues to improve in accuracy and to overcome some of the initial problems of the global positioning system (GPS). Similarly, as location-based services move into broad consumer use, new products and services have become available for GEOINT.

Research is needed to improve the design and effectiveness of sensor networks. Many issues are highly spatial, for example the optimization of sensor suites, quantities and locations, the mix of stream versus temporally sampled data, the mix of static versus mobile sensors, and the movement of sensors to adaptively sample activity. In addition, new sensor types can be used to supplement and build on existing systems for imaging, mapping, and data collection. For example, a software program monitoring Internet traffic is a sensor, as also are civil systems such as air traffic control and camera-based traffic monitoring systems. Indeed, any mobile operative with a positioning device could be considered an input device to a sensor net. Furthermore, sensors can now be adaptable in terms of timing, fault tolerance, and power consumption in addition to geographical placement and movement. Given the importance of nontraditional sensor networks, their linkage to geographical space, and the need to integrate the information that they supply both within and across systems, the committee recommends the following.

RECOMMENDATION 1: Sensor network research should focus on the impact of sensor networks on the entire knowledge assimilation process (acquisition, identification, integration, analysis, dissemination, and preservation) in order to improve the use and effectiveness of new and nontraditional sensor networks. Particular emphasis should be placed on the relation between sensor networks and space, sensor networks and time, accuracy and uncertainty, and sensor networks and data integration.

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

From NGA’s perspective, this is more important than pursuing new variants in existing sensors, since industry now seems capable of delivering innovations for a growing nonmilitary sensor market in the coming years.

One of the shorter-term issues related to sensor networks relates to scheduling of sensors. Traditionally, the NGA has relied on space-based and airborne sensors. Even though the resolution of measurements is improving over time, space-based sensors tend to have coarse resolution and require time for repositioning. Space-based sensor systems are costly and must be designed and deployed years in advance of use. In the short term, the NGA will deploy novel sensors to detect subsurface and hidden human activity and military equipment. Sensor networks will include ground-based fixed as well as mobile sensors to provide even finer resolution and better persistence. However, it is expensive to provide persistent coverage of large geographic areas over long periods of time. Thus, benefits can be gained in the short term by addressing the geospatial scheduling problem to minimize the time to reach arbitrary locations and to maximize coverage. Scheduling will involve sequencing a suite of sensors, both ground and air, and not simply dealing with the details of aircraft access and orbital position. Scheduling problems of this type, however, can become computationally complex and involve multiple, conflicting criteria. Consequently, research on efficient multicriteria optimization methods that can be used by decision makers to configure sensor arrays is needed.

The new streams of multisensor data will strain existing database systems and require new approaches to dealing with vast quantities of time- and space-stamped information. There are challenges across the board for the development of spatiotemporal database management systems (ST-DBMS) and analysis routines based on the time-space patterns they reveal. Research will have to build a theoretical understanding of the tracking and recognition of movement, both of objects and of more complex entities such smoke, clouds, weather systems, and biothreats. While GIScience has begun work in the area of methods for the analytical treatment of time-space trajectories or lifelines (e.g., Laub et al., 2005; Peuquet, 2002), and the importance of the concept is represented well in the University Consortium of Geographic Information Science (UCGIS) research agenda (McMaster and Usery, 2004), much work on data structures, analytical methods, and theory remains. Research to date has been centered on transportation systems and human activity space, including visualization and description of process (McCray et al., 2005; McIntosh and Yuan, 2005; Miller, 2005a). Much is based on Torsten Hagerstrand’s concept of a time line or prism (Kraak, 2003; Miller, 2005b). As yet, however, little research pertinent to GEOINT has been done. Consequently, the committee recommends the following.

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

RECOMMENDATION 2: Research should be encouraged on spatiotemporally enabled geospatial data mining and knowledge discovery, on the visualization of time-space, and on the detection and description of time-space patterns, such as trajectories, in order to provide the new data models, theory, and analytical methods required for persistent TPED. Specific problems are real-time inputs, sparse and incomplete data, uncertain conditions, content-based filtering, moving targets, multiple targets, and changing profiles in time and space.

In addition, there is a strong likelihood that future sensor networks will outstrip the capacity and capabilities of systems for data management, reduction, and visualization. Many statistical packages and GISs, for example, place a limit on the maximum number of features or records they are capable of processing (e.g., samples, nodes, polygons). While ArcSDE and Oracle 10g have support for raster databases, in general current S-DBMS only poorly support many subtypes of geospatial intelligence data models, including raster (e.g., imagery), indoor spaces, subsurface objects (e.g., caves, bunkers), visibility relationships, or direction predicates (e.g., left, north). Research is needed to develop next-generation S-DBMS, if current commercial or research prototype S-DBMS fail to meet the performance needs of persistent TPED. The committee recommends that such research be conducted.

RECOMMENDATION 3: Research should be targeted at the ability of current database architectures and data models to scale up to meet the demands of agile geospatial sensor networks. The next generation of spatial database management systems must be able to flexibly and efficiently manage billions of georeferenced data records. These systems should support time-space tracking, automatically create and save metadata, and make active use of data on source accuracy and uncertainty.

Research on the problems of representing, storing, and managing unprecedented amounts of spatiotemporal data streams from sensor networks is generally a longer-term issue. Specific challenges (Koubarakis et al., 2003) are semantic data models, query languages, query processing techniques, and indexing methods for representing spatiotemporal datasets from sensor networks. In particular, research should explore high-performance computing techniques (e.g., data structures, algorithms, parallel computing, grids) to deal with the volume of data coming from

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

sensor networks for achieving persistent TPED. Also, the steady migration of imagery from the multispectral to hyperspectral and ultraspectral realms will demand the generation of newer and more efficient algorithms and models to enhance imagery exploitation and feature extraction processes. Moreover, the increasing generation of three-dimensional datasets (including light detection and ranging [LIDAR]) from active and passive remote sensors will have to be used in ways that are quite different from the traditional data models that were generated to deal with geospatial data in two-dimensional space. In addition to the three-dimensional potential of LIDAR, interferometric synthetic aperture radar (IFSAR) will generate substantial amounts of detailed surface data, including details of surface cover such as buildings, structures, and vegetation canopy. However, analysis, representation, and visualization of geospatial intelligence will have to be accomplished in both two-dimensional and three-dimensional environments, on both mobile and virtual clients, and in near real time or real time.

Also in the long term, research will have to be aimed at integrating vastly different data from traditional and nontraditional sensors in both time and space. Given the simultaneous sensing of events by sensors in the air and space, on the ground, and through non-NGA systems (e.g., census data, employment records, criminal justice system reports), the likelihood of false duplicates is probably greater than the likelihood of missing data. Future systems will have to resolve the ambiguity that results from multiple sensors sensing multiple moving targets, probably in real time. Similarly, each sensor will have its own relative and absolute accuracy and level of statistical certainty associated with features and their locations and descriptions. It is essential that the integration solutions devise means to store and use the known measures of these properties so that they can be applied to derivative products and decisions. For all of these reasons, sensor integration is considered a priority in the research agenda.

Other long-term issues include the development of techniques for combining data of different spatial and temporal resolution, with different levels of accuracy and uncertainty, including both conflation and generalization. Integration applies not only to data items, but also to data catalogs. Since the type of features apparent in an image or dataset may vary with resolution, the development of thesauri that match feature semantics (including behavior) rather than feature types is a current research need. This could lead to deployment of fully operational multiscale or scaleless databases. A third area of relevance is the fusion of two-dimensional and three-dimensional datasets, resolving uncertainties such as those caused by building shadows and varying information quality (Edwards and Jeansoulin, 2004).

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

Promising Methods and Techniques

An Agile Geospatial Sensor Network

The emerging research area of location-based services (LBS) is providing algorithms for determining optimal positioning of mobile servers to minimize the time to reach an arbitrary geographic area of interest. Such algorithms may be used to evaluate the quality of current geospatial scheduling methods for mobile sensor platforms (Bespamyatnikh et al., 2000). If current scheduling methods are not optimal, LBS research can be used to reduce the time to reach unanticipated geographic areas specified by customers of NGA. In addition, newer sensor platforms, such as motes and remote-controlled mobile platforms deployable via air drops, have the promise to reduce the time of positioning sensors over geographic areas of interest (Warneke and Pister, 2002). Moreover, autonomous and distributed sensor networks capable of locally optimizing sensor tasking and collection rather than centralized accumulation and processing of geospatial-temporal data will provide greater efficiency in information generation (Lesser et al., 2003).

Spatiotemporal Database Management and Analysis Systems

Many current GIS software vendors have moved their systems architecture to a georelational database model incorporating object-oriented properties. The consequent tools have led to experiments with data model applications templates (e.g., for intelligence uses) that encourage reuse and interoperability and can be used in more complex “model building” operations and systems database design that is more conducive to use over the Internet in a variety of client-server architectures. Exploration of the consequences of this transition is not yet complete, especially for processing time-related transactions (Worboys and Duckham, 2005). New theory may be necessary. Research is needed on semantic data models, query languages, query processing techniques, and indexing methods for representing spatiotemporal datasets from heterogeneous sensor networks. Extensions beyond the Quad, R, and S trees will be necessary, and new search and query tools based on spatiotemporal zones and patterns will be required. Some convergence of time-space geography and time-space data management will also be necessary.

Extensions to analytical methods to incorporate temporal as well as spatial description and inference should be a priority. Multicriteria analyses and object tracking are in the early stages of development (Bennett et al., 2004). Multicriteria analyses become important, for example, if a decision maker is forced to trade off risk-exposure against time-expedience. What

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

tools can be used to make objective decisions and to explore the consequences? Analytical methods that incorporate input from multiple participants—for example, in the specification of parameters—offer promise since complex problems require a range of expertise that is rarely held by a single individual (Armstrong, 1994). With further advances in change detection, monitoring individual time-space trajectories is becoming robust and is leading to some potential analytical approaches (Laub et al., 2005; McCray et al., 2005). Monitoring, minimizing, and communicating uncertainty in analytic outcomes is another area of high-priority investigation that is developing methods of value (Foody and Atkinson, 2003). Technologies are beginning to be developed for resolving locations to geographic place names that advance beyond current address matching and geocoding capabilities into telephone communications, news reports, IP addresses, and e-mail (Hill and Zheng, 1999). These place name, or toponymic, services need to offer multilinguistic transliteration and temporal place name shifts. Cultural transliteration remains a difficult problem since the names given to localities can vary among local communities according to cultural activity or context. Research is needed to crosswalk the toponymic view with map and image views.

Performance Benchmarking

Performance benchmarks (Transaction Processing Performance Council, 2005) are an objective way to evaluate the ability of alternative systems (e.g., sensor networks, S-DBMS) to support the goal of achieving persistent TPED. Consider a benchmark to evaluate an S-DBMS (Shekhar and Chawla, 2003) to manage the data rates and data volumes from persistent TPED sensor networks. The benchmark may contain specific geospatial intelligence data streams and datasets, geospatial analysis tasks, performance measures, and target values for the performance measures. Spatial database vendors and research groups could be invited to evaluate commercial (e.g., object-relational database management systems that support open geographic information systems spatial data types and operations) and research prototype spatial database management systems. If current S-DBMS meet the performance needs of NGA, adoption of current S-DBMS for various kinds of geospatial intelligence data may be appropriate. Specific tasks could include development of a semantic schema and object-relational table structures, plus conversion of existing geospatial datasets and applications to new data representations. However, additional research is needed to develop next generation S-DBMS if current commercial or research prototype S-DBMS fail to meet the performance needs of persistent TPED.

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

In summary, the hard problems in achieving TPED are the effective use of sensor networks, spatiotemporal data mining and discovery, and spatiotemporal database management systems. Promising solutions are suggested in the areas of developing agile sensor networks; spatiotemporal database management models and theory; detecting patterns within spatiotemporal data; and exploiting performance benchmarking.

COMPRESS THE TIME LINE OF GEOSPATIAL INTELLIGENCE GENERATION

Hard Problems

The timeliness of geospatial intelligence is becoming more crucial due to, among other things, the increasing numbers of mobile targets. Thus, the field of geospatial intelligence is making a transition from deliberate targeting to time-sensitive targeting. It is becoming increasingly important to move toward real-time data generation, processing, and dissemination to reduce latency in intelligence generation and delivery processes. However, the traditional geospatial intelligence generation process relies heavily on manual interpretation of data gathered from geospatial sensors and sources. This poses an immediate challenge given the increasing volume of data from geospatial sensors.

Characterization and reengineering of the geospatial intelligence cycle are crucial to achieve the goal of compressing the time line of geospatial intelligence generation. Characterization of the processes of generating geospatial intelligence would be a good starting point for NGA, including the necessity for and levels of human intervention in these processes. For illustration, consider the following process: Raw Data → Annotated Subset → Summary Patterns → Knowledge and Understanding. In other words, a large amount of raw sensor data is gathered continuously over geographic areas of interest. Human analysts review the stream of raw sensor data to identify and annotate interesting subsets of sensor data. The collection of interesting data items is analyzed to produce summaries and to identify interesting, nontrivial, and useful patterns. Human analysts correlate these patterns with other information to create knowledge and understanding about their meaning and underlying causes. Once the process of geospatial intelligence generation is characterized, the bottleneck steps can be identified, and ways found to reduce the time to complete those steps, possibly via automation or via the provision of tools to speed up the manual steps. Creating a system that provides the most efficient flow for the knowledge required would be the target of this research.

An important area for research is determining the scope of what is

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

achievable in time line reduction using automation versus human cognition (Egenhofer and Golledge, 1998; Nyerges et al., 1995). Are there limits to autonomy? At what parts of the knowledge cycle can complete automation produce best results (e.g., image processing, georectification, and mosaicing versus source discovery, temporal conflation, feature detection, and extraction)? What roles do humans play best in the knowledge discovery loop? NGA can benefit directly from research that delineates the limits of what tasks can be semi- or fully autonomous, and which will remain best served by trained interpreters. Automatic feature extraction algorithms continue to advance but remain somewhat unreliable. Although they can combine spectral and textural information, and recognize primitive shapes and their combinations, the automated segmentation and labeling of entire images remains an elusive goal. This leads to the following.

RECOMMENDATION 4: Research should be directed toward the determination of what processes in the intelligence cycle (acquisition, identification, integration, analysis, dissemination, preservation) are most suitable for automated processing, which favor human cognition, and which need a combination of human-machine assistance in order to compress the GEOINT time line. This is equally important for current and future systems.

Benefits could be gained in the near future by characterizing the processes of generating geospatial intelligence, possibly by observing codified as well as tacit organizational information flows and by surveying operational analysts. This would include examining the information flow dependencies between tasks and categorizing them into necessary and accidental dependencies. Once the steps of common processes are characterized with dependencies, NGA can gather data on a typical time duration needed to complete the overall cycle as well as individual subprocesses. Other potential short-term areas of focus include studying ways to automate the bottleneck steps in the processes of geospatial intelligence generation and use, and identifying ways to eliminate unnecessary waiting and dependencies to speed up the process by exploring alternatives to accomplish the same results. Ways to speed up the remaining manual tasks that cannot be automated because of the need for higher accuracies or for other reasons could be studied in the longer term, becoming the target of research designed to yield information about human behavior and cognition and of human-computer interaction studies. While this research evolves however, it would be useful to explore tools to aid analysts in completing the remaining manual steps by understanding the

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

cognitive processes that human analysts use. Such information would also be of value in training and evaluating analysts.

Effective interpretation, representation, and visualization of spatial information across all types of displays (virtual, web-based, mobile, three-dimensional, and two-dimensional) calls for innovative research. Recent trends indicate a strong inclination for transitioning into digital maps that can be delivered quickly to a variety of end user thick (desktops and larger) and thin (handheld and mobile) clients. These trends foreshadow a new paradigm of spatial data visualization. Simply moving away from static hard-copy maps to interactive digital media will not necessarily solve the issues of static information. A goal is to have “dynamic” maps rather than “digital” maps. Visualization of dynamic spatial-temporal information within a traditional cartographic framework will be an exciting area for future research that will address the new ways of depicting space-time changes in geospatial features or objects through animated symbology and cartographic designs. Moreover, end users of geospatial intelligence are expected to be using a variety of thin or thick clients that in turn will have variable connectivity, dictating the amount of information that can be efficiently delivered and visualized. Thus, middleware that performs optimized filtering of geospatial intelligence for content delivery based on the end user’s connectivity and visualization environment will be an important research area to be addressed.

High-priority research includes development of intelligent algorithms that become more proficient over time at browsing and sifting through image and data archives. Autonomous workflow management procedures would streamline retrieval of specific types of information by anticipating what an analyst might search for next, given what has just been requested. By learning from the results of previous analysis tasks, it would also be possible to suggest “best-practice” approaches to new tasks. Agent-based information retrieval can seek out additional sources that may be distributed in friendly or foreign archives. Accomplishing these tasks requires advances in intelligent image compression, multiple levels of security masking, and routines for efficient, on-the-fly semantic indexing.

Each of these stages must be advanced in the context of the significant computational resources that will be locally unavailable, but accessible through a network. Middleware that supports distributed data sharing and computation is an important area of future work (e.g., Armstrong et al., 2005). Other common themes among these research challenges include the development of procedures for handling, storing, and disseminating intelligence that is context sensitive. Another theme is development of self-describing resources that can be linked readily to other possibly disparate forms of data with similar content and include information on uncertainty and provenance. The equivalent of landscape intervisibility

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

analysis is established between databases, whereby semantic “lines of sight” are created and made explicit to analysts in order to exploit a very large number of relationships that exist among archived data objects. Indexing and catalog routines can reorganize information flexibly, reflecting various aspects of the underlying semantics, to generate catalogs whose surface organization varies with analyst task, but with a deep structure that remains stable. Analysts can track provenance and uncertainty through the life cycle of intelligence preparation. Analysis of space and time can become more seamlessly integrated. In short, the transition from a data-centric to a knowledge-centric working environment is built up to facilitate preparation and dissemination of GEOINT2.

The term salience is used in cartography to refer to the parts of a map display that are distinct, prominent, or obvious compared to the remaining parts. Salience is an important area in GEOINT map display research, specifically in how critical information can be communicated to an interpreter by visual prioritization. This is a known problem on small displays and mobile devices (Clarke, 2004; Peng and Tsou, 2003) and may also be true for web-based service delivery (Kraak and Brown, 2001). Maps designed for one type of display rarely suit another, as experience with the World Wide Web has shown. What devices best suit particular tasks? Where should devices fit on the thin-thick client scale? How can context awareness, multimodality, nontraditional interfaces, and augmented and virtual displays be used for GEOINT tasks? What are the special demands of uncharacterized spaces, indoors, in urban areas, beneath canopy, or underground? How do displays have to be changed from outdoors to indoors, from day to night, and from dark to light surroundings? What spatial differences exist in critical communications such as wireless Internet availability or cell-phone coverage? How does salience vary with users who may be stressed, distracted, or incapacitated? Geographical visualization is an important part also of the UCGIS research agenda (Buckley et al., 2004). Given the importance of these topics, the committee makes the following recommendation.

RECOMMENDATION 5: Given the importance to NGA of visualization of GEOINT data, research should be supported that investigates new methods of data representation that facilitate flexible and user-centered salient visualization. This applies both to new methods (e.g., cartographic data exploration) and to new technologies (e.g., mobile devices).

Visualization problems solvable now include map design and context sensitivity for maps delivered via the Internet on standard display devices.

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

Longer-term problems include those focused on innovative visualization tools such as immersive virtual reality, augmented reality, and mobile and wearable devices. Lastly, longer-term solutions must integrate human cognition research and context sensitivity to create salient, adaptive displays that sort information and display it according to analysts’ needs and the context in which they are working, whether in the laboratory, in the field, in the air, or on board ship.

Given the amount of data and the complexity of the analytical tasks faced by NGA, including visualization, it would be advantageous for NGA to position itself more centrally in the field of high-performance computing and so-called grid computing. As yet, the academic GIScience community has been slow to steer the emerging body of research focused on what is now often called the “cyberinfrastructure” or grid computing toward the needs of geospatial data processing in general and GEOINT in particular. In 2005, the National Science Foundation (NSF) took a bold step in promoting these approaches by establishing an Office of Cyberinfrastructure. Nevertheless, thus far NSF has not made GEOINT a component of the effort. A key step for NGA would be the creation of a new class of middleware that is designed specifically to address the needs of GEOINT2. Consequently, the committee recommends the following.

RECOMMENDATION 6: NGA should ensure that the special needs of geospatial data are met by high-performance grid computing so that it can be utilized to address the large amount of data and the complexity of analytical tasks (such as visualization) involved in producing GEOINT.

With the pursuit of solutions to these problems comes a demand for higher computing performance. While other agencies are pursuing and supporting the era of grid computing under initiatives of “cyberinfrastructure” and with parallel and high-performance computing, the overarching promise of the “virtual organization” has high value to NGA in its GEOINT goals. Ultimately, few advances will be possible via computational methods unless the grid is exploited. This still-emerging field of mostly computer science research has much to offer, yet geospatial grid applications remain mere tangents of this research field. Integral components of grid computing from NGA’s perspective are high-speed networks, distributed processing, and a computational services model. These are essential ingredients of next-generation networking and computational mobility. The growing field of geocomputation (Atkinson and Martin, 2000; Longley et al., 1999) is broader than the grid but has led to some research on which future work can be based. Thus, immediate ben-

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

efits can be gained by helping further geocomputational research now emerging internationally (Gahegan et al., 2001). In the long term, partnering with other agencies, particularly NSF, will help advance the broader national agenda around cyberinfrastructure and the grid.

Promising Methods and Techniques

Pattern Discovery in Spatiotemporal Data

Consider the step of identifying interesting subsets of raw sensor data as a bottleneck in the GEOINT production cycle. This step can be automated by characterizing the notion of the interestingness of raw sensor data items. If a mathematical formula characterizes “interestingness,” automation of the identification of interesting subsets should be possible by manipulating raw sensor data and developing computer software to automate this step. If this so-called interestingness cannot be characterized by a computable mathematical formula, a computation model using a set of positive and negative examples provided by analysts using data mining, machine learning, and/or statistical techniques could be substituted. This learned computational model might be evaluated using appropriate testing examples. If the learned computational model is accurate, it could be used to automate this step (Fayyad et al., 1996). Similarly the step of producing summary patterns could be automated using a variety of data mining, machine learning, and statistical techniques.

Many classical techniques in data mining, machine learning, and statistics assume the independence of learning samples. This assumption may be false for geospatial and spatiotemporal data due to the presence of spatial autocorrelation (i.e., the tendency of nearby locations to have similar properties). A specific research strategy is to develop novel spatial and spatiotemporal data mining techniques by exploring new methods that not only model autocorrelation but also address other unique spatiotemporal issues (Shekhar et al., 2004), including the presence of complex data types (e.g., time series, tracks, regions, curves, shapes) and implicit relationships (e.g., distance, direction, visibility.) A recent review by a group at the University of Zurich sets forward and demonstrates some methods for point objects (Laub et al., 2005). Note that this objective is supported by Recommendation 2.

Cognitive Modeling of Human Analysis Tasks

Cognitive modeling is a promising method that can be explored not only to identify the cognitive subtasks performed by analysts in manual tasks, but also to categorize those subtasks into information-bounded and

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

cognitive resource-bounded subtasks. Human analysts’ performance (e.g., time to complete a subtask) on information-bounded subtasks can be improved by providing additional information via novel tools to search for and present relevant information. Performance on cognitive resource-bounded tasks may be improved by reducing the cognitive load—for example, by eliminating extraneous tasks or unnecessary details—and by providing assistive tools to automate routine cognitive subtasks. Development of these tools will require usability testing and analysis, qualitative methods, and human subjects testing (e.g., Hix et al., 1999).

Cognitive Load-Aware Spatiotemporal Data Presentation and Interaction

Task-appropriate data exploration and presentation tools may help reduce unnecessary detail while presenting task-relevant information to improve analyst performance. Traditional two-dimensional cartographic display modes are probably inadequate for this task and will have to be revised or redesigned. Wearable computing devices exist now; yet requirements are not yet met for context-sensitive, graphical, lexical, and verbal data presentation modes (“multivalent documents”) (Phelps and Wilensky, 1998). Advances in human-computer interaction research should facilitate information dissemination and delivery in hostile environments. Results of high-priority research can be applied and validated in battlefield planning scenarios. Current presentation modes include text, tables, animated cartographic symbols, three-dimensional perspective, virtual imaging, spatial audio and speech, and haptics. Interaction models to specify user requests include implicit predictive interaction (e.g., based on observation of the user’s activities) or explicit requests using textual commands (e.g., Unix), graphical user interfaces, or haptic interfaces. User evaluation can be employed to compare the cognitive load imposed by alternative information representations and interaction models. This information can be used to select, design, and implement cognitive load-aware tools so that spatiotemporal data presentation can be automated and human performance improved.

High-Performance Geospatial Computing

It is well known that parallel and distributed computing can be an effective solution to computationally expensive spatial data integration, analysis, and presentation problems (Clarke, 2003). Plausible solutions include development of high-performance geospatial intelligence systems (HPGIS) by segmenting geospatial computational and input-output (I/O) tasks across processors in a multiprocessor machine, or across a cluster to alleviate the bottleneck associated with computational processes without

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

incurring significant overheads of communication and synchronization (Armstrong et al., 2005; Wang and Armstrong, 2003). HPGIS can take advantage of emerging cyberinfrastructure technologies such as grids of parallel computers to support integration and coordination of data and models in multisimulation environments (Atkins et al., 2003). Effective use of these computing environments will require the design and development of novel middleware that will coordinate the distribution of spatial data and tasks.

A unique challenge (Shekhar et al., 1996, 1998) in parallelizing geospatial intelligence tasks arises as a consequence of heterogeneity in spatial data. First, the computational loads imposed by data elements (e.g., polygons) vary a great deal based on the shape and location of the data item and the query of interest. Thus, naïve static data partitioning techniques (e.g., round-robin, random, geographic partitioning) may be ineffective. In addition, it is often more expensive to transmit geospatial data (e.g., a polygon with thousands of edges) than to process it locally using filter and refining techniques. This violates the basic assumption behind dynamic load balancing and data partitioning techniques. Other balancing problems arise because analysis methods require different amounts of computation in areas that have a variable density of observations (Cramer and Armstrong, 1999). Geospatial intelligence computations may pose similar unique challenges for scheduling algorithms and other components of grids (Wang and Armstrong, 2005). Thus, further basic and applied research is needed in evaluating traditional data partitioning techniques and developing novel ones for computational tasks that are essential components of the geospatial intelligence cycle.

In summary, hard problems in compressing the time line of GEOINT generation include reengineering the GEOINT generation cycle and automating it where possible, the visualization of data, and speeding up the computer processing of GEOINT data. Promising solutions include techniques for pattern discovery; cognitive modeling of human analysis tasks; developing cognitive load-aware spatiotemporal data presentation and interaction methods; and exploiting high-performance geospatial computing.

EXPLOIT ALL FORMS OF INTELLIGENCE

Hard Problems

In the post-9/11 world, NGA needs to exploit all forms of intelligence to thwart denial and deception, track moving targets, and target precisely.

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

In GEOINT, this meant fusion across imagery, maps, and sensor data. In GEOINT2, it is evolving to mean fusion across all forms of intelligence.

First, fusion across image sources alone is still a hard problem. For example, individual remote sensing devices are limited in thwarting denial and deception by use of subsurface facilities, such as bunkers and caves. They are also limited in their ability to track moving objects under cover (e.g., on the forest floor under a tree canopy). To address the limitations of traditional sensors, NGA needs not only to evaluate emerging sensor technologies but also to explore techniques to fuse information from multiple sensors. Geospatial intelligence derived from information fusion across sensors is often richer than the sum of the information available from individual sensors. For example, tracking moving targets may be facilitated by fusion of information from a collection of sensors that observe the moving target across time. Fusion of information from multiple sensors may also improve the reliability of geospatial intelligence about a target in the face of denial and deception. Fusion must integrate across space, time, the electromagnetic spectrum, and scale. Data fusion can also be thought of in terms of three different areas: data-level fusion (i.e., the pixel level), feature-level fusion, and decision-level fusion (IEEE, 1999). There is much to be gained, for example, by merging the target-specific high-resolution imagery associated with a single feature (e.g., a mountain pass) with coarser temporal and spatial resolution data on land-use change, weather, vegetation, and so forth.

Current understanding of multisensor intelligence integration is far from complete, and future research should develop innovative algorithms for contextual models that appropriately integrate and analyze information from different sensors. Consequently, as discussed above with respect to Recommendation 1, traditional single-sensor techniques for object recognition, feature extraction, feature tracking, and change detection will require reevaluation and redesign for multisensor environments.

The importance of fusion to GEOINT research cannot be understated, but again, research on fusion is in its relative infancy. While many other recommendations in this report will enhance fusion (e.g., by increasing horizontal integration and interoperability of architectures), NGA has a vested interest in drawing scholarship toward this demanding field of research. Given the importance of fusion and the increasing suite of inputs that require it, a shortage of expertise is likely in both the short and the long terms. The committee therefore recommends the following.

RECOMMENDATION 7: NGA should attract scholarship and research toward the problems that it faces in terms of information fusion, across sources, scales, spatial sampling schemes, systems,

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

formats, and architectures, in order to avoid a shortage of expertise in this area in the future.

Traditionally, NGA has relied on fusing measurements from homogeneous and synchronized collections of sensors. For example, determination of position is often by triangulation algorithms based on the readings of synchronized signals from three or more geopositioning satellites and differential GPS sources. Change detection over a geographic area of interest is based on comparison of time-stamped and georeferenced measurements, such as remotely sensed images from a common sensor. Motion of an object can also be detected by observing the position of the moving object at different time points from different synchronized sensors. However, current information fusion techniques are limited in their ability to integrate information from collections of sensors, which may be heterogeneous, possibly asynchronous, and not identically georeferenced due to motion, limited fields of view, or constraints on power and/or the GPS signal.

In the near term, information fusion techniques will merge measurements from heterogeneous collections of sensors that are synchronized in time and space. Challenges include development of cross-sensor signatures of targets to improve the reliability of target detection. Longer-term information fusion techniques could integrate information from heterogeneous collections of sensors that are not perfectly synchronized in space and time. Challenges include modeling of motion by dead reckoning as well as clock drifts over time, methods to synchronize spatiotemporal frameworks of different sensors, and techniques to represent and compute with spatiotemporal objects having imprecise, uncertain, or unknown positions.

Fusion of data and information from multiple intelligence sources is an even more formidable task. Specific intelligences usually have standalone systems of technology for data acquisition, interpretation, and analysis that often work against fusion. System architectures are often incompatible. Data that should be interoperable are not. There are few base structures, ontologies, descriptive schema, or common languages to support source integration. Issues of architecture as a solution to integration are seen as technological and not strictly “hard problems” in the research sense, and so are dealt with in Chapter 5. Nevertheless, semantic interoperability is critical. There is relatively poor integration of geographical text as an information source in addition to absolute location. Interoperability often depends on cross-walking between languages (e.g., English-Russian), on toponymic services, and on semantic correspondence between feature descriptions (e.g., creek, river, stream, brook, arroyo,

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

wash, wadi). Similarly, different representations or repetitions of text (vector, raster, map tile, feature) are often coexistent. Yet these are often the sole link between mapped data and other text-based intelligence, such as reports, news feeds, and the Internet.

Important research problems in the area of archiving include advancing algorithms for geocoding and georeferencing directly from locative text and anecdotal information. Data pedigree and source should be checked on acquisition to back-trace intelligence preparation and the chronology of procedures (known as the “flow of provenance”) that have been applied. NGA has apparently established and is implementing a goal for intelligence provenance and discovery as an element of the National System for Geospatial Intelligence (NSG). This is an important step for NGA’s future systems. To achieve NSG, both semantic and syntactical aspects of the framework must be interoperable with the data infrastructure used by other agencies.

The value of past GEOINT as a potential solution to a current or future problem is often unseen or ignored. With data being acquired so quickly and at high volumes, the tendency is to ignore or postpone the creation of metadata or the indexing of source material. Yet the nature of spatial change virtually requires a “before” and “after” view of the scene. Automated preservation and indexing are the only option for NGA. This requires schema and metadata that bridge all of the intelligence sources used and preservation suitable for recovery of ad hoc and unseen intelligence. An analyst should be able to make a query such as, “Over the last year, what trucks passed within 50 m of this building on Tuesdays between 2 p.m. and 3 p.m.?” This might mean bringing back from archives imagery, bus timetables, parking data, security video, and so forth, and then automatically processing it for vehicle information. There is a tendency to think of preservation as pertinent to imagery alone; however, the need is universal and the role of toponymy is critical. Accordingly, the committee makes the following two recommendations.

RECOMMENDATION 8: Research needs to be conducted to improve the role that text and place name search plays in integrating GEOINT since this is often the only link between mapped data and other text-based intelligence. Place names apply across scales and dimensions (point, line, area), language, and time. Systems that easily translate terms among applications areas (e.g., weather, navigation, operations, intelligence) are in need of research solutions, both theoretical and applied.

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

RECOMMENDATION 9: Research to promote the reuse and preservation of data should be made a priority so that critical before and after scenarios can be developed. Specific research areas should include metadata, data pedigree, provenance and lineage, and information about accuracy and uncertainty, as well as the storage of these elements along with data, the facilitation of their use in analysis software tools, and their coordination and exchange across other intelligence agencies.

Cartographic text and place name support have long been an NGA GEOINT activity. Data resources such as the NGA GEOnet names server, a database of 4 million features and 5.5 million names, with 20,000 updates daily, are available via ftp and http queries. One short-term area of focus is pursuing new methods for the creation, management, update, verification, and accuracy assessment of existing toponymy services, including the existing databases and as many others as is practicable. A longer-term concern is the development of data mining and exploration tools that allow innovative approaches to knowledge discovery in geographic names. For example, intelligence reports can be “mined” for place names, and the associated places displayed in the context of the searches. Sequences of places involved in military operations could be linked and shown as flow maps or animations, and paths matched against previous similar pathways. Similarly, with the Internet now forming the primary delivery mechanism for geographic names, problems of text placement, selection, labeling, and positioning will require attention. Partial solutions are often embedded in today’s web mapping and GIS software. These will have to be reassessed in light of the special needs of NGA, such as secure environments and horizontal integration.

Data reuse and preservation are pressing problems for NGA. An immediate issue is detecting data that has been collected and archived but not yet cataloged. Cataloging strategies that do not scale up from historical library preservation methods into the volumes of information collected for modern electronic data archives is the basic challenge that drove digital library research initiatives in the 1990s and persists today in organizing and cataloging the bodies of knowledge accessible on the World Wide Web. Also in the short term, there is an immediate need for autonomous metadata creation and comparison, to determine quickly if two different data streams contain essentially the same or different content. This could involve research to determine new metadata types that summarize salient geospatial data characteristics, analogous to keywords that summarize salient characteristics in full-text databases. Research is needed to develop efficient encrypting of provenance and lineage directly within datasets,

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

for example, using data watermarks or stenographic encryption—to ensure that metadata are transported along with data. Data and image watermarks are currently being developed, along with data encryption standards. The hard problem here is to establish criteria for determining salience, which can vary with task, target user community, and data type.

In the longer term, methods to establish permanent object identifiers (OIDs) and embed them in objects that are tracked across multiple sensor streams are necessary so that objects retain a permanent identity and can announce their arrival and departure from a spatiotemporal tracking process. A rudimentary example of this occurs in Windows computing environments where a particular process will generate a popup window telling the user for example that the advertisement in the sponsored window is being blocked from view. Similarly, dynamic objects should incorporate message-passing behavior to alert to their disappearance from a given sensor stream, and permanent OIDs can be compared to recognize emergence within another sensor stream when another sensor has picked up the object in its field of view. Numerous hard problems underlie establishing such a capability—for example, establishing an object whose delineation changes over time, that is indeterminate, or that is susceptible to denial and deception. Another hard problem relates to fusing signals across multiple sensors, discussed earlier in this report.

Due to the renewed interest in sensor technology by federal agencies and industry, novel sensors and sensor networks are being developed for a variety of applications. Some of the emerging sensors provide the ability to measure the properties of covered surface and underground facilities. An applied research project could evaluate a selected subset of emerging sensors for their ability to track moving objects in the presence of denial and deception possibly for precision targeting. This subset should be thought of as a suite, not a set of isolated sensors to be used individually. For example, one sensor may favor outdoor conditions in daylight; another, night; a third, objects occluded by buildings; and a fourth, objects under camouflage. The goal of this adaptive sensing strategy is a system that tracks an object seamlessly, integrating its signal across multiple sensor streams as the background conditions change in real time with object motion (Stefanidis and Nittel, 2005).

RECOMMENDATION 10: Emerging and existing sensor technologies should be reassessed in light of their abilities to detect moving objects even in the presence of denial and deception or for precision targeting. Research should focus on how data can be integrated for this task (1) across sensors, (2) across scales and resolutions, (3) across the spectrum, and (4) in time. Solutions by adaptive arrays of sensors rather than single sensors should be stressed.

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

The need to track moving objects should be seen as generic, not particular. From a machine vision perspective, object tracking is now relatively mature. Yet the data structures, pattern analysis, and data mining tools are somewhat lacking. In the short term, it would be beneficial to cross disciplinary bounds to seek out approaches to object tracking and to examine which approaches offer the most long-term benefits. For example, there is a considerable literature in transportation studies under the various intelligent vehicle programs. In the intermediate term, the most promising methods could be tested and improved to meet NGA’s needs. In the long term, new methods can be brought into the GEOINT cycle at several points. Of long-term interest are tracking multiple moving objects, automating detection and recognition of key movement patterns, and in doing so under uncertainty. For example, inputs from traffic cameras can be used as sensors to detect unusual vehicle activity that might indicate suicide bombing, search out simultaneous attackers in a multivehicle attack, and rapidly use maps to alert possible target buildings that a suspect vehicle is approaching.

Promising Methods and Techniques

Evidential Reasoning over Homogeneous Spatiotemporal Frameworks

If all sensors use a common spatiotemporal framework, possibly using the GPS and common triangulation algorithms, traditional evidential reasoning techniques, such as Bayes’ rule, possibility theory, or Dempster-Shafer theory, can be used for data fusion (Pagac et al., 1998). For illustration, consider a collection of visual, acoustic, and seismic sensors monitoring a common area for single-sensor signatures of a target object, such as an armored vehicle. Readings from each sensor could be analyzed independently for the sensor-specific signature of the target object and produce probabilities of observing the target object of interest given the sensor measurements. Results from individual sensors can then be combined using evidential reasoning techniques to fuse the information across sensors (Klein, 2004).

Geospatial Framework Synchronization

Traditional evidential reasoning techniques may not be effective if sensors are not synchronized in space and time, possibly due to natural or deliberate jamming of GPS signals, limitation of power to constantly monitor GPS signals, natural drifts of clocks, and other physical reasons. For example, consider the detection of the signature of an object moving from east to west by two sensors. If the sensors are not synchronized in space

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

and time, fusion of cross-sensor information could incorrectly infer that the object is moving from west to east instead of the reverse. Event ordering in time based on message interchanges between nodes in a sensor network is a promising approach to synchronize temporal frameworks across sensors. Similar approaches may have to be developed for synchronizing spatial frameworks, possibly by enabling each sensor to estimate relative positions of nearby sensors via techniques such as independent surveys, direct observation of positions of nearby sensor nodes, and estimation of the geospatial position of neighbors using information available in the network communication messages. An alternative approach is to develop a new generation of geopositioning sensors and clocks that are so accurate and synchronous that the residual errors do not impact geospatial intelligence results as they do with GPS.

Robust Frameworks for Geospatial Computations and Reasoning

Despite advances in technology for measurement of time and geospatial position, inaccuracies and errors have not been eliminated. Residual errors in measurements of position and time can lead to erroneous conclusions by traditional geospatial algorithms and data models that do not account for errors in position estimates or resolutions of sensor measurements. Recent research has explored robust spatiotemporal frameworks (Guting, 1994), modeling both the error and the resolution. In addition, this research has provided methods to estimate errors in the result of simple geometric algorithms, such as those for determining the point of interaction between two lines from the errors in positions of the end points of the intersecting lines. Geospatial error propagation models have the promise of characterizing geospatial errors to provide the analyst with more informative geospatial intelligence. Similar algorithms need to be developed to estimate the position (or track) of a moving object, given measurements from multiple sensors with their imprecise geospatial locations. Additional challenges include modeling of geospatial uncertainty, imprecision, and scale within S-DBMS and ST-DBMS.

Semantic Interoperability

Problems of interoperability for geospatial data have often been seen as problems of integrating semantics for disparate ontologies. Research in databases and computer science has produced a significant amount of information on semantic interoperability, including semantic webs. Creation of semantic webs is also seen as essential to bringing the vision of grid computing into reality, especially the highly distributed components. De Roure et al. (2001) see promising methods as including semantic

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

grid toolkits such as Globus (Foster and Kesselman, 1997); agent-based approaches; creation of new network philosophies for lightweight communications protocols; methods for dealing with trust and provenance; and methods for dealing with metadata and annotation. Most important are the knowledge technologies, which include knowledge capture tools, dynamic content linkage, annotation-based search, annotation reuse repositories, and natural language processing.

Toponymic Services

There is relatively little research on the relationship between toponymy and advanced search tools. Areas of promise include geographic information retrieval methods and geoparsing, a cross between natural language understanding and gazetteer lookup. Tools are needed to support the modeling of information about named geographic places and access to distributed, independent gazetteer resources. This has involved semantic webs, resource description frameworks (RDFs), ontologies and mark-up languages such as extensible markup language (XML) and geography markup language (GML). Related to this is research into the type classification of named places (e.g., features types, populated places) and the correspondence between different approaches to such classifications. A component of place name research deals with the history and etymology of place names and their cultural context. While the methods used in this area are simple, the research is nevertheless important and can be enhanced with advances in information technology.

Reuse and Preservation of Data

Promising methods for the reuse and preservation of data have emerged from research in geospatial databases and from the various Digital Libraries initiatives by NSF and the National Aeronautics and Space Administration. There have been considerable advances in the creation of effective metadata standards and in the creation of tools for authoring metadata. Next-generation systems will leverage the Federal Geographic Data Committee (FGDC)-style metadata to support advanced reuse. Many of the promising methods and techniques reflect those of semantic interoperability, large-scale database management, and toponymic services. Spatial OnLine Analytical Processing (SOLAP), a type of software technology that enables rapid information retrieval from multidimensional data and has been extended to geospatial data and GIS, has some potential to integrate metadata about data quality into the information processing flow (Devilliers et al., 2005).

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Database and Sensor Technologies for Moving Objects

Sensor technologies for moving objects include technical measurement solutions for positioning, with GPS and similar technologies, in both indoor and constrained environments. They also include video-capture and machine vision methods, mature research fields. Less mature is the database management research aimed at creating computer systems for managing and exploiting moving object data. Recommendation 2 covers theory and visualization for moving objects. Guting (2005) has recently presented some promising methods and techniques for moving objects databases, including extended query languages and data models (e.g., Transect-Structured Query Language), spatiobitemporal objects, event-based and transactions processing approaches (Worboys and Duckham, 2005; Worboys and Hornsby, 2004), trajectory uncertainty analysis, spatiotemporal predicates, indexing methods (e.g., time-parameterized R-tree, kinetic B-tree, kinetic external range trees), and special cases, such as analysis of movement on networks.

In summary, hard problems related to the exploitation of all forms of intelligence include information fusion across diverse sources, the role of text and place name search in data integration, preserving data in a way that they can be easily reused, and techniques for using multiple sources to detect moving objects. Promising methods include evidential reasoning methods over homogeneous spatiotemporal frameworks, geospatial framework synchronization, robust frameworks for geospatial computations and reasoning, semantic interoperability, toponymic services, methods for reuse and preservation of data, and database and sensor technologies to support moving objects.

SHARE WITH COALITION FORCES: INTEROPERABILITY

Hard Problems

Interoperability will be a key challenge for NGA in the coming years as it pursues its goal of sharing geospatial intelligence not only with other U.S. organizations but also with coalition forces and foreign partners. For example, consider an exchange of navigation maps among coalition forces. A source may focus on terrain maps, where each route segment is navigable by a land vehicle (e.g., a tank), possibly because it primarily serves Army missions. Maps from another source may include land- as well as water-based route segments for amphibious vehicles, possibly because they serve U.S. Marines. If the maps from two sources are merged without accounting for differences in semantic meanings, it can

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

lead to land vehicles losing egress routes during battles or falling into deep water bodies. In addition, precise tracking of a moving target becomes difficult if geopositions recorded by two different sources use disparate coordinate systems, data file formats, and map symbols. In general, combining the maps from these two sources will require careful consideration of differences at the semantic (e.g., meaning of route segments), structural (e.g., coordinate system, other metadata), and syntactic (e.g., data format) levels.

Some critical problems faced in spatiotemporal interoperability are the role of real-time sensor inputs, the problems of dealing with incomplete and sparse data, disparate ontologies, uncertainty management, content-based filtering, moving targets, and changing profiles in time and space (e.g., growth, aging, decay). These all have implications for data conflation, for analysis, and for data mining and are components of Recommendation 2.

Issues of syntactic interoperability are already being addressed through techniques such as spatial data standards, especially those of the Open Geospatial Consortium. Similarly, structural interoperability can also be addressed by practical means. Therefore, these were not considered hard research problems. However, because of their importance, they are still included in the following discussion.

Nevertheless, semantic interoperability is considered a hard problem. There can be little progress in pursuing interoperability without a thorough examination of the abstract set of objects or features of interest to GEOINT, so that they can be formally defined and converted into abstract objects that become transferable because they are complete in their descriptions. While GIScience research has begun this task, there is little motive outside of NGA to target an ontology toward NGA’s needs. Nevertheless, a generic ontology would have great value to other agencies, software developers, and researchers. As a result, the committee makes the following recommendation.

RECOMMENDATION 11: Research that creates a complete descriptive schema for geospatial objects of importance to GEOINT as formalized in a GEOINT ontology should be pursued to ensure effective data interoperability and fusion of multisource intelligence. This ontology should have a set of object descriptions, should contain precise definitions, and should translate into a unified modeling language (UML) or other diagram suitable for adaptation into spatial data models.

A short-term issue is addressing the syntactic interoperability of

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

geospatial data, such as those related to data file formats. One option is to carefully diagram and examine a complete catalog of differences between two varying spatiotemporal conceptual models, sufficient that a translation procedure between them can be either automated or exhaustively described procedurally. A good pair of models to choose would be two that cause known interoperability problems at NGA.

The structural interoperability challenges are longer-term issues. Example challenges include interoperability across geospatial intelligence sources with differences in conceptual schemas (e.g., an entity relationship diagram or UML diagram) and metadata such as coordinate systems, resolution, and accuracy. Long-term challenges include semantic interoperability toward addressing the challenges related to differences in meanings (e.g., definition) of geospatial intelligence across sources. This is an extremely difficult and long-standing problem. Thus, it will be important to support high-risk research to explore promising approaches (e.g., semantic web, ontology translation) that address important sub-problems.

Promising Methods and Techniques

Geospatial Intelligence Standards

Initial efforts are addressing syntactic interoperability by developing common standards. The Open Geospatial Consortium (OGC) has provided a sound foundation for work in distributed geoprocessing, real-time processing, sensor-web challenges, geospatial semantic webs, and brokering multiple distributed ontologies. The first step is to determine whether intelligence needs dictate new standards, require extended existing standards, or fit within existing standards. Applied research can evaluate current geospatial data interchange standards, especially OGC, for exchanging geospatial intelligence across U.S. organizations as well as coalition organizations. If current standards do not cover crucial types of geospatial intelligence, it would be of benefit to NGA to encourage the extension of current standards or the development of new standards for exchanging geospatial intelligence data and services from a variety of sources such as sensors, human interpretation, modeling, and simulation. Effective standards are based on a consensus among major stakeholders including the producers and consumers of geospatial intelligence within the United States and its partner countries. Thus, basic and applied researchers need to address how their results can be incorporated into geospatial and/or computational interoperability standards through the evolution of those standards.

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Spatial and Spatiotemporal DBMS Interoperability

Structural differences (e.g., conceptual data models, reference coordinate systems, resolution, accuracy) could be addressed by a combination of automatic and manual methods. For example, geospatial intelligence analysts and their database designers could review the differences between the geospatial conceptual models (e.g., entity relationship diagrams with pictograms) of a pair of sources to develop translation schemes. This would require a careful analysis of issues such as synonyms and homonyms. In addition, it requires establishing correspondences between the building blocks of two conceptual data models. Once the translation scheme is developed and validated, future interchange of geospatial intelligence between the selected source pair can be automated by implementing the translation scheme in software. However, manual effort for this approach grows superlinearly with an increase in the number of sources, and an alternative approach based on global conceptual schema becomes more attractive. Of course, the time and effort of developing a global schema and translation procedures can be reduced by the provision of appropriate tools.

Geospatial Intelligence Ontology

Semantic differences across sources are difficult to resolve largely because of differing ontologies. Availability of a geospatial intelligence ontology (e.g., a concept dictionary, thesaurus, concept taxonomies) is likely to help manual tasks of developing global schemas and translation procedures. It may also help formalize the geospatial intelligence and make it amenable to large audiences to facilitate training of new analysts. Several ontologies have been explored in the GIScience literature and can be researched to provide a framework for further work (e.g., Agarwal, 2005). Prior standards efforts such as the federal spatial data transfer standard (SDTS) include feature lists and definitions, and geometric objects both with and without topology that could be building blocks for future work. Future work will build on the ongoing body of research on the semantic web (Berners-Lee et al., 2001). The GML standard and work by the Open Geospatial Consortium already form a significant element in NGA’s research programs. OGC in particular has been closely integrated with NGA’s research, and it would be beneficial to continue this in the future. Research into geospatial intelligence ontologies will build on the more generic work described above, but will build tools specific to the needs of geospatial intelligence analysts.

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

In summary, the hard problem associated with sharing data is semantic interoperability. Promising methods and techniques for increasing interoperability include further research and development of geospatial intelligence standards, translation schema for spatiotemporal conceptual models, and geospatial intelligence ontologies.

SUPPORTING HOMELAND SECURITY

The creation of the Department of Homeland Security (DHS) as a response to the nation’s increased vulnerability to terrorist attack has led to another significant demand for GEOINT from NGA. To quote a recent planning document (MITRE, 2004): “In the war against terrorism, our front-line troops are not all soldiers, sailors, fliers, and marines. They are also police, firefighters, medical first responders, and other civilian personnel. These are groups whose historical access to sources of national intelligence has been near zero; yet their need for real-time and analytical intelligence is now critical.” Extension of NGA’s responsibilities to work far more closely with civilian agencies, including but not limited to DHS, has broadened NGA’s mission. For the most part, DHS’s needs place NGA in the category of an information supplier. With current trends, NGA will be more suited to serving as a knowledge supplier. In this case, few options are available for collaboration in research. There remains an opportunity for the intelligence world to collaborate with academics and others in the conduct of research. The few DHS-funded centers in universities are starting points, but there are already a large number of vehicles in place to encourage collaboration and the sharing of experience, expertise, and resources. It is in the interest of NGA to explore relationships among the existing research encouragement mechanisms, reviewed in the next chapter, and DHS. The committee believes that working with DHS involves many of the same issues as sharing GEOINT with coalition forces, similar to ensuring horizontal integration. Therefore, homeland security issues are supported by many of the recommendations in this report. However, while many of the report’s recommendations are applicable to homeland security, the distinction between domestic and foreign intelligence is of great importance. The need for and use of such information within the United States will also be substantially different from that outside the United States. With a new institutional infrastructure for intelligence in the United States, NGA is well placed to clarify and support the role of GEOINT in the new integration-based intelligence environment.

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

PROMOTING HORIZONTAL INTEGRATION

Hard Problems

Horizontal integration refers to “the desired end-state where intelligence of all kinds flows rapidly and seamlessly to the warfighter, and enables information dominance warfare” (MITRE, 2004). The expanded role of NGA and the new clients for NGA services brought by international collaboration and work with civilian communities and agencies places strains on the mechanisms in place for protecting the security of assets and technologies available to NGA but not available elsewhere. As GEOINT2 evolves and creates new ways to assimilate geospatial intelligence for a particular problem, new vehicles will be necessary “so that the full value of the information can be realized by delivering it to the broadest set of users consistent with its prudent protection” (MITRE, 2004). Yet this multilevel demand for information brings with it risks. In the past, a culture has existed of separable roles of content producer and protector. A bias toward knowledge withholding as the default case has led to an extraordinary amount of geospatial data being withheld from the potential user community. In at least one case, there is evidence that such overprotection of geospatial data is unnecessary or even damaging (Baker et al., 2004).

New protocols must be established to promote safe data exchange in light of legitimate changing demands for geospatial intelligence products. Existing solutions include using and sharing similar data, such as commercially available high-resolution imagery, without security classification. Research can contribute reliable security protocols in several ways. Discussions by the committee focused on the reported issues of dealing with two problems. First, how can GEOINT products be modified so that their content at any given level of clearance is visible only to those at that level? For example, could an image be made such that its display resolution varied depending on the interpreter? Could a GIS dataset be made that hides detail or entire features on the same basis? Such data could be distributed universally in a single form, but would be used differentially under the control of keys associated with different security levels. Such key-based methods are the domain of research in cryptography and steganography, where GEOINT has received less attention than elsewhere. Secondly, what alternate data sources in the public or commercial domain can be shared, so that information can flow while sources are protected? Anecdotally, commercial high-resolution remote sensing seems to be filling this need in many contexts. Nevertheless, research could contribute to both of these options.

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

RECOMMENDATION 12: Research should be directed toward the particular needs of geospatial data for protection with multilevel security to promote safe data exchange, including innovative coding schemes, steganography, cryptography, and lineage tracking. Similarly, the processing of data so that it resembles public domain (e.g., digital orthophotoquadrangle) or commercially produced structures and formats should be pursued.

In the short term, issues of image and map degradation are pertinent. For example, what other than a median filter can be used to gracefully degrade the contents of a high-resolution image so that it can be shared? There are also tasks relating to metadata that can be done immediately: for example allowing a web-based query to indicate that spatial data covering a particular area of interest exist, but not allowing access other than providing relevant contact information. In addition, within computer networks, various firewall protection systems and sub-local area networks (LANs) can limit access by Internet domain with ease and can alert NGA to users seeking access inappropriately.

Also in the short term, new steganographic methods to support multilevel security could be researched, and protocols developed for location-specific identification and spatially constrained security. For example, a user undeniably situated at a particular location (e.g., from GPS codes) could be granted access for data covering that location. Alternatively, users from or in particular locations could be denied access, perhaps even temporarily. In the longer term, increased research in spatial data licensing (NRC, 2005), geospatial digital rights management, location privacy rights, and geospatial denial and deception methods would be beneficial.

Promising Methods and Techniques

The field of image processing has developed numerous ways for encoding and selectively processing imagery. However, little work has extended methods into multispectral and hyperspectral sources, or other spatial data such as digital elevation models. Similarly, little work has been done with place name or vector data (Armstrong et al., 1999). An emerging body of research in location-based services is examining some of the technical issues (Schiller and Voisard, 2005), but policy issues such as location privacy need further study. Location authentication research has focused on technology of the GPS, yet next-generation systems will use both new systems such as Galileo and new positioning approaches (Rizos and Drane, 2004). Methods are already available to support mutiresolution imagery and to some extent maps, such as quadtrees,

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

recursive and adaptive meshes, and resolution-dependent georeferencing (Shekhar and Chawla, 2003).

In summary, the hard problem associated with horizontal integration is the issue of multilevel security. Promising methods and techniques in-

TABLE 4.1 Summary of Hard Problems

NGA Challenge

Hard Problems

Recommendation

(1) Achieving persistent TPED

Assimilation of new, numerous, and disparate sensor networks within the TPED process

1

Spatiotemporal data mining and knowledge discovery from heterogeneous sensor data streams

2

Spatiotemporal database management systems

3

(7) Compress time line

Process automation versus human cognition

4

Visualization

5

High-performance grid computing for geospatial data

6

(2-6) Exploit all forms of imagery (and intelligence)

Image data fusion across space, time, spectrum, and scale

7

Role of text and place name search in data integration

8

Reuse and preservation of data

9

Detection of moving objects from multiple heterogeneous intelligence sources

10

(8) Sharing with coalition forces, partners, and communities at large

GEOINT ontology

11

(9) Supporting homeland security

Covered by other areas

 

(10) Promoting horizontal integration

Multilevel security

12

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

clude current research in location-based services, location authentication, and methods for selectively processing multiresolution imagery.

SUMMARY

This chapter has presented recommendations on the “hard problems” in geospatial science that NGA should address in order to meet its evolving mission toward GEOINT2. It has also examined promising methods, approaches, and technologies for the solutions to the hard problems. The hard research problems and associated methods are summarized in Table 4.1. Many of the technical problems are ontological issues (i.e., the solution of architecture and interoperability problems lies in the creation of a comprehensive ontology for the collection, handling, and archiving of geospatial information). The chapter also shows that the nature of input networks, and the volume and type of data coming from these networks, are likely to change markedly in the future. By exploiting foreknowledge of these changes, NGA can position itself for the radical shift in geospatial paradigms discussed in Chapter 3.

Nevertheless, the challenges of responding to the hard problems outlined in this chapter will be disruptive to NGA both technologically and organizationally. In Chapter 5 recommendations are made that are intended to ease the transitions due to the hard problems outlined in this chapter.

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×

This page intentionally left blank.

Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 31
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 32
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 33
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 34
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 35
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 36
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 37
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 38
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 39
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 40
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 41
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 42
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 43
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 44
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 45
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 46
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 47
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 48
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 49
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 50
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 51
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 52
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 53
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 54
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 55
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 56
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 57
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 58
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 59
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 60
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 61
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 62
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 63
Suggested Citation:"4 Hard Problems and Promising Approaches." National Research Council. 2006. Priorities for GEOINT Research at the National Geospatial-Intelligence Agency. Washington, DC: The National Academies Press. doi: 10.17226/11601.
×
Page 64
Next: 5 The Research Infrastructure at NGA »
Priorities for GEOINT Research at the National Geospatial-Intelligence Agency Get This Book
×
Buy Paperback | $50.00 Buy Ebook | $39.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF
  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!