Skip to main content

Currently Skimming:

4 Hard Problems and Promising Approaches
Pages 31-64

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 31...
... They are organized into six classes that align with the NGA top 10 challenges: achieving TPED; compressing the time line of geospatial intelligence generation; exploitation of all forms of intelligence (which includes challenges 2-6) ; sharing geospatial information with coalition and allied forces; supporting homeland security; and promoting horizontal integration.
From page 32...
... and database management systems are inadequate to achieve persistent TPED for many reasons. First, current sensor networks were designed for tracking fixed targets (e.g., buildings, military equipment)
From page 33...
... in order to improve the use and effectiveness of new and nontraditional sensor networks. Particular emphasis should be placed on the relation between sensor networks and space, sensor networks and time, accuracy and uncertainty, and sensor networks and data integration.
From page 34...
... Sensor networks will include ground-based fixed as well as mobile sensors to provide even finer resolution and better persistence. However, it is expensive to provide persistent coverage of large geographic areas over long periods of time.
From page 35...
... are semantic data models, query languages, query processing techniques, and indexing methods for representing spatiotemporal datasets from sensor networks. In particular, research should explore high-performance computing techniques (e.g., data structures, algorithms, parallel computing, grids)
From page 36...
... ) from active and passive remote sensors will have to be used in ways that are quite different from the traditional data models that were generated to deal with geospatial data in two-dimensional space.
From page 37...
... Research is needed on semantic data models, query languages, query processing techniques, and indexing methods for representing spatiotemporal datasets from heterogeneous sensor networks. Extensions beyond the Quad, R, and S trees will be necessary, and new search and query tools based on spatiotemporal zones and patterns will be required.
From page 38...
... to manage the data rates and data volumes from persistent TPED sensor networks. The benchmark may contain specific geospatial intelligence data streams and datasets, geospatial analysis tasks, performance measures, and target values for the performance measures.
From page 39...
... It is becoming increasingly important to move toward real-time data generation, processing, and dissemination to reduce latency in intelligence generation and delivery processes. However, the traditional geospatial intelligence generation process relies heavily on manual interpretation of data gathered from geospatial sensors and sources.
From page 40...
... Other potential short-term areas of focus include studying ways to automate the bottleneck steps in the processes of geospatial intelligence generation and use, and identifying ways to eliminate unnecessary waiting and dependencies to speed up the process by exploring alternatives to accomplish the same results. Ways to speed up the remaining manual tasks that cannot be automated because of the need for higher accuracies or for other reasons could be studied in the longer term, becoming the target of research designed to yield information about human behavior and cognition and of human-computer interaction studies.
From page 41...
... clients. These trends foreshadow a new paradigm of spatial data visualization.
From page 42...
... RECOMMENDATION 5: Given the importance to NGA of visual ization of GEOINT data, research should be supported that investi gates new methods of data representation that facilitate flexible and user-centered salient visualization. This applies both to new methods (e.g., cartographic data exploration)
From page 43...
... Given the amount of data and the complexity of the analytical tasks faced by NGA, including visualization, it would be advantageous for NGA to position itself more centrally in the field of high-performance computing and so-called grid computing. As yet, the academic GIScience community has been slow to steer the emerging body of research focused on what is now often called the "cyberinfrastructure" or grid computing toward the needs of geospatial data processing in general and GEOINT in particular.
From page 44...
... If a mathematical formula characterizes "interestingness," automation of the identification of interesting subsets should be possible by manipulating raw sensor data and developing computer software to automate this step. If this so-called interestingness cannot be characterized by a computable mathematical formula, a computation model using a set of positive and negative examples provided by analysts using data mining, machine learning, and/or statistical techniques could be substituted.
From page 45...
... This information can be used to select, design, and implement cognitive loadaware tools so that spatiotemporal data presentation can be automated and human performance improved. High-Performance Geospatial Computing It is well known that parallel and distributed computing can be an effective solution to computationally expensive spatial data integration, analysis, and presentation problems (Clarke, 2003)
From page 46...
... In addition, it is often more expensive to transmit geospatial data (e.g., a polygon with thousands of edges) than to process it locally using filter and refining techniques.
From page 47...
... Consequently, as discussed above with respect to Recommendation 1, traditional single-sensor techniques for object recognition, feature extraction, feature tracking, and change detection will require reevaluation and redesign for multisensor environments. The importance of fusion to GEOINT research cannot be understated, but again, research on fusion is in its relative infancy.
From page 48...
... Longer-term information fusion techniques could integrate information from heterogeneous collections of sensors that are not perfectly synchronized in space and time. Challenges include modeling of motion by dead reckoning as well as clock drifts over time, methods to synchronize spatiotemporal frameworks of different sensors, and techniques to represent and compute with spatiotemporal objects having imprecise, uncertain, or unknown positions.
From page 49...
... Important research problems in the area of archiving include advancing algorithms for geocoding and georeferencing directly from locative text and anecdotal information. Data pedigree and source should be checked on acquisition to back-trace intelligence preparation and the chronology of procedures (known as the "flow of provenance")
From page 50...
... Also in the short term, there is an immediate need for autonomous metadata creation and comparison, to determine quickly if two different data streams contain essentially the same or different content. This could involve research to determine new metadata types that summarize salient geospatial data characteristics, analogous to keywords that summarize salient characteristics in full-text databases.
From page 51...
... Another hard problem relates to fusing signals across multiple sensors, discussed earlier in this report. Due to the renewed interest in sensor technology by federal agencies and industry, novel sensors and sensor networks are being developed for a variety of applications.
From page 52...
... For example, inputs from traffic cameras can be used as sensors to detect unusual vehicle activity that might indicate suicide bombing, search out simultaneous attackers in a multivehicle attack, and rapidly use maps to alert possible target buildings that a suspect vehicle is approaching. Promising Methods and Techniques Evidential Reasoning over Homogeneous Spatiotemporal Frameworks If all sensors use a common spatiotemporal framework, possibly using the GPS and common triangulation algorithms, traditional evidential reasoning techniques, such as Bayes' rule, possibility theory, or DempsterShafer theory, can be used for data fusion (Pagac et al., 1998)
From page 53...
... Additional challenges include modeling of geospatial uncertainty, imprecision, and scale within S-DBMS and ST-DBMS. Semantic Interoperability Problems of interoperability for geospatial data have often been seen as problems of integrating semantics for disparate ontologies.
From page 54...
... While the methods used in this area are simple, the research is nevertheless important and can be enhanced with advances in information technology. Reuse and Preservation of Data Promising methods for the reuse and preservation of data have emerged from research in geospatial databases and from the various Digital Libraries initiatives by NSF and the National Aeronautics and Space Administration.
From page 55...
... Promising methods include evidential reasoning methods over homogeneous spatiotemporal frameworks, geospatial framework synchronization, robust frameworks for geospatial computations and reasoning, semantic interoperability, toponymic services, methods for reuse and preservation of data, and database and sensor technologies to support moving objects. SHARE WITH COALITION FORCES: INTEROPERABILITY Hard Problems Interoperability will be a key challenge for NGA in the coming years as it pursues its goal of sharing geospatial intelligence not only with other U.S.
From page 56...
... This ontology should have a set of object descriptions, should contain precise definitions, and should translate into a unified modeling language (UML) or other diagram suitable for adaptation into spatial data models.
From page 57...
... The first step is to determine whether intelligence needs dictate new standards, require extended existing standards, or fit within existing standards. Applied research can evaluate current geospatial data interchange standards, especially OGC, for exchanging geospatial intelligence across U.S.
From page 58...
... . Prior standards efforts such as the federal spatial data transfer standard (SDTS)
From page 59...
... The committee believes that working with DHS involves many of the same issues as sharing GEOINT with coalition forces, similar to ensuring horizontal integration. Therefore, homeland security issues are supported by many of the recommendations in this report.
From page 60...
... In the past, a culture has existed of separable roles of content producer and protector. A bias toward knowledge withholding as the default case has led to an extraordinary amount of geospatial data being withheld from the potential user community.
From page 61...
... Promising Methods and Techniques The field of image processing has developed numerous ways for encoding and selectively processing imagery. However, little work has extended methods into multispectral and hyperspectral sources, or other spatial data such as digital elevation models.
From page 62...
... Promising methods and techniques in TABLE 4.1 Summary of Hard Problems RecommenNGA Challenge Hard Problems dation (1) Achieving Assimilation of new, numerous, and disparate 1 persistent TPED sensor networks within the TPED process Spatiotemporal data mining and knowledge 2 discovery from heterogeneous sensor data streams Spatiotemporal database management systems 3 (7)
From page 63...
... It has also examined promising methods, approaches, and technologies for the solutions to the hard problems. The hard research problems and associated methods are summarized in Table 4.1.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.