The Challenge: Providing Geospatial Data, Tools, and Information Where and When They Are Needed
In this era of heightened requirements for prompt and effective response, rapid access to disparate geospatial information sources is essential. As shown in Chapters 2 and 3, the emergency management community relies heavily on the ability to discover and use accurate up-to-date information in order to respond to disasters and other emergency events. However, the necessary data are scattered among numerous agencies, there are many impediments to rapid access, the skilled personnel needed to work with the data and tools are often not available in sufficient quantity, and the technological environment is changing constantly, causing endless confusion. This chapter explores these and other related issues in greater depth. Each section of the chapter takes one issue, describes the problem in detail, elaborates on its significance, describes possible solutions, and where appropriate, offers recommendations. This overview and the first three sections deal with issues that require policy changes; the next three focus on operational changes that could be made to enhance the use of geospatial data and tools; the next two sections on tools and training discuss changes that will produce better utilization in the future; and the final section addresses funding.
It is important to note that this study deals with the intersection of two distinct communities—the emergency response community and the geospatial community. The issues discussed may have their roots in one community or the other, but the resolution of these challenges will require both communities to work together, as reflected in the recommendations. The fact that both of these are professions in their own right, with
the emergency management community often seen as conservative with regard to the adoption of new technologies, presents a challenge. Without the support—and preferably the leadership—of the emergency management community, the geospatial data community’s own efforts will have little benefit.
The committee heard from many federal, state, and local emergency management professionals during its deliberations and during the study’s workshop, as well as from several representatives of the private sector and nongovernmental organizations (NGOs). All testified to the central importance of geospatial information. The first questions responders ask when a disaster occurs are, Where is it? Where are the victims? Where are the hazards? Where are the resources? The first request from an incident commander is often for a map, and the need is immediate. Responders must act within a “golden hour,” during which delivering victims to appropriate care providers has the best chance of saving lives.
Data on the cost savings from more effective emergency management are almost impossible to compile, in part because many benefits, such as lives saved, are impossible to value and in part because any form of controlled experiment in which costs are compared with and without effective emergency management is impossible to conduct. Nevertheless some of the more direct cost savings might be quantified, in certain limited contexts. For example, The National Governors Association Center for Best Practices published an Issue Brief on State Strategies for Using IT for an All-Hazards Approach to Homeland Security (July 13, 2006).1 In the section about geographic information systems (GIS), it has the following paragraph:
State and local governments in Virginia combined their efforts in October 2001 to launch the Virginia Base Mapping Program (VBMP) for use in deploying resources and personnel during disasters. At an estimated cost of $8.2 million, this program began delivering DVDs [digital video discs] with GIS technology to 134 cities and counties in February 2003, providing information about transportation systems, private-sector facilities, natural resources, and many other assets. Although measures of lives saved, injuries averted, and property damage avoided are difficult to calculate, it is estimated that in its first year the VBMP saved the state between $5 million and $8 million in operating costs.
Responders and managers need to be able to work with several map layers or themes. The most important layer to them is the search grid, which must be established quickly and applied by all agencies working
on the incident. They also need to be able to locate points on the map and on the ground. While street address normally provides an easy way to do this in urban areas, it is often unsatisfactory in rural areas or when street signs and house numbers have been obliterated. The Global Positioning System (GPS) provides an effective and universal alternative, but requires that maps be overprinted with GPS coordinates, using latitude-longitude, Universal Transverse Mercator coordinates, or the proposed National Grid (the National Grid was endorsed and adopted by the Federal Geographic Data Committee in 2001).2 Further, they need to be able to map an event as it changes in real time and to print and distribute updates quickly. From an emergency management perspective, maps enable the location-specific assessment of hazard, risk, vulnerability, and damage. They are required with different levels of geographic detail throughout the emergency management cycle, from the moment an incident occurs through long-term recovery and into mitigation.
For most emergency events, the needed geospatial information and services for planning and response are maintained by a variety of public and private organizations in multiple jurisdictions. Government agencies are stewards of large volumes of data, most of which are held by state or local agencies. However, additional key layers, such as critical infrastructure data, are maintained by the private sector.3 As mentioned previously, estimates from the Department of Homeland Security’s Protected Critical Infrastructure Information (PCII) Program are that the private sector owns and operates 85 percent of the nation’s critical infrastructure.4 Many of these organizations are members of local utility notification centers, also referred to as “One Call” or “Call Before You Dig” agencies. However, data are shared with these and other consortia under very restrictive agreements and may not be used for any other purpose, even during emergencies.
Emergency preparedness and response require data from many sources both public and private, and critical infrastructure information is but one of many themes that must be accessed. There are also needs for property records, street centerlines, floodplain delineations, and other data that are maintained by the public sector. From an emergency preparedness and response perspective, it is critically important for all sources of data to be utilized to ensure that planners and responders have
the best possible common understanding of the operating picture. However, although many of the data that are needed by emergency managers are already developed by other organizations for other purposes in the general course of local government and community development, various issues and challenges prevent easy access to or use of these data for emergency management.
Data on the ownership of land parcels, or cadastral data, provide a particular and in some ways extreme example of the problems that currently pervade the use of geospatial data in emergency management. Vast amounts of such data exist, but they are distributed among tens of thousands of local governments, many of which have not invested in digital systems and instead maintain their land-parcel data in paper form. As with many other data types, it is not so much the existence of data that is the problem, as it is the issues associated with rapid access. In their report Parcel Data and Wildland Fire Management, Stage et al. (2005) argue that cadastral data can provide the most current and accurate information in support of emergency management, but note that access to such information can be limited by a number of factors including the following:
Data distribution agreements. In some cases, local units charge for the data or have data licensing agreements that constrict access to the information.
Data format. The data might be in a format that is not recognized or usable by responding agencies.
Local emergency responders generally have vast personal knowledge of their communities, and as a result the use of geospatial information may sometimes be seen as superfluous to their immediate needs. However, when disasters extend far beyond the boundaries of a community, when local responders are unable to respond adequately and professionals without knowledge of the area must be brought in from elsewhere, or when impacts extend to infrastructure such as underground pipes about which local responders have little personal knowledge, then geospatial data and tools become absolutely indispensable to an effective, coordinated response.
As the committee heard in testimony, geospatial data and tools are essential to all aspects of planning for disaster and to all aspects of community resilience. In this respect, the committee echoes a conclusion of an
earlier National Research Council (NRC) study: “Much of the information that underpins emergency preparedness, response, recovery, and mitigation is geospatial in nature” (NRC, 2003, p. 1). Without knowledge of where the event has occurred, the area it has impacted, the nature of impact in each part of that area, or the locations of shelters and potential responders, and without access to the tools to analyze such information and to present and distribute it in useful form, the eventual impact of the event will necessarily be greater than it need be, whether measured in loss of life, injury, damage to property, or disruption of essential activities. Also, although many of the geospatial data needed for emergency response generally have already been developed by communities for other purposes, there are a variety of issues that currently impede their use for emergency management. Therefore, steps must be taken to explicitly recognize and meet the geospatial needs of the emergency management field. As its first, overarching conclusion, the committee believes that the importance of geospatial data and tools should be recognized and integrated into all phases of emergency management and, specifically, into the national plans and policies reviewed in Section 3.3 and existing emergency management procedures.
RECOMMENDATION 1: The role of geospatial data and tools should be addressed explicitly by the responsible agency in strategic planning documents at all levels, including the National Response Plan, the National Incident Management System, the Target Capabilities List, and other pertinent plans, procedures, and policies (including future Homeland Security Presidential Directives). Geospatial procedures and plans developed for all but the smallest of emergencies should be multiagency, involving all local, state, and federal agencies and NGOs that might participate in such events.
FOCUS ON COLLABORATION
The lack of consistent policy for collaboration, together with protocols and structures for coordination and communication, has long been an impediment to effective collaboration, sharing, and reuse of geospatial data and tools among all levels of government. Since the early 1990s a number of government initiatives and orders have charged federal agencies with coordinating their programs in this specific area.
In 1990 the Federal Geographic Data Committee5 (FGDC) was formed and given the lead responsibility for this coordination by an updated Of-
fice of Management and Budget (OMB) Circular A-16.6 In 1994 the FGDC was also charged by Executive Order 12906 to provide leadership in coordinating the federal government’s development of the National Spatial Data Infrastructure (NSDI) and to seek the involvement of other levels of government and sectors in this endeavor.7 Federal-level coordination has produced benefits in the development of more than 20 standards supporting the NSDI, the implementation of the NSDI Clearinghouse Network,8 the Geospatial One-Stop,9 and the emerging Geospatial Profile for the Federal Enterprise Architecture.10
State-level coordination has also produced many improvements. The National States Geographic Information Council (NSGIC) has been an effective mechanism for facilitating coordination among states.11 NSGIC’s activities have leveraged the strong geospatial programs present in a number of states to bring about improvement of coordination activities in many other states. Private-sector and professional organizations have also played important roles in facilitating coordination among various segments of the geospatial community and have likewise produced benefits for participants. However, these efforts have been confined primarily to local jurisdictions and, as such, have proven difficult to replicate across a wider spectrum.
Specific examples of effective collaboration exist in many places both across the nation and internationally. There are excellent resources already available that describe the issues involved in collaboration and suggest approaches to enhancing cooperation across jurisdictions. One such project developed by the Geospatial Information and Technology Association (GITA) is entitled GECCo (Geospatially Enabling Community Collaboration).12 Another resource is the work of the Open Geospatial Consortium as part of the Critical Infrastructure Protection Initiative (CIPI) completed in 2002 and 2003,13 and another is the work done by Emergency Management Alberta.14 In these three examples, a common principle is that agreements must be discussed, negotiated, and formalized
The current version of the circular can be found at http://www.whitehouse.gov/omb/circulars/a016/a016_rev.html.
before an emergency situation occurs if the impacts of institutional and social barriers to interoperability are to be reduced.
Many types of agreements are needed, including the following:
Data-sharing agreements among public and private organizations
Proprietary agreements so that geospatial data can be used during emergencies without becoming part of the public domain
A predefined list of geospatial and other technical personnel and vendors required in support of a response to an event
Guidelines for sharing data with the media during and after an event
Agreement on interoperability standards to enable the on-demand access, integration, and exchange of relevant geospatial data
A process to organize, integrate, and distribute both data internal to an organization and data from other organizations
These agreements can take considerable time and energy to put in place, but if they are not, the results can be at a minimum very frustrating and at worst devastating. However, despite efforts at various levels and within sectors, collaboration between levels of government and with other sectors has been difficult to achieve. The FGDC has been seeking to carry out this role for the geospatial community; however it has not achieved complete success due to lack of authority, budget, and resources.
The FGDC’s Future Directions Initiative recently provided a high-level look at the nation’s sharing and use of geospatial information and the development of the NSDI.15 The study report finds that geospatial data and information have been identified as valuable assets in conducting the business of government. In the post-9/11 era, there is a heightened appreciation of the importance of geospatial data to support homeland security needs and other critical requirements. There is a clear sense of urgency that the problems associated with intergovernmental and intersector collaboration in geospatial data production, access, and sharing need to be resolved in a timely and comprehensive manner.
The Future Directions Initiative study team found widespread agreement that the NSDI requires strong national leadership, that all sectors should be represented in the leadership and governance process, that stable funding and political support are required, and that an effective NSDI requires a clear national strategy to complete and maintain the framework layers. The team found a broad consensus that a strong and
renewed national focus is needed to drive our country toward the production of highly accessible, accurate, and reliable geospatial data. The team believed that a national approach, incorporating all sectors, is necessary to accelerate the production of geospatial data for the NSDI and to ensure its ongoing maintenance. The increasing ubiquity of geospatial data and tools lends urgency to the need for current, complete, accurate, and nationally consistent data. The study team recommended the establishment of a new governance structure to provide national leadership in the development of the NSDI, with participation from multiple sectors.
The Committee on Planning for Catastrophe also reviewed the current governance structure of the NSDI in light of this study and discussed whether it was adequate to provide effective coordination across state, local, and federal governments and the private and not-for-profit sectors in the particular context of emergency management. The arguments and conclusions of the Future Directions Initiatives study resonated strongly with the committee, which concluded that the proposed changes in the governance structure would provide a much more effective framework for geospatial data and tools in emergency management. Moreover, the committee felt that it was desirable for the needs of emergency management to be addressed within this larger framework and that the emergency management community should be given a sufficiently strong voice to ensure that these needs are met.
A national geospatial governance process such as the one described above would do much to improve the attention given to policy and other institutional issues that make it difficult for the different levels of government and other sectors to work together effectively in the development of geospatial capabilities for emergency management. Whatever the root cause of a disaster—terrorism, natural occurrences, or accident—the methods of preparing for, responding to, recovering from, and mitigating the effects of such events, and ideally preventing reoccurrences, are based on a common approach: the collaborative and coordinated use of geospatial data and tools. This cannot happen without the many mutually dependent agencies and organizations charged with protecting our nation’s citizens and infrastructure being able to share their geospatial data and tools efficiently and effectively for emergency management purposes. Moreover the special circumstances of emergency management—the need for speed and for planning in advance without knowledge of where and when disaster will strike, and the extreme costs in damage and loss of life that may result from a bungled response—all give additional merit to arguments in favor of greater collaboration and effective governance.
The myriad of individual and organizational collaboration efforts are currently doing much to resolve specific local needs and to provide a positive, dynamic environment for collaboration. Many problems and issues remain, however, and many of these successful efforts have been costly in terms of the time required to develop and maintain them. Missing is a strong, nationally focused governance process to bring the relevant and affected organizations together within the established framework of the NSDI to ensure collaborative approaches to resolving multijurisdictional and national-level issues. The kind of governance process described by the report of the FGDC Future Directions Initiative is the subject of continued discussion within the NSDI community and could significantly improve the environment for collaboration and data sharing during emergency response. The Department of Homeland Security (DHS) has been assigned responsibilities for coordinating geospatial data and tools for emergency management, as detailed in Section 3.2.2. The committee therefore recommends that DHS play a leading role in ensuring that this proposed strengthening of NSDI governance addresses the needs of emergency management.
RECOMMENDATION 2: The current system of governance of the NSDI should be strengthened to include the full range of agencies, governments, and sectors that share geospatial data and tools, in order to provide strong national leadership. DHS should play a lead role in ensuring that the special needs of emergency management for effective data sharing and collaboration are recognized as an important area of emphasis for this new governance structure.
GEOSPATIAL DATA ACCESSIBILITY
A critical requirement for emergency preparedness, response, and mitigation is to have rapid access to the most accurate, up-to-date geospatial content, whether it be current wind speed and direction, the location of hospitals, damage assessment data, or the results of predictive flood models. Emergency managers and responders need rapid and reliable access to such content on demand. However, there are numerous issues involved in meeting the challenges of this on-demand, rapid-access requirement. Whether the geospatial data are being accessed from archives or from real-time sensor feeds, the following must always be considered if we are to build a national asset not just for emergency management but also for other homeland security functions:
Are geospatial data being collected once and maintained by the organization that can do this most effectively?
Is it possible to combine geospatial data seamlessly from different sources and to share them between many users and emergency applications?
Are geospatial data available for use in emergency management, or do use conditions restrict their availability?
Is it easy to discover which geospatial data are available, to evaluate their fitness for the purpose, and to know which conditions apply to their use?
There are both policy and technology impediments to the achievement of these goals. Some of the issues deal with sharing data among organizations, since there are many reasons why a data-producing agency may be reluctant to make its data available, such as concerns related to privacy, confidentiality, or liability. Other issues are more technical in nature and are focused on the interoperability of data and the need for standards that address not only the content and labeling of data, but also real-time discovery of and access to data through clearinghouses and portals. Finally, although data may be accessible, there may be questions related to their quality. This section describes these various challenges.
The unwillingness to share geospatial data is by no means universal, and many entities make their data free and easily accessible for use by the public. Many do not, however, particularly local governments or private utility companies, where some of the most important geospatial data for emergency management often reside. There are a number of reasons for this reluctance to share data, including the following:
The desire to sell data to obtain revenue from a costly and valuable asset;
The considerable effort required to convert data into a form in which they can easily be shared, especially at the local level;
The fear that data may assist terrorists in their activities;
A basic distrust of the entity requesting the data or a basic unwillingness to cooperate;
A concern for liability if the data are improperly used or are of insufficient quality for a specific use;
The fear that once others are aware of the existence of data they may attempt to obtain access to them through freedom-of-information laws; and
The most basic fear that the organization will lose control of its data.
A workshop panelist told the committee that government agencies were much more willing to share data for homeland security than for other purposes, but were adamant in many cases that access be restricted to that purpose. In some cases, agencies went so far as to agree to forward certain data that they deemed sensitive only after an incident had occurred.
Currently, policies for geospatial data sharing within specific levels of government are set by their executive branches. The policies developed for each level of government vary, and enforcement varies within each level from department to department (Sidebars 4.1 and 4.2 contrast two different approaches to policy formulation). Almost none of the policies set for one level of government are imposed on another level, and many local governments have no policies for sharing geospatial data at all. It is impractical to expect that the data-sharing policies of all government entities will be the same. However, it is reasonable to expect that all government and private entities have clearly defined data-sharing policies and guidelines, especially for data relevant to emergency management.
As the committee heard in testimony from many individuals and agencies, the lack of such policies and guidelines results in confusion for data custodians and becomes a nightmare for those wishing to acquire data on a large-scale basis either before or during an event. A significant amount of time and staff effort is required to investigate each data owner’s issues and policies and to keep abreast of changes in these policies. For example, New York State has been aggressive in collecting and maintaining geospatial data since the events of September 11, 2001. A representative of New York State’s Office of Cyber Security and Critical Infrastructure Coordination reported to the committee that it has had a team assigned to geospatial data collection and maintenance for homeland security and emergency response since 2002 and has collected more than 850 sets of geospatial data. However, it was noted that this involved significant effort because each government entity required personal contact to discuss how its data sets would be used and where they would be stored. In one case the office had been unsuccessful in negotiating for certain utility data from a federal agency. In another case, more than two years of effort were required to obtain the use of a local government’s parcel data. Data produced by federally funded research and development centers (FFRDCs) and GOCO (government-owned, contractor-operated) national assets are often not included in data-sharing agreements between government agencies. This presents additional barriers to effective data sharing.
Utility companies, in particular, have been viewed as organizations that create and compile sophisticated databases but will not readily share these data with others except in emergency situations. The first concern of
The Case for Mandatory Data Sharing
The National Pipeline Mapping System
As a means of creating a single source of information, the U.S. Department of Transportation (USDOT) requires all transmission pipeline operators to provide data on an annual basis to the National Pipeline Mapping System (NPMS).a Section 15 of the Pipeline Safety Improvement Act of 2002 required that operators provide “geospatial data appropriate for use in the National Pipeline Mapping System or data in a format that can be readily converted to geospatial data,” together with other information on pipeline operations. The NPMS enforces strict mapping and metadata standards that must be followed by all operators. Its intent is to provide a common national database depicting the location of all pipeline networks and related attribute information that can be accessed whenever needed.
The terrorist attacks of September 11, 2001, placed additional security concerns on the U.S. pipeline infrastructure; therefore, access to the NPMS is restricted to federal, state, and local government agencies (including emergency responders). Participating pipeline operators are able to view only their own data on-line from the USDOT’s web site and cannot view data from any other operator. Operators who do not comply by the annual mid-June deadline are subject to a minimum $1 million penalty that increases the longer the operator is noncompliant. This action, however drastic it may seem, is one example of how data sharing can be made a mandatory rather than a voluntary venture.
The NPMS standards are a good example of program-oriented specifications from a federal agency, providing clear guidance on how to prepare and submit data. If they could be extended to bring them into line with the more general geospatial data standards of the International Organization for Standardization (ISO),b the Open Geospatial Consortium (OGC),c and the FGDC,d it would be easier to integrate NPMS data with other geospatial data in emergency management operations as part of a national fabric of critical infrastructure information within the framework of the NSDI.
The Case for Voluntary Data Sharing
The 50 States Initiative
This project seeks to support the NSDI by coordinating data-sharing methodologies and standards at the local level, providing a common framework in which all states can participate.
The effort is coordinated by the National States Geographic Information Council (NSGIC), and plans to establish 50 state coordinating councils that will contribute routinely to the governance of the NSDI.
Statewide councils will bring consistency to the NSDI by
many utility companies is that their data will become part of the public domain and be available to anyone. From a utility perspective, this has serious liability issues. For example, if an individual or organization wanted to perform some type of activity that required excavation and utilized information about underground facilities that had been made available to the general public but proved to be dated and therefore inaccurate, the utility company could be liable for any damages incurred. The risk of this happening is very real, especially if the proper method of submitting a facility location request with the local utility notification center within 48 hours prior to performing any excavation tasks was not followed. Therefore, adequate measures to ensure that these data are excluded from the Freedom of Information Act (FOIA) are a basic requirement. Second, to address the need to accurately track facility information, utility companies have spent a great deal of time and money to build sophisticated databases, which are consequently seen as corporate assets and treated as such. Therefore, the prospect of sharing these data without the protection of a proprietary agreement is often not an option. These two barriers must be addressed if data-sharing and collaboration initiatives are to prosper, perhaps through the creation of suitable incentives for data owners to participate in these types of activities.
However, it is important to note that data sharing does not necessarily mean sharing one’s data in their entirety, but rather can be limited to key data elements (e.g., commodity, location, size, material, ownership— to name a few). Portions of an organization’s data may be proprietary, confidential, or sensitive or may require protection as intellectual property, whereas other portions may be suitable for limited or full data sharing. Where there are legitimate reasons to protect portions of a data set, identifying the critical data elements that are relevant in responding to and planning for emergency events is particularly appropriate. Organizations vary in the level of geographic detail of their databases and in the complexity of the attributes that are maintained to meet regulatory and internal needs. Only a subset of attributes may be needed for emergency management, and lower levels of geographic detail may also be sufficient. Data sharing may be much more palatable to data owners if it involves only subsets of attributes or coarser levels of geographic detail.
As defined in Section 1.3.2, interoperability is about the ability of two or more systems to share data and tools effectively and seamlessly, independent of location, data models, technology platform, terminologies, and so forth. To achieve true interoperability (or the effective sharing of geospatial data and tools), many levels and aspects of interoperability
must be understood. The most obvious, and perhaps the easiest to solve, is technical interoperability, which is concerned primarily with issues of format. More problematic is semantic interoperability, which addresses and overcomes differences in concepts and the meaning given to data by different users and systems. These core semantic differences are reflected in the selection and definitions of technical terms used in publications, communications, and databases. (Institutional, human, and political issues that make it difficult for individuals and organizations to work together and lack of legal interoperability due to inconsistencies between the legal contexts in which different individuals and organizations operate are discussed in other sections of this chapter.) Achieving effective interoperability for emergency management may require radical changes to the ways in which organizations work, especially to their attitudes about information. The following material expands on each of these sources of interoperability problems.
This is the “nuts and bolts” of software and hardware interoperability, where the work of standards organizations can best be leveraged. Technical interoperability is typically achieved by selecting and implementing the appropriate software and Internet standards, common content encodings for transmission, and so forth. Within an enterprise, technical interoperability is quite often the easiest form of interoperability to achieve for any given business process. Yet, in cases where different data formats are encountered in the field during response, it can still cause significant delays in developing useful geospatial products.
Geospatial data represent an extremely rich conceptual domain that requires special attention, perhaps more so than any other type of data. The enormous variety of ways of encoding geospatial data and the large number of classification schemes, vocabularies, terms, thesauri, and data definitions in use by data-producing agencies make it particularly challenging to process requests for geospatial information. Within any organization seeking to integrate geospatial data, it is vitally important that there is agreement on the proper definition and use of metadata. In principle, proper metadata can provide the foundation for semantic interoperability, by defining the meaning of each of the terms that underlie the data production process. In reality, however, the difficulties of overcoming differences of culture, language, and discipline may be far beyond the capacity of current metadata standards and practices. Efforts to
map terminologies across domains have proved challenging, but solutions to this challenge are critical to ensuring accuracy and efficiency in data sharing.
Issues related to interoperability are often addressed through establishment of standards. For example, the issue of communication of data is being addressed by the Organization for the Advancement of Structured Information Standards (OASIS), which is creating the Emergency Distribution eXchange Language (EDXL) standard that will be incorporated in the National Information Exchange Model (NIEM). OASIS is a well-recognized standards organization that works on e-business standards— many of which are required in the emergency management world. Therefore, there is an Emergency Management Technical Committee that has representatives from many organizations, such as DHS, Warning Systems Inc., the Capital Wireless Integrated Network (CAPWIN), the Open Geospatial Consortium (OGC), Environmental Systems Research Institute (ESRI), and many others. Initially, EDXL was an outgrowth of work at the Department of Justice and will be a multipart standard. The initial part that has been approved by OASIS membership is EDXL-DE, where DE stands for Distribution Element. The primary purpose of the Distribution Element is to facilitate the routing of any properly formatted Extensible Markup Language (XML) emergency message to recipients. The Distribution Element may be thought of as a “container.” It provides the information to route “payload” message sets (such as alerts or resource messages) by including key routing information such as distribution type, geography, incident, and sender-recipient IDs. There are very few implementations of this new standard as yet.
There is also a need for standards that address not only the content or communications of data, but also the real-time discovery of and access to data through clearinghouses and portals. Part of this solution is agreement on best practices regarding the standards of content and service interfaces that can best help achieve these goals and meet the requirement of providing geospatial content to emergency managers and personnel when and where it is needed.
Examples of how other countries are defining best practices for on-demand access to geospatial data can be seen in various e-government and spatial data infrastructure activities in the European Community,16 the United Kingdom,17 Germany,18 and New Zealand.19 All of these
activities include a variety of geospatial standards using consistent, standards-based implementation architectures.
The committee heard many comments to the effect that use of the most accurate and up-to-date framework and foundation data is essential to successful response and recovery. These data almost always reside at the local government level or with the private sector; at the local government level, they are generally the by-product of routine business processes such as the creation of parcel data for property taxation. Because they serve everyday business needs, they are kept up-to-date and represent the most accurate data available. Furthermore, local governments, which are usually the first responders to emergencies, are the most familiar with these data and understand their strengths and weaknesses (the importance of education in issues of geospatial data quality and uncertainty is addressed in Section 4.8).
The use of geospatial data during the 9/11 recovery efforts was much more effective because the local geospatial professionals working with these data were familiar with them and understood their complexities and high level of accuracy. The committee heard that when federal agencies arrived on-site with their less accurate national data, they realized the benefits of the more accurate local data and converted to them after several days to improve overall coordination. During large disasters, the use of the same framework and foundation data by responders in various parts of the country is vital for close coordination.
Some municipalities and states have aggressively pursued the gathering and improvement of data needed to respond to emergencies, but many others have not. Unfortunately, instead of coordinated multiagency efforts to organize data, assist in improving their accuracy, and create new data to fill gaps, the committee heard about many redundant efforts to gather, recreate, or purchase similar data at various levels of government. This results in significant amounts of funding being needlessly spent and time and effort wasted. As a result, an opportunity to focus on improving the original local government data and reusing them through all levels of government is often missed.
Furthermore, geospatial information is only as good as the available data. The lack of established quality assurance and control (QA/QC) standards and testing processes results in data inconsistency and inaccuracies, which can negatively impact analyses used by decision makers. Determining whether differences in analyses and recommendations are the result of inconsistent or inaccurate data is extremely time-consuming and may be difficult to accomplish in a time-critical response environment.
As Sections 4.2.1-4.2.3 have shown, current arrangements for geospatial data access in support of emergency management resemble a complex patchwork that is time-consuming to establish and maintain and confusing for all involved. There is little agreement or consistency in such technical issues as the formats that will be used, the locations where data will be made available, the security mechanisms that will prevent unauthorized access, or the architecture of the servers. This patchwork makes it difficult for agencies to acquire the data needed to prepare for, respond to, recover from, and mitigate the effects of disasters; makes it difficult to identify gaps in data coverage or to address problems related to data quality; and is one factor among many in determining eventual costs, injuries, and possibly even loss of life. With some exceptions, there are both confusion in the legislative context and an acute lack of consistency across levels of government.
While a few data custodians prefer to not provide their data until after an event has occurred, presentations to the committee made it clear that access is needed prior to an event during the preparedness phase in support of training and planning, and because pressures during the response phase are so great that the process of acquiring data would delay other essential activities.
The level of interoperability necessary to enable systems to exchange and use data and tools, without special effort on the part of the user, can essentially be achieved by the adoption of policies stressing the importance of interoperability and requiring standards-based software, terminologies, and communications in all emergency management-related activities. While the focus of research and standards development is often on technical and semantic issues, interoperability is a multidimensional problem, with institutional, political, social, and legal ramifications. Indeed, the 9/11 Commission in its final report (National Commission on Terrorist Attacks Upon the United States, 2004, p. 418) concluded:
Recommendation: The President should lead the government-wide effort to bring the major national security institutions into the information revolution. He should coordinate the resolution of the legal, policy, and technical issues across agencies to create a “trusted information network.”
The National Spatial Data Infrastructure provides an excellent template for the development and sharing of geospatial data. It provides standards for the documentation of data resources (geospatial metadata standard); a network and procedures for posting, discovering, and accessing geospatial data (Geospatial One-Stop Portal and NSDI Clearinghouse Network); and base layers of framework data (National Map). While much
progress has been made in implementing the NSDI, many agencies and organizations are not yet managing their geospatial data resources effectively or participating fully in the NSDI. Data description through metadata is often insufficient to support effective discovery, and conflicts exist between the metadata requirements of NSDI and other programs such as the National Pipeline Mapping System (NPMS) and the National Incident Management System (NIMS) Integration Center (NIC). A new effort to develop the necessary policies and guidelines for the support of emergency management, led by DHS but within the framework of the NSDI, would strengthen the efforts of both DHS and the NSDI and bring about more consistency. The recent work of DHS to develop the geospatial data model in conjunction with the FGDC is an example of how this could work and should be extended even further. Such a new initiative will likely require strong backing, in the form of a directive or even legislation, if it is to be effective. The Emergency Management Accreditation Program (EMAP), which supports the continued development of standards for emergency management, could be of great help in developing the needed standards, and the National Emergency Managers Association (NEMA) and the International Association of Emergency Managers (IAEM) also would be critical in their development, but even more so in helping to ensure that standards are adopted and disseminated to the emergency management community.
RECOMMENDATION 3: A new effort should be established, within the framework of the NSDI and its governance structure and led by DHS, to develop policies and guidelines that address the sharing of geospatial data in support of all phases of emergency management. These policies and guidelines should define the conditions under which each type of data should be shared, the roles and responsibilities of each participating organization, data quality requirements, and the interoperability requirements that should be implemented to facilitate sharing.
GEOSPATIAL DATA SECURITY
The security of the data that are gathered for emergency management must be examined from a variety of perspectives. To begin with, there is the need to determine the actual risk to the nation should these data fall into the hands of terrorists or others with harmful intentions. A report20 by the RAND National Defense Research Institute entitled “Mapping the
Risks: Assessing the Homeland Security Implications of Publicly Available Geospatial Information” (Baker et al., 2004) found that fewer than 1 percent of federal data sets are unique and 94 percent of the data sets would not be useful to terrorists.
While this study clearly demonstrates the limited risk of making such data widely available, it has not overcome the real fear exhibited by some agencies about open access to their data, particularly in law enforcement, defense, and local government. Whether or not the perception is justified, it is real. Data that are not secure may be subject to intentional tampering as well as inadvertent corruption. Thus, many agencies believe that data shared for emergency management must be held securely, and if security cannot be guaranteed, many data custodians will be unwilling to provide their data. The committee heard from a panelist from New York State that efforts to collect geospatial data for homeland security purposes were more likely to succeed if security measures were guaranteed to the data custodian. If data are shared and security is subsequently breached, the resulting publicity could damage the agency’s long-term effectiveness and erode its political support.
On the other hand, emergency planners at all levels must be aware that needless restrictions on access to geospatial data can lead to skepticism, if not open hostility, among local officials, the media, and the public. When restricted access reflects understandable agreements with private-sector data holders, the public needs to be informed about the reasons for these limitations and to be assured that key data will be available to those who need them should an emergency occur. Similarly, an informed public needs to appreciate restrictions on the release of unique, clearly sensitive, publicly held data involving critical infrastructure or hazardous materials. However, while narrow, sensible restrictions can inspire confidence and cooperation, overly broad restrictions on access, particularly when equivalent data are readily available through other channels, are unnecessary, as discussed in the Rand report cited above. Equally unwarranted are limitations on publicly collected geospatial data, which should be available to emergency responders and the public, and should include accurate metadata with a conscientious assessment of usability or strong, clear caveats regarding appropriate use.
The issue of security should be addressed whenever agreements are made for the sharing of data. Protection of confidentiality is one major source of concern in such agreements, because data about individuals and property that are useful during emergencies may be perceived as invasions of privacy at other times, when they might be used by criminals, for example. Appendix B presents a possible model for a confidentiality agreement that could be incorporated into negotiations over data sharing where appropriate.
It is important to note that some data elements used for emergency management may contain highly sensitive information (see Sidebar 4.1, for example), while other data elements may already be in the public domain. There is a need to be able to extract elements useful to emergency management without compromising security or criminal investigations. A system of security levels could be established, with appropriate rules governing access at each level, and applied to entire data sets, specific features in data sets, or specific attributes of those features as appropriate.
Once agreements are in place or guidelines have been established about the security of data that are shared for use in emergency response, methods must be implemented to carry out secure access to the data effectively. One model mentioned earlier is the National Pipeline Mapping System, which acts as a secure source of pipeline information. Another potential technique would be to utilize the role that One-Call centers play in managing underground critical infrastructure data from a variety of data sources. As mentioned earlier in this chapter, One-Call centers act as secure repositories of local utility information, which can be called upon to mark the locations of underground utilities before excavation work is done. Since these types of organizations have already established relationships with utility data owners and receive facility updates at scheduled intervals, they possess the means to potentially serve as a “one-stop” source for facility data. Such an approach would alleviate the need for data owners to provide their data to multiple agencies and would leverage existing processes that are already in place in the majority of regions throughout the United States. Although participation in a One-Call program is currently voluntary, the ability to participate in a data-sharing program to support national disaster recovery initiatives may serve as incentive to participate. In a recent example of an effort to address this issue, as mentioned earlier, the U.S. Geological Survey (USGS) in partnership with the Department of Homeland Security and the National Geospatial-Intelligence Agency (NGA) has implemented a central “GIS for the Gulf” database within the framework of the Geospatial One-Stop. When a federal emergency is declared, a user name and password will be distributed to all government agencies within the affected area, allowing them to access an extensive collection of geospatial data. Mechanisms also exist within this framework for agencies to contribute their own data, which will be available to others during emergencies under the same constraints.
Data security is a key element of any data-sharing effort in support of emergency management, and a mechanism that will protect and reassure
the suppliers of data is therefore a core requirement. Guidelines defining appropriate levels of security for various kinds of data needed for emergency response have to be established and implemented. Again, the emergency management professional organizations such as NEMA and IAEM could be instrumental in helping support the development and adoption of guidelines.
RECOMMENDATION 4: DHS should lead, within the framework of the NSDI, the development of a nationally coordinated set of security requirements for data to be shared for emergency preparedness and response. All organizations should implement these guidelines for all data shared in support of emergency management and should use them where necessary to restrict access to appropriately authorized personnel. In concert with these efforts, the leveraging of existing organizations that could potentially serve as a “clearinghouse” for critical infrastructure data should be explored.
Disasters raise immediate questions about geographic extent or footprint, and about the intensity of impact within the footprint. Emergency responders at all levels need to know not only the areas affected and the severity of damage but also the locations of any people who might require timely rescue or immediate evacuation. When a disaster damages critical parts of the telecommunications or other infrastructures or when calls originating within and outside the area overload circuits, severely injured people might not be able to summon help. Also, because a disaster with a wide footprint readily overwhelms local first responders, agencies outside the region need to know the condition of the transport system, including places where rescue helicopters and other aircraft might safely land. Aerial surveillance of comparatively small sites is also beneficial, especially when wreckage or complex terrain thwarts line-of-sight observation from the ground. Existing geospatial data might describe the road network and pinpoint special-needs populations, but these are baseline data prior to the event, and effective emergency response requires a broad, up-to-date depiction of the devastation. However valuable, isolated reports from victims and rescue teams will not initially provide as complete a picture as images from an aircraft or a remote-sensing satellite with a high-resolution sensor.
Aerial images can expedite disaster response and recovery if they meet three requirements: (1) a strategically positioned platform collecting imagery at the right place and time, (2) a competently revealing imaging system (with sufficient geographic detail), and (3) skilled interpreters.
Because the timing of the disaster and the type of damage affect all three considerations, there are no one-size-fits-all solutions. For instance, a disastrous seismic event occurring in early morning a few hundred miles from an airport with reconnaissance aircraft and knowledgeable image specialists allows for timely assessment with photographic or imaging systems relying on reflected solar radiation. By contrast, overhead imaging is likely to be thwarted by an earthquake occurring late on a winter evening, with darkness approaching and a long night ahead. Many imaging satellites pass over only near local noon, at intervals that may be as long as two weeks. Delayed imaging is also inevitable when a severe, slow-moving tropical storm makes low-altitude flying dangerous and creates heavy cloud cover beneath comparatively safe, high-altitude flight paths. Even so, timely aerial coverage after the storm dies out or moves on is highly valuable to officials orchestrating response and recovery.
Four imaging platforms that are potentially useful during or shortly after an event are fixed-wing aircraft, remote-sensing satellites, helicopters, and unmanned aerial vehicles (UAVs). Examination of their diverse applications and limitations reveals significant complementarity. All four platforms are developing rapidly, particularly in those aspects that can potentially benefit emergency management, so they are discussed in some detail from that perspective in the following paragraphs.
The fixed-wing airplane is the most common platform for aerial imaging and mapping. Whether propeller driven or jet powered, conventional aircraft can reach a disaster scene within an hour or two from a base several hundred miles away and provide generally thorough coverage of a scene several hundred square miles in extent. Of particular interest are technological improvements that enable some level of rectification of the imagery (processing of the imagery so that the location of objects in the image is more nearly the same as their actual location on the ground) while the aircraft is in flight and telemetering of the imagery (rectified or not) from the plane to a receiving station on the ground, processes that can potentially reduce the time between image acquisition and delivery to responders by 24 to 48 hours. If turbulence is minimal, low- or medium-altitude aerial imagery might be acquired for multiple, slightly overlapping flight lines. When winds or unstable air make flying hazardous or preclude accurate imaging, aircraft with high-resolution photographic systems or multispectral scanners can capture suitable images from higher, safer altitudes. Cloudy skies are another obstacle, but radar-based imaging systems designed to penetrate cloud cover may also be useful at night depending on the ground cover and building density. When atmospheric conditions preclude conventional approaches and the need is urgent, use of high-altitude aircraft and satellite platforms could be crucial despite the inevitable loss of resolution.
Emergency response can also benefit from the improved resolution of imaging systems authorized for and available from commercial remotesensing satellites in low-Earth orbit (altitude of roughly 400 to 800 kilometers). Commercial remote-sensing firms offer panchromatic, color infra-red, true-color, and multispectral satellite imagery with resolutions of 1 meter (3.3 feet) or better. Off-nadir viewing, with the scanner directed sideways rather than straight down, has substantially reduced revisit times, which are significantly shorter than the 18-day cycle for the comparatively coarse imagery collected by Landsat-1, the pioneering civilian remote-sensing satellite launched in 1972, and may be as little as three days. Further improvements are likely, with private-sector imaging firms planning to launch “next-generation” satellites before the end of the decade that will collect imagery with even greater geographic detail, allowing objects as small as half a meter or less to be detected. As the level of detail improves and the number of commercial satellites increases, the remote-sensing industry has the potential to become an increasingly significant source of geospatial data for emergency managers.
By contrast, helicopters afford a much longer “dwell time” than either remote-sensing satellites, which must race along their orbital paths at thousands of kilometers per hour, or fixed-wing aircraft, which cover large areas systematically with carefully configured parallel flight lines but cannot hover over a specific site. Although helicopters can be equipped with aerial cameras and other imaging systems, photogrammetric firms prefer fixed-wing aircraft, which are more cost-efficient for mapping wide areas. Municipal and state governments that own helicopters typically use them only for rescue and law enforcement because commercial photogrammet-ric surveying is appreciably less expensive for routine mapping. Moreover, police agencies have been reluctant to conduct systematic aerial imaging since 2001, when the Supreme Court, in a strongly worded decision, warned against unconstitutional warrantless searches (Kyllo v. United States). While it is unlikely that helicopters equipped for medical evacuation could be diverted to imaging, other state and municipal helicopters might usefully be equipped with video imaging systems, which could be linked to an emergency operations center. In Los Angeles and other large municipalities where television broadcasters use helicopters for traffic reporting and news coverage, the private sector could be an important partner in real-time overhead imaging.
UAVs, also called drones, can be useful as well, especially during a radiological emergency, when low-altitude surveillance might imperil human pilots. UAVs appropriate for real-time overhead surveillance vary in size and payload. At one extreme are larger versions of the comparatively inexpensive, remote-controlled model aircraft used by hobbyists. Equipped with a small, ultralightweight video camera, a model airplane
could record the scene below on magnetic media. Time aloft is limited, and the operator would have to keep the craft in view to orchestrate a safe landing or trigger a retrievable drop. A UAV with a slightly larger pay-load might carry more fuel for a longer range or a battery and transmitter for real-time video transmission. GPS could improve navigation and control, and a gyro stabilizer could allow more accurate imaging (see, for example, Oshman and Isakow, 1999). The military, which recognizes the importance of “over-the-hill” surveillance, has been experimenting with mini-UAVs for more than a decade.21 By contrast, the Air Force’s sophisticated high-altitude endurance class Global Hawk, which can attain an altitude of 65,000 feet and keep a payload of 1,960 pounds aloft for 42 hours, can be operated from a base hundreds of miles away.22 Less expensive but no less relevant to emergency response imaging is the Air Force’s medium-altitude endurance class Predator, which can fly above 40,000 feet and sustain 29 hours of flight with a 700-pound payload. UAVs equipped for communication with a satellite can operate hundreds of miles from their base. A smaller, less sophisticated military system is the joint tactical class Hunter, designed for real-time aerial surveillance with a range of up to 200 kilometers, a maximum altitude of 15,000 feet, and 12 hours of flight with a 200-pound payload. Deployment of UAVs at military bases around the country raises the possibility of timely imaging support with flexible UAV platforms and trained interpreters.
Efficient use of overhead imagery during an emergency will depend on the availability of trained personnel at the emergency operations center itself or at a remote site in direct contact with the center. Although identification of obvious obstacles such as a fallen bridge or blocked highway requires little expertise, trained interpreters are needed to recognize seriously weakened structures or signs of trapped occupants, both of which might require timely inspection by responders in the field. Communications are especially important because field personnel, especially those who know the area well, will have knowledge of significant benefit to experienced interpreters at a remote site. Because careful before-and-after comparison is a key strategy in image interpretation, baseline imagery acquired before the event is absolutely essential for change detection as well as for identification of trouble spots on which real-time video surveillance by helicopter or mini-UAV should be focused. Overhead imagery is also valuable for developing response plans, planning and carrying out training exercises, and planning mitigation efforts.
For a description of the five-pound Dragon Eye used by the Marine Corps for over-the-hill surveillance, see http://www.globalsecurity.org/intell/systems/dragon-eye.htm.
There are several issues related to the use of overhead imagery for emergency management. First, there is a critical need for rapid data acquisition during emergency situations, and these temporal requirements often cannot be met, especially for data gathered using remote sensing. Issues of availability and level of geographic detail will continue to preclude the use of remote-sensing imagery in the immediate response phase of the disaster. However, the use of remote-sensing products for the preparedness and mitigation phases, when speed is not as critical, is also important. Three types of barriers inhibit the fuller and more immediate access to overhead imaging systems needed for emergency response and recovery. Contractual barriers exist where private-sector service providers are reluctant to commit staff and resources without guaranteed compensation. Contingency contracts available to municipal and state governments and specifying imaging requirements, payment schedules, and various options and fees can ensure the prompt cooperation of private photogrammetric mapping firms. Statutory barriers include restrictions, real or perceived, on cooperation between different levels of government or between civilian and military agencies. Federal agencies, on the other hand, may have indefinite delivery-indefinite quantity (IDIQ) contracts in place prior to an event that can permit very rapid development of mission assignments for image acquisition. Regulatory barriers include Federal Aviation Administration (FAA) restrictions on UAVs, which could interfere with other air traffic. Demonstrations of the reliability of UAVs and procedures for monitoring their use and licensing operators might overcome the FAA’s understandable concerns, especially in times of emergency when special rules are needed for civilian aircraft (aware of the potential value of UAVs, the FAA is currently developing new policies, procedures, and approval processes for their regulation23). Another general issue brought to the committee’s attention has been the lack of a lead federal agency to coordinate the procurement of imagery during response. While this has led to duplication of effort or confusion among the various private industry vendors contracted to acquire imagery, these problems currently are being addressed by the Federal Emergency Management Agency (FEMA) and DHS with USGS and NGA as the lead agencies.
Based on the experiences of committee members and workshop participants, the committee believes that overcoming barriers to the rapid acquisition of high-resolution imagery, particularly through contingency
contracts, is crucial to its timely and effective use in disaster response. Remotely sensed imagery should be available for all phases of disaster management, and its acquisition should be better coordinated among federal agencies and enabled through the development of IDIQ contracts and through partnerships with federal agencies having such contracts. Advantage should be taken of advances currently being made in remotesensing technology using the full range of fixed-wing, helicopter, UAV, and satellite platforms.
RECOMMENDATION 5: Standing contracts and other procurement mechanisms should be put in place at local, regional, and national levels by the responsible agencies to permit state and local emergency managers to acquire overhead imagery and other types of event-related geospatial data rapidly during disasters.
COMMUNICATION OF REPORTS TO AND FROM THE FIELD
As noted and documented in previous sections of this report, data and information are critical to the responders and emergency managers dealing with a disaster. The committee heard that information may be known by first responders on the front line but not known by managers in the command post or the emergency operations center (EOC) or by other agencies. Cultural differences between various groups of responders, such as police and fire organizations, can also inhibit communication of information. This information may include such things as knowledge of road closures and inundated areas, specific information about damage to infrastructure, or the location of disaster victims trapped in homes or other buildings. Inability to communicate this information between responders, and between responders and managers, can delay critical action and add unnecessarily to loss of life, personal injury, and property damage.
Flow of information to management groups at higher levels within the responding agencies and the sharing of data with FEMA and other agencies are essential components of the response. The National Response Plan’s Emergency Support Function (ESF) 5, discussed in Section 3.3.3, and the Homeland Security Operations Center (HSOC) serve the overall disaster information function, supporting planning and decision making at both the field or regional and the headquarters levels with information from all sources. All data of interest to those outside the agency collecting them should be provided to ESF 5 and the HSOC. Similarly, ESF 5 and the HSOC should be providing necessary data to other responding agencies (e.g., road outage information should be collected by municipalities or the state police and provided to ESF and HSOC, who would redistribute these data to all responding agencies for whom this is important.) Imagery ac-
quired during disasters is provided to FEMA headquarters, the Joint Field Offices (JFOs), and the state EOCs, as well as to the USGS and the U.S. Army Corps of Engineers, which post the imagery for viewing and downloading over the Internet.
A major difficulty in the past has been the inability to move information from the field to the EOC or JFO, and then to the headquarters of the responding agencies and to FEMA. The technical components of these problems hinge on both computer security and the use of different data standards and formats. Data standardization, at least for those data used during events, will greatly facilitate the transmission of critical information. The EDXL standard described earlier in Section 4.2 will be very helpful in this regard. The committee also heard about specific problems resulting from the use of agency firewalls. For example, the FEMA firewall has impeded sharing of data during some incidents, which forced the FEMA GIS team to set up a separate network outside the firewall to share the data.
Difficulties also exist with the use of the imagery provided. While most disaster-related missions require high-resolution data, responders in the JFO, EOCs, and Emergency Response and Recovery Offices (ERROs) are typically unable to work with the huge files involved. To maximize their usefulness these data must be either compressed so that they can be pulled across the network at the field offices, written onto physical media such as CDs or fire wire hard drives and transported physically, or converted into hard-copy maps with reduced resolution. Implementation of a web-services architecture using NSDI practices would help alleviate this problem by enabling the transfer of only those portions of an image file that are actually needed by the user. Metadata, including indexing of imagery and other useful documentation, are especially helpful when data are to be transferred.
Perhaps the single greatest difficulty during disasters is the movement of files (large and small) between agencies working in the National Response Coordination Center (NRCC), the Regional Response Coordination Center (RRCC), and the JFOs when responders from agencies other than FEMA are attempting to communicate with their own agencies and attempting to connect to their home agency networks, and when telephone and Internet communication may be impossible. This issue is of paramount concern to computer security specialists.
Attempts to reduce these difficulties range from the marginally acceptable provision of analog phone lines at FEMA facilities, which let responders use modems to connect to their home facilities, to the use of high-speed digital subscriber lines (DSLs). While the second solution is significantly faster, DSL transmission still requires the use of two computers, one at the origin and one at the destination, and requires that both
they and the network be operational and secured. The security problem may be solved in at least two ways. The first is to give disaster workers from another facility the same responsibility for protecting the network that they have at their own agency. If this strategy cannot satisfy computer security specialists, the gov.net concept (Sewell, 2002), whereby subnets on each of the agencies involved in the response become part of an intranet, might provide secure access but only to essential parts of each agency’s network.
The culture of not sharing data and information, which has already been addressed in Section 4.2, is less tractable but of critical importance to reports to and from the field. Because of strong resistance, the issue will have to be dealt with through directives from higher levels in the organizations involved. The assistance necessary to help affected populations can best be delivered when all responders are able to contribute their information in timely fashion and when procedures are in place to see that the information is redistributed to those that need it and available for those who might need it.
An additional difficulty has been the communication (or its absence) between the federal and non-federal partners in disaster responses. Many states and municipalities complain of multiple data calls from federal agencies for nearly the same information. Coordinating data calls through one federal agency such as FEMA or USGS (who is putting liaisons in every state), and coordinating their geospatial needs through a single point of contact such as the state GIS coordinator, would help streamline operations. This as well as various cultural differences from different experience bases can best be dealt with through exercises that result in the development of partnerships prior to the onset of a disaster and also through the use of the incident command system mandated as part of the National Response Plan (NRP). Because the NRP is new, it may take some time (and many exercises) before these problems are resolved.
For a variety of reasons ranging from damage in the disaster area, to firewall issues, to other inter- and intrainstitutional problems and conflicts, data access in the field is often compromised. Plans must be in place for the provision of broad-band Internet and intranet services to emergency responders or for data transmission through other means such as physical transport of CDs or other digital media. Likewise, firewall issues that prevent access to essential data by members of multiple agencies located at disaster sites (e.g., a JFO, the FEMA National Response Coordination Center, FEMA’s Regional Response Coordination Centers, or other locations not run by the responding personnel) must be resolved.
ESF 5 and the HSOC must continue to address the role of the collectors and providers of data and information to the responding community. This would be greatly facilitated by the use of standardized databases by all emergency responders, but also requires identification of procedures and deadlines for the provision of data and information to FEMA and DHS, as well as policies and procedures for pushing data and information to the responders and higher headquarters of all agencies. This will require procedures for making data available to the larger response community as well. More broadly, the committee finds that information flow between entities participating in disasters must be improved, particularly between responders in the field, field offices, and emergency operations centers.
This and earlier sections of the report have drawn attention to a host of problems that currently impede communication to and from the field. While many forms of action might help to address these problems, in the committee’s view the best strategy would be to invest in intensive preparedness exercises, in which all aspects of communication can be tested.
RECOMMENDATION 6: Interpersonal, institutional, technical, and procedural communications problems that currently inhibit communication between first responders in the field and emergency operations centers, emergency management agency headquarters, and other coordinating centers should be addressed through intensive preparedness exercises by groups involved in all aspects of disaster management. Such exercises should be tailored to focus on clear objectives with respect to the use of geospatial data and assets. They should involve decision-making representatives from all levels of government, as well as other relevant organizations and institutions, and should be coordinated nationally so that common problems can be identified. They should be realistic in their complexity and should allow participants to work carefully through the geospatial challenges posed by disasters, including the difficulty of specifying requirements, the difficulty of communicating in a context of compromised infrastructure, and the difficulty of overcoming logistical obstacles.
BACKUP, REDUNDANCY, AND ARCHIVING
Ideally, all data could be accessed through a distributed network from local sources. This would guarantee that all responders are working off the latest version of the data. However, numerous experiences have shown that data can be lost, servers and networks can fail, and power and communication systems can go down. In order to guarantee effectiveness
during a disaster, multiple methods of accessing data have to be available and tested regularly.
Standardized methods are needed to ensure appropriate data backup and recovery. Of particular significance is the geographic dispersion of the backed-up data so that copies survive the event and can be used in response and recovery. On September 11, 2001, the New York City Office of Emergency Management lost both its primary and backup data because both were stored close to the World Trade Center (see Section 2.1.1). The committee also learned that backup copies of data for the New Orleans area were stored in close proximity to each other, and as a result, during Hurricane Katrina both the original and the backup copies of some data were destroyed.
As noted in Chapter 3, the data, tools, and procedures used during an event are rarely if ever archived. A number of individuals and organizations told the committee of the need to establish procedures to archive event-created data on a daily basis. This, in turn, would allow researchers at a later time to measure the effectiveness of the geospatial information and to determine methods to improve its contribution to the overall emergency response effort in future events. Since it is likely that some of the data used by responders will be proprietary, confidential, or subject to privacy laws, archiving plans will have to balance the desire for openness with security (see Section 4.3). Archiving plans will also have to include effective strategies for data management.
Backup and archival plans should exist for all geospatial data, tools, and procedures developed as part of disaster response and recovery, in order to ensure the security of essential resources through geographic dispersal, and to provide extremely useful knowledge for improving response to future events. Responsibility for this function should be stated in Emergency Support Function 5 of the National Response Plan. Since FEMA is currently given the responsibility in ESF 5 for coordinating GIS support, the archive and backup function should be stated as part of FEMA’s responsibilities in the JFOs.
RECOMMENDATION 7: DHS should revise Emergency Support Function 5 of the National Response Plan to include backup and archiving of geospatial data, tools, and procedures developed as part of disaster response and recovery. It should assign responsibility for archiving and backup in the JFOs during an incident to the Federal Emergency Management Agency, with an appropriate level of funding provided to perform this function.
TOOLS FOR DATA EXPLOITATION
A variety of geospatial tools exist that can meet a wide range of emergency management needs. The following material describes these tools, how they are used, and issues that inhibit their successful exploitation. Since another recent NRC report has an extensive discussion of technologies and methods for disaster research that includes geospatial data, research, and technology (NRC, 2006), this report focuses on impediments to the take-up of existing tools and the need to adapt them better to the conditions of emergency management.
Visualization tools provide the opportunity to visualize features of the pre- and post-event world individually or simultaneously. These features may include hazards (fault zones, potential hurricane landfall areas, and flood inundation areas) and risks (potential hurricane damage zones based on projected wind speed and its impact on population, building types, and critical infrastructure). These may be stand-alone tools for operation on a single system or server-based tools designed for Internet or intranet use, allowing an operator in one location to view data stored on a server at a remote location. It is also possible to perform data-mining operations on the data through the display-tool interface to determine critical relationships between hazards, disaster events, and the most appropriate actions.
Analysis tools include a wide range of models performing a hierarchy of functions, from models indicating impact area and expected severity (shaking or wind speed), to those showing expected damage (combining shaking or wind speed with geology and construction type), to models to determine evacuation routes based upon the road network and traffic flow (see Figures 4.1, 4.2, and 4.3). They can include validated atmospheric models such as those used by the National Hurricane Center; atmospheric plume models used by the national laboratories; waterborne plume models used by the National Oceanic and Atmospheric Administration (NOAA); earthquake and wind damage models used by FEMA, the Department of Defense, and others; hydraulic and hydrologic models used by the U.S. Army Corps of Engineers; or empirical models used by the Corps to estimate debris volume from disasters. They also include high-interaction computer graphics, three-dimensional virtual reality media, and other visualization techniques that promote realistic simulations of disastrous events and planned responses.
Decision support systems assist emergency and other managers in making the best decisions based upon conditions as they are known at a particular point in time. These systems are often a combination of display capabilities, one or more models, and visualization and data analysis functions. With high-resolution digital elevation models, hydrology and hy-
draulic models, elevations for the living floors of dwellings, and economic damage models based on the depth of water above the elevation of the living floor, it is possible to model the consequences of the operation of water control structures during high-flow events. Both upstream and downstream inundation with different release rates from the water control structure can be displayed, and analysis of the economic damage that occurs for any release rate option can be calculated. While economic damage is not the only factor involved in release rate decisions, this is an example of how such systems can help the decision-making process. As part of a multiagency effort, the U.S. Army Corps of Engineers is participating in the development of a decision support system that will help communities and regulatory agencies evaluate the consequences of different potential land-use decisions in a river subbasin on communities located downstream.24
Geospatial models and tools are currently being used successfully in numerous areas. One excellent example is the National Interagency Fire Center in Boise, Idaho, that controls forest fire fighting using a range of GIS tools, has a clear organizational structure, and sends real-time information about topography, vegetation, and weather forecasts—and the resulting fire predictions—to field units rapidly using handheld systems. Another successful system presented to the committee was by the Province of Alberta, in which GIS and geospatial data are an integral part of its full emergency management operations system. These successful systems often concerned single disciplines or small jurisdictions where it was possible to avoid the kinds of interagency issues addressed earlier in this report. Yet despite these successes, there are still many impediments to
the better use of geospatial tools in emergency management, as described below.
First, all of these possibilities are easier to realize when the necessary data are readily available and easily accessible. Ideally, they will take advantage of interoperability between the systems used by the different entities involved with a disaster event. At their best, network-enabled approaches will allow (with sufficient security) all emergency responders to access data sets and analytical products that are located on servers managed by other responding entities. However, present implementations of geospatial tools are largely, if not entirely, the work of single agencies and not easily distributed. Although such solutions are workable, they are inefficient in the larger context.
Moreover, during emergency response, data may well be incomplete and of poor quality, and first responders may be working under very difficult circumstances with limited technical resources. It is very difficult and indeed unlikely that response personnel will take on the task of learning about new tools in this type of situation. Therefore, training on tools must take place so that people working in an emergency situation feel completely comfortable with their use (see further discussion on this topic in Section 4.8). Furthermore, in the report Making the Nation Safer (NRC, 2002, p.162), a section on information management and decision support tools makes the following comment:
In a chaotic disaster area, a large volume of voice and data traffic will be transmitted and received on handheld radios, phones, digital devices, and portable computers. Nevertheless, useful information is likely to be scarce and of limited value. Thus, research is needed on “decision-support” tools that assist the crisis manager in making the most of this incomplete information.
In some cases, the modeling capabilities exist even though the needed input data are not readily available. For example, loss estimation tools exist; however the underlying nature of the default data on the building inventory, infrastructure, and economic structure of places means that only very generalized estimations can occur, normally at a regional scale. The damage is expressed as the probability of the building being in one of four damage states: slight, moderate, extensive, and total, with a range of generalized damage functions (expressed as repair cost or replacement cost) assigned to each damage class. The loss estimations work best when applied to a class of buildings (e.g., residential), rather than to individual buildings. Moreover, specific loss estimations at the city or county level require supplemental data on building stock and infrastructure, localized data not normally included. Finally, the loss estimations represent “direct” damage, not “indirect” losses such as lost wages, loss of business earnings, or loss of building use, in the overall loss estimation.
As one emergency management professional told the committee, a
serious impediment to better use of geospatial data and tools for disaster management is that “uninformed, overwhelmed public officials get sold expensive systems they don’t need and don’t know how to use.” Several testified to the need for a “common denominator” set of tools with designs based on user requirements. Moreover, such tools must be simple, easy to use, and tailored to what users really need (for example, functions to assist navigation through the application, functions for basic query and measurement of location, and tools for the management of saved files). At the same time, users often fail to take advantage of capabilities because they are unfamiliar with them. Typically, users only encounter geospatial data and tools during emergencies, so they do not know what is available or how to make use of it. Also, it can be hard to get users to adopt new technology, especially in the midst of an event when novel approaches feel like distractions rather than solutions to emergency responders. To address these challenges, users argue that geospatial data and tools should be integrated as a routine component of emergency planning, training, exercises, and routine incident operations, so that during major disasters they are readily available and easily incorporated.
Although numerous tools exist and may be very useful in the planning stages, they are not as effectively used during response because (1) the necessary data may be of poor quality or not available during response, or (2) the tools have not yet been fully integrated into regular response activities. The committee concludes that efforts should be made to more effectively integrate the use of geospatial tools into all phases of emergency management, as proposed in Recommendation 1. Additional research is needed on how geospatial data and tools can be used for decision support in the special conditions that prevail during emergency response.
RECOMMENDATION 8: The National Science Foundation and federal agencies with responsibility for funding research on emergency management should support the adaptation, development, and improvement of geospatial tools for the specific conditions and requirements of all phases of emergency management.
EDUCATION, TRAINING, AND ACCESSING HUMAN RESOURCES
Presentations to the committee provided ample evidence of the nonuse and underutilization of geospatial data and tools, and previous sections of this report have focused on many of the causes cited by the indi-
viduals and agencies that provided testimony. This section focuses on one of the more important and endemic causes: the lack of appropriate education and training in geospatial data and tools among emergency management personnel and a similar lack of education and training in emergency management among geospatial professionals. These deficiencies exist at all levels, from the bottom of agencies to the top, and must be addressed by programs that raise awareness among leaders as effectively as among their staff. It is important, moreover, to recognize that education in the many issues surrounding the use of geospatial data and tools in emergency management that are identified in the report, as well as in underlying principles of geospatial data, such as the correct interpretation of maps, their time-limited nature, and knowledge of their inherent uncertainties, is at least as important as training in the technology itself.
While academic emergency management programs at both the bachelor’s and the master’s levels are growing in the United States, the committee heard that the emphasis given to geospatial data and tools varies widely. Geospatial data and tools are not always considered essential from a curricular standpoint and must compete for space in the curriculum with many other subject areas. As an earlier NRC study concluded, “The very people who could leverage this information [geospatial data and tools] most effectively, such as policy makers and emergency response teams, often cannot find it or use it because they are not specialists in geospatial information technology” (NRC, 2003, p. 3). Only by requiring that those in emergency management programs take classes in geospatial data and tools (with a primary focus on emergency management applications), and by including modules in other emergency management classes showing how geospatial data and tools can be used in all phases of emergency management, will it become clear to future generations of emergency managers that geospatial data and tools have significant contributions to make. Furthermore, because geospatial information is often taken at face value, and since the response community is getting greater access to multiple sources of geospatial data, it is critical to ensure that the underlying assumptions, data quality, and uncertainty are conveyed properly. Geospatial data and tools are often so complex that even geospatial professionals sometimes lack the training for proper interpretation of results. Access to technical specialists is key to the proper use and improvement of such mapping products. For example, it is critical to understand the time sensitivity of technical information (e.g., decaying radiation dose from deposited material, relative changes in indoor and outdoor exposure during a hazardous airborne release).
In a similar way, geospatial data and tools must be a component of the in-service training offered to the current generation of emergency management professionals. While such content is currently included in
some of the training offered by FEMA at its Emergency Management Institute in Emmitsburg, Maryland, and by the U.S. Army Corps of Engineers in training provided to members of its emergency management planning and response teams, there is a need to include such content in all relevant training programs nationwide. First responders need relatively rudimentary training in geospatial capabilities: they have to be able to communicate what conditions they encounter and what they need to know to fulfill their mission assignments. Incident command- and management-level personnel (e.g., plans and operations section chiefs) need a more sophisticated understanding of geospatial capabilities. Both users and GIS personnel are extremely busy, stressed, and sometimes emotionally volatile during emergency response. This is not the best time to assess needs or to learn new material. Data sources and tools should be presented to emergency management personnel before an incident in a training situation so that they know what will be most useful. The concepts, as well as the end products, have to be documented.
It is similarly important that geospatial professionals are acquainted with the emergency management process and that geospatial products are designed to be useful to emergency managers. Many geospatial professionals who become involved in emergency response are not routinely associated with the emergency management community in their normal roles, however. It is essential therefore that training be provided as part of emergency preparedness or, at worst, that it be part of the orientation process when geospatial professionals or volunteers join response and recovery teams. Not only should this training explain the emergency management organization in which the geospatial team will operate, but it should also provide a window into the pressures and time constraints under which personnel will be expected to perform. Emergency responders do not always know how best to articulate their information and imagery needs to geospatial professionals. Sometimes they are unsure of what they need because they do not know what is possible. It is important for technical personnel to spend time helping responders frame their questions. Sometimes, it helps to produce a variety of products and let responders identify those that are most helpful. Working with emergency management personnel in this way can help provide geospatial professionals with feedback on how well geospatial data and tools address the needs of responders, especially in terms of ease of use, interpretational and supporting information, and documentation.
The lack of sufficient personnel trained in all aspects of the application of geospatial data and tools to emergency management is a problem at all levels of government, from local to federal, although specific needs vary from agency to agency. In particular, trained personnel with knowledge of imagery products should be included as essential members of the
emergency management team. The past two hurricane seasons have shown that the ability to provide these personnel has already been stretched very close to the breaking point. Annual or semiannual exercises that provide the opportunity for involved agencies to meet, exercise, and discuss potential geospatially related successes and pitfalls in the event of an actual disaster can both raise awareness of the importance of an adequate supply of personnel and provide essential experience to those involved.
Disasters, by definition, overwhelm the ability of local emergency managers to respond sufficiently, and recent disasters have demonstrated the importance of being able to augment local human resources with professionals and volunteers drawn from both neighboring and remote areas. One of the common problems reported to the committee was the lack of a preestablished team of geospatial professionals to support emergency response within a significant number of emergency management organizations. As a result, when a catastrophe occurs, a significant amount of effort and time is wasted to locate geospatial professionals, bring them into the emergency management organization, and provide them with resources to accomplish their mission. By the time they become available, many of the opportunities to apply technologies to solve problems have passed.
As the committee heard repeatedly and as noted in Section 3.2.2, FEMA has only a limited number of permanent geospatial professionals and must rely on reservists to respond to events, which almost inevitably delays deployment. If federal geospatial professionals arrive on scene after state and local staffs have already begun work, it can be difficult to integrate and coordinate the various efforts. There is clearly a need for FEMA to expand the effective size of its permanent staff of geospatial professionals, perhaps through dual use, and to develop strategies that will lead to their more rapid deployment.
In addition to having a preestablished team of geospatial professionals, having a mechanism in place to locate additional geospatial professionals to respond to a disaster is essential. Subsequent to the attacks on the World Trade Center, the members of GISMO (Geographic Information Systems and Mapping Operations), the New York City GIS user group, used their contact lists to assist in assembling a team of volunteers. Having this type of information can be invaluable in a disaster that is geographically large and runs over an extended period of time (see Sidebar 4.3). A national system to facilitate access to additional geospatial professionals from a range of related fields could be organized by the Department of Homeland Security, perhaps in partnership with universities as nodes of expertise, and used during disasters to locate and assemble
“Geocoding” Used to Locate Katrina Survivors—Street Addresses Not Very Useful After Hurricane Hit
By Marsha Walton, CNN
Police, firefighters, and Coast Guard crews may be the first to come to mind when naming the lifesavers during disasters such as Hurricane Katrina. It might be time to add geographers to that list. In the sometimes desperate hours following Katrina’s landfall, experts in geographic information services—GIS—helped search and rescue crews reach more than 75 stranded survivors in Mississippi. One of their most valuable tools was a process called “geocoding,” the conversion of street addresses into GPS coordinates. With streets flooded, street signs missing, and rescue crews unfamiliar with the Gulf Coast area, street addresses were not very useful.
“They would get phone calls, or the Coast Guard would come in with addresses in their hands and say, ‘I need a latitude and longitude for this address.’ So the GIS professionals would do a geocoding, give it to the Coast Guard who got on helicopters and saved lives,” said Shoreh Elhami, director of GISCorps.
Elhami, co-founder of GISCorps, said that since 2004, the organization’s volunteers have responded to disasters such as the Asian tsunami and Hurricane Katrina, as well as efforts to provide humanitarian relief, sustainable development, economic development, health, and education in all parts of the world. The Corps had 20 volunteers on the ground in Mississippi less than 48 hours after Katrina’s landfall. GISCorps is part of URISA, the Urban and Regional Information Systems Association. Elhami said more than 900 qualified volunteers have GIS experience, and range from city and state government officials to academics to people in private industry.
Volunteer Beth McMillan, a field geologist and professor at the University of Arkansas in Little Rock, worked in Pearl River County, Mississippi, a couple of weeks after the storm. “A couple of days after the hurricane hit, I felt so down, and wondered what I could do. I could give a little bit of money, but that doesn’t seem very satisfying. To be able to have a skill that can be used is much more empowering, it doesn’t make you feel so helpless,” said McMillan, back in Little Rock….
Volunteers are never sure of the conditions they might face when deployed to disaster sites or developing countries. Assignments usually last between two weeks and two months. McMillan said her many experiences “roughing it” as a field geologist helped her deal with the living conditions in Mississippi. “They said be prepared for really hot weather, and bring a sleeping bag,” she said. “I slept in an empty U.S. Department of Agriculture building on a cot, with probably several hundred other people. But it did have power, bathrooms, and showers, so conditions were not as bad as they could have been,” she said….
SOURCE: Excerpted from Walton (2005).
teams. Such a system could also promote appropriate training, by establishing and implementing the minimal qualifications needed to be listed.
RECOMMENDATION 9: Academic institutions offering emergency management curricula should increase the emphasis given to geospatial data and tools in their programs. Geospatial professionals who are likely to be involved in emergency response should receive increased training in emergency management business processes and practices.
RECOMMENDATION 10: The Federal Emergency Management Agency should expand its team of permanent geospatial professionals, and develop strategies that will lead to their more rapid deployment both in response to events and in advance of events when specific and reliable warnings are given.
RECOMMENDATION 11: The Department of Homeland Security should establish and maintain a secure list of appropriately qualified geospatial professionals who can support emergency response during disasters.
Along with the lack of an effective governance process (see Section 4.1), funding is usually identified as a major barrier to effective use of data in preparing for and responding to disaster events. For many organizations, particularly those in states and localities that are comparatively resource poor, there is inadequate funding to build even a basic geospatial capability. For others, funding is lacking for ongoing programs to maintain and update existing geospatial data, for the servers and support services needed to ensure effective access to and use of these data for converting data formats to meet a standard, and for creating the metadata needed to make data accessible through the NSDI. Finally, others lack the capability of participating in coordination activities due to shortage of personnel and funds. At the state level, geospatial preparedness is often not seen as sufficiently important to qualify as a target for funds that flow from federal homeland security programs. Several points are evident: different locations have different sets of needs and requirements across the country; not all of the perceived funding problems can be fixed by the infusion of more money; and funding for geospatial investments has often not been seen as a high priority.
A National Academy of Public Administration (NAPA) study, conducted in the late 1990s and titled Geographic Information for the 21st Cen-
tury: Building a Strategy for the Nation (NAPA, 1998), identified geospatial information and technology as key components of substantial elements of the U.S. economy. The report cited some of the major sectors of the economy that are impacted by geographic information and stated that geographic information plays a role in about one-half of the economic activities of the United States (NAPA, 1998, p. 11). Although not focused on budget or funding issues, the report also stated that competing priorities at the time, such as the year 2000 computer problem, created the reality that in the absence of additional major funding, only part of the highest-priority efforts would be implemented and that fulfillment of the stated goals for the NSDI was many years away (NAPA, 1998, p. 66-67).
It is not known with any accuracy how much money is spent by the many units of government for geospatial activities. In part this is due to differences in the programming and budgeting systems that exist among the nation’s levels of government, along with the fact that much of the nation’s resource of geospatial data and tools is acquired and used as part of a mission program, not as a specifically identified activity. However, the Government Accountability Office (GAO) estimated that billions of dollars are spent each year by federal agencies and their partners on geospatial data, services, technology, and expertise (GAO, 2004), and a global figure for 2000 of $12 billion to $20 billion was given by Longley et al. (2001, p. 360). With this amount of money being appropriated annually to sustain the existing—but in many ways inadequate—resource of geospatial data and tools, it is essential that ways in which current funding could be used more effectively be found, in addition to calling for new funding.
As noted in Section 1.1, the past two decades have seen dramatic increases in the use of geospatial data and tools in many aspects of human activity. Data needed for emergency management are often collected, managed, and disseminated for other purposes, particularly in the case of the framework and foundation data defined in Section 1.3.2. Collection of variables particularly important for emergency management might be piggybacked on existing data collection activities at minimal additional cost, and similar economies might be found in the costs of data dissemination.
Many changes will be needed in existing practices if geospatial preparedness is to be funded more adequately. While this has proven difficult, some components of government have made significant progress by
Adopting a clear strategic direction that lays out future objectives;
Initiating changes early within the organization in order to address personnel, structural, or other adjustments that affect employee performance;
Planning a funding bridge to enable transition from current business processes to new ways of conducting business; and
Establishing customer expectations that help to drive the needed changes but are also realistic about the pace at which changes can be made.
The committee heard in testimony that the most critical gaps in current funding appear to be
A lack of funds that can be used in shared arrangements to leverage the funding resources of multiple organizations;
A lack of funds for coordination activities among multiple organizations; and
The lack of a long-term base of funding to sustain geospatial data collection, maintenance, and dissemination over time.
Several previous efforts have been explored to address these needs, and it is useful to examine them as potential guidance in any renewed attempts to address financing issues, whether through new mechanisms, legislation for grant programs, or increases in agency appropriations.
The FGDC conducted a study in 2000 to explore alternative mechanisms for “Financing the NSDI.”25 The study report found that an opportunity existed to provide a national capacity through public-private partnerships that could underwrite information technology investments for geospatial data and tools, and could provide the capital financing that local, regional, industry, and interest group consortia need to form and grow. The report recommended ways in which these consortia could pool and align intergovernmental and public-private investments in geospatial data acquisition and maintenance; decision support applications; and supporting hardware, software, and integration services. Financial mechanisms such as government-backed bonding authority for use by local governments, revolving loan programs, and other debt structures were suggested for use in a range of capital planning strategies. Financing would be dependent on the use of consensus standards for interoperability from recognized standards development organizations and other NSDI design elements as underwriting criteria.
Other NRC reports have also discussed funding options for the NSDI and related issues (NRC, 1993, 1994), and several legislative proposals have identified increased funding as a need. As one example among many, in 2003 a committee formed within the Spatial Technologies Indus-
try Association (STIA) identified the need for increased funding and a government-wide legislative base for establishing and maintaining geospatial preparedness for homeland security, national defense, electronic government, and other purposes. The proposal was presented in testimony to the House Committee on Government Reform by STIA President Fred Corle.26 One of the key elements identified was a major grant program to assist non-federal levels of government to build and maintain the NSDI and to achieve geospatial preparedness. Key elements of the grant program were that it was to provide matching funds as an incentive for geospatial preparedness and would require participants to adhere to standards for interoperable access, sharing, and use as part of the development and implementation of the NSDI.
To make it easier for organizations to find grant programs that they can utilize to obtain funding for various activities, the federal government has taken steps recently to identify grant opportunities through its electronic government initiatives. The FGDC has a grant program called the NSDI Cooperative Agreements Program to assist the geospatial data community through funding and other resources in implementing the components of the NSDI.27 This program is open to federal, state, local, and tribal governments, and to academic, commercial, and nonprofit organizations and provides small seed grants to initiate sustainable ongoing NSDI implementations. This program could be used for geospatial preparedness activities. The Department of Homeland Security has a number of grant programs for emergency management in which geospatial activities could be included as part of the applicant’s proposal. However, there is no comprehensive grant program that would provide funds for coordinated actions across the nation to better organize, manage, share, and use the geospatial data and technology that exist now and are being acquired for emergency management and other important public purpose and business reasons.
The committee concludes that the funding available to achieve geospatial preparedness for disasters is not sufficient to meet the need. Adequate resources must be made available to support existing mandates and new initiatives that integrate geospatial resources into all phases of emergency management and facilitate the acquisition and sharing of geospatial data for emergency management. In particular, resources such
as grants need to be made available at the state and local levels where many of the emergency management activities occur, but where resources may be lacking to adequately support the development of the geospatial capabilities needed.
RECOMMENDATION 12: To address the current shortfall in funding for geospatial preparedness, especially at the state and local levels, the committee recommends: (1) DHS should expand and focus a specifically designated component of its grant programs to promote geospatial preparedness through development, acquisition, sharing, and use of standard-based geospatial information and technology; (2) states should include geospatial preparedness in their planning for homeland security; and (3) DHS, working with OMB, should identify and request additional appropriations and identify areas where state, local, and federal funding can be better aligned to increase the nation’s level of geospatial preparedness.