Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 1
Summary of a Workshop on Using Information Technology to Enhance Disaster Management Summary of Remarks Made by Workshop Participants On June 22-23, 2005, CSTB’s Committee on Using Information Technology to Enhance Disaster Management held a workshop in Washington, D.C., to obtain perspectives on the use of information technology (IT)1 to enhance the management of natural and human-made disasters. Workshop participants invited as panelists included federal program managers and researchers, state and local officials and first responders, representatives from industry, and academic researchers from a variety of disciplines. In addition to providing perspectives on the current state of the art and practice, workshop presenters also described future opportunities to make better use of information technology to improve disaster management.2 This report provides the committee’s summary of key points made by workshop participants. It does not aim to present comprehensively all the remarks made during the workshop. Reflecting the workshop’s structure, this summary is organized to cover three major topic areas: The critical and evolving role of information technology in disaster management, Research directions for information technology in disaster management, and Collaboration, coordination, and interoperability: pressing issues in a need-to-share world. Each topic area was covered by two or more panels. Highlights of the comments made by speakers in each topic area are provided in the sections that follow. One important issue that cut across all three topic areas is the capability for wireless communication, which plays a critical role in disaster management because in almost all situations, first responders will depend on wireless links. As discussed below under Topic 2, there are fundamental constraints to wireless communication, significant unsolved problems, and a number of areas of active research. As discussed below under Topics 1 and 3, enhancing interoperability among current and future wireless systems is also widely recognized as a pressing issue in disaster management and is the subject of a number of current initiatives. 1 IT is used as a shorthand in this report to cover both computing and communications capabilities. It should be understood in the context of this report as synonymous with the term information and communications technology, which appears at several places in the workshop program. 2 The workshop agenda, including questions posed in advance for the panelists, is given in Appendix A.
OCR for page 2
Summary of a Workshop on Using Information Technology to Enhance Disaster Management TOPIC 1: THE CRITICAL AND EVOLVING ROLE OF INFORMATION TECHNOLOGY IN DISASTER MANAGEMENT Three panels discussed lessons learned about the effective use of IT in disaster management, technological and organizational barriers to the introduction and adoption of new IT systems, and types of IT that could be of particular use in disaster management. Panelists made a number of points characterizing the challenges of providing more interoperable communications for disaster management across federal, state, and local agencies: Most communications interoperability issues are not technical. Better human organization, willingness to cooperate, and a willingness of government at higher levels to listen to those at local levels who really do the work and who are the actual responders are all critical factors in making better use of information technology for disaster management. Adoption of new equipment and systems that provide greater interoperability will take a long time. A speaker estimated at more than $60 billion the total amount invested by localities in their public safety communications infrastructure. Such an investment can be replaced only over decades. Discussing the federal role in improving interoperability, panelists observed that: Disaster management—and the supporting IT infrastructure—is firmly rooted at the local level. Local organizations provide most of the infrastructure, personnel, and other resources. More than 90 percent of wireless public safety infrastructure is owned, operated, and maintained by localities. A speaker estimated that the total federal investment in interoperable communications represents something less than 3 percent of what the nation spends on public safety wireless communications. Local first responders make up the vast majority of day-to-day users. Even in an event on the scale of the September 11, 2001, attacks on New York City and the Pentagon, the federal presence, which was massive by normal standards, represented a fraction of 1 percent of all personnel involved. By virtue of their primary responsibility and reflecting long-standing organizational culture, localities and their police and fire services have, and generally seek to maintain, control of their communications systems. As a result, the federal role in improving interoperability is limited largely to providing guidance, coordination, and technical assistance. The federal government could, for example, provide a road map, a policy framework, and an architectural framework to create a system of systems. It could also support initiatives that motivate local agencies to move toward standards-based systems. A number of federal programs, including the Department of Homeland Security’s SAFECOM, are aimed at providing such support. Federal interoperability activities are diverse and themselves require coordination. For example, the Homeland Security Act identifies no less than three separate agencies as responsible for aspects of interoperability. Indeed, more than 60 programs deal with interoperability across the federal government. The Department of Homeland Security’s Office for Interoperability and Compatibility has created an umbrella program to coordinate these federal interoperability efforts. Interagency efforts are also underway to address the need for coordination. Still needed are a road map and a more coherent policy framework within which federal agencies can work together. Commenting on activities at the state and local level, panelists noted that:
OCR for page 3
Summary of a Workshop on Using Information Technology to Enhance Disaster Management Interoperability and coordination issues are also evident at the state level. A number of states have developed statewide communications interoperability plans, but many thus far involve only the state police. Localities have begun to recognize the need to better coordinate planning and communications. This work began in the 1990s and was accelerated by the events of September 11, 2001. A panelist also noted that interoperability and coordination issues also arise in the context of public alert systems. The technology may exist today to create an all-hazards warning system, but fragmented responsibility and lack of coordination would likely lead to inconsistent messages and/or overly broad messages instead of the desired authoritative and targeted warnings. Several speakers pointed to standards in areas such as syntax (the organization and structure of data) and semantics (the meaning of the data) for representing, storing, and transferring information as critical to better use of IT in disaster management, noting that: Standards ease interoperability and can foster increased information exchange and help lower costs. Even good standards will have to be changed as circumstances evolve, which places a premium on processes and methods that tolerate extensibility and both incremental and rapid change. The lack of common semantics is a huge inhibitor for more effective use of IT.3 The issue of data semantics boils down to the problem of reconciling terminology used by different organizations and systems so that data can be properly integrated. Different first-responder communities (e.g., fire protection, medical services, law enforcement) as well as different levels of government have different names for the same things or different definitions for the same terms. Semantics can also be different among neighboring jurisdictions, creating additional impediments to communications and information sharing. Implementing standards broadly is a slow process, given the time it takes to build consensus among the relevant communities and the resources and planning required to replace legacy systems. There are opportunities to build momentum for adoption of standards as localities seek to reduce costs by pooling resources with regional neighbors. One effective approach to bridge-building among systems and communities is to use a distributed architecture that is glued together by common semantics. Several comments by panelists addressed the importance of coordinated information technology planning and acquisition to achieve greater interoperability and better-integrated disaster management capabilities: As jurisdictions upgrade their technology to fulfill their own acquisition plans, a stair-stepping effect occurs, with the result that localities’ systems are frequently incompatible with their neighbors’ systems. Overcoming the effects of these mismatches requires better-coordinated and synchronized acquisition cycles. Strategic planning on a multi-jurisdictional, cross-agency basis eases the burden on individual jurisdictions and agencies by giving them a common framework within which 3 A reviewer of this report in draft form cautioned, however, that one should not expect semantic problems to be resolved by significantly changing the behaviors of individual communities and that one should instead expect IT systems to adapt to this reality.
OCR for page 4
Summary of a Workshop on Using Information Technology to Enhance Disaster Management to identify particular technologies that would be able to meet their individual requirements and would also be interoperable with the technologies of other communities and agencies. Localities are increasingly working together in regional organizations to leverage expertise, improve training and planning, and coordinate technology acquisition. Effective examples of such coordination cited by speakers include the Automated Regional Justice Information System for the San Diego region and activities under the auspices of the Metropolitan Washington Council of Governments. Panelists offered several observations about the impacts of new IT on disaster management. Responders and emergency managers should focus on the goals to be achieved rather than on acquiring and using technology for its own sake. New IT capabilities can have a major impact by changing how information is used in a disaster. For example, systems that are architected appropriately could allow decision makers at the local level to directly access (pull) the information they seek rather than having to depend on (sometimes inappropriate) information being pushed to them. Regarding successful introduction and adoption of new information technology, the panelists noted the following issues: To be useful in a disaster, IT must be in routine use. In a crisis situation, people tend to fall back on what they are comfortable with. Technology that is not included in planning, training, exercises, and standard operating procedures will not be used in an actual disaster. Similarly, it is also important to use during routine operations those systems and procedures that would be needed in a crisis. The rate at which IT changes continues to outpace the rate at which public safety organizations are adapting to it. In a number of instances visionary leadership has helped overcome the political, economic, and organizational challenges, but these are the exception, not the rule. The location of a jurisdiction’s emergency management agency within its government structure varies widely and affects what aspects of emergency management are emphasized and how successfully a jurisdiction acquires and adopts IT. First responders and emergency managers must be able to rely on the technology they use to accomplish their work. As a result, they are very reluctant to depend on commercial infrastructure such as the public switched telephone network or cellular telephone systems, which historically have become very congested or collapsed quickly after major disasters. Being at the cutting edge in the use of IT is widely understood to have a number of drawbacks, including higher up-front acquisition costs, greater resources required to customize software and systems, and a tendency to customize systems beyond what is necessary. The cost of new technology is a major inhibitor of its adoption. The public safety community is primarily dependent on local revenue, which must also cover such local needs as education and roads. Life-cycle costs, including the ongoing expense of maintenance, training, and operations, should be factored in from the beginning and understood by all involved. The integration and deployment of technology should not be a one-shot event. Rather, it should be an iterative, ongoing process that involves all stakeholders, especially first
OCR for page 5
Summary of a Workshop on Using Information Technology to Enhance Disaster Management responders. Creating a feedback loop among those who use, acquire, implement, and develop technology is critical to both the development of useful capabilities and their successful adoption. Exercises, drills, live simulations (such as those employed by the military), and shadow operations (such as those conducted in conjunction with the 2003 Super Bowl in San Diego) help build the user community’s confidence and trust in the technology and also provide essential feedback to technology developers and providers on actual user requirements and how existing technologies could be improved. In addition, speakers noted several practical challenges to introducing new technologies: Making updates to IT systems can be difficult. For example, technical barriers (such as limited wireless bandwidth that constrains over-the-air updates) and logistical challenges (such as scheduling appointments to update systems) can complicate the process of updating maps and other large databases deployed in the field. Training responders to use new technologies presents significant challenges. Responders must be taken away from their daily duties, and motivating them to receive training can be hard, especially when responders do not see an immediate practical use for the training or technology. Incident reporting systems and other systems that collect field data provide useful information for developing future technology as well as assessing current technologies and operations. However, the data provided by responders is sometimes of little or no value. Responders who do not see a connection between data and practical results are unlikely to invest the effort to ensure that the data entered is complete, accurate, and timely. The introduction of new information technology that makes users’ actions more readily observed or recalled raises concerns regarding exposure of those users to legal liability or to the professional risk of being second-guessed. Panelists offered several comments regarding the growing use of data (in addition to voice) communications in disaster management: Systems that allow data to be accessed from the field are valuable in a number of settings. For example, access to effective directories of information could enhance the decision-making capabilities of first responders. Visual data such as pictures, video, and maps are increasingly complementing and being integrated with voice and text data. However, reliable voice communications are, and will continue to be, the unequivocal highest priority for the public safety community. For the firefighter entering a burning building or a police officer in a high-speed pursuit, for example, entering text or reading a screen would be distracting and dangerous. Wider adoption and use of data services will require cultural change. This process has already begun, as exemplified by police officers’ common use of mobile data terminals in patrol cars. Wider use of data communications will create new interoperability challenges relating to protocol, syntax, and semantics that go well beyond those associated with voice communications. Several panelists underscored the importance information technology for improved situational awareness and command and control, which were characterized as force multipliers
OCR for page 6
Summary of a Workshop on Using Information Technology to Enhance Disaster Management that would greatly improve what could be done with limited resources. They noted that: The importance of better situational awareness is illustrated by the observation that responders run the risk of becoming casualties themselves because they do not know enough about an incident scene when they arrive. A variety of situational awareness initiatives have been undertaken by various organizations. For example, the U.S. military has a long history of investment in information technology capabilities to provide situational awareness. The U.S. Forest Service’s Situation Awareness Firefighting Equipment (SAFE) program—which includes wearable computer, wireless communications, global positioning satellite, night vision, and software components—is another example. Several panelists also discussed opportunities to employ sensors and other surveillance capabilities for disaster management. Their observations included the following: Sensor systems provide new opportunities to detect hazards and gather other vital information in a disaster. The widespread availability and use of open, Internet Protocol-based technologies makes it easier and cheaper to link already deployed sensors such as video cameras. Surveillance capabilities raise privacy and civil liberties concerns that those deploying these technologies will need to carefully address. Better sensors and better detection, analysis, and filtering technologies do not, however, obviate the need for humans to be in the loop. Indeed, it is generally believed that only humans, not IT systems, should issue warnings or take similar actions in a disaster. Panelists also noted that existing and potential future technologies can improve the ability of responders to act in hostile environments and to extend where and when they can operate. For example, better sensors and IT systems that make use of them could aid night operations and urban search and rescue. TOPIC 2: RESEARCH DIRECTIONS FOR IT IN DISASTER MANAGEMENT Five panels discussed current research programs and potential directions for new research in information technology. Commenting on the nature of those efforts and describing lessons learned, several speakers made the following observations: IT research for disaster management is of an applied nature, reflecting challenges unique to the application and often requiring interdisciplinary efforts emphasizing coordination and collaboration among researchers and practitioners. As a result, the National Science Foundation’s Digital Government program, which has supported work in disaster management, has employed atypical research management approaches.4 Field research, which provides feedback and helps build community acceptance, is vital. Panelists cited the Disaster Management Interoperability Services program and the Biological Warning and Incidents Characterization project as examples of programs that have had success in carrying out field research that involved the public safety community. Successful development is iterative. It is important to provide responders with initial 4 A reviewer of this report in draft form offered as another example of interdisciplinary work the Infrastructure Management and Hazard Response program in NSF’s Engineering Directorate, which seeks to integrate engineering, social, behavioral, political, and economic research.
OCR for page 7
Summary of a Workshop on Using Information Technology to Enhance Disaster Management prototypes to bootstrap the iterative process. Testbeds and exercises are particularly critical in developing IT for disaster management because they provide opportunities for feedback from actual users about critical requirements of responders that may not otherwise be apparent. In some cases, large-scale testbeds are required for understanding issues that emerge only at a large scale. Simulations present opportunities not only for training but also for observation and assessment of IT capabilities such as decision support tools. Operational facilities that permit instrumentation, experimentation, and iteration are needed. Instrumentation is important for both real and synthetic environments. The Department of Defense, which is confronting many of the same interoperability challenges that face the public safety community, is in the process of researching, developing, and implementing a network-centric approach to communications and information management that would overcome existing stovepipes among systems and organizations. Several panelists also made a number of more general observations about what kinds of information technologies and research are appropriate in what circumstances: Different information technologies are appropriate in the various phases of the disaster management life cycle, i.e., preparation, response, mitigation, and recovery. Researchers tend to look for overarching themes, but experience has shown that it is critical, in applying IT to disaster management, to start with real problems faced by real end users, to find solutions, and then to work back from there to overarching themes. Starting with overarching themes will lead to dead ends, and unimplemented and unimplementable technology. False positives are the bane of any system providing critical functionality and will result in technology not being used. Even if is seems that a few false positives ought to be tolerated, the reality is often that false positives will not be tolerated, especially when the consequences are great. Several speakers focused on the topic of wireless and mobile communications.5 They identified some general issues as well as areas of promising research and challenges related to this technology. Panelists noted several attributes of both commercial and governmental wireless technologies that are important in disaster management: In addition to being untethered, wireless communications are highly and dynamically reconfigurable without physical linking, which allows reprovisioning of communications infrastructure on the fly. Its dynamic nature makes wireless communication especially suitable for reaching areas not served well by fixed infrastructure, as well as places where the fixed infrastructure has been compromised or damaged. As noted by panelists, several areas of wireless technology merit further research: Wireless communication is very challenging at the physical layer. Connectivity is often poor, and bit error rates are high. As a result, protocols that are robust and efficient in the 5 Some of the comments of Nader Moayeri (manager, Wireless Communications Technologies Group, National Institute of Standards and Technology) who spoke in a later panel session, are also included in this section.
OCR for page 8
Summary of a Workshop on Using Information Technology to Enhance Disaster Management face of disconnections are important. The tradeoff between high data rates but limited reception, and low data rates but longer-distance reception, results in part from a shadowing effect, in which obstacles make reception more difficult at higher frequencies. Power consumption and limits on battery capacity can significantly constrain the use of mobile communications. Of particular interest for disaster response is the fact that it is hard to communicate through metal frame buildings or metal-containing debris—either low frequencies (with little capacity) or repeaters must be used. Options for deploying repeaters include pre-installing them in structures (like sprinklers) or having first responders leave them behind (like a trail of bread crumbs). Wireless networking is an active area of research. Ad hoc and mesh networking is being deployed and used today (especially by the Department of Defense), but there are many unsolved problems associated with complex wireless networks. For example, how can information be moved reliably through a complex network of unreliable nodes and links? How can such a network be set up and managed? Because network capacities are limited in comparison with those of wired networks, wireless networks are much more susceptible to overload if the wrong data is transmitted or is sent to the wrong people at the wrong time. Sending video to someone who does not want or need it not only distracts the human but uses up network bandwidth that cannot be used for something more useful. One approach is content routing, which attempts to move data to where it is needed for analysis or decision making without overloading wireless links. Another strategy is to anticipate the locations where many people will need to look at a particular piece of information, and then move that information to a local server for later asynchronous access. Network trustworthiness is important. If a network cannot be relied on, because of either technical or security problems, it will not be used. Improved network management capabilities are similarly important. Cognitive radios and other devices that hide complexity from users and move that complexity into the devices or the network have great promise, but they require more research. Wireless technologies can be used to provide location information independent of the Global Positioning System. Better techniques of that kind are needed for applications indoors, in dense urban environments, and so forth. Several panelists spoke about general issues and promising research areas in information integration and data fusion. Several comments outlined the general problem area: A growing number and variety of sensors and other data sources are generating ever-larger volumes of data, including text, numeric, geospatial, and video data. Most information is manually processed today—meaning in practice that most data is ignored and is not analyzed even if it actually contains actionable information. Automation is essential to process, filter, and correct this flood of data, to present it as accurate and actionable information for human decision makers. Improved analysis, synthesis, and fusion of data will require progress on both the syntax and semantics. Several comments described particular research topics of interest in information integration and data fusion: Considerable research has been done in data semantics. Progress has been made on a number of fronts in several different research communities, but many challenges remain.
OCR for page 9
Summary of a Workshop on Using Information Technology to Enhance Disaster Management For example, given the past lack of success in forging agreement on global schemas, it is widely believed today that a multiple schema approach is required that maps schemas to each other. A major technical challenge is how to reason about the relationship among what may be a large number of different schemas. A problem of particular interest for disaster management is the ability to fuse data from disparate sources. For example, how can geospatial data about the location of victims be integrated with online data about the location of medical facilities to provide information needed by emergency managers, first responders, and others. Adjudication management is an important area of potential value for data fusion. As information from multiple sources flows up to higher levels, a more complete picture can be created, enabling adjudication at the higher level to correct erroneous information that has arisen at lower levels. Adjudication also helps reduce the volume of information being pushed up, which can overwhelm decision makers. Another challenge is dynamic selection and combination of sources of data. Sources have varying quality and credibility, due to both human- and technology-related issues. Data may be redundant or contradictory. Data fusion has to advance to the point that multiple data streams of heterogeneous and often confusing data can be converted into actionable information. Detection of change is an area potentially ripe for research. It has been looked at primarily by database people concerned chiefly about detecting anomalies in database joins (a common database operation in which data from multiple sources is combined). With respect to disaster management, the interest is really detection of anomalies and determination of whether a change is significant or not. Several panelists spoke about decision support: The type of technology useful for decision making varies depending on whether it is being applied at the phase of preparation for, response to, or recovery from a disaster. To create and deploy the right technology solutions, it is critical to understand the differences between these situations, as well as the different time lines for decision making. Strategic risk management involves discussion, analysis, and long time lines. Effective operational risk management requires doing something now, based on whatever information is immediately available. Decision support currently is focused largely on optimization, or determining the optimal plan for achieving a goal. Experience has shown (based primarily on Defense Department efforts) that decision makers need “good enough” solutions produced in the time available rather than optimal solutions that arrive too late. It is important that the implications of decision makers’ criteria for assessing outcomes be factored in and presented to them, to prevent the undesirable situation of having a solution that does a good job of meeting ill-advised criteria. Decision making is both planned and improvised. Sometimes, for example, the formal organizational structures prove too rigid and obstruct information sharing, prompting improvisation. One technical challenge is how to make it easier and more natural for people to improvise. Building such capabilities requires new models to predict and explain improvised roles, processes, and structures. It is critical to find the balance between what machines can do most effectively and what humans can do most effectively. That the balance may change over time as technology advances must be understood and managed as well. Advances in visualization of information are critical. Humans process visual data very efficiently but text data slowly. (For a computer system, the reverse is generally true.)
OCR for page 10
Summary of a Workshop on Using Information Technology to Enhance Disaster Management Context (including a user’s location, task load, and environment) is critical to decision support and situational awareness. What information people need or the form in which to present it cannot be determined in isolation if systems are to provide actionable information without overloading users. Even good information delivered at the right time may not be appropriate if it is delivered in the wrong way. An example that also illustrates the importance of sensitivity to context was provided by a panelist who noted that in a recent training exercise, Marines who were using personal digital assistants to work with map data rather than focusing on immediate dangers were the first ones to be “shot.” Several panelists focused on sensors, sensor networks, and autonomous devices: Unmanned aerial vehicles, especially when combined with improved network communications, have tremendous promise because they can carry weather and other types of cameras and sensors to places that human responders cannot reach safely or at all. The distance between a remote device and the human interpreting the device’s information output to make decisions introduces major complications, with respect to both human processing and communications infrastructure. It can be very difficult to build and maintain situational awareness when information is delivered by remote devices, in what perceptual psychologists call a mediated presence. The brain tries to treat the information it is receiving as if it is being seen directly, which can introduce subtle mistakes. This so-called keyhole effect leads to a deconstructed environment for people trying to analyze the information they are receiving. Training can help alleviate, but will not eliminate, fundamental perceptual problems. These must be addressed by research that considers the entire data-information-knowledge cycle. An important aspect of autonomous, remote-presence devices is that they are active, not passive. If people stop focusing on the task at hand and instead concentrate on managing the technology (e.g., driving the robot, pointing the camera), the result can be tunnel vision. Sensor data may be especially useful for immediate response and mitigation efforts involving critical infrastructure, such as bridges. Deployment of sensors so that they are in place in advance of an event implies their integration into the design and maintenance cycles of the infrastructure—an effort that is beginning to happen but has to become ubiquitous. Further research is required to optimize deployment of sensors integral to critical infrastructure. The military is an important source of lessons in how to build computer simulations that can incorporate sensor data. Methods for cost-effective virtual prototyping and virtual exercises are critical to advancing the state of the art. Although the military has done much work in this area, which should be leveraged as much as possible, making this type of research cost-effective for civilian disaster management is a challenge that will require innovative approaches. TOPIC 3: COLLABORATION, COORDINATION, AND INTEROPERABILITY: PRESSING ISSUES IN A NEED-TO-SHARE WORLD Two panels considered current and future approaches to interoperability and information exchange. Several presentations focused on interoperability and wireless infrastructure activities being undertaken at the state level. A number of major trends were noted by panelists: New organizational models are being adopted that balance the roles of state-level bodies,
OCR for page 11
Summary of a Workshop on Using Information Technology to Enhance Disaster Management which coordinate communications activities, and local agencies, which retain responsibility for most acquisition and deployment decisions. Increasingly, states are building and operating statewide public safety communications networks. In some cases, subscriber equipment (radios) to access the network must be purchased by public safety agencies; in others the state supplies the equipment at no cost. In some states access to the network is free, whereas others charge access fees. According to panelists, these efforts to get localities to acquire and use more interoperable equipment have been relatively successful. Panelists also offered a variety of perspectives on lessons learned about how to achieve interoperability: Regardless of the approach, major change will take many years. Systems that can be deployed in the short term to provide even limited capabilities to bridge existing communications systems are a useful interoperability tool. Achieving the goal of widespread deployment of interoperable systems requires a long-term strategy for migrating from today’s systems to the desired capabilities. The migration strategy should be developed and refined in consultation with all the relevant stakeholders. Mandates, whether funded or unfunded, have not proved effective as means to achieve interoperability. Effective approaches require considerable consensus building and leave as much autonomy as possible at as low a level as possible. Several comments were made about organizational and cultural challenges associated with increasing sharing of information for disaster management: Non-governmental organizations and other private entities (such as hospitals or operators of critical infrastructure) are increasingly seen as major players in disaster management. Many of these organizations have significant amounts of relevant data and information that is currently made available for disaster management only in a very ad hoc, unintegrated manner. Much work will be required to allow non-traditional sources to supply data for use in emergency operations. The reliability of sources is a critical issue. Data from a hospital, for instance, may have a higher level of reliability than that supplied by an individual eyewitness report. Indicators of the reliability of sources should be part of the data collected and distributed to emergency managers so that information can be assessed properly. The military long ago adopted this practice, but indicating the reliability of data is largely not done in current disaster management operations, at least not in any systematic fashion. Finally, several speakers focused on opportunities and challenges for building future interoperable networks. Their comments included these: Adaptive technologies, such as cognitive radios, that sense their environment and modify their frequency, waveform, and even their power consumption to fit the situation will continue to evolve and play a growing role in public safety communications. Insufficient availability of radio frequency spectrum remains a constraint to realizing future public safety and disaster management networks. A number of approaches may provide practical improvements in spectrum allocation issues. “Lights and sirens” priority access allows public safety users to signal other users aside, much as an
OCR for page 12
Summary of a Workshop on Using Information Technology to Enhance Disaster Management ambulance does with road traffic. Time- and spatial-sharing can be further exploited. Spectrum rights obtained in secondary markets could be subject to preemption by communications for public safety. Envisioning, enabling, and building networks of the future will be facilitated by the availability of test beds and simulation environments that allow disparate technologies to be designed, built, and tested. Testbeds and capabilities for simulation will also facilitate the ability to analyze such technologies in relation to an information architecture and strategy. Information systems are increasingly network-centric rather than hierarchical. In network-centric operations, information is shared horizontally, among tactical-level peers, as well as vertically up and down the command chain. To move in this direction, the Department of Defense has had to overcome cultural obstacles and make a major investment in new technologies. Similar challenges can be expected for the public safety community. Information systems that provide access to state and federal databases are of growing importance for public safety and disaster management. Databases of interest range from motor vehicle records to weather forecasts to public health information. The availability of information in the form of Internet Protocol-based data will continue to be a driver for future networks. As the availability of data increases and its usefulness becomes better understood, networks will have to integrate and incorporate it. The Web services model is catching on as a way of exchanging and integrating data for disaster management. The Disaster Management Interoperability Services toolkit that is being supplied to the disaster management community by FEMA provides a way of exchanging information among systems and organizations. It is anticipated that this toolkit will help enable the building of a common operating picture during disasters. A culture of information technology tool sharing, which will help make use of future networks more effective, is beginning to take root in the disaster management community but needs further nurturing. Public-private partnerships are an important part of this effort, as are laboratories and test beds where vendors, researchers, and public safety and emergency management organizations can work to integrate their products and services. Increased trust among the various stakeholders is critical if their IT systems are to be more closely integrated.
Representative terms from entire chapter: