The Role of State-of-the-Art Technologies and Methods for Enhancing Studies of Hazards and Disasters
Technical and methodological enhancement of hazards and disaster research is identified as a key issue in Chapter 1, and computer systems and sensors are discussed in Chapter 2 as technological components of societal change having important implications for research on societal response to hazards and disasters. As summarized in Chapters 3 and 4, pre-impact investigations of hazard vulnerability, the characteristics and potential impacts of alternative hazards, and related structural and nonstructural hazard mitigation measures have been the sine qua non of hazards research. Post-impact investigations of disaster response, recovery, and related disaster preparedness measures have been the hallmark of disaster research. Indeed, post-impact investigations have been so prominent historically that special attention was given in the committee’s statement of task to offer strategies for increasing their value. Yet as highlighted in both Figure 1.1 and Figure 1.2, the committee believes that hazards and disaster research must continue to evolve in an integrated fashion. Thus, any discussion of state-of-the-art technologies and methods must ultimately be cast in terms of how they relate to this field as a whole.
Post-impact investigations inherently have an ad hoc quality because the occurrence and locations of specific events are uncertain. That is why special institutional and often funding arrangements have been made for rapid-response field studies and the collection of perishable data. However, the ad hoc quality of post-impact investigations does not mean that their research designs must be unstructured or that the data ultimately produced
from these investigations cannot become more standardized, machine readable, and stored in accessible data archives. Having learned what to look for after decades of post-disaster investigations by social scientists, the potential for highly structured research designs and replicable datasets across multiple disaster types and events can now be realized. As noted in Chapter 1, post-impact studies also provide a window of opportunity for documenting the influence of vulnerability analysis, hazard mitigation, and disaster preparedness on what takes place during and after specific events. However, pre-impact investigations of hazards and their associated risks are critically important on their own terms, less subject to the uncertainties of specific events, arguably more amenable to highly structured and replicable data sets, and no less in need of machine-readable data archives that are accessible to both researchers and practitioners.
So what has been referred to in Chapter 1 as “hazards and disasters informatics” (i.e., the management of data collection, analysis, maintenance, and dissemination) is a major challenge and opportunity for future social science research. This chapter begins with an overview of how social science research on disasters and hazards has been conducted in the past, and consistent with Figure 1.2, a case is made for the essential relatedness in chronological and social time of post-disaster and pre-disaster investigations. This section also illustrates the influence of changes in technologies and methods in hazards and disaster studies. Survey research is highlighted specifically in this regard because of its historical prominence within hazards and disaster research as well as mainstream social science. Consistent with the committee’s statement of task, the second section provides a specific discussion on the challenges of post-disaster investigations and ways to increase their value. The third section discusses “hazards and disaster informatics” issues such as dealing with institutional review boards (IRBs), standardizing data across multiple hazards and events, archiving resulting data so that they accumulate over time, and facilitating access of accumulating data from original researchers to those engaged in secondary data analysis.
The fourth section provides examples of how state-of-the-art technologies and methods enhance hazards and disaster research and, in so doing, relate directly or indirectly to these informatics issues. Although this chapter cannot cover everything in what amounts to the very broad terrain of “nuts and bolts” research matters, special attention is given to increased use of computing and communications technologies, geospatial and temporal methods, statistical modeling and simulation, and laboratory gaming experiments. Sensitivity to the roles of these technologies and methods will contribute to more focused attention and advancing solutions to hazards and disaster informatics issues. The chapter closes with specific recommendations for facilitating future hazards and disaster studies.
DOING HAZARDS AND DISASTER RESEARCH
In examining hazards and disasters through disciplinary, multidisciplinary, and interdisciplinary lenses and perspectives (see Chapters 3 to 6), social science researchers have used a variety of technologies and methods. They have employed both quantitative and qualitative data collection and data analyses strategies. They have conducted pre-, trans-, and post-disaster field studies of individuals, groups, and organizations that have relied on open-ended to more highly structured questionnaires and face-to-face interviews. They have used public access data such as census materials and other historical records from public and private sources to document both the vulnerabilities of social systems to hazards of various types and the range of adaptations of social systems to specific events. They have employed state-of-the-art spatial-temporal, statistical, and modeling techniques. They have engaged in secondary analyses of data collected during previous hazards and disaster studies when such data have been archived for this purpose or otherwise made accessible. They have run disaster simulations and gaming experiments in laboratory and field settings and assessed them as more or less realistic. As research specialists, hazards and disaster researchers have creatively applied mainstream theoretical and methodological tools, thereby contributing to their continuing development and use.
The Commonality of Hazards and Disaster Research
The technologies and methods of hazards and disaster research are indistinguishable from those used by social scientists studying a host of other phenomena (Mileti, 1987; Stallings, 2002). That is as it should be. However, the simultaneity of hazards and disasters core topics within chronological and social time is a source of theoretical complexity, the consideration of which calls for creative applications of the most robust technologies and methods that are available. As noted in Chapter 1 (see Figure 1.2 and its related discussion), chronological time allows partitioning of collective actions by time phases of disaster events (pre-, trans-, and post-impact). The primary explanatory demands of hazards research in chronological time are to document interactions among conditions of vulnerability, disaster event characteristics, and pre-impact interventions in the determination of disaster impacts (see Chapter 3). The primary explanatory demands of disaster research in chronological time are to document interactions among disaster event characteristics, post-impact responses, and pre-impact interventions in the determination of disaster impacts (see Chapter 4). However, such straightforward partitioning in chronological time is not feasible with social time because, as discussed in Chapter 1, pre-, trans-, and post-disaster time phases become interchangeable analytical features of hazards
on the one hand and disasters on the other. In social time, in effect, the respective explanatory demands of hazards and disaster researchers become one and the same.
So in considering how state-of-the-art technologies and methods can enhance studies of hazards and disasters, there must always be sensitivity to the way specific applications and findings within disaster research inform applications and findings within hazards research and vice versa. For example, post-impact field interviews and population surveys seek data on “present” behaviors during the disaster, relationships between these behaviors and “past” experiences with hazards and disasters, and links between present behaviors and past experiences with “future” expectations of vulnerability. Pre-impact field interviews and population surveys seek data on relationships between past experiences with hazards and disasters, future expectations of hazard vulnerability, and links between these experiences and expectations with decisions to locate in harm’s way, adopt hazard mitigation measures, or engage in disaster preparedness. Pre- and post-disaster uses of public access data and other historical materials, as well as searches for unobtrusive data (e.g., meeting minutes, formal action statements, communications logs, memoranda of understanding, telephone messages, e-mail exchanges), are undertaken with these same objectives in mind. Computer simulations and gaming experiments are always subject to reality checks, and with respect to hazards and disasters, these checks are subject to present behaviors, past experiences, and future expectations.
Thus, taking an integrated approach to research on disasters and hazards requires that any assumed impediments of data production during post-impact investigations—such as the ad hoc selection of events, special pressures of the emergency period, lack of experimental controls, difficulties in sampling population elements, and perishable data (see Stallings, 2002)—should be considered also in terms of their consequences for hazards research. In the final analysis, it is because the explanatory demands of disaster and hazards studies are essentially inseparable that these impediments, whatever they may be, are of concern within this entire research community. Also, the impediments are not simply confined to doing either post-disaster or pre-disaster field research. They encompass the way data are collected, maintained, retrieved, and used for purposes above and beyond those of the original studies. The resulting informatics demands on state-of-the-art technologies and methods are major.
Influence of Technology on How Hazards and Disaster Research Is Conducted
Mainstream social science technologies and methods used to study hazards and disasters have changed over the years, and the role of technol-
ogy has been singularly important. A useful illustration because of its importance to hazards and disaster research is technological change in the administration of social surveys. As summarized in Chapters 3 and 4, survey research has provided an excellent source of data for post-impact investigations of the physical and social impacts of disasters, as well as individual and structural responses to these impacts (i.e., disaster research). No less important, survey research has provided an excellent source of data for pre-impact investigations of vulnerability expectations, as well as individual and structural responses to these expectations (i.e., hazards research). Over time, therefore, in hazards and disaster research the survey has been increasingly recognized as a valid form of quantitative data collection (Bourque et al., 2002). Yet like all other methodological tools, the use of surveys is subject to technical, methodological, and societal changes that can affect, both positively and negatively, the ability to collect high-quality data.
Surveys of human populations threatened by hazards or actually experiencing disasters may be conducted using a number of different administration forms. They can be administered in face-to-face interviews, through telephone interviewing, or through self-administration of questionnaires. Each of these forms has its own merits and drawbacks, and new technologies are influencing the way they are implemented. Survey research has changed over the past three decades. In the 1970s, most surveys were administered using traditional face-to-face interviews or through mailed questionnaires. However, the near universal access to telephones by the 1990s made telephone interviewing a more attractive administration format. By 1998, 95 percent of U.S. households had telephones, with most of the remaining households having access to a phone. Telephone coverage is lowest in the South, with approximately 93 percent of households having a phone (Bourque et al., 2002). Moreover, the availability of computers and access to the Internet by the late 1980s and early 1990s for both the general population and, more notably, hazards and disaster management practitioners, has led to increased use of self-administered e-mail and web-based surveys.
Survey research has become increasingly difficult during the more recent past. Response rates for all forms of administration are dropping, and the costs of conducting survey research are increasing. More people live in gated communities, have guard dogs, have answering machines or caller ID, or live in a “cell phone-only” home. All of these trends, along with increases in the elderly and non-English-speaking immigrants in the general population of the United States (see Chapter 2) are affecting interview completion and response rates. While the rates of nonresponse of all types are increasing, this does not appear to increase bias in the studies (Tourangeau, 2004).
Certainly surveys have become more difficult to implement; however,
there have been significant changes in technology that have increased the choices of administration methods. In the 1970s, computer-assisted telephone interviewing became available. This methodology allowed researchers to load a questionnaire onto a computer from which interviewers could read and enter data directly into a database during the interview process. By the 1980s, similar systems for in-person interviewing became available. This methodology allows complex skip patterns to be programmed into the questionnaire and reduces the need for interviewers to find the correct question. It also eliminates the data entry step, creating a complete data set at the close of interviewing. However, this also means that paper interviews are not available for double entering of data. If errors are made in data entry during the interview process, there is no way to verify accuracy.
By the time of the Second Assessment in 1994 (Mileti, 1999b), computers had gained widespread uses, and access to the Internet had just taken off. With the rise of the Internet, e-mail surveys quickly became available. The earliest form of e-mail surveys were questions typed into the body of an e-mail. When replying to the e-mail, the participant simply typed in his or her responses. Then in the 1990s, Web-based survey technology became available. In Web surveys, questions are programmed with response options. Although the methodology shows promise as a low-cost survey method, there are questions about its applications in academically sound research. While Internet access is increasing, the coverage is not currently sufficient to be able to adequately sample the general population without significant bias. Furthermore, unlike telephone samples, a sampling frame for all people who have access to the Internet does not exist currently. As a result, at present a probability sample of all Internet users cannot be determined.
Web surveys may be useful for specific populations in which Internet use is high and there is a list of users in a closed system, such as a university. It is also possible to utilize Web surveys in a mixed-mode fashion. For example, in a survey of health care providers in California regarding their training needs for bioterrorism response, a list of all licensed providers was obtained from the licensing agency in the state. A sample was selected from the list and mailed an invitation to log into a Web site to participate in the survey. Each invitation letter included a unique password so that responses could be tracked.
Notwithstanding problems of administration, technically enhanced and highly structured survey research has been used increasingly to produce quantitative data about hazards and disasters. When combined with more traditional qualitative field research methods, geospatial and temporal methods, considerable use of public access data and historical records, and some simulation and experimental work, the picture that emerges over the past half century is one of an ever-expanding volume of data on hazards and disasters. The production of these data has been and will continue to be
facilitated by state-of-the-art technologies and methods within mainstream social science. However, the data being produced are largely not standardized across multiple hazards and disasters, not archived for continuing access, and underutilized once the original research objectives have been met. Therein lies the “hazards and disasters informatics” problem discussed in the third section of this chapter.
In 1954, a National Research Council (NRC) committee charged with writing a volume similar to this one gave highest priority to exploratory research to define major variables and discover trends (Williams, 1954). It is safe to say that in the ensuing 50 years that goal has been achieved through a host of descriptive and often comparative case studies. With that foundation, and through the National Earthquake Hazards Reduction Program (NEHRP) support during the past 25 years, the transition from descriptive work to the more integrated explanatory work demanded by Figure 1.2 is certainly well under way.
THE CHALLENGES OF POST-DISASTER INVESTIGATIONS AND INCREASING THEIR VALUE
Post-disaster investigations, especially the field work required for the collection of data on disaster impacts as well as activities related to emergency response and disaster recovery, are undertaken in widely varied contexts and often under difficult conditions. As suggested earlier, the selection of events to be studied is necessarily ad hoc. The timing and location of field observations are heavily constrained by the circumstances of the events themselves as is the possibility of making audio and video recordings of response activities. There are special constraints and difficulties in sampling and collecting data on individuals, groups, organizations, and social networks. Unobtrusive data such as meeting minutes, formal action statements, communications logs, Memoranda of Understanding, telephone messages, and e-mail exchanges are sometimes impossible to obtain, and so on.
Post-disaster investigations rely heavily on case studies (the “events”). These case studies have accumulated over time, providing incomplete albeit often sufficient data upon which to base theoretical generalizations about community and societal responses to disasters. In so doing, they have often confirmed and reinforced existing knowledge about response to disasters and hazards (including the continued existence of hazard exposure and specific vulnerabilities). In documenting planned as well as improvised post-disaster responses, they have shed light on hazard mitigation and disaster preparedness practices. In addition, they have served as experience-gaining and training mediums for hazards and disaster researchers.
While the analysis of hazards and disasters in social time requires a
historical perspective and research, post-disaster investigations can be characterized loosely by five principal and frequently overlapping chronological stages: (1) early reconnaissance (days to about two weeks), (2) emergency response and early recovery (days to about three months), (3) short-term recovery (three months to about two years), (4) long-term recovery and reconstruction (two to about ten years), and (5) revisiting the disaster-impacted community and society to document any other longer-term changes (five to at least ten years). Not all of these chronological stages necessarily require field research, and post-disaster investigations may not even take place in the stricken community. For example, studies of post-disaster national response, recovery, and public policy actions may best be completed in capital cities where decision agendas are established and resources are allocated. The level of funding, research foci, methods, availability of data, their quality, and the duration of the study vary greatly across these chronological stages, but resources permitting, the net long-term results can provide important advances in knowledge. Each chronological stage is described briefly below:
Early Reconnaissance: Although of primary interest to physical scientists and engineers because of their need to examine and collect data about the direct physical impacts of a disaster, this stage presents social scientists with opportunities to identify the physical causes of social impacts, learn from scientists and engineers about why and how such physical impacts occurred, observe and document emergency response and immediate relief operations on an almost real-time basis, and define potential responding individuals, groups, organizations, and social networks for more structured follow-on research.
Emergency Response and Early Recovery: Observing planned and improvised actions at the height of the emergency response stage provides knowledge about the analysis and management of disaster agent- and response-generated problems, the availability and allocation of local and externally provided resources, the types and effectiveness of individual and structural responses, and the transition from emergency responses (e.g., search and rescue) to early recovery (e.g., temporary shelter) activities.
Short-Term Recovery: Studying the evolution from the emergency response and early recovery stages to the short-term recovery stage is particularly interesting because researchers can identify more clearly the characteristics of key responding groups and organizations, how these social units influence decisions, and how short-term decisions (e.g., location of temporary housing) influence the allocation of resources for long-term recovery and reconstruction.
Long-Term Recovery and Reconstruction: During this period the sometimes permanent consequences of earlier decisions (or non-decisions) and the application of resources to implement them become visible to researchers, as disaster-related workings of the marketplace. It is then possible to reconstruct how the host of earlier commitments (or noncommitments) combine to shape the previously stricken area spatially, demographically, economically, politically, and socially. This stage also provides the opportunity to document how influential leaders, groups, and organizations have affected the outcomes and why.
Revisiting the Stricken Area: After significant time has passed (probably five years to more than a decade) and the disaster-related issues have receded largely from the public’s and decision makers’ agendas, post-disaster investigations of how the “new equilibrium” came to be and how and why the impacted social system is functioning the way it is can help researchers and users understand the anticipated, real, and unintended consequences of the full range of earlier decisions and their implementation. Research at this interval can include, for example, examining the effectiveness of mitigation/ loss prevention measures instituted after the previous disaster and understanding who benefited and who did not from the entire process.
Sometimes operating alone or in partnership with engineers, earth scientists, and representatives from other disciplines, social scientists have been part of the continuing history of post-impact investigations. Within the context of post-earthquake studies, it was the National Academies’ comprehensive study of the March 1964 Alaska earthquake that saw a fully integrated social science component (NRC, 1970). To varying degrees, this model was repeated for subsequent events, such as the National Oceanic and Atmospheric Administration (NOAA) study of the 1971 San Fernando, California, earthquake, and it continues to serves a model for post-disaster investigations of earthquakes as well as other natural and technological disasters.
As noted in Chapter 1, post-disaster investigations have been seen historically as so important to advancing knowledge that special institutional arrangements have been made and special funding has sometimes been made available (particularly for earthquake research) to enable social scientists and other researchers to enter the field and collect perishable data or conduct more systematic research. As suggested in Box 7.1, support for post-impact investigations of willful disasters is now part of the funding mix at the National Science Foundation (NSF).
A possible model for enhancing the value of post-disaster investigations
National Science Foundation Support for Post-September 11 Research
The National Science Foundation’s (NSF) Division of Civil and Mechanical Systems in the Directorate for Engineering has a long history of supporting post-disaster investigations, particularly those induced by natural and technological hazards. For example, funding from NSF has enabled social science and engineering researchers to carry out post-disaster investigations to gather information (perishable data) that might be lost once the emergency period is over. Such research is funded through NSF’s Small Grants for Exploratory Research Program and with funds made available for rapid response research programs administered by the Earthquake Engineering Research Institute (EERI) and the Natural Hazards Research and Applications Information Center (NHRAIC). The largest of the latter efforts is EERI’s Learning from Earthquakes Program, whose funds are used to support multidisciplinary reconnaissance teams after significant earthquakes in the United States and overseas. NHRAIC’s activity, called the Quick Response Program, supports primarily social science investigations. All three of these NSF funding mechanisms were put in play after the September 11, 2001 attacks on the World Trade Center and the Pentagon, and the plane crash in Pennsylvania, resulting in important social science and engineering studies. Upon completion, the results of these studies were published in a book (NHRAIC, 2003). This book includes social science analyses of the disaster responses following the September 11 attacks, such as individual and collective actions, public policy and private sector roles, and engineering analyses on physical impacts on physical structures and infrastructures. No less important, the book documents similarities and differences between the September 11, 2001 event and past disasters, offers policy and practice recommendations for willful and other kinds of disasters, and provides guidance for future research. An appendix includes a list of awards for social science and other studies funded by NSF that were published in the book as well as other awards related to homeland security made directly by NSF or through NHRAIC in fiscal year 2002.
SOURCE: NHRAIC (2003).
of natural, technological, and willful disasters is the Earthquake Engineering Research Institute’s (EERI’s) Learning from Earthquakes Program (LFE). When federal funding through NSF became available to support field investigations of (primarily) earthquakes, such studies were small in scale, of very limited duration, and composed virtually exclusively of engineers and earth scientists, and the dissemination of the knowledge gained was limited, for all practical purposes, to the earthquake engineering community. The paradigm shifted in 1973, resulting in more sustained federal support for post-disaster investigations and the inclusion of social scientists. The effec-
tiveness of today’s LFE program can be traced directly to that paradigm shift. Within the normal constraints of NSF funding, combined with the support capabilities of other organizations (such as the Natural Hazards Research and Applications Information Center [NHRAIC], the three earthquake research centers, and independent researchers from universities, nonprofit and consulting organizations), the availability of principal investigators (who are expected to and do contribute their time) continues to advance research and knowledge about earthquakes and other types of disasters.
Drawn from its researcher and practitioner members, EERI (2005) has produced a thoughtful retrospective The EERI Learning from Earthquakes Program. This retrospective captures succinctly LFE’s significant accomplishments during the past 30 years. Among those accomplishments are 11 subjects identified and documents that have benefited directly from investments in social science post-disaster investigations. These 11 subjects have nearly universal application, transcending earthquakes as well as other natural, technological, and willful hazards and disasters. The initial four subjects include (1) strengthening research methods and broadening the mix of social science disciplines involved; (2) applying lessons learned to improve the development of loss estimates and their implications for planning scenarios, emergency operations plans, and training; (3) increasing the understanding of cross-cultural disaster impacts that have demonstrated both commonalities and differences related to key societal variables and levels of development; and (4) providing lessons learned that have or are being applied to improve emergency response capabilities, recovery and reconstruction plans, search and rescue actions, understanding the epidemiology of casualties, measures to reduce life loss and injuries, managing large-scale shelter and temporary housing services, and organizing mutual aid programs. The remaining subjects include (5) applying organizational response lessons learned to improve and standardize emergency response procedures; (6) developing clearer and more effective warning procedures and messages, a necessary component of improving warning system technologies; (7) applying lessons learned about fault rupture and other geologic hazards to land-use planning and zoning; (8) carefully examining the adaptive organizational and decision making processes involved in recovery; (9) understanding the need for and measures to organize and manage large-scale temporary shelter programs; (10) improving management related to the flow and on-site handling of inappropriate donations to impacted areas; and (11) adapting scientific data from instrumental networks to support real-time decision making and emergency operations.
It is notable that all of the above subjects relate directly to the social science research summarized in Chapters 3 to 6 of this report. One implication is very clear: The future development and application of social science knowledge on hazards and disasters depends heavily on implementing
recommendations included in these chapters. As noted in Chapter 5, some of these recommendations are disciplinary based, some involve interdisciplinary research among the social sciences, and some require interdisciplinary research that connects the social sciences with natural science and engineering fields. However, the planning and funding needed to implement these research recommendations must be sensitive (1) to the essential relatedness of post-disaster and pre-disaster investigations, (2) to the need for a cross-hazards and cross-societal approach, and (3) to addressing the hazards and disaster informatics issues discussed below.
THE HAZARDS AND DISASTERS INFORMATICS PROBLEM
Informatics refers generally to the management of data—from its original collection and analysis, to its longer-term maintenance, to ensuring its accessibility over time to multiple users. As noted in Chapter 1, the 2003 NEHRP plan (Department of the Interior, 2003) makes it clear that hazards and disaster informatics is an essential planning consideration. The plan speaks, for example, of the need for searchable Web-based data systems, but it is not precise about how these systems should be constructed, the kinds of data that should be included in them, when these data should be collected (pre- or post-disaster), where they should be stored, or how the demands for information from multiple audiences will be met. Hazards and disaster informatics, therefore, is an enormously significant problem. The problem is summarized below in terms of a series of specific trends and related issues: the changing conditions within which hazards and disaster research is conducted, with IRBs, standardizing data across multiple hazards and events, data accumulation and storage, and providing data access to researchers and practitioners that is user friendly.
The Changing Environment of Research on Hazards and Disasters
Fieldwork remains fundamental to hazards and disaster research as this field enters the twenty-first century. Skillful field researchers continue to gain access to individuals, households, and representatives of organizations in the public and private sectors, and respondents more often than not want to be cooperative and helpful. The result is often an effective blending of field interviews, broader population surveys, spatial and temporal data, census materials and other public access information, and unobtrusive data. Such blending is essential to the development of knowledge about the five core topics of hazards and disaster research identified by this committee. Tierney (2002) notes, however, that six important societal trends—mostly challenging, but sometimes facilitating—are affecting the practice of fieldwork. The first of these, (1) human subjects regulations, is of such impor-
tance that it is discussed separately. The others include (2) legal complexities affecting social science research, (3) organizational perceptions of and attitudes toward social science research, (4) the significant expansion of post-disaster research activities, (5) increasing ethnic and gender diversity within the research community and among those being studied, and (6) the increasing professionalism of hazards and emergency management.
The increasingly litigious environment within the United States will continue to affect studies of hazards and disasters. Because researchers are potential sources of information about legal issues, they may become part of a larger pool of people named in complex, controversial, expensive, and lengthy court proceedings. Tierney provides several recent sobering examples from hazards and disaster research as well as mainstream social science and notes that “courts are increasingly faced with balancing the privilege offered to researchers and research participants with the needs of litigants, often to the detriment of the former” (Tierney, 2002:355).
The approach adopted by the National Institute of Standards and Technology (NIST) on use of research findings and reports in civil actions may have broad applicability to hazards and disaster research. Specifically, NIST’s recent draft report on structural and life safety systems at the World Trade Center (Lew et al., 2005) contains this disclaimer: “No part of any report resulting from NIST investigation into structural failure or from an investigation under the National Construction Safety Team Act may be used in any suit or action for damages arising out of any matter mentioned in such report (15 USC 281a; as amended by P.L. 107-231).” The National Construction Safety Team Act (P.L. 107-231, 15 U.S.C. 7301 et seq.) was enacted by Congress in 2003 as a direct result of the collapse of the World Trade Center. With respect to federally funded social science research on natural, technological, or terrorist-induced hazards and disasters, the NIST disclaimer merits careful consideration and for the same reason: to allow social scientists to conduct the best possible science.
Impression management is a related issue that affects organizational studies, in particular, because of the heightened mass media scrutiny that attends management of and accountability for hazards and disasters, and the possible importance of research findings for assessment of organizational performance (Tierney, 2002:359-362). An increasingly litigious environment and related concerns about impression management are exacerbated by the convergence of field researchers, particularly following disasters of significant magnitude and scope of impact (Tierney, 2002:362-365). The need for coordination becomes increasingly important to reduce the burden on disaster impacted communities and regions, as is the need to communicate clearly the purposes and rationale for social science research.
On the more positive side, Tierney (2002:365-370) identifies gender and ethnic diversity as having significant implications for knowledge devel-
opment and the capabilities of the disaster research community. As summarized in Chapters 3 to 6 of this report, the focusing on gender, ethnic, and cross-cultural diversity has several positive results. These include improved access to and reliable information from and about groups that were outside the mainstream of earlier hazards and disaster research, improved understanding of how hazards and disasters affect a broader spectrum of people, increased attention to the impacts on and roles of more informal and community-based groups compared to highly structured formal (particularly government) organizations, and (as discussed in Chapter 9) a more representative and capable research community.
Finally, one of the most interesting, albeit uneven, trends has been the increasing professionalism of emergency management during the latter decades of the twentieth century (Tierney, 2002:370-372). The largely ex-military background of emergency managers following World War II is explained largely by the civil defense and Cold War orientations of the nation’s civil emergency management programs. In more recent decades, however, academic instruction and professional development activities have raised the level of knowledge of emergency management practitioners, provided opportunities for continuous education, created closer connections between hazards and disaster researchers and emergency management practitioners, and otherwise contributed to greater prestige and professionalism in the emergency management field. These kinds of developments facilitate access in pre-disaster as well as trans- and post-disaster contexts and increase communications and understandings about the purposes and rationale of social science research.
Dealing with Institutional Review Boards
The current requirements governing research on human subjects extend from experimental research and studies of “at-risk” populations under normal and controlled conditions to the messier, less structured, and often more fluid conditions encountered by hazards and disaster researchers. For some time the trend has been moving in the direction of defining most contacts in the field as being within the domain of human subjects regulatory procedures. This inclusion complicates the process of doing fieldwork and ensuring confidentiality, particularly during post-disaster reconnaissance studies, where highly formalized approaches to informed consent and confidentiality are inconsistent with the fluid, and often unstructured, data collection strategies and techniques that are required in these contexts (Tierney, 2002:353).
Protecting the rights of research participants and the formal necessity of informed consent have been the major historical issues in studies of human subjects since World War II. The experiments performed by the
Nazis during the war focused attention on the need to protect participants in research. Although initially concerned primarily with biomedical research, by the 1960s federal agencies had begun to consider the potential risks of sociobehavioral research. In May 1974, the Department of Health Education and Welfare issued regulations requiring the review by an IRB of all funded research on human subjects. The IRBs were mandated to determine if research participants were at risk for harm; whether risks were outweighed by benefits (to the individual or society); whether the rights and welfare of research participants were adequately protected; and whether “legally effective informed consent” would be obtained (NRC, 2003c).
In January 1981, following concerns about the impact of the existing regulations on sociobehavioral research, a revised set of regulations was issued by the Department of Health and Human Services. These regulations narrowed the definition of human subjects and allowed certain broad categories of research to be exempted from IRB review or to be subject to an expedited review process. In 1991, the Common Rule was published, which again changed the requirements for exemption and expedited review, allowing IRBs to decide more easily not to exempt certain research from review. Since that time, a number of highly publicized tragic events associated with biomedical studies have occurred. These events underlie what many researchers believe is a tightening of restrictions on exemption and expedited reviews. An NRC panel convened to review the participation of human subjects in social and behavioral research identified three broad areas for improvement of IRB procedures: (1) enhancing informed consent; (2) enhancing confidentiality protection; and (3) improving the effective review of minimal-risk research (NRC, 2003c).
Hazards and disaster researchers are particularly affected by the definition of minimal-risk research. Anecdotal reports of researchers in the field include instances of research not being conducted because of the restrictions placed on researchers. Box 7.2 provides an illustration of the challenge.
IRBs may view research on hazards and disasters, and especially on terrorism, as inherently being of significant risk, thereby requiring full review of studies that otherwise would meet the requirements of an exempted study. Full review takes time and may limit the ability of field researchers to successfully gather potentially perishable data in the immediate post-disaster period. The issue of dealing with IRBs is of such significance that the committee has developed an explicit recommendation at the end of this chapter. Following the committee’s recommendation will not necessarily solve the problem, but it could lead to the development of workable guidelines that will be of educational value to hazards and disaster researchers and the IRBs that oversee their studies.
Impact of IRB Requirements
One example was a proposed study of the perceived effects of convergence behavior in hospital emergency departments following a well-publicized mass casualty event. Researchers proposed to conduct a study using anonymous self-administered questionnaires. The questionnaires were to elicit the respondents’ (professional staff in the emergency department) perceptions of the impact of the convergence of staff, the media, and patients’ families on the ability to respond to the event. While such research is generally considered exempt under the Common Rule, the local university IRB not only would not exempt the study, but required full review of the project. After three rounds of trying to meet the changing requests of the IRB to the researchers, the researchers decided that too much time had passed since the event to effectively retrieve the perishable information from the respondents (NRC, 2003c).
Standardizing Data Across Multiple Hazards and Events
Over the years, there have been calls to standardize data collection across multiple hazards and disasters. This call was formalized in the previously discussed NEHRP plan (Department of the Interior, 2003). Specifically, the plan recommends that data collection strategies and instruments become standardized so that comparisons can be made over time and across earthquake events. The committee concludes that this formal call for standardization for earthquakes applies equally to hazards and disasters of all types, and to social science as well as natural science and engineering studies of them.
In interpreting the NEHRP plan from a cross-hazards perspective, disasters having major significance can provide findings of relevance to the United States as a whole. Significance can be defined as events having relatively high magnitude and scope of impact (see Chapter 4, Recommendation 4.4). Such high-impact events ensure a presidential declaration of disaster and provide the opportunity to examine much more comprehensibly the interrelationships among all dimensions of Figure 1.2 (i.e., interrelationships among conditions of vulnerability, event characteristics, pre-disaster interventions, and post-disaster responses as determinants of physical and social impacts). Yet to optimize the value of research on these relatively rare events, a more coordinated and integrated approach to research design, data collection and analysis, data archiving, and dissemination of findings is needed. Smaller-scale research of less severe but still locally damaging events certainly should continue because findings related to them remain valuable to researchers and practitioners.
The NEHRP plan recognizes that modern post-disaster investigations are far more complex and sophisticated than they were just a few decades ago when teams were small, often funded voluntarily by their members, and of short duration. It also recognizes the need to avoid overwhelming local contacts and organizations in the interest of learning about hazards and disasters, particularly in foreign settings when local officials and residents—many of whom might have experienced losses in the disaster—could be operating under very stressful conditions. Consistent with the above discussion of the changing environment of hazards and disaster research, the committee endorses these assessments. Indeed, the changing environment of research compels the coordination envisioned by the 2003 NEHRP plan.
A few other features of the plan deserve mention because they represent a better understanding of post-disaster contexts, data collection and archiving, and the importance of the timely dissemination of findings in multiple formats and media. Noteworthy, the plan anticipates studies of significant disasters as lasting about five years. Social science researchers have been aware of this need, and at least for some recent earthquakes, such as Loma Prieta (1989), Northridge (1994), and Kobe, Japan (1995), support has been provided for longer-term social science research. Recognizing the need to expand traditional contexts and chronological time frames is consistent with Figure 1.2 and the research summarized in the preceding chapters of this report. The related challenges, of course, are ever-expanding data and the need for more standardized and predictable data collection, archiving, and dissemination. The committee encourages recognition of this informatics problem within the social sciences.
In this regard, the NEHRP plan lists several newly available technologies that can assist with the early collection, rapid transmission, and archiving of field data related primarily to building and lifeline performance. Less is said about the value of technologies and methods to support social science research. Nevertheless, the NEHRP plan recognizes the need to improve the quantity and quality of social science data through new, more standardized protocols on the socioeconomic and health impacts of hazards and disasters. The committee concurs that these improvements are essential for high-quality comparative research.
In sum, the 2003 NEHRP plan provides a focused statement on the need to address hazards and disaster informatics issues that are of central importance to the social sciences as well the natural science and engineering fields. Following the guidance provided by the plan can help to optimize resources, achieve greater efficiencies, avoid duplication, minimize burdens on those being studied, and yield cumulatively greater comparative knowledge about hazards and disasters. To every extent possible,
standardization of social science data collection is essential for achieving these objectives.
Attempts to standardize data collection efforts and instruments have occurred intermittently within the social science research community and with variable success. Most of the efforts have been made by individual researchers attempting to cross-validate over time their own studies. One example is the NEHRP-supported archival work of Kreps, Bosworth, and Webb over two decades on organizing and role enactment during the emergency periods of multiple types of events (Kreps, 1985; Bosworth and Kreps, 1986; Webb, 2002). Another example is NEHRP-supported work that responds to the long-standing call by disaster epidemiologists and medical personnel for standardization in collecting casualty data. Here work by Shoaf et al. (2000) involves efforts to standardize the collection and reporting of casualty data on earthquakes. The work (available at http://www.ph.ucla.edu/cphdr/scheme.pdf) recommends standards for data on the hazard, the building and the person and, where possible, makes use of existing standards, such as the International Classification of Diseases manual for coding injuries (Shoaf et al., 2000). It also makes recommendations for expanding and otherwise improving protocols where existing coding schemes are not sufficient.
While previous attempts to standardize social science data on hazards and disasters have generally been intermittent and not coordinated among respective individual researchers and teams working on the same or related topics, the potential for increased standardization in future research is enormous. The above two examples of standardization efforts highlight again a fundamental point: Knowing what to look for in studies of hazards and disasters enhances the possibility of developing modular protocols and data collection instruments.
As highlighted in Chapters 3 to 6, with NEHRP support a fairly solid knowledge base has developed on physical and social vulnerabilities and their associated risks (both objective and subjective) as well as the standard data requirements to produce critically needed loss estimation models. A great deal has been learned at the individual and household levels about risk communication, warning dissemination and response, evacuation, and other forms of protective action. The preparedness and response activities of disaster-relevant organizations have been the foci of post-impact investigations for decades, to the point that over time codification of knowledge has become increasingly possible. Findings at the multiorganizational response network level of analysis have expanded rapidly during the past two decades and they are based on highly structured methods and protocols. And while less is known about the behavior of firms, other community-based organizations, and intergovernmental relationships before, during, and after major
disasters, existing conceptual and methodological tools that have been used to study individuals, households, disaster-relevant organizations, and multiorganizational response networks can be readily applied to these related topics.
So the groundwork has been established through past social science research under NEHRP and other funding sources for standardizing data across multiple hazards and events. Figure 1.2 provides a useful conceptual framework for building on that foundation. However, individual researchers and teams engaged in studies of the same or related topics need to go beyond the traditional reviewing and sometimes discussion of their respective papers and publications. For this to happen, however, structural mechanisms will be needed to focus, motivate, and support collaborative efforts to produce modular research designs and data collection instruments. As made clear in Chapter 5, such collaboration is essential at both intra- and interdisciplinary levels.
Creating and maintaining data archives have not heretofore been preoccupations of social science research on hazards and disasters. A notable exception with respect to post-disaster investigations is Disaster Research Center archives. The center was founded during the mid-1960s at the Ohio State University and, since 1985, has based its research activities at the University of Delaware. At its founding, the leadership of the center made the decision to create archives of transcribed field interviews (from audio-tapes) and documents from its post-disaster field research and then developed a rudimentary system of cataloguing and retrieving research materials on specific events. The transcribing of field interviews continued until the late 1970s, when it became too expensive; however, since then, audio tapes and documents have continued to be catalogued, stored, and made available to other researchers. The wisdom of that early decision at the Disaster Research Center is documented in Box 7.3 on NEHRP-sponsored secondary research using these archival materials. The archival materials discussed in the box were composed almost exclusively of unstructured field interviews and unobtrusive data until the mid 1980s when survey research became a more prominent tool at the center.
Archiving highly structured population surveys is much easier to accomplish and the resulting data are much easier to work with. The availability of computer-assisted telephone interviewing (CATI) systems and computerized data entry programs allows for the rapid development of clean data sets that can be made available to both the original researchers and other researchers for secondary data analysis. For example, surveys from the Whittier Narrows and Loma Prieta earthquakes have been housed
Funded under NEHRP for nearly two decades (1982–2001), a series of secondary analyses using the Disaster Research Center (DRC) data archives have been completed by a research program at the College of William and Mary. The archives contain descriptions of planned and improvised post-impact responses to multiple types of disasters. The goal of the archival research program has been to extract qualitative descriptions from the DRC archives that allow for quantitative comparisons of organized responses, social networks that connect them, and the performance of post-disaster roles within organized responses and social networks (see Kreps, 1985, 1991b, 1994; Bosworth and Kreps, 1986; Saunders and Kreps, 1987; Kreps and Bosworth, 1993; Noon, 2001; Webb, 2002). The starting point for the William and Mary research program was the DRC typology of organized disaster responses (Dynes, 1970). That typology distinguishes organizations that are expected to be involved post-impact (established and expanding) from other existing organizations whose involvement is not expected (extending), and from completely new organizations (emergent) whose involvement is totally ad hoc.
The William and Mary research program has employed a structural code and logical metric to measure the origins of emergent organizations as falling along a continuum of formal organizing to collective behavior. The structural code and related metric have also been used to describe the restructuring of existing organizations (established, expanding, and extending) as well as social networks among all four types of organized responses in the DRC typology. Additionally, the research program has developed a methodology to isolate individual role behaviors in organizations and social networks as either consistent or inconsistent with pre-disaster positions, as either continuous or discontinuous with pre-disaster relationships among positions, and as performed either conventionally or improvised.
Both the findings and the methodology of William and Mary archival research have drawn the interest of researchers from the Rensselaer Polytechnic Institute and the New Jersey Institute of Technology who have expertise in disaster research, information science, and decision science. These researchers have three primary interests: first, studying the dynamics of conventional and improvised role enactments during the emergency periods of disasters; second, applying state-of-the-art communications technologies to advance archival methods for analyzing post-disaster roles within organizational and social network contexts; and third, using these advanced archival methods to develop simulations and other decision support tools for emergency management practitioners. These tools can both increase practitioner understanding of post-impact improvisations and improve their ability to plan for improvisation prior to impact (Mendonca and Wallace, 2002, 2004). Maximizing the utility of decision support tools in the future will require standardized data collection protocols and data archiving on, in particular, the responses of established (e.g., law enforcement agencies, fire departments, hospitals and public health agencies, public utilities, departments of public works, military units, mass media) and expanding (e.g., emergency management agencies, Red Cross, Salvation Army) organizations from the original DRC topology, whose involvement is expected in natural, technological, and willful disasters.
at the Earthquake Engineering Research Center Library at the University of California, Berkeley and the Social Science Research Archive at the Institute for Social Science Research, University of California, Los Angeles. Such archiving of general population surveys is consistent with mainstream social science practices generally. Standardized population-based surveys are particularly useful for examining individual perceptions and behaviors (e.g., perceptions of community vulnerability and personal risk, individual and household preparedness and mitigation measures, sources and uses of warning information, evacuation and other types of protective action, estimates of damages, uses of disaster services, support from relatives and friends). Of essential importance, these survey data are quantified and therefore amenable to comparisons with other quantitative data using geographic information system (GIS) and other state-of-the-art technologies on the spatial and physical features of impact (e.g., proximity of households, neighborhoods, census tracts) to areas of varying physical impact (Bourque et al., 2002).
Issues of standardization and data archiving, when combined, pose perhaps the most significant informatics challenge facing social science hazards and disaster research. Simply put, there historically has been a lack of attention to standardizing quantitative or qualitative data and a lack of support for archiving these data in an orderly way for short- and longer-terms uses. The result is unavailability of and access to useful information (Thomas, 2001; Goodchild, 2003). For some important problem areas, data are not available in a form that is of use for the research community. Perhaps the most significant case in point is the lack of consistent and standardized data on economic losses attributed to natural and technological hazards and disasters in the United States. We simply do not know with any certainty what hazards and disasters cost this nation on an annual basis. Further, we do not have a standardized reporting method for losses (nor a clear or consistent definition of what “loss” means), despite repeated attempts to do so (NRC, 1999). Missing as well are archives of general population surveys and field research data on what the committee has termed the hazards and disaster management system.
From Data Standardization and Data Archiving to Data Sharing
Plans and strategies related to the output functions of hazards and disaster informatics are no less important than those related to its inputs. It is reasonable to assume, however, that future advances in data standardization in hazards and disaster research will compel the application of technical tools to support management of archives and mining data from them. Much can be learned about these functions from ongoing research and development activities in the physical and life sciences, in engineering, and in interdisciplinary work in computational science (e.g., software solutions
and professional services that support extraction of data, visual imaging, and Web browsing). Bioinformatics issues, broadly defined, have become sufficiently important within the life sciences that the National Academies has focused attention on them within the context of future research and development initiatives (see, for example, National Research Council [NRC] 2003a, 2002c). The growing technical capabilities and required bandwidth for data transmission through the Internet certainly will facilitate data sharing efforts within all natural science, social science, and engineering fields if related administrative and policy issues can be resolved.
The informatics issues of data standardization, archiving, and sharing are generic as are potential solutions to them. The solutions developed collaboratively by researchers lead inevitably to questions of how best to disseminate findings from primary researchers and secondary data analysts to management professionals at national, state, and local levels. The dissemination issue is of sufficient importance to the committee’s charge that Chapter 8 is devoted to its consideration. The technical capabilities to disseminate findings in more “user-friendly” ways and through multiple media will continue to increase.
RELATIONSHIP OF STATE-OF-THE-ART TECHNOLOGIES AND METHODS TO HAZARDS AND DISASTERS INFORMATICS ISSUES
This section considers four state-of-the-art technologies and methods that relate directly or indirectly to the above hazards and disasters informatics issues: computing and communications technologies; geospatial and temporal methods; modeling and simulation; and laboratory or field gaming experiments.
Computing and Communications Technologies
Much of the change in qualitative data collection in hazards and disaster research has resulted from improvements in audio and video recordings of data collected in the field. High-fidelity microphones and the ability to digitally record images and sounds have become accessible to all researchers in the last few years. Video and audio data can provide all of the details collected from key informant interviews and, when they are feasible or required by circumstances, focus groups of respondents. They also allow for matching verbal statements with nonverbal cues of research participants. In addition, there is the possibility of gathering data without the presence of a researcher, who potentially can bias the responses of interviewees of focus group members.
As described below in the section on gaming experiments, both audio and video recordings can also be made of participant responses to laboratory or field experiments. New qualitative analysis software such as ATLAS.ti (ATLAS.ti Scientific Software Development GmbH, 2002) and Qualrus (Idea Works, Inc., 2003) allow for effective use of these enhanced forms of data processing. With these existing and pending new versions of qualitative analysis software, researchers can build highly structured protocols for text data, such as transcribed interviews and documents in the Disaster Research Center archives discussed in Box 7.3 (Mendonca and Wallace, 2004). Such protocols can also be applied to video and audio data that are exclusively in these forms. Existing software also allows for exporting coded data into state-of-the-art statistical software packages. Using these same statistical packages, the potential then exists to integrate highly structured visual and audio data with highly structured data produced through general population or subpopulation surveys as discussed earlier in the chapter. In effect, the technical and methodological means to merge qualitative and quantitative information in standardized data sets is substantial, and this potential exists for both pre-disaster and post-disaster investigations. By employing computing and communications technologies, such standardization also facilitates solutions to data archiving, mining, and transmission issues.
Possibly the greatest influence both on researchers and the population as a whole during the past three decades has been access to computer technology and the Internet. Indeed, changes in computation and communications are arguably among the most rapidly diffusing technologies in America. In the year 2000, for example, 51 percent of households in the United States had access to a computer in the home, which compares to only 8 percent in 1984, the first year the question was asked in a U.S. Census Bureau (2001) current population survey. Today, Internet access is practically synonymous with computer access, with nearly 42 percent of households reporting Internet access at home. Households with children are the most “plugged in,” with two-thirds having computers and 53 per-
cent having Internet access. With telephone and cable companies offering access to DSL and broadband Internet, more and more homes have high-speed access to the Internet, allowing more complex forms of information to be accessed (e.g., streaming video).
Changes in computing and communications technologies during the past decade are not simply a matter of increased access. There is also greater computing capacity in smaller and smaller computers. The handheld computers of today, whether powered by a Palm operating system or Windows CE, are as powerful as desktop computers were 10 years ago. The advent of compact memory cards and USB memory drives allow for storage of large amounts of data that are easily transferred from computer to computer. Likewise, the advances in microprocessor technology have resulted in improved digital imaging as well as audio and video recording. As discussed earlier, all of these advances have improved the ability to conduct research and have greatly facilitated more highly structured data collection in the field. Such technologies also increase enormously the ability to archive, mine, and transmit data among researchers.
Access to the Internet has increased the speed with which field reports become available to other researchers and the general public. The Quindio, Colombia, earthquake of January 1999 was one of the first times that an EERI Learning from Earthquakes (LFE) reconnaissance team filed its initial report from the field. It is now standard practice for field reports to be sent back to research centers from the field via the Internet.
In addition, there have been advances in wireless communication technologies. In 2001, more than 62 percent of Americans owned a cellular phone. Additionally, cellular telephone coverage is becoming ubiquitous in even the least developed countries. Indeed, in less developed countries cellular telephones are popular because people do not have to wait for the installation of standard national telephone services (McFarland, 2002).
A definite asset in conducting research is the almost universal coverage of cellular service and the capability of the newest phones to be used on multiple network formats. This capability allows U.S. researchers to have phone service, using a single phone number that provides access across the country and internationally. Wireless communication technology has also impacted the computing world. It can now be included in notebook computers to take advantage of the more than 25,000 publicly available wireless access points. This wireless access allows the transfer of information from a remote location to other researchers and to centralized data storage points. Wireless computers can take advantage of publicly available wireless access points, connections through cellular telephones, or similar technology built into a wireless modem. Although wireless technology is still limited by the number of access points or the location of cell sites, as cell sites increase so will the usefulness of wireless computing.
Geospatial and Temporal Methods
As noted throughout this report, hazards exist and disasters occur in chronological time and physical space. Whereas maps have been the traditional manner by which geographers represent things in physical space, a new definition of mapping suggests that it allows for more than just placing things on maps; more basically, it allows for understanding the spatial nature of things (Edson, 2001). Spatial analysis is the term used to describe a set of tools and methods for examining the patterns of human activity or physical processes as well as movements across the Earth’s surface (Hodgson and Cutter, 2001:50). In addition to statistical analysis and mathematical modeling, mapping (cartography) and GIS are the tools most commonly used in spatial analysis. Their use is equally relevant to pre-, trans-, and post-disaster investigations; they promote the development of standardized protocols on hazard vulnerability and disaster impacts; and they yield data that can be stored, merged, and disseminated electronically.
GIS is a rich set of tools that can be used for collecting, analyzing, storing, and displaying geographic data. All data must be georeferenced; that is, they must possess some locational attribute such as a coordinate (longitude/latitude) point, a polygon (such as a census tract), or a line (such as a road). Diverse data can then be combined by overlays to see the relationship between the two layers or the many layers that are included in the GIS. The simplest version is the construction of a data layer of housing properties overlain with a data layer depicting the 100-year flood zone to see which properties are inside or outside the zone for a given community. The Federal Emergency Management Agency’s (FEMA’s) HAZUS (NIBS-FEMA, 1999) is a GIS-based decision support tool that helps identify potential losses from a number of different scenarios.
GIS is widely used in some hazard-prone areas, among them the reverse 911 notification system (E-911) and wildfire hazards monitoring. Increasingly, GIS is becoming the preferred tool for vulnerability assessments and other hazards modeling applications. As noted earlier (Radke et al., 2000; Cutter, 2003a) there are a number of areas in which geographic vulnerability science can enhance the hazards and disasters research community. These include better temporal and spatial estimates of tourists, homeless people, and undocumented workers; better integration of physical processes and social data to predict hazard impacts; and interoperability where data in a variety of formats can be easily shared and exchanged by various systems in a highly decentralized and distributed system.
The advent of GIS has increased the ability of researchers to study the spatial nature of hazards and their relationship to human populations. Spatial data can be gathered through other means, however. Aerial photography has been used in post-disaster situations to visualize changes
in topography and geography from pre- to post-disaster. An example is the pre- and post-impact comparisons of aerial photography to measure impacts such as those from Hurricane Andrew (Hodgson and Cutter, 2001; Ramsey et al., 2001) or the monitoring of heat from the debris pile at the World Trade Center using thermal sensors (Greene, 2002).
A newer trend is to utilize satellite-based technology to visualize changes associated with disasters impacts. Such technology as light detection and ranging (LIDAR) and RADAR can be utilized to visualize impacts and provide spatial data. Probably the most important attribute includes an “ability to quickly gain an overview understanding of the extent of damage” (EERI 2005:20) so that social scientists can identify geographic areas of interest or types of damages that probably resulted in significant social impacts that would be worthy of study. In recent articles (Adams et al., 2004a, 2004b, 2004c), several examples were provided, including the Bam, Iran, earthquake; Niigata, Japan, earthquake; Hurricane Charley; the catastrophic Indian Basin earthquake and tsunami, the World Trade Center, and the search for the Columbia Space Shuttle wreckage. Moreover, such remote sensing technologies are enhancing emergency preparedness activities and related loss estimation and decision support software tools.
Remote sensing technologies have been used in studies of many of the foreign earthquake events of the past five years. In order to be most effective, it has been noted that standardized damage scales need to be developed to ensure consistent interpretation of remotely sensed data and images. With standardized scales, these technologies can be used to identify areas of significant damage, collapsed structures, estimation of mortality (based on building damage), inundation zones, and areas of utility outage (Eguchi, 2005). Post-disaster investigations, especially those occurring during the reconnaissance stage, therefore benefit from state-of-the-art spatial technologies and methods. Real-time data from earthquakes, for example, when translated into ground-shaking maps can allow identification of the most likely area of serious impacts.
As illustrated in Box 7.4, GIS and remote sensing technologies and methods have powerful applications. However, with increased access to and usability of these tools, technologies, and methods, the risk exists of inaccuracies being promulgated. One significant constraint on the effective use of maps is the availability and quality of data being utilized. Two characteristics of data inputs are required in analyzing hazardous conditions and disaster events: a temporal dimension and a geographic or spatial dimension. The type of spatial data required depends on the chronological time phase of the application (e.g., an immediate post-impact response versus a longer-term reconstruction response).
In looking at the spatial nature of hazards, the scale, resolution, and extent of data are equally important. Map scale is the relationship between
GIS and Remote Sensing Technologies
Using an innovative approach with geographic information system (GIS) and remote sensing technology, the LandScan global population project has developed a population distribution model that produces the finest resolution population distribution data available for the entire world and the continental United States (Bhaduri et al., 2002). LandScan global at 1 km resolution represents an “ambient population” (average over 24 hours) and is 2,400 times more spatially refined than the previous standard. The LandScan population distribution model involves collection of the best available census counts (usually at subprovince level) for each country and four primary geospatial input data sets—namely, land cover, roads, slope, and nighttime lights—that are key indicators of population distribution. Relationships between any of these datasets and population distribution are not globally uniform. For each region, the population distribution model calculates a "likelihood" coefficient for each LandScan cell, and applies the coefficients to the census counts, which are employed as control totals for appropriate areas. Census tracts are divided into finer grid cells (1 km), and each cell is evaluated for the likelihood of being populated based on the four geospatial characteristics. The total population for that tract is then allocated to each cell weighted to the calculated likelihood (population coefficient) of being populated.
As an expansion of global LandScan, very high-resolution (90 m cell) population distribution data (LandScan USA) are being developed for the United States. LandScan USA includes nighttime (residential) as well as daytime population distributions. LandScan USA is more spatially refined than the resolution of block-level census data and includes demographic attributes (age, sex, race). Locating daytime populations uses a modeling approach that involves not only census data, but also other socioeconomic data including places of work, journey to work, and other mobility factors. Hourly population distribution at the 90 m cell have been developed for several major metropolitan areas The combination of both residential and daytime populations will provide significant enhancements to geospatial applications ranging from homeland security to socioenvironmental studies.
SOURCE: http://www.ornl.gov/sci/gist/; Bhaduri et al. (2002).
the length of a feature on the map and its length on Earth (Hodgson and Cutter, 2001). Many people mistake spatial scale for another dimension of data, spatial resolution. Spatial resolution is the observational or collection unit (e.g. county, census tract, individual). Finally, spatial extent is the area covered by the study (e.g., city, entire nation, world). The choice of spatial characteristics is driven by the research problem—for some, more detailed and fine-grained analyses (based on individual observations within one community) are more appropriate than larger, more generalized analyses
such as those that concentrate on the hazard characteristics of counties (resolution) for the entire United States (spatial extent).
The chronological nature of spatial data involves two important concepts: frequency and lag time. Data often become “old” as the time increases between when they were first collected and ultimately used. Real time or near real time refers to data that has no discernible lag time, that is, receipt of the data is almost instantaneous to its collection. The use of sensors to provide data on traffic flows or Doppler radar that is used to identify tornado winds is a good example of real-time or near-real-time application. The frequency of data is another characteristic that has implications for social science research. For example, surveys about hazards and disaster experiences and expectations, as both relate to mitigation activities, heretofore have been done infrequently. While post-disaster field surveys are done in greater numbers, their frequency depends on the uncertainties of event frequencies. Post-hurricane evacuation behavior surveys are normally, but not always, conducted after major landfalls of hurricanes.
An example of a frequency concern is the decennial census. Population and housing data are essential for modeling populations and infrastructures at risk from hazards, yet these data are only collected every 10 years (frequency). At the same time, there is often lag time between when they were collected (e.g., 2000) and when they become available for use (e.g., 2002). Thus, data that represent the social or demographic situation in 2000 (the census year) may or may not be applicable to a community in 2005, especially in areas that have experienced rapid growth. Given this time lag, communities often resort to population projections in producing demographic profiles.
The temporal characteristics of data influence the types of research questions that can be addressed. A good example is data production with remote sensing technologies. Remotely sensed data are most often used for purposes of pre-event threat identification (e.g., identification of hurricanes in the mid-Atlantic) and post-event rescue and relief operations. While the collection of remote sensing data can be scheduled on demand, the lag time required for processing such data may negate their utility in immediate emergency response situations such as the attack on the World Trade Center (Thomas, 2001; Bruzewicz, 2003). Thus, both the frequency of data collection and the lag time between the collection of data and their availability influence what hazards and disasters researchers study and how the research questions are framed.
Modeling and Simulation
Models are abstractions of reality, and modeling is the process of creating these abstractions. Because reality is nearly infinitely complex and all empirical data are processed with reference to that complexity, model build-
ing involves the simplification of reality as data are transformed into knowledge. The models created are essentially forms of codified knowledge and used to represent the “reality” of things not known from things that are known (Waisel et al., 1998). Modeling is the sine qua non of science. Virtually all scientific activities require modeling in some sense, and any scientific theory requires this kind of representational system (Neressian, 1992). The structure of a model can be symbolic (i.e., equations), analog (i.e., graphs to model physical networks), or iconic (physical representations such as scale models). Models are usually thought of as quantitative, and able to be represented mathematically. However, qualitative models are no less, and arguably more, common. For example, mental models play a very important role in our conceptualization of a situation (Crapo et al., 2000), and verbal and textual models are used in the process of communicating mental models.
Science can be seen as a model-building enterprise because it attempts to create abstractions of reality that help scientists understand how the world works. Technological advances in computing allow the development of complex computer-based models in a wide range of fields. These models can be used to describe and explain phenomena observed in physical systems from micro- to macrolevels, or to provide similar representations of real or hypothetical experiences of individuals and social systems. Models play an essential function in formalizing and integrating theoretical principles that pertain to whatever phenomena are being studied. For example, the computational models used for weather forecasting integrate scientific principles from a variety of natural science and engineering fields. In similar fashion, computational models used for social forecasting integrate theories from a variety of social science as well as interdisciplinary fields such as urban and regional planning, public policy and administration, and public health management.
Computational modeling provides an opportunity for social scientists conducting studies of hazards and disasters to integrate theories and empirical findings from the natural sciences, engineering, and social sciences into models that can be used for decision making. For example, one of the most widely used models in emergency management is that of loss estimation. Loss estimation modeling for disasters has grown in the last decade. Early loss estimation methods were grounded in deterministic models, based on scenarios. Scenario events were chosen and estimates of impacts were based on those events. During the 1970s, for example, NOAA scenarios (NOAA, 1972, 1973) estimated regional physical and social impacts for large earthquakes in the San Francisco and Los Angeles, California, areas and were intended to provide a rational foundation for planning earthquake disaster relief and recovery activities. By the 1990s, technological advances in personal computing technology, relational database management systems, and
the above GIS and remote sensing systems had rendered the development of automated loss estimation tools feasible.
As noted above, HAZUS (NIBS-FEMA, 1999) was developed by FEMA and the National Institute of Building Sciences (NIBS). It is a standardized, nationally applicable earthquake loss estimation methodology, implemented through PC-based GIS software. HAZUS methodology estimates damage expressed in terms of the probability of a building being in any of four damage states: slight, moderate, extensive, or complete. A range of damage factors (repair cost divided by replacement cost) is associated with each damage state. While the front-end of the loss estimation methodology is clearly driven by the earth sciences and engineering, the outputs of the model are much more social science driven. The outputs of interest to urban and regional planners and emergency management professionals are not ground motions, but rather the impacts of ground motion at community, regional, and societal levels. Researchers from the Pacific Earthquake Engineering Center have developed a performance-based earthquake engineering model that describes these outputs as the “decision variables” and often refers to them as “death, dollars, and downtime.”
Other far less used computational models have the potential for significant use in social science hazards and disaster research. For example, what has come to be known as agent-based modeling is a set of computational methods that allows analysts to engage in thought experiments about real or hypothetical worlds populated by “agents” (i.e., individuals, groups, organizations, communities, societies) who interact with each other to create structural forms that range from relatively simple to enormously complex (Cederman, 2005). Such modeling, which has grown out of work on distributed artificial intelligence, can be used to simulate mental processes and behaviors in exploring how structural forms operate under various conditions (Cohen, 1986; Bond and Gasser, 1988; Gasser and Huhns, 1989). A major strength of agent-based modeling is its focus on decision making as search behavior. Model applications have been used to address issues of communication, coordination, planning, or problem solving, often with the intent of using models as the “brains” of real or artificial agents in interactions with each other. These models can facilitate descriptions and explanations of many social phenomena and test the adequacy and efficiency of various definitions or representation schemes (Carley and Wallace, 2001). The earlier example (Box 7.1) of planned and improvised post-disaster responses illustrates the kind of research topic in hazards and disaster research that can be advanced through use of agent-based modeling techniques.
In that example, conventional and improvised roles are nested within different types of organizations and social networks, which connect roles and organizations. The networks themselves represent more inclusive structural (i.e., relational) aspects of agent-based modeling and inform knowl-
edge of when, where, how, and why role behaviors and organizational adaptations occur following a disaster (Mendonca and Wallace, 2004). It is important in this regard to develop representations of both network adaptation and how “agent” knowledge, behaviors, and actions affect and are affected by their respective position within the network. Network models have been used successfully to examine issues such as power and performance, information diffusion, innovation, and turnover. The adequacy of these models is determined using nonparametric statistical techniques (Carley and Wallace, 2001).
From the perspective of a researcher concerned with social phenomena in disaster contexts, two issues stand out (Carley and Wallace, 2001). First, how scalable are agent-based models and representation schemes? That is, can the results from analyses of social networks from two to a relatively small number of members (agents) be generalized to larger more complex response systems that are so characteristic of events having high magnitude and scope of impact? Second, are cognitively simple characterizations of individuals as “agents” adequate or valid representations of agents when the actions of groups, organizations, communities and societies are at issue? Answers to these questions are not possible at this point in knowledge development. However, agent-based modeling techniques are developing rapidly (Gilbert and Abbot, 2005), their development is unambiguously interdisciplinary (Cederman, 2005), and their twin focus on human decision making and structural adaptation (Eguiluz et al., 2005) is a core feature of what has been termed the hazards and disaster management system. Decision support tools are needed in this system, and agent-based modeling techniques can facilitate their development and dissemination (Mendonca and Wallace, 2004).
Perhaps the most familiar computational modeling tool to social scientists is simulation. Simulation models often represent an organization or various processes as a set of nonlinear equations and/or a set of interacting agents. In these models, the focus is on theorizing about a particular aspect of social action and structure. Accordingly, reality is often simplified by showing only the entities and relations essential to the theory that underlies them. Models embody theory about how an individual, household, small group to larger organization, community, or society will act. With a model structure in place, a series of simulations or virtual experiments can be run to test the effect of a change in a particular process, action, policy, or whatever. In so doing, models are used to illustrate a theory’s story about how some agent will act under specified conditions. Cumulative theory building evolves as multiple researchers assess, augment, reconstruct, and add variations to existing models (Carley and Wallace, 2001).
The dominant use of computing in the natural sciences, social sciences, and engineering continues to involve statistical models of existing data.
These statistical models range from relatively simple to highly complex configurations of variables, but over time the increasing capacity of computers to process enormous volumes of data has allowed the development of the kinds of computational techniques discussed above. Computational models are more powerful to the extent that simulated data are informed by real data. The ready accessibility of those doing computational modeling to empirical data previously collected on the same topics, and hopefully archived for secondary data analysis, is therefore essential.
Laboratory and Field Gaming Experiments
It is certainly possible for any research program or center to include both field studies and laboratory simulations of responses to hazards and disasters. When the Disaster Research Center (DRC) was established during the mid-1960s, for example, its research program included both field studies and laboratory gaming experiments. The field studies have continued for decades. However, after a very creative early application (see Drabek and Haas, 1969), the simulation work was largely suspended because of its related cost and complexity. No formal program in social science hazards and disaster research involving laboratory or field gaming experiments has been sustained since the early 1970s. Certainly, emergency management professionals engage routinely in realistic simulations, either at their own local or regional emergency operations centers or perhaps at FEMA’s Emergency Management Institute in Emittsburg, Maryland. However, these simulations are designed as training exercises not as research opportunities for assessment of their effectiveness or realistic foundation in disaster field studies. For the purposes of this chapter, the early combination at the DRC of field studies, data archiving, and simulations continues to serve as a template for future hazards and disaster research.
The use of experimentation has been both touted and criticized by researchers in the social sciences (Drabek and Haas, 1969; Hammond, 2000). Of particular concern is the need to ensure proper scientific conduct of experiments. Increasing realism in experimental situations leads potentially to problems of generalizability. However, the generalizability of “realistic” laboratory or field experiments may be compromised if participants are not experienced in the domain—the result being that the hypotheses postulated may not correspond to the phenomena actually encountered in a real decision environment. Moreover, the events or activities that are controlled in experiments may not be controllable in a real world. Gaming simulations are quasi-experimental designs that can provide both statistical power and the ability to generalize results to a variety of crisis situations.
The advent of computational modeling, as discussed above, has provided another application for gaming simulations (i.e., the testing and
validation of computational models of social phenomena). In these simulations, “agents” in the computational model “play” the same roles as human participants. The actions taken by both human and artificial agents can then be compared. Also, agent-based models, when informed by empirical data, can be used to create a realistic setting for the gaming simulation and, in effect, “play against” the participants, again providing opportunities to investigate cognitive and behavioral phenomena of individuals in social entities of various types.
Both research and practical experience have shown that written plans and procedures serve the valuable purposes of training and familiarization with the role of incumbents such as public officials in crisis-relevant organizations (Salas and Cannon Bowers, 2000). These plans and procedures serve as a normative model for education and training activities. Gaming simulations can provide a means for evaluating the plans and procedures in laboratory settings or in the field (e.g., emergency operations centers). An additional and equally important benefit of these simulations is that they can provide a field laboratory or field venue for experimentation on multiple types of circumstances. Thus, experiments on responses to terrorist events can readily be compared with those related to natural and technological disasters.
A variety of data can be collected prior to a gaming simulation, subject only to the patience of the participants. Biographical data are certainly available—and they may be needed for designing the experiment (Grabowski and Wallace, 1993). For example, data could be collected on cognitive style prior to the exercise and the results used to design the experiment. However, it is important not to deluge participants with an extensive battery of questionnaires because they may create apprehension, alter behavior, or magnify the lack of realism of the simulation. Unobtrusive measures for data collection can also be devised in laboratory or field experiments to record the activities engaged in by the participants. All communications can be recorded and a digital record kept of phone messages, including recording sender, receiver, length of message, and content. These data can be collected for each sample run and categorized in a variety of ways. It must be recognized that participants may communicate with outsiders or with insiders who are not part of the experiment but are with the training group.
Unobtrusive measurements can be built into the exercise, such as recording time and measuring the difference between the time that an event was initiated and the appropriate responses were made. To measure the degree of correctness, every initiated event can have a set of appropriate decisions. In addition to maintaining a record of the activities of participants in the game, many times simulations lend themselves to observation. Participants in the exercise can be observed in a very structured manner
with pre-designed instruments to be completed by trained observers. Video-taping can be also used, but usually needs to be electronically transcribed for analysis—resulting in a great deal of qualitative data that require extensive effort to analyze. Various techniques, such as protocol analysis, have been found useful for research purposes, but the benefits of their use must outweigh the costs because they are so time consuming. Behavioral coding of group interactions can be done both in real-time or from the videotapes. Here training of the coders is crucial (Fleiss, 1971).
After a gaming experiment is run, participants can complete self-reporting questionnaires. These can be done as part of, or immediately after the activity, and digitally recorded (Litynski et al., 1997). Participants can also be asked to describe and rate each other’s behavior on a variety of dimensions, and to record their interactions with each other during the course of the exercise. Both preceding and following the exercise, interviews can be conducted with each of the participants.
The foregoing activities will create a wealth of data. Analysis of the data generated by experimentation using gaming experiments can usually be assessed by standard statistical techniques (Cohen, 1977). The degree of realism of a game is extremely important, not only from the point of view of evaluating the decision aid per se, but in maintaining the interest in and enhancing the educational benefits of the simulation. Such validity can be easily ascertained by having experienced field researchers and emergency management professionals walk through the simulation prior to the actual exercises.
Perhaps the most complex issue with gaming experiments is as follows: Do participants treat the simulation as realistic? This was certainly the case in the seminal work by Drabek and Haas (1969). Box 7.5 provides a case where the realism of the gaming simulation could be compared to an actual event that followed shortly after a gaming experiment was run. In this case, it was found that there was some in-game playing because the recovery activity in the simulation did taper off in comparison with the actual event; in fact it ended dramatically at 4:00 p.m. (Belardo et al., 1983). This suspension was obviously not the case with the actual event. However, gaming simulations can be designed in the laboratory or field as learning experiences, and the participants usually understand that training is very important as a precursor to the need to prepare for dealing with incidents with the potential to escalate to a disaster.
In conclusion, gaming simulations with hazards and disaster management professionals as participants have an important role in social science research on disasters. The core idea here is to build gaming simulations with an eye toward realism. Such realism can be captured through standardized data from previous field studies that are maintained in effectively managed data archives, accessible to multiple researchers, and used to every
Realism in Gaming Simulation
A serendipitous evaluation of a gaming simulation yielded the observation that realism in the crisis environment was replicated in the simulation environment in terms of both organizational- and individual-level responses. The evaluation entailed data collection during a training exercise held by the U.S. Nuclear Regulatory Commission and the Federal Emergency Management Agency (FEMA) at the Robert A.F. Genet Nuclear Facility in New York. Four days after the simulation an actual incident occurred that involved the activation of all emergency response activities throughout the State of New York. This provided an opportunity to evaluate the benefit of simulations. The realism of the crisis environment was well replicated, both organizationally and its impact on individuals. Stress levels were found to be similar between the simulation and the actual event. Communications were similar during the beginning of the crisis, but there were some differences during the latter stages of the exercise, particularly with respect to decisions concerning recovery operations. This may have been due to participants in the gaming simulation being aware of the need to end the exercise before the end of the working day.
SOURCE: Belardo et al. (1983).
extent possible in the development of computational models such as those summarized above. The hazards and disaster research community has developed knowledge to the point at which it is feasible to integrate these core informatics activities.
The research findings and recommendations from Chapters 3 to 6 of this report summarize what has been done in the past under NEHRP support and what the committee feels should be done in the future. The discussions of technologies, methods, and informatics issues in this chapter relate the substance of past and future hazards and disaster research to its actual implementation. Thus, regardless of the topics discussed in previous chapters, social science studies in the next several decades must be responsive to the changing environment of hazards and disaster research. By whatever available technological and methodological means available, they must capture data that are more highly structured and standardized across natural, technological, and willful hazards and disasters. They must analyze, store, and manage data with dissemination and formal rules of data sharing in mind.
Recommendation 7.1: The National Science Foundation and Department of Homeland Security should jointly support the establishment of a nongovernmental Panel on Hazards and Disaster Informatics. The panel should be interdisciplinary and include social scientists and engineers from hazards and disaster research as well as experts on informatics issues from cognitive science, computational science, and applied science. The panel’s mission should be (1) to assess issues of data standardization, data management and archiving, and data sharing as they relate to natural, technological, and willful hazards and disasters, and (2) to develop a formal plan for resolving these issues to every extent possible within the next decade.
As summarized in this chapter, there are continuing issues in the following areas: (1) standardizing data on hazardous conditions, disaster losses, and pre-, trans-, and post-impact responses at multiple levels of analysis; (2) improving metrics in all of these same research areas; (3) developing formal data standards for storing, aggregating, disaggregating, and distributing data sets among researchers; and (4) using computing and communications technologies to enhance quantitative and qualitative data collection and data management. Addressing these issues systematically can, and the committee believes should, lead ultimately to the establishment of both centralized (virtual) and distributed data repositories on hazards and disasters.
The range and depth of research inquiries and approaches in hazards and disaster research will perforce result in major increases of data. Thus, the status quo ante of continuing inattention to data management issues is no longer acceptable. Resolving what the committee has termed globally the “hazards and disasters informatics problem” will require careful consideration and planning. This research community is not in a position to simply adopt informatics solutions from other fields of inquiry because such solutions are only now in the process of being developed. Like other research domains, hazards and disaster research has its own unique theories, models, and findings. Yet informatics issues and their resolution are not field specific; they are generic to basic and applied science. The committee believes that the first step in becoming a more active participant in the “science of informatics” is to create the interdisciplinary panel of experts specified in Recommendation 7.1.
The research domain of this community includes natural, technological, and willful hazards and disasters. Thus, the committee believes that it is quite appropriate for the National Science Foundation and the Department of Homeland Security to provide joint support for the work of the recommended interdisciplinary panel. The conceptual framework developed in Chapter 1 (see Figure 1.2 and its related discussion)—placed within the changing societal context described in Chapter 2, the research findings and
recommendations summarized in Chapters 3 to 6, and discussions of research methods, techniques, and informatics issues in this chapter—provides the foundation for the recommended panel. The work of this panel should commence as soon as possible.
Recommendation 7.2: The National Science Foundation and Department of Homeland Security should fund a collaborative Center for Modeling, Simulation, and Visualization of Hazards and Disasters. The recommended center would be the locus of advanced computing and communications technologies that are used to support a distributed set of research methods and facilities. The center’s capabilities would be accessible on a shared-use basis.
There is an immediate need in social science hazards and disaster research to expand the use of state of the art modeling and simulation techniques for studies of willful as well as natural and technological hazards and disasters. The joint support of the National Science Foundation and Department of Homeland Security is therefore encouraged for purposes of implementing Recommendation 7.2. Three areas of research would be supported by the center, each of which would be developed and maintained at distributed research sites:
Modeling and simulation: The center would act as both a repository for models constructed by social science researchers at distributed sites, and would work to ensure collaboration, (including experimentation using the Internet), maintenance, and refinement of models. Compatibility, which permits “docking” of computational models, would be a major responsibility of researchers and support staff of the Center.
Visualization: Social science researchers are making ever-increasing use of digitized spatial and graphical information, such as global positioning system (GPS)-GIS displays. In addition, human-computer interface technologies are being investigated for possible use as decision tools for hazards management and emergency response. Research on the cognitive processes underlying visualization under conditions of stress and information overload typical of emergency response situations is just one potential topic for this visualization component of the recommended center.
Gaming experimentation: The recommended center would have its own and distributed laboratory settings with data collection technologies for research on individual, small group/team, and “organizational” decision making using exercises, “games,” and other interactive experimental media. Researchers could gather data and control treatment from distributed locations networked to the center.
As documented in this chapter, computational modeling, visualization, and gaming experiments are important tools for building on and applying knowledge gained from field studies. Heretofore the use of these technical tools has not been integrated, thus reducing their potential value. Such integrated use is best accomplished within a center established for that purpose. As noted above for example, the core idea of gaming and simulation is to build them with an eye toward realism. Such realism is enhanced through standard data production from previous field studies. As the resulting data from field studies become more effectively maintained in distributed data archives, they can be used systematically by the proposed center in the development of computational models and simulations, and the design of gaming experiments specifically for hazards and disaster management professionals. The hazards and disaster research community has developed to the point at which the sustained integration of field research, modeling, and experimentation can be accomplished.
Recommendation 7.3: The hazards and disaster research community should educate university Institutional Review Boards (IRBs) about the unique benefits of, in particular, post-disaster investigations and the unique constraints under which this research community performs research on human subjects.
The committee has noted above the difficulties involved in harmonizing the actual practice of research with the demands placed on researchers during field studies by the fluid situations that inevitably follow disasters. In particular, the fine points of consent forms, detailed interview protocols, and other research infrastructure are often unachievable in the hours to weeks after a disaster. Furthermore, such requirements may violate cultural norms in the places studied. At the same time, IRB members may have real but sometimes misplaced concerns about the risks of psychological harm that they believe attach to research on hazards and disasters.
To the extent that they are not, hazards and disaster researchers must become familiar with federal (in particular, 45 CFR 46.101 et seq.) and local university regulations regarding human subjects research so that they can be knowledgeable resources for their respective IRBs and effective advocates for appropriate deviations from “standard” practices, while maintaining the personal privacy and dignity of research subjects. Members of the research community should seek to become members of human subjects review panels on IRBs or should assist in other policy-making roles.