National Academies Press: OpenBook
« Previous: Front Matter
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×

INTRODUCTION

The U.S. Department of Health and Human Services’ Metropolitan Medical Response (MMRS) program has evolved from an idea originally developed in the Washington, D.C., area in 1995. Using the combined personnel and equipment resources from Washington, D.C., Arlington County in Virginia, and Montgomery and Prince Georges Counties in Maryland, the Metropolitan Medical Strike Team (MMST) received training, equipment, and supplies specifically designed to facilitate an effective response to a mass-casualty terrorism incident with a weapon of mass destruction (WMD). The first of its kind in the civilian environment, the MMST was intended to be capable of providing initial, on-site emergency health, medical, and mental health services after a terrorist incident involving chemical, biological, or radiological (CBR) materials. The team’s mission includes CBR agent detection and identification, patient decontamination, triage and medical treatment, emergency transportation of patients to local hospitals, coordination of movement of patients to more distant hospitals via the National Disaster Medical System (NDMS), and planning for the disposition of nonsurvivors. Building from the initial efforts of the Washington, D.C., Metropolitan Area MMST, OEP provided funding for the development of a similar team in the city of Atlanta in preparation for the 1996 Summer Olympic Games. The U.S. Congress has subsequently authorized and provided funding for additional contracts with the 120 most populous U.S. cities.

Although the first two MMSTs were essentially enhanced hazardous materials (hazmat) teams, with plans, training, and equipment centered around dealing with chemical agents, some of the other early MMRS cities changed the MMST concept by integrating on duty existing fire, emergency medical services, and police personnel into a “MMST response.” In addition, their plans incorporated local public health officials, non-governmental organizations, state agencies, including the National Guard, federal military and non-military officials, and private healthcare organizations. OEP soon amended the initial contracts to focus more attention on coping with a covert release of a biological agent and changed the name of the program to the Metropolitan Medical Response System. The new name emphasizes that the program is intended to enhance the capabilities of existing systems that involve not just hazmat personnel, law enforcement, emergency medical service, public hospitals, and the American Red Cross, but also public health agencies and laboratories, private hospitals, clinics, independent physicians, and other private sector organizations. This emphasis on enhancing existing systems rather than building new, and perhaps competing, CBR-specific systems was strongly recommended by a previous Institute of Medicine (IOM) committee as a first principle in efforts to prepare for CBR terrorism.1

It was in this spirit of system improvement and enhancement that OEP approached IOM about OEP’s ongoing need to systematically assess and evaluate the preparedness of the MMRS cities (“MMRS city” is used throughout this report to mean the metropolitan area encompassed by a given MMRS program contract, which might involve several cities and counties) and understand the effectiveness of the overall program approach. Continuing improvement, as in any program, is

1  

Institute of Medicine. 1999. Chemical and Biological Terrorism: Research and Development to Improve Civilian Medical Response. Washington, DC: National Academy Press.

Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×

critically dependent on regular evaluation of successes and shortcomings, a task rendered more difficult in this case by the low rate of actual CBR terrorism incidents.

CHARGE TO THE COMMITTEE

IOM shall identify and develop performance measures and systems to assess the effectiveness of, and to identify barriers related to, the MMRS development process. Additionally, IOM shall establish appropriate evaluation methods, tools, and processes, based upon the performance measures, to assess the MMRS development process. The products of this work will assist OEP in determining appropriate mechanisms to assess the effectiveness of, and identify barriers related to, the MMRS development process.

In Phase I, an expert committee shall identify, recommend, and develop performance measures and systems to assess the effectiveness of, and identify barriers related to, the MMRS development process at the site, jurisdictional, and governmental levels. [OEP posed 11 more specific questions relevant to this task. The questions, and the committee’s answers, are provided below in a separate section].

In Phase II, the committee shall use the performance measures developed from Phase I to recommend and then develop appropriate evaluation methods, tools, and processes to assess the MMRS development process.

The evaluation system(s) developed should be geared toward the timely assessment of each deliverable or phase of the development process with emphasis placed on identifying barriers, identifying solutions, and sharing successes of both the technical and administrative components of the MMRS program.

METHODS

In the fall of 2000, IOM assembled a committee whose members provided expertise from the fields of emergency medicine, emergency and disaster management, medical toxicology, urban planning, epidemiology, public safety, public health, hospital administration, infectious diseases, mental health services, and program evaluation. This was accomplished in accordance with the established procedures of the National Academies, including an examination of possible biases and conflicts of interest and provision of an opportunity for public comment.

A wide variety of sources were used to assemble the data and the information necessary to respond to the charge. A comprehensive list of individuals who assisted the committee in this effort will be provided in the final report. An initial organizational and data-gathering meeting of the committee in December 2000 provided an overview of the MMRS program from the viewpoints of both OEP and several of the initial MMRS cities. Other speakers provided an overview of program evaluation principles and practices and some insights into two Federal Emergency Management Agency (FEMA) programs focused on assessing state and local readiness for a variety of potential disasters.

At a subsequent meeting, in February 2001, the committee heard about the legislative and executive origins of the MMRS program and other federal counterterrorism programs.

Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×

Representatives from the U.S. Department of Justice (DOJ) and the Centers for Disease Control and Prevention (CDC) described their current programs aimed at enhancing state and local capabilities, and a Public Health Service (PHS) project officer described the different approaches used and the levels of success achieved by the 16 MMRS cities in his geographic area. That meeting also featured briefings on the assessment techniques and procedures used by medical organizations evaluating residency training programs, poison control centers, and individual physician specialists and by FEMA’s National Urban Search and Rescue Team program. Follow-up with the speakers provided more detailed information and points of contact for additional questions.

The sponsors’ project officers shared copies of completed plans from six MMRS cities from their files and put committee members in touch with offices that had relevant data. The committee members themselves contributed both personal contacts and specific information from their own files and experiences. The World Wide Web provided much information about additional organizations and counterterrorism activities, and IOM staff assembled a library of over 350 documents, published and unpublished, bearing on federal, state, and local preparations for managing the consequences of CBR terrorism incidents. These documents and other written materials presented to the committee are maintained by the Public Access Office of the National Research Council Library. Appointments to view these materials may be made by telephoning the library at (202) 334–3543 or by sending electronic mail to nrclib@nas.edu.

The present report was the result of extensive discussion among the committee members at a two-day meeting in May during which draft answers were formulated and initial preparedness indicators compiled. Subsequent versions of each were reviewed and modified via email, and committee members “signed off” on the review draft in late July.

INITIAL OBSERVATIONS

The MMRS program context presents some special challenges for evaluation. First, there is much to be learned from analysis of the local, state and federal responses to the terrorist attacks on the World Trade Center and the Pentagon in September 2001, but the committee believes that CBR terrorism incidents of the scale envisioned by OEP are unlikely to occur on a regular basis. As a result, any evaluation of a response system will have to be indirect, in that it will have to measure the intermediate consequences of the MMRS program rather than the ultimate goal, which is to save lives and minimize morbidity from a terrorism incident.

Second, every city’s MMRS encompasses a web of planning activities, resources, intergovernmental agreements, and exercises at multiple levels of government. This web of activities is illustrated in Figure 1. The many activities in the box beneath “Emergency Capacity” represent only some of the capabilities required for an effective response to CBR terrorism events. Producing those capabilities is the concern of a wide variety of governmental and private-sector institutions through an equally wide variety of mechanisms, including the MMRS program. The MMRS program itself represents an effort to coordinate multiple entities and activities that are independently funded and that receive the authority for their activities from other sources. This complexity means that isolation and quantification of OEP’s role in creating readiness for a CBR terrorism incident will be nearly impossible, regardless of how well one might measure readiness in any given city. It also suggests that caution is

Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×

necessary in making changes in any part of the web of activities, for they may have unintended consequences far from the locus of change.

Third, although many of the pieces of a response plan may be thoroughly evaluated, evaluation of response capacity as a whole will, by necessity, be inferential; that is, assumptions must be made about how the component parts should work together.

Fourth, the wide variations in the resources and vulnerabilities of the MMRS municipalities may preclude use of a single yardstick or measure that places all the MMRS cities along a single scale of readiness. For example, Washington, D.C., must anticipate attacks on numerous federal facilities and embassies, whereas Baton Rouge, Louisiana, has a variety of chemical plants that are vulnerable to attack. Some cities operate their own emergency medical services; others depend on county or state assets. OEP has dealt with this variation by not attempting to impose a single model or acceptable plan on all its MMRS cities, instead opting to encourage cities to build their own plans around available structures, resources, and vulnerabilities. This flexible approach results in a substantial reduction in the ability to impose universal performance measures and standards and a corresponding difficulty in devising fair and comparable evaluation tools.

Finally, the committee has been persuaded by both the first five observations and the written and oral explications of OEP that it should approach its tasks with a strong bias toward a formative rather than a summative evaluation. That is, the committee takes as a given that the primary goal of the proposed evaluation is constructive feedback both to OEP staff and to the MMRS cities.

Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×

FIGURE 1 MMRS Program Participants, Policy Instruments, Development Activities, Emergency Capacity, and Follow-up Activities

Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×

MMRS PROGRAM CONTRACTS

Unlike many federal programs of assistance to state and local governments that provide funds by means of grants or cooperative agreements, OEP chose to use contracts as the mechanism for providing funds to participating MMRS cities. A distinguishing characteristic of contracts is the level of detail provided in the “statement of work.” Unlike grants, which often support desired processes and activities without specifying the expected product in any detail, contracts focus more closely on the products (“deliverables,” in government jargon) and less closely on how the contractor is to produce them. The monetary value of the contracts have varied slightly with the size of the metropolitan area involved, but have averaged about $500,000. This is delivered in installments as the deliverables are produced. For example, contracts awarded to the fiscal year (FY) 2000 MMRS cities are 18 months in duration and call for phased delivery of 12 products:

  1. Meeting with project officer (within 2 weeks of contract award);

  2. MMRS Development Plan (within 3 months of contract award) [The plan for developing a plan]

  3. Primary MMRS Plan (within 6 months of contract award);

  4. Component MMRS plan for forward movement of patients utilizing the NDMS (within 8 months of contract award);

  5. Component MMRS plan for responding to a chemical, radiological, or explosive WMD event (not a biological WMD event) (within 9 months of contract award);

  6. Component plan for MMST if it is a component of the municipality’s MMRS (within 12 months of contract award);

  7. Component plan for managing the health consequences of a biological WMD (within 18 months of contract award);

  8. Component plan for local hospital system (within 18 months of contract award);

  9. MMRS training plan including training requirements and a follow-on training plan (within 18 months of contract award);

  10. MMRS pharmaceutical and equipment plan that includes a maintenance plan and a timetable for procurement of equipment and pharmaceuticals that have been approved by the project officer (within 18 months of contract award);

  11. Monthly progress reports; and

  12. Final report.

The products of these contracts are thus a series of plans for organizing and responding to large-scale acts of CBR terrorism. Current MMSR contracts explicitly demand coordination with county government and neighboring jurisdictions, e.g. the Los Angeles MMRS involves 88 jurisdictions.

As noted above, the program has changed since its initiation in FY 1997, and that is reflected in the number and nature of the deliverables demanded by contracts let in subsequent years. Contracts awarded in fiscal years 1999, 2000, and 2001 are very similar to one another, although they differ in a number of respects from the FY 1997 contracts. The 1997 cities’ “bioterrorism supplement” was incorporated into the body of the contract in subsequent years, and post-1997

Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×

cities were given the option to build the capabilities of an MMST into their existing response organizations rather than create a stand-alone team. Smaller changes clarified OEP intent in a number of places and provided cities with additional information about acceptable actions in others. No substantive requirements were added or deleted, and so, in the interests of brevity, only the provisions of the fiscal year 2000 contract are presented here.

OEP provides considerable guidance on the required elements of an acceptable plan, and those required elements form the basis for the organization of the committee’s collection of preparedness indicators in a subsequent section of this report (see Appendix). The committee used the FY 2000 contract for that purpose, but the major substantive elements have been present in some form in every contract, and no city should be disadvantaged by the minor differences in wording or order of presentation that exist.

ANSWERS TO SPECIFIC QUESTIONS ASKED BY OEP

This IOM project is divided into two phases. In Phase I the committee was asked to identify performance measures and systems to assess the effectiveness of and identify barriers related to the MMRS development process at the site, jurisdictional, and governmental levels. OEP further asked the committee to “include the following considerations”:

QUESTION A: How can OEP determine, at the program level, whether the strategies, resources, mechanisms, technical assistance, and monitoring processes provided to the MMRS development process are effective?

ANSWER A: This question concerns the performance of the OEP staff, as opposed to the performances of the MMRS cities, which are the focus of most of the succeeding questions. That said, the simple answer to this question is straightforward: ask the contractors, that is, the MMRS cities, about the extent to which they used OEP technical assistance and resources (e.g., the Public Health Service officers serving as Regional Emergency Coordinators) in fulfilling the terms of the contract, their perception of the value of OEP’s technical assistance and resources, and to what extent community preparedness was improved by fulfilling the terms of the contract. That answer, however, assumes that fulfilling the terms of the contract is synonymous with the ultimate goal of the program: an enhanced local capability for coping with the consequences of a CBR terrorism incident. That is not the case. Although the contracts call for establishing a stockpile of appropriate pharmaceuticals, equipment, and supplies in the community, the primary demand on the contractor is the production of a series of plans. Although written plans, like a stockpile of equipment and supplies, are a necessary part of preparedness, they are not sufficient. OEP has recognized this in asking IOM not just for some tools to evaluate how OEP helps the MMRS cities fulfill the terms of the contracts but also for advice on the terms of the contracts themselves and for tools to evaluate preparedness at the local level. This Phase I report deals primarily, although not exclusively, with the last of these three tasks, but the final report will address all three in detail. It cannot be overemphasized, however, that whatever the state of local preparedness, many programs and initiatives—those of the federal government, state and local governments, and the private sector—as well as preexisting conditions in each jurisdiction contribute to preparedness. It is therefore impossible to disentangle the causal effects of the MMRS program from the effects of these other influences.

Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×

QUESTION B: How can OEP identify whether the performance objectives identified in the MMRS contract lead communities to preparedness?

ANSWER B: This question seeks the committee’s opinions on the adequacy of the contract deliverables. Are they the right ones? Should there be more? That is, are the actions demanded of the MMRS communities by their contracts with OEP necessary and sufficient for preparedness? Although this is a question for which input from the contractors themselves would again be helpful, it is probably impossible even for them to unequivocally assert a causal link between the MMRS program and preparedness because of myriad confounding variables such as U.S. Department of Defense (DOD) and DOJ programs and professional society activities. The question also assumes an independent measure of preparedness, which is the major task of the committee. The committee nevertheless believes that several modifications and additions to the contract objectives (deliverables) will likely enhance a community’s response to a CBR terrorism event. These modifications and additions are explained below in the response to Question C.

QUESTION C: What modifications, additions, or subtractions should be made to these performance objectives to assist communities throughout the development process?

ANSWER C: The evaluation of local preparedness that presumably will follow this committee’s report should result in contract modifications appropriate to its findings. In the interim, the committee offers the following observations on the scope and appropriateness of the current objectives (i.e., the deliverables). OEP staff, PHS project officers, and contractors have identified two objectives as being especially important: Deliverable 2 (the MMRS development plan) and Deliverable 8 (Component Plan for Local Hospital and Healthcare System).

The required elements of Deliverable 2 include specification of the proposed leadership and membership of a development team and the roster of a steering committee that will assist in the planning and development of the MMRS. The contract suggests a number of organizations and agencies that should be considered, but variations among communities probably ensure that no list of suggested members would be appropriate for all communities. More importantly, the committee has repeatedly heard that the real value of assembling a steering committee lies in the personal relationships established in the course of preparing the plan. Yet, nowhere in the guidance to the contractor on this deliverable is that stated explicitly.

Also missing from this deliverable is a preliminary assessment of the planning environment, that is, the community’s strengths, weaknesses, opportunities, or threats and any barriers and resources that might be unique to the community. A plan to enhance local capabilities should begin by identifying those capabilities in most need of enhancement. This should be a multidisciplinary effort offered by multiple voices in the community (e.g., members of the police force, firefighters, emergency medical technicians and paramedics, public health officials, and hospital personnel, among others), with participation attested to by the signatures of all parties. The committee recognizes that this proposed addition to the list of deliverables comes too late for the 122 cities already under contract but believes that it would be the most logical start to any OEP initiative to provide follow-on support to sustain their readiness.

Deliverable 8 does not distinguish between public and private health care facilities, although it is clear by now that MMRS program contractors have had great difficulty involving private hospitals and clinics. The contract’s guidance on this deliverable should include or refer the contractor to some strategies, mechanisms, or incentives that have proved successful in other cities. In addition, the committee has identified two important

Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×

elements of coping with a mass-casualty event that are not addressed in the objective: staff callback procedures and replenishment of medical and ancillary (food, laundry, housekeeping, etc.) supplies and services.

The committee also identified several other essential activities or MMRS functions that are not addressed at all in the current contracts:

  • Receipt and distribution of materials from the National Pharmaceutical Stockpile;

  • Refugee holding (providing shelters for healthy people fleeing an area of real or perceived contamination);

  • Volunteer utilization and management;

  • Traffic control at the scene, at health care facilities, and in the community as a whole;

  • Evidence development, collection, and protection;

  • Evacuation and disease-containment decisions and procedures;

  • Post-event follow-up of the health of responders and caregivers; and

  • Plans for postevent amelioration of anxiety and feelings of vulnerability among the community at large.

It might be argued that several of these functions are not medical in nature and therefore do not fall within the scope of this DHHS project. However, all of these functions are essential to the ability of medical personnel to perform their jobs, even if, as seems likely, public safety personnel carry out the required actions. A realistic plan should therefore address these areas.

QUESTION D: How can existing standards be used to validate these performance objectives? If standards do not exist, how can new standards be created or how can the performance objectives be validated?

ANSWER D: Many of the personnel, professions, organizations, and jobs referred to in the plans of MMRS cities are governed by existing standards; some of these are legally mandated (Occupational Safety and Health Administration [OSHA] regulations) and others are voluntary. The following is a partial list of potentially relevant standards that the committee examined:

Joint Commission for Accreditation of Healthcare Organizations (JCAHCO)

Standard EC.1.4—Emergency preparedness management plan

Standard EC.2.9.1—Emergency preparedness drills

Standard EC.1.4 (1997)—Security management plan

Standard EC.1.5 (1997)—Hazardous materials and waste management plan

Commission on Accreditation of Ambulance Services Standards

Organization (includes disaster plan, yearly disaster simulations)

Management

Community relations and public affairs

Mutual aid agreements

Human resources

Clinical services

Safety

Equipment and facilities

Communications

Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×

National Public Health Performance Standards (CDC)

National Fire Protection Association Standards

NFPA 471—Recommended Practice for Responding to Hazardous Materials Incidents

NFPA 472—Standard for Professional Competence of Responders to Hazardous Materials Incidents

NFPA 473—Standard for Competencies for EMS Personnel Responding to Hazardous Materials Incidents

NFPA 1600—Standard on Disaster/Emergency Management and Business Continuity Programs

OSHA Standard 29CFR1910.120—Hazardous waste operations and emergency response

Nuclear Regulatory Agency/FEMA Criteria for Preparation and Evaluation of Radiological Emergency Response Plans and Preparedness in Support of Nuclear Power Plants (NUREG-0654/FEMA-REP-1)

Department of Transportation National Highway Transportation Safety Agency Emergency Medical Services National Standard Curriculums

American College of Emergency Physicians Task Force Recommendations on Objectives, Content, and Competencies for training of Emergency Medical Technicians, Emergency Physicians, and Emergency Nurses on Caring for Casualties of NBC Incidents

With only a few exceptions, the committee deemed these standards to be of limited utility in assessing the preparedness of local communities for coping with a CBR terrorism incident. Most are qualitative in nature and are “enforced” only by well-publicized and infrequent inspections. None explicitly addresses CBR terrorism or an emergency of the scale described in the MMRS program contract, and attempts to apply these standards to such scenarios in the past have often proved counterproductive (e.g., misinterpretation of OSHA hazardous waste operations standards has led to expectations that hospital emergency department [ED] personnel should have Level A chemical protective suits). Furthermore, each standard applies to only one element, discipline, or agency involved in an MMRS. It is difficult to envision a successful MMRS in which any of the constituent elements fails to meet its own narrow standards, but it is also true that a collection of individually competent elements does not guarantee a successful system. Each of the standards listed above was nevertheless examined for elements that could be incorporated into an MMRS program-specific evaluation, and a number of those have been incorporated into the matrix of preparedness indicators provided later in this report.

QUESTION E: What strategies have communities used to enhance their existing capabilities? What are the most effective means to measure these additional capabilities?

ANSWER E: It is probably fair to say that before 1995 few of the MMRS cities had given much thought to preparedness for CBR terrorism events at all. Certainly, all the nation’s largest

Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×

cities had preexisting hazmat teams; those with nuclear power plants in close proximity had plans and equipment for coping with radiation releases; and all-hazards emergency response plans were influenced heavily by the frequency with which they had experienced earthquakes, floods, tornadoes, and so forth. To the extent that any of the cities had begun to address coping with a CBR terrorism incident, they were reacting to the attack with a nerve agent on the Tokyo subway and therefore emphasized chemical agents. Although biological agents were not ignored, DOD and DOJ training and equipment programs begun in 1997 reinforced that emphasis on chemical agents, and as a result their equipment money and training was most often directed at firefighters and emergency medical technicians rather than hospital personnel.

Most cities used their MMRS program contracts to expand their capabilities by incorporating CBR-specific training and equipment for city personnel, primarily those involved with public safety, into existing all-hazard plans. In some cities, for example, in Honolulu, the MMRS program appears to have been extremely successful in promoting extensive mutual-aid agreements with surrounding communities and nearby military facilities.

The committee expects that cities with such extensive aid networks will enhance their preparedness not just for CBR terrorism incidents but for all hazards. In retrospect it appears that such “relationship building” across disciplines and communities may be a critical element in meeting the demands of the MMRS program contract and, along with dual-use equipment and procedures, may be the key to sustaining preparedness. Although a lack of pre-MMRS measures or control cities precludes a causal analysis, finding ways to measure these additional capabilities is a central element of the committee’s task and will be fully addressed in the final report. The committee cannot provide a concise answer to this question at this stage of the project.

QUESTION F: Can the relationships between traditional first responders-public safety officials and their supporting hospitals and public health offices be assessed? If so, how?

ANSWER F: A number of complementary strategies can be used to assess these relationships. The MMRS program contract already demands the minutes of all MMRS-related meetings, presumably including those who attended the meetings, in the requisite monthly progress reports. More intrusive measures might include independent oral or written queries of key personnel in each of these sectors regarding the joint actions specified in the community’s response plan. An Agency for Healthcare Research and Services (AHRQ) grant is supporting SAIC, Inc and its subcontractor, the Joint Commission for Accreditation of Healthcare Organizations (JCAHCO), in developing an assessment tool for measurement of hospital-community linkages. Designed as a 20-minute self-assessment survey, the draft version made available to the committee demands short answers from hospital administrators to a variety of questions about the hospital’s interactions not only with the community’s first responder-public safety community but with other hospitals as well.

QUESTION G: What tools and models exist to measure preparedness for natural disasters?

ANSWER G: The committee examined the following assessment tools for possible application in whole or in part to the task of evaluating preparedness for CBR terrorism events:

Capability Assessment for Readiness (CAR)

  • FEMA self-assessment instrument to evaluate state emergency management

  • A 1,801-element survey administered to all states and territories in 1997

Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
  • “All-hazards” document with only a handful of items related to chemical and biological weapons

Local Capability Assessment for Readiness (LCAR)

  • FEMA’s smaller, local community version of CAR

  • Currently undergoing pilot testing in selected counties

Hazardous Materials Exercise Evaluation Supplement

  • Instructions and checklist for peer reviewers in FEMA’s Comprehensive HAZMAT Emergency Response-Capability Assessment Program

  • Sixteen elements, each with 10 to 50 “points of review”

  • Yes-or-no responses and the time that the specific action was observed

Epidemiologic Capacity Assessment Guide

  • Step two of a three-step process (Step 1 is document collection, and Step 3 is site visit) designed by the Council of State and Territorial Epidemiologists

  • Self-assessment questionnaire

  • Short answers or essays and data on speed of investigation from recent cases

  • Suggestions for interviews of key personnel

State Domestic Preparedness Equipment Program Assessment and Strategy Development Tool Kit

  • Instruments developed by DOJ, the Federal Bureau of Investigation, and CDC to evaluate vulnerability, threat, and public health performance combined with assessments of required and current capabilities in the realms of fire services, hazmat, emergency medical service (EMS), law enforcement, public works, public health, and emergency management

  • A 100-page “Tool Kit” provided for use by the state and local personnel assigned to fill out the forms, but could be the basis of peer interviews

  • State assessment designed to be a compilation of local assessments, so it is really a local instrument

Public Health Assessment Instrument for Public Health Emergency Preparedness (CDC)

  • Ten essential public health services amplified specifically for preparedness for CBR terrorism events

  • Nineteen “indicators,” each with multiple subparts requiring mostly yes-or-no answers

  • Part of DOJ state assessment instrument

Assessment of Community Linkages in Response to a Bioterrorism Event

  • Draft product of JCAHCO and SAIC for AHRQ due out in June 2001

  • Forty-item questionnaire for hospitals (yes-or-no and short answers)

Each of these instruments seeks information about elements of disaster preparedness that are directly relevant to CBR terrorism preparedness. All are written self-reports, and either of the

Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×

two most comprehensive assessments, done properly, would take several people many hours or even several days to complete. In addition, the committee believes that self-reports are vulnerable to corruption of indicators. It has long been understood in evaluations of health and social programs that when rewards and punishments result from people’s apparent performance on an indicator, that indicator can sometimes change in ways that have no bearing on the actual outcomes of the governmental program. In the MMRS context, at least two possible forces can lead to “corruption of indicators.” First, to the extent that municipalities may believe that continued federal funding is contingent on contract compliance, self-reports may make the situation appear to be better than it really is. Second, and alternatively, if local officials believe that further funding is dependent on need, self-reporting may actually lead to an underestimate of preparedness. The committee will reexamine the possible utility of these self-assessment instruments in Phase II of the study, but for the present, the committee views them as providing too little additional assurance for the substantial effort involved.

QUESTION H: Do current federal performance measures for natural disasters or other programs (mitigation and response) have application to preparedness for a terrorism incident involving WMD (e.g., FEMA Project IMPACT)?

ANSWER H: According to Jeff Glick, director of FEMA’s Assessment Branch, postdisaster assessments of the performances of federal, state, and local government offices and agencies during natural disasters are ad hoc and very much event specific. He reports that no common template or database of findings is available for possible use in evaluating the preparedness of MMRS program communities. He anticipates that a current effort by FEMA, the National Emergency Management Association, and others to create an emergency management accreditation program will eventually include performance standards as well as preparedness indicators based on LCAR.

QUESTION I: How can casualty assumptions for communities of varying populations be established (percent of population, historical data)?

ANSWER I: Casualty assumptions, including those generated by “plume” models and computer programs showing how a release spreads, all depend heavily on fairly extensive knowledge of the agent—how much, what kind, and how dispersed—but these are all facts that are least likely to be known in a terrorist incident. Modest changes in these “initial terms” lead to casualty predictions that can vary by orders of magnitude. Computers, of course, can be used to generate a potentially huge table or series of tables by systematically varying the initial terms, but the committee sees little to be gained by this approach, the rationale for which is presumably the requirement to estimate the need for hospital beds, medications, other supplies and equipment, and personnel.

OEP’s arbitrary selection of 1,000 casualties from release of a chemical agent as a target figure for use in preparation of plans is probably a better approach. The same holds for biological agent incidents, for which the MMRS program contracts demand three levels of planning: one plan for handling an event with less than 100 victims, a second plan for handling an event with 100 to 10,000 casualties, and a third plan for handling events with greater than 10,000 casualties.

Given the infinite number of possible incidents, an even better approach may be to turn 180 degrees and seek estimates of capacity rather than estimates of numbers of casualties. That is, ask cities how many patients they can currently care for with current staffing levels and standards of care; how many patients they could care for in a true mass-casualty situation,

Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×

allowing for some attrition of regular staff but with help from outside the community and a different standard of care; and finally, how many patients would truly overwhelm the system, with or without outside help.

QUESTION J: How can OEP measure the preexisting systems, methodologies, and plans that public safety, public health, and health services agencies use to communicate during day-to-day operations? How can OEP measure the impact that the MMRS development process has had on the level of this communication or the expectations for this communication, or both?

ANSWER J: The committee suggested above, in its answer to Question C, that a missing but important part of Deliverable 2 is a prospective assessment of community strengths and weaknesses. This would certainly have to include communication among the fire department, the police department, EDs, trauma centers, poison control centers, EMSs, hazmat units, medical evacuation (Medevac) units, and other state and local agencies and institutions. Examination of existing mutual aid agreements or lack thereof should certainly be included.

As noted previously, even with pre- and postcontract measures, which will not actually be possible for at least the 122 cities that have already contracted with OEP, it would be very difficult to analyze the effects of the MMRS program independently of the effects of other concurrent federal, state, and private-sector initiatives. Asking cities directly is probably better than no assessment, but the current status is really all that can be objectively determined. That assessment could be carried out by independent questioning, written or oral, of essential participants in the public safety, public health, and health services sectors or through evidence from periodic testing of emergency communication systems under adverse conditions and at times of typically low activity. Some of the communications-related areas that might be probed would be access to a common radio frequency and how often it is used, the numbers and compatibilities of cell phones, existing agreements and mechanisms for gaining priority use of wireless and landline phones, and Internet and intranet connectivity.

The committee cautions that although it is possible that under some circumstances planning for an extraordinary event might improve the ability to conduct ordinary activities, it is by no means certain. Planning for detection of and coping with epidemics caused by bioterrorism may well make detection of and coping with a meningitis outbreak more efficient, but it is not likely that everyday care of individual patients with infectious diseases will be similarly affected. Certainly, it would be a mistake to judge preparations for a rare mass-casualty event solely by changes, or lack of changes, in everyday effectiveness or efficiency.

QUESTION K: How can financial barriers related to WMD preparedness be identified and measured?

ANSWER K: In the course of its data collection effort, the committee has become aware of some financial barriers (no doubt known to OEP as well) that hinder preparedness in at least some MMRS cities. Most prominent is the difficulty of sharing funding or material purchased with OEP contract dollars with adjacent jurisdictions and private-sector entities, especially hospitals and physicians. In the former case, political, and sometimes legal considerations underlie a predictable reluctance of elected officials to spend “their” money on others’ constituents. In the case of private-sector hospitals, financial pressures from the current adverse economic climate in health care, including competition from other local hospitals, have led most hospitals to eliminate all spare or surge capacity to cope with disasters of any sort. Even “free” equipment results in an obligation to provide expensive maintenance and training to staff.

Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×

Indeed, participation in the local MMRS at all results in a similar training obligation, and as a result, cities unable to provide financial incentives have had great difficulty in bringing about the participation of private hospitals. This MMRS program is a direct response to a “national security threat” and as such should be funded by the federal government at a level and in a manner that will both cover all initial costs and the continuing costs of sustaining preparedness. This does not seem to be an unreasonable expectation given the very large increases in funding for counter-terrorism programs being proposed in the FY 2002 federal budget.

As to the larger methodological question of how OEP can identify and measure such financial barriers, it would seem that a large piece of the committee’s Phase II task will be to put together a postcontract questionnaire (or final deliverable) that asks cities about this question and any changes in the way in which they carry out everyday business might be attributable to the MMRS program (see Questions and Answers A, B, C, E, and J).

PERFORMANCE MEASURES AND PREPAREDNESS INDICATORS

The MMRS contract deliverables are all written plans, and although written plans are certainly necessary elements of preparedness, they are in most cases only the beginning of a continuing process. Some elements of these plans can be carried out only during or after an actual incident or a very realistic exercise, but many require advance preparations, such as the purchase of equipment, hiring or training of personnel, or even changes in the way in which everyday business is conducted (for example, citywide electronic surveillance of ED calls). Even though these advance preparations and their documentation are actions, and are necessary for preparedness, they are not the same sort of performances that might be assessed in an actual mass-casualty event (whether it involves CBR terrorism or not) or a drill or field exercise. Measures related to advance preparations are generally easier and cheaper to access, however, and can provide a measure of effective response capability or potential (although, in the absence of an act of mass-casualty-producing CBR terrorism, there are no data that can validate the relationship between the selected indicators and actual performance). The committee therefore prefers the more inclusive term “preparedness indicators” to “performance measures.”

The committee’s recommended preparedness indicators are presented in Attachment 2 as a series of tables. A separate table is provided for each of the substantive deliverables of the MMRS program’s fiscal year (FY) 2000 contract (omitted are deliverables calling for a meeting with the project officer, monthly progress reports, and a final report). In each table the far left column, labeled “Plan Elements,” lists the required elements of the deliverable, numbered in accord with the checklist supplied to FY 2000 MMRS cities by OEP under the title “2000 MMRS Contract Deliverable Evaluation Instrument.”

The remaining three columns of the tables present the committee’s suggested preparedness indicators for each plan element. These fall into three categories: inputs, processes, and outputs.

Inputs are the constituent parts called for, implicitly or explicitly, by a given deliverable. An adequate plan itself would contain at least one input for nearly every deliverable, assuming that the required plans would have been completed at the point that assessment is being undertaken. Other inputs could be designated personnel; standard operating procedures; equipment and supplies; or schedules of planned meetings, training, and other future activities.

Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×

Processes are evidence of actions taken to support or implement the plan. Evidence that such actions had been taken or are under way might include minutes of meetings, agreements prepared, training sessions conducted, or the numbers or percentages of personnel trained to use CBR detection equipment.

Outputs are indicators of effective capabilities developed through the actions included under processes, that is, indicators of the effectiveness of actions taken to support or implement the MMRS plan. They would include preparations that have been completed, for example, establishment of a stockpile of antidotes and antibiotics appropriate for the agents that pose the greatest threat, with evidence of adequate maintenance and deployment procedures. Another output would be demonstration of critical knowledge, skills, and abilities in tabletop exercises, full-scale drills, or surrogate incidents (hoaxes, nondeliberate chemical releases, naturally occurring epidemics, or isolated cases of rare diseases). Outputs may be evaluated through expert judgment by peer reviewers of answers to written questions or on-site probes. In all cases care should be taken to avoid inappropriate generalization from chemical to biological incidents and vice versa.

The best evidence for preparedness will always be outputs, which are the end products of processes undertaken with inputs. A variety of circumstances, including the timing of the assessment, may make collection of output data impossible or impractical. In this circumstance evidence for preparedness might be sought among inputs and processes. All three types of indicators are, however, merely surrogate or proxy measures of MMRS effectiveness that are based on the judgment of knowledgeable students of the field but that have never been truly validated (and cannot be, short of an actual mass-casualty CBR terrorism incident).

The tables in the Appendix present many preparedness indicators, in part because of the committee’s decision to derive indicators for each of the items on OEP’s checklist of elements required in the plan. In fact, no practical evaluation program could or should use all the indicators listed. Use of the output-based indicators, presented in the far right column of each table, provides the best means of assessing readiness, and whenever possible these indicators should be used in preference to process- or input-based indicators. In fact, the importance of the output-based indicators, especially those obtained from exercises or careful evaluation of real disasters, cannot be overemphasized. The committee will expand on this point in Phase II of the study, but an important advantage of outputs is that they reflect intangibles not easily captured by the input and process indicators we suggest. For example, a strong MMRS takes a champion with desire and commitment to continually advocate for the project, individuals who are willing to cooperate, a change in attitude by organizational leadership that will adopt an inter-organizational and systemic approach to the MMRS, and leaders from local, state, federal, and private agencies with trust and sensitivity to each other’s missions, goals, strengths and weaknesses.

Similarly, process-based indicators should take preference over input-based indicators. In addition, it should be clear that every element of the plan need not be given equal weight in the evaluation of preparedness. Indeed, it may not be necessary to include every element in even a very comprehensive evaluation. This selection and prioritization process will constitute a significant focus of the committee’s work in Phase II of this project, as will determination of the most effective and efficient means of collecting the desired information and attempting to specify some minimum standards for preparedness, whenever possible.

At a more general level, the committee has been favorably impressed by the catalytic role of the MMRS program in many communities. As noted above, the concurrent efforts of three

Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×

federal agencies in the nation’s largest cities make it impossible to unequivocally assign credit for improvements in preparedness. However, the committee believes that OEP’s emphasis on collaboration, the use of existing agencies and programs, and the promotion of local discretion in addressing preparedness gaps, although difficult to measure, has been an undeniable contribution. One of the challenges of Phase II will be to ensure that assessment gives that collaboration appropriate weight.

Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
This page in the original is blank.
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
Page 1
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
Page 2
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
Page 3
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
Page 4
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
Page 5
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
Page 6
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
Page 7
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
Page 8
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
Page 9
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
Page 10
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
Page 11
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
Page 12
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
Page 13
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
Page 14
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
Page 15
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
Page 16
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
Page 17
Suggested Citation:"Report." Institute of Medicine. 2001. Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report. Washington, DC: The National Academies Press. doi: 10.17226/10221.
×
Page 18
Next: Appendix: Preparedness Indicators »
Tools for Evaluating the Metropolitan Medical Response System Program: Phase I Report Get This Book
×
Buy Paperback | $29.00 Buy Ebook | $23.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The U.S. Department of Health and Human Services' Metropolitan Medical Response (MMRS) program has evolved from an idea originally developed in the Washington, D.C., area in 1995. Using the combined personnel and equipment resources from Washington, D.C., Arlington County in Virginia, and Montgomery and Prince Georges Counties in Maryland, the Metropolitan Medical Strike Team (MMST) received training, equipment, and supplies specifically designed to facilitate an effective response to a mass-casualty terrorism incident with a weapon of mass destruction (WMD). The first of its kind in the civilian environment, the MMST was intended to be capable of providing initial, on-site emergency health, medical, and mental health services after a terrorist incident involving chemical, biological, or radiological (CBR) materials. The team's mission includes CBR agent detection and identification, patient decontamination, triage and medical treatment, emergency transportation of patients to local hospitals, coordination of movement of patients to more distant hospitals via the National Disaster Medical System (NDMS), and planning for the disposition of nonsurvivors. Building from the initial efforts of the Washington, D.C., Metropolitan Area MMST, OEP provided funding for the development of a similar team in the city of Atlanta in preparation for the 1996 Summer Olympic Games. The U.S. Congress has subsequently authorized and provided funding for additional contracts with the 120 most populous U.S. cities.

Tools for Evaluating the Metropolitan Medical REsponse System Program: Phase I Report identifies and develops performance measures and systems to assess the effectiveness of, and to identify barriers related to, the MMRS development process. This report identifies, recommends, and develops performance measures and systems to assess the effectiveness of, and identify barriers related to, the MMRS development process at the site, jurisdictional, and governmental levels.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!