National Academies Press: OpenBook

Emergency Communications Planning for Airports (2016)

Chapter: Chapter Eight - Evaluating the Effectiveness of Emergency and Crisis Communications

« Previous: Chapter Seven - Issues with Contact Lists
Page 29
Suggested Citation:"Chapter Eight - Evaluating the Effectiveness of Emergency and Crisis Communications ." National Academies of Sciences, Engineering, and Medicine. 2016. Emergency Communications Planning for Airports. Washington, DC: The National Academies Press. doi: 10.17226/23591.
×
Page 29
Page 30
Suggested Citation:"Chapter Eight - Evaluating the Effectiveness of Emergency and Crisis Communications ." National Academies of Sciences, Engineering, and Medicine. 2016. Emergency Communications Planning for Airports. Washington, DC: The National Academies Press. doi: 10.17226/23591.
×
Page 30
Page 31
Suggested Citation:"Chapter Eight - Evaluating the Effectiveness of Emergency and Crisis Communications ." National Academies of Sciences, Engineering, and Medicine. 2016. Emergency Communications Planning for Airports. Washington, DC: The National Academies Press. doi: 10.17226/23591.
×
Page 31

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

29 chapter eight EVALUATING THE EFFECTIVENESS OF EMERGENCY AND CRISIS COMMUNICATIONS EVALUATION The evaluation of airport emergency or crisis communications plans is most often described in qualitative terms during a hot wash and the after-action review/report. The feedback or change loop is often not actually measured. Measuring communications effectiveness is difficult, as communications is just one element that employ- ees, tenants, and passengers experience in emergency response or a mission-critical event. For example, more attention is given to how long and how thorough a terminal evacuation is and how smoothly terminal repopulation proceeds afterwards is (Griffith et al. 2014). Good communications can promote those outcomes but are hard to measure in the process. During the course of this study, the main metrics encountered were for social media, and those were mostly for marketing, not for emergencies. However, Palttala and Vos (2012) have devel- oped and tested a set of quality indicators—a scorecard—for crisis communications to support emergency management by public authorities. Their tool’s framework, based on the concepts of institutional learning, is described as an audit instrument that facilitates learning, supports the continuous improvement processes for crisis communications, and gives insights into the perfor- mance measures by which efficiency can be measured (pp. 39–40). The scorecard is notable for tracking on-going communications through all the stages of emergency management (p. 40). It is strongly oriented toward the information needs and reactions of stakeholders, the media, and the public (pp. 41–42). A notable exception to the lack of quantitative measures of the effectiveness, or at least the effects, of the use of social media dur- ing an emergency at airport came during and immediately after the Nov. 1, 2013, active shooter incident at LAX. The Los Angeles Visi- tors and Convention Bureau, having appropriate data analysis tools, analyzed the social media traffic, particularly Twitter, to look for engagement—that is, conversations that resulted when the airport’s tweets were responded to by members of the public (LAWA n.d.; M. Grady, personal communication, Nov. 17, 2015). The partner- ship of the airport and the visitor’s bureau to evaluate social media effectiveness can be copied by nearly any U.S. airport, as most airports already work with their local visitors or tourism bureaus. Burns (2013) evaluated LAX’s emergency use of Twitter by tracking the increase in the number of Twitter followers the airport had, a number that nearly doubled from October 2013 to November 2013. Oliver (2013) tracked responses to LAX commenting on the quality and usefulness of the airport’s use of Twitter. A final important aspect of evaluation of social media was also illustrated by LAX in November 2013: The airport included expert verification, usually from senior law enforce- ment officials, in its tweets, even as those senior officials emphasized that the LAX Twitter account was the official source of information (Wilson 2013). As noted in ACRP Synthesis 60, airports are becoming more transparent in their sharing of experiences dealing with emergencies and crises, including AARs and lessons learned, with other To evaluate the emergency communications plan, Watsonville airport holds two airport-specific exercises a year, and one of them is a surprise drill arranged in partnership with the city fire department. After each exercise there is an after-action review with broad par- ticipation by stakeholders, and the plan is tweaked as needed. (WVI Case Example) The @LAX_Official account tweeted approximately 500 times [on Nov. 1–2, 2013] and generated more than 260 million impressions. The equivalent media value for this activity is more than $2,000,000. The value is based on $8.43 CPM (cost-per-thousand impressions). (Karz 2013)

30 airports. This is another way that an airport can evaluate its emergency and crisis communications plan. Perhaps the most useful part of the Palttala and Vos scorecard is the generic outline/matrix that relates the stages of crisis and emergency management activities to specific communications tasks and with stakeholder groups (p. 45). The tasks are phrased as statements that can be rated on a fixed scale (p. 46). In the water contamination emergency example, Palttala and Vos use a scale of 1 to 5: 1 = This is completely not taken care of; 2 = The importance has been recognized, but no action is taken; 3 = We have started to manage/act on this; 4 = This is part of the action, but non-systematic; and 5 = This is a systematic (and expected) part of the action. Applying this scale to the crisis phrases/stakeholder matrix allowed the computation of scores (p. 47) that help in spotting strengths and weaknesses in the communications plan. It looks quantita- tive, but in reality it is not substantially different than the HSEEP ratings of how well an exercise’s capability targets are performed, defined as: P—Performed without challenge S—Performed with some challenges M—Performed with major challenges U—Unable to be performed (FEMA 2015). The Palttala and Vos scorecard approach may be very helpful for making sure that an airport’s emergency or crisis communications plan addresses the right types of activities and the right stake- holders, but it will probably be cumbersome to apply. This scorecard could easily accompany the information that was learned in the airport’s after-action report/hotwash meeting and become part of the overall documentation of the event. APPLICATION OF LESSONS LEARNED Lessons learned from real incidents and from exercises about communications need to be cap- tured during the evaluation phase and reported in a manner that allows for follow-up. Unless action items are assigned and progress on them is tracked, the les- sons learned are likely to be lost. If the communications lessons learned from an airport’s emergency activities are not applied to future behavior and investments, the airport is wasting a major opportunity for self-improvement. Lessons learned can involve things that went wrong and sys- temic failures. These are the most important kind of lessons learned as they can lead to corrective actions through the con- tinuous improvement process. Ideally, such lessons learned are garnered through exercises so that the consequences of actual emergencies can be mitigated by appropriate preparedness mea- sures. In a way, exercises may be visualized as experiments to test plans (here, crisis/emergency communications plans), and failures can be invaluable as learning opportunities (Edmondson 2011). The airports were specifically asked if they had a formal process for incorporating lessons learned from exercises into their written plans and procedures such as AEPs, SOPs, or communications plans (Question 53). Nearly half (48%) of the surveyed airports have a formal system, but an equal number do not; 4% skipped the question. About one-fifth of the airports (22%) have a written process for cap- turing and applying lessons learned. During a recent (2015) after-action review meeting, an important lesson was learned as a result of CCP activation dealing with an aircraft crash. The review revealed that the use of telephones for “critical infor- mation” such as runway opening/closure was essen- tial versus other mediums of communication to avoid confusion to ensure that vital information was clear to all parties involved. The second critical change was to have a single point of contact in airport operations and air traffic control, so that the messages were not a point of confusion between different employees. (BOI Case Example) Every event or exercise is followed by post event debriefs and evaluations of what went right, what went wrong, and how we can improve and incorpo- rate these lessons learned immediately into our plan, processes and procedures. (DFW Case Example)

31 Ten (10) of the surveyed airports—all among the 22% that reported having a written process for applying lessons learned—reported the use of one or more of five basic tools (Question 54): • After-action reviews (AARs) • Improvement plans • Explicit provisions in the AEP specifying process and individual responsibilities • HSEEP AAR/Improvement Plan Matrix (DHS 2013) • Active tracking of the implementation of lessons learned, either by a committee or by assigned individuals. The survey results show that these tools are sometimes used in combination; this is also the rec- ommendation of HSEEP (DHS 2013). It is important that airports continue the final process of assessment with metrics that can be implemented and used to improve the level of compliance gained in the next exercise; otherwise the planning effort could be viewed as futile and a waste of resources (Smith et al. 2016).

Next: Chapter Nine - Conclusions and Suggestions for Further Research »
Emergency Communications Planning for Airports Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB's Airport Cooperative Research Program (ACRP) Synthesis 73: Emergency Communications Planning for Airports explores emergency communications planning and is specifically designed for use by airport senior management, public information officers, and first responders and emergency managers. The report includes sample communication plan tables of contents, field operations guides, and a checklist of effective communications plans.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!