Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
29 chapter eight EVALUATING THE EFFECTIVENESS OF EMERGENCY AND CRISIS COMMUNICATIONS EVALUATION The evaluation of airport emergency or crisis communications plans is most often described in qualitative terms during a hot wash and the after-action review/report. The feedback or change loop is often not actually measured. Measuring communications effectiveness is difficult, as communications is just one element that employ- ees, tenants, and passengers experience in emergency response or a mission-critical event. For example, more attention is given to how long and how thorough a terminal evacuation is and how smoothly terminal repopulation proceeds afterwards is (Griffith et al. 2014). Good communications can promote those outcomes but are hard to measure in the process. During the course of this study, the main metrics encountered were for social media, and those were mostly for marketing, not for emergencies. However, Palttala and Vos (2012) have devel- oped and tested a set of quality indicatorsâa scorecardâfor crisis communications to support emergency management by public authorities. Their toolâs framework, based on the concepts of institutional learning, is described as an audit instrument that facilitates learning, supports the continuous improvement processes for crisis communications, and gives insights into the perfor- mance measures by which efficiency can be measured (pp. 39â40). The scorecard is notable for tracking on-going communications through all the stages of emergency management (p. 40). It is strongly oriented toward the information needs and reactions of stakeholders, the media, and the public (pp. 41â42). A notable exception to the lack of quantitative measures of the effectiveness, or at least the effects, of the use of social media dur- ing an emergency at airport came during and immediately after the Nov. 1, 2013, active shooter incident at LAX. The Los Angeles Visi- tors and Convention Bureau, having appropriate data analysis tools, analyzed the social media traffic, particularly Twitter, to look for engagementâthat is, conversations that resulted when the airportâs tweets were responded to by members of the public (LAWA n.d.; M. Grady, personal communication, Nov. 17, 2015). The partner- ship of the airport and the visitorâs bureau to evaluate social media effectiveness can be copied by nearly any U.S. airport, as most airports already work with their local visitors or tourism bureaus. Burns (2013) evaluated LAXâs emergency use of Twitter by tracking the increase in the number of Twitter followers the airport had, a number that nearly doubled from October 2013 to November 2013. Oliver (2013) tracked responses to LAX commenting on the quality and usefulness of the airportâs use of Twitter. A final important aspect of evaluation of social media was also illustrated by LAX in November 2013: The airport included expert verification, usually from senior law enforce- ment officials, in its tweets, even as those senior officials emphasized that the LAX Twitter account was the official source of information (Wilson 2013). As noted in ACRP Synthesis 60, airports are becoming more transparent in their sharing of experiences dealing with emergencies and crises, including AARs and lessons learned, with other To evaluate the emergency communications plan, Watsonville airport holds two airport-specific exercises a year, and one of them is a surprise drill arranged in partnership with the city fire department. After each exercise there is an after-action review with broad par- ticipation by stakeholders, and the plan is tweaked as needed. (WVI Case Example) The @LAX_Official account tweeted approximately 500 times [on Nov. 1â2, 2013] and generated more than 260 million impressions. The equivalent media value for this activity is more than $2,000,000. The value is based on $8.43 CPM (cost-per-thousand impressions). (Karz 2013)
30 airports. This is another way that an airport can evaluate its emergency and crisis communications plan. Perhaps the most useful part of the Palttala and Vos scorecard is the generic outline/matrix that relates the stages of crisis and emergency management activities to specific communications tasks and with stakeholder groups (p. 45). The tasks are phrased as statements that can be rated on a fixed scale (p. 46). In the water contamination emergency example, Palttala and Vos use a scale of 1 to 5: 1 = This is completely not taken care of; 2 = The importance has been recognized, but no action is taken; 3 = We have started to manage/act on this; 4 = This is part of the action, but non-systematic; and 5 = This is a systematic (and expected) part of the action. Applying this scale to the crisis phrases/stakeholder matrix allowed the computation of scores (p. 47) that help in spotting strengths and weaknesses in the communications plan. It looks quantita- tive, but in reality it is not substantially different than the HSEEP ratings of how well an exerciseâs capability targets are performed, defined as: PâPerformed without challenge SâPerformed with some challenges MâPerformed with major challenges UâUnable to be performed (FEMA 2015). The Palttala and Vos scorecard approach may be very helpful for making sure that an airportâs emergency or crisis communications plan addresses the right types of activities and the right stake- holders, but it will probably be cumbersome to apply. This scorecard could easily accompany the information that was learned in the airportâs after-action report/hotwash meeting and become part of the overall documentation of the event. APPLICATION OF LESSONS LEARNED Lessons learned from real incidents and from exercises about communications need to be cap- tured during the evaluation phase and reported in a manner that allows for follow-up. Unless action items are assigned and progress on them is tracked, the les- sons learned are likely to be lost. If the communications lessons learned from an airportâs emergency activities are not applied to future behavior and investments, the airport is wasting a major opportunity for self-improvement. Lessons learned can involve things that went wrong and sys- temic failures. These are the most important kind of lessons learned as they can lead to corrective actions through the con- tinuous improvement process. Ideally, such lessons learned are garnered through exercises so that the consequences of actual emergencies can be mitigated by appropriate preparedness mea- sures. In a way, exercises may be visualized as experiments to test plans (here, crisis/emergency communications plans), and failures can be invaluable as learning opportunities (Edmondson 2011). The airports were specifically asked if they had a formal process for incorporating lessons learned from exercises into their written plans and procedures such as AEPs, SOPs, or communications plans (Question 53). Nearly half (48%) of the surveyed airports have a formal system, but an equal number do not; 4% skipped the question. About one-fifth of the airports (22%) have a written process for cap- turing and applying lessons learned. During a recent (2015) after-action review meeting, an important lesson was learned as a result of CCP activation dealing with an aircraft crash. The review revealed that the use of telephones for âcritical infor- mationâ such as runway opening/closure was essen- tial versus other mediums of communication to avoid confusion to ensure that vital information was clear to all parties involved. The second critical change was to have a single point of contact in airport operations and air traffic control, so that the messages were not a point of confusion between different employees. (BOI Case Example) Every event or exercise is followed by post event debriefs and evaluations of what went right, what went wrong, and how we can improve and incorpo- rate these lessons learned immediately into our plan, processes and procedures. (DFW Case Example)
31 Ten (10) of the surveyed airportsâall among the 22% that reported having a written process for applying lessons learnedâreported the use of one or more of five basic tools (Question 54): ⢠After-action reviews (AARs) ⢠Improvement plans ⢠Explicit provisions in the AEP specifying process and individual responsibilities ⢠HSEEP AAR/Improvement Plan Matrix (DHS 2013) ⢠Active tracking of the implementation of lessons learned, either by a committee or by assigned individuals. The survey results show that these tools are sometimes used in combination; this is also the rec- ommendation of HSEEP (DHS 2013). It is important that airports continue the final process of assessment with metrics that can be implemented and used to improve the level of compliance gained in the next exercise; otherwise the planning effort could be viewed as futile and a waste of resources (Smith et al. 2016).