Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
H Preparatory Exercises at the Salt Lake City Olympics Summary of a presentation by Brian Beitler, DTRA At the 2002 Olympic Winter Games in Salt Lake City, Utah, DTRA played a key role in preparing for the possibility of a terrorist attack involving an atmospheric release of hazardous agents. DTRA and several other groups involved in this work operated out of a central "smart building" that was fully equipped with computing, communications, and atmospheric monitoring equipment. This building also had the capability of protecting its inhabitants in case of a nearby release. The primary dispersion modeling system employed in this work was the HPAC (described earlier). The primary capabilities were based at DTRA headquarters in Alexandria, Virginia, but the groups planned for redundancy, with backup systems running at NCAR and Dugway Proving Ground. Meteorological data servers, which provide a critical source of input for the HPAC system, were available from three different locations (Alexandria, Virginia; Dugway Proving Grounds; and Salt Lake City, Utah [SLC]~. Investigators also had continuous, real-time atmo- spheric monitoring in SLC throughout the games and drew upon daily SLC forecast discussions and teleconferences with the SLC National Weather Service. The weather forecasts employed were split into two regimes: 1. A 0-12-hour forecast, generated with the MM5 model, which could be "nudged" with real-time observations from SLC's mesonet system; and 2. A 12-36-hour forecast, generated with their "expert system" in combination with high- resolution forecasts from the RAMS/OMEGA modeling system and the University of Utah's MM5. They ran an intercomparison test of available modeling systems, simulating the release of a nerve agent from a sprayer. All of the models used to simulate the resulting plume gave slightly different answers. In a comparison to "ground truth" obtained by a local mesonet system, investigators found that an ensemble mean of all the model simulations seemed to perform better than any single model. Several important lessons were learned from this work: · It is valuable to have redundancy in all of the critical systems (monitoring, computing, communications, etch. 89
9o APPENDIX H · One of the biggest technical challenges can be dealing with communication issues (file transfer protocol limitations, firewalls, etc.~. · It is impossible to plan for every situation, so flexibility in operations is necessary. It is also important to have contingency and backup plans for disseminating dispersion model forecasts and other data products. · It is not clear which provides a better measure of 'truth' a high-resolution model that is regularly spaced or the actual observations that are irregularly spaced. · There is a benefit to using high-fidelity weather data. When the local details of topography were included in the SLC forecasts (e.g., upslope and downslope flows), they produced very complex plumes. DTRA is currently investigating simpler alternatives for generating transport and dispersion forecasts quickly. For instance, in mountainous areas such as SLC, the decision of whether you are allowed to use wood-burning stoves is based on the ventilation index, a function of boundary layer height and wind speed. A poor ventilation index means a low boundary layer and wind speed, so any release will be trapped closer to the ground. This type of simple parameterization may lend itself to "quick look" dispersion forecasting as well.