Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 49
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring 3 Monitoring Technologies: Research Priorities INTRODUCTION This chapter summarizes specific research issues confronting the major CTBT monitoring technologies and discusses strategies to enhance national monitoring capabilities through further basic and applied research. The technical challenges and operational issues differ among the technologies, but all of them share the basic functional requirements discussed in Section 2.2. Most of the research needs are discussed with respect to the monitoring challenges of detection, association, location, size estimation, and identification. The panel anticipates that new research challenges and issues will emerge as the IMS is deployed and there is experience with the analysis and use of its data. Presumably these needs will be motivated by "problem" events that defy present identification capabilities. Thus, although this report seeks to identify current areas of research priority, the panel emphasizes that a successful CTBT research program should maintain flexibility to shift emphasis and should nurture basic understanding in related areas that may provide unexpected solutions to future monitoring challenges. 3.1 SEISMOLOGY Major Technical Issues The CTBT seismic monitoring system will be challenged by the large number of small events that must be processed on a global basis to provide low-yield threshold monitoring of the underground and underwater environments. Although U.S. monitoring efforts will focus on certain areas of the world, those areas are still extensive. In many cases, the locations lack prior nuclear testing for direct calibration of identification; they require calibration efforts to improve location capability; and, for the most part, they have not been well instrumented seismologically. While historical arrival time observations are available from stations in many regions of the world, readily available waveform signals for determining local structures are less common. Implementation of well-established procedures for calibrating primary functions of the IMS seismic network (in conjunction with additional seismological NTM), such as event detection, association, and location, will provide a predictable level of monitoring capability.
OCR for page 50
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring In principle, the stated U.S. goal of high-confidence detection and identification of evasively conducted nuclear explosions of a few kilotons is achievable in limited areas of interest. In practice, doing so will require adequate numbers of appropriately located sensors, sufficient calibration of regional structures, and the development and validation of location and identification algorithms that use regional seismic waves. With the advent of the IMS and planned improvements in U.S. capabilities, many of the current data collection requirements for achieving the current national monitoring objectives will be met. However, additional research is certainly required to use the new data to meet these objectives. Given the current state of knowledge, a number of seismic events within the magnitude range of U.S. monitoring goals would not be distinguishable from nuclear explosions, even if the full IMS-NTM seismic network were in operation. Routine calibration methods will somewhat reduce the upper bound on this population of problem events in certain areas, but even then, research will be essential for significantly improving the overall capabilities of the system. The purpose of the research programs reviewed in this report is to improve monitoring capabilities to the level defined by U.S. monitoring goals. There are several philosophies in the seismological community about how best to advance the capabilities of seismic monitoring systems, and there is extensive experience with global and regional monitoring of earthquakes and global monitoring of large nuclear explosions. Earthquake monitoring has emphasized collecting data from large numbers of stations, usually in the form of parametric data such as arrival times and amplitudes of seismic phases provided by station operators to a central processing facility. Several thousand global stations contribute data of this type to the production of bulletins and catalogs of the USGS/NEIS and the ISC (see NRC, 1995). Earthquake studies have prompted the development of many global and regional seismic velocity models for use in event location procedures. Many regional seismographic networks process short-period digital seismic waveforms for local earthquake bulletin preparation, and there has been some progress in use of near-real-time digital seismic data for production of the USGS/NEIS bulletins. For these bulletins, the need for prompt publication is usually less that that associated with nuclear test monitoring. When there is a need for a rapid result, as in documenting the location of an earthquake disaster to assist in emergency planning, the USGS/NEIS can and does provide a preliminary location within a few minutes of the arrival of the seismic waves to distant stations. These earthquake-related activities will continue in parallel with CTBT monitoring. In recent years, global and regional broadband networks deployed by universities and the USGS for studying and monitoring earthquakes have developed entirely new analytical approaches, including systematic quantification of earthquake fault geometry and energy release based on analysis of waveforms. Of greatest relevance to CTBT monitoring are the quantitative approaches for event location and characterization being developed for analysis of seismic signals from small nearby earthquakes. When adequate crustal structures and seismic wave synthesis methods are available, it is possible to model complete broadband ground motions for regional events, enabling accurate source depth determination, event location and characterization, and development of waveform catalogs for efficient processing of future events (see Appendix D). The modeling may include inversion for the source moment tensor19 Efforts of this type require complete understanding of the nature of all ground motions recorded by the monitoring network. One of the core philosophical issues for seismic monitoring operations is whether it is better to use global and/or regional travel time curves, possibly with station or source region corrections, or to explicitly use models of the Earth's velocity structure and calculate the travel times and amplitudes for each source-station pair. The velocity models, which can include variable crustal and lithospheric structure, can be derived from the same data used in defining local travel time curves, but once they are determined they could also be used to model additional seismic signals that are not employed in standard event processing, such as free oscillations, surface waves, and multiple-body 19 The moment tensor is a representation of the set of equivalent forces at the source that would produce the observed ground motion. The sizes and orientation of the equivalent forces are distinctive for earthquakes and explosions and thus form a potential discriminant.
OCR for page 51
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring wave reflections. Velocity models are constantly improving on both global and regional scales and provide better approximations to the Earth with each model generation, with corresponding improvements in event location. Velocity models also have a key advantage completely lacking in travel time curves: they provide the basis for synthesizing the seismic motions expected for a specific path, as involved in the regional wave modeling mentioned above. The synthetic ground motions are useful for estimates of improving the source depth, identifying blockage of certain phase types, and enhancing the identification of the source type. The use of travel time curves has been adequate for teleseismic monitoring of large events but may be too limited for dealing with the regional monitoring required for small events. The nature of the Earth's velocity structure is such that heterogeneity exists on all scales (Figure 3.1), and at some level, interpolation of empirical travel time corrections as well as the intrinsic interpolation involved in aspherical model construction will fail to account for actual effects of Earth structure. For the teleseismic monitoring approach, in which the major phases of interest travel down into the mantle (or are relatively long-period surface waves that average over shallow structure), there is no obvious advantage of using travel time corrections versus three-dimensional models in regions with many sources, other than perhaps the operational simplicity of the former. However, for regions with sparse source or station distributions, the velocity model can incorporate information from many independent wavetypes and paths and predict structural effects for other paths and wavetypes that cannot otherwise be calibrated directly. The confidence gained from correctly predicting the energy partitioning in the seismic signal on a give path by waveform modeling directly enhances the source identification. The value of this approach is not a controversial notion, but it is in tension with the magnitude of resources that must be invested to adequately determine the structure in extensive areas of the world. When the entire field of seismology is considered, it is clear that the science is moving toward a three-dimensional parameterized model of Earth's material properties that will provide quite accurate predictions of seismic wave travel times for many applications (including earthquake monitoring and basic studies of Earth's composition and dynamics). One component of a long-term CTBT monitoring research program could involve commitment of resources toward the development of an improved global three-dimensional velocity model beginning with regions of interest, perhaps in partnership with the National Science Foundation (NSF; see Appendix C). Previous nuclear test monitoring research programs have supported development of reference Earth models. The operational system could be positioned for systematically updating the reference model used for locations by adopting a current three-dimensional model at this time, possibly including three-dimensional ray-tracing capabilities that will become essential as resolution of the model improves. This approach would provide a framework for including the somewhat more focused efforts of the CTBT research program, such as the pursuit of a detailed model of the crust and lithosphere in Eurasia and the Middle East, for event location and identification. Partnership with other agencies and organizations pursuing related efforts could lead to rapid progress on this goal. Another major coordinated effort could be the development of regional event bulletins complete down to a low-magnitude level, such as 2.5 (achieving a global bulleting at this level would require significant enhancement of global seismic monitoring capabilities). This would require extensive coordination between the earthquake monitoring research and the CTBT monitoring communities, but it is technically viable and would provide a basis for CTBT monitoring with high confidence. Related research issues are summarized below as specific functions of the monitoring system are considered. Detection Figure 3.2 shows the projected detection threshold of the 50-station IMS Primary Network when fully deployed (Claassen, 1996). This calculation is based on a criterion of three or more stations detecting P arrivals with a 99 per cent probability. Where available, actual station spectral noise statistics were used, but for stations that do not yet exist the noise levels were assumed to be those of low-noise stations. NTM will enhance the performance of the national monitoring system relative to these calculations in several areas of the
OCR for page 52
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring FIGURE 3.1 Heterogeneous S wave velocity variations at a depth of 100 km beneath the western U.S. The scale bar shows the relative velocity variations in per cent. (Source: K. G. Duekers and E. Humphreys, personal communication, 1997).
OCR for page 53
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring FIGURE 3.2 Detection threshold predicted for the fully deployed IMS Primary Network. The station locations are indicated as black squares on the map. Source: (Claassen, 1996) world. This simulation, which will have to be validated by actual operations, suggests that IMS detection thresholds across Eurasia will be at or below magnitude 3.75, with some areas being as low as 3.25. Given the small percentage (5–6 per cent) of detections ultimately associated with events in the first few years of the prototype-IDC REB, and the fact that 30 per cent of the detections in the final event list had to be added by analysts, there are clearly research issues related to improved seismic detection. Original goals for the IDC involved association of 10 and 30 per cent of detected phases from the Primary and Auxiliary stations, respectively. Many unassociated detections are actually signals from tiny local events that occur only at one station. It would not be useful to locate such small events in most regions. It may be possible to screen out such signals using the waveshape information provided by templates of past events recorded by each station. This could reduce false associations and unburden the overall algorithms for association. Three-component stations have a lower proportion of associated detections (4 per cent) than do arrays (6 per cent) in the prototype system, and further research on combining and adjusting automated detection parameters holds promise of improving the performance. The significant number of detections added or revised by analysts suggests room for improving detection algorithms that run automatically.
OCR for page 54
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring Research on enhancing the signal-to-noise ratio and improving the onset time determination has particular value. Exploration of simultaneous use of multiple detectors at a given station may result in approaches to reduce spurious detections and improve onset determinations. Practice at the prototype IDC has found that 76 per cent of the automatic detections have an onset time within 1.0 second of that picked by an analyst, whereas the goal of the IDC is 90 per cent. A key detection issue is improving how overlapping signals are handled. This includes the problems of both multiple events and multiple arrivals from events. Evasion scenarios that involve masking an explosion in an earthquake or quarry blast require detection of superimposed signals. Time series analysis procedures for separating closely spaced overlapping signals with slightly different frequency content potentially can improve detection in such cases. Perhaps the greatest room for research progress on detections involves phase identification. Only 7 per cent of teleseismic phases at Primary Network arrays were mis-identified as regional phases by the prototype IDC in December 1995, but 32 per cent of teleseismic phases from three-component stations were mis-identified. The complementary numbers of mis-identified regional phases were 8 per cent for arrays and 48 per cent (P waves) and 27 per cent (S phases) for three-component stations. Improved polarization analysis for three-component stations is needed. There is also a need for improved slowness and azimuth determinations. In the prototype system, azimuth and slowness measurements currently make up 9 per cent and 5 per cent, respectively, of the total defining parameters used in the REB, and the introduction of methods to improve these parameters will enhance both association and location procedures significantly. There is relatively little experience with detectors for T-phases observed at island seismic stations, and effective algorithms must be developed for these noisy environments. Another area of research is the identification of seismo-acoustic detections generated by propagating air waves. Although they are generated by acoustic waves in the atmosphere, they are often called "lonesome" Lg waves because they appear as an Lg phase long after the first arrival. Consequently, they are interpreted incorrectly as a seismic signal to be associated with records from other seismic stations. Analysis of colocated seismic and infrasonic sensors holds promise for solving this problem. Association Association is an area in which recent basic research has enhanced CTBT monitoring capabilities, as newly developed Generalized Association (GA) algorithms (see discussion of association in Chapter 2) are being incorporated into routine operations of the U.S. NDC (and the prototype IDC). Active research is under way on incorporating additional information into GA methods, and several approaches have been explored for using relative phase amplitude information and other arrival properties. Use of complete wave-forms is also promising, and such methods may find particular value for complex overlapping sequences of aftershocks or quarry blasts. Given the archive of waveform data for older events that will accumulate at the U.S. NDC, innovative use of previous signals as templates and master events should be explored. This is a new arena of monitoring operations, and only limited research has been conducted on such approaches. It is likely that the number of unassociated detections can be reduced by preliminary waveshape screening to recognize local events detected at only one sensor and to remove their signals from further association processing. Incorporation of regional propagation information into the GA, such as blockage patterns for a given candidate source region, using a computer knowledge base should enhance association methods, but this strategy requires further development. Location Seismic event location is a key area for further research efforts because accurate locations are essential for event identification and On-Site Inspections. Formal procedures exist for assessing the precision in event location (see Appendix C), but these typically do not account for possible systematic error and hence may overestimate event location accuracy. Figure 3.3 presents a
OCR for page 55
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring FIGURE 3.3 Estimates of location precision for the IMS Primary and Auxiliary networks for (a) magnitude 4.25 events and (b) events at the detection threshold in Figure 3.2. Neither plot includes an appraisal of uncertainty due to systematic error. The seismic stations are indicated as black squares on the map. Source: (Claassen, 1996)
OCR for page 56
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring calculation of the expected distribution of the precision of seismic event locations for the fully deployed Primary and Auxiliary networks of the IMS (Claassen, 1996) for events with magnitudes of 4.25 at the detection threshold calculated for the Primary Network (Figure 3.2). As in the case of detection-level calculations, actual station noise levels were used when available, low noise levels were assumed for future Primary Network stations, and low noise levels raised by 10 dB were used for future Auxiliary Network stations. The precision of the location performances is stated in terms of 90 per cent confidence of being within a given area measured in square kilometers. Only P-wave arrival times and azimuths were used, and no extensive calibration was assumed. (Note that this error estimate does not include the uncertainty due to possible systematic error.) Location precision better than 1000 km2 can be attained in most regions of the world for events at and above the detection level, but these results can be misleading because it is important to allow for biases due to systematic error, which can be substantial. For example, during assessment of the first 20 months of the REB, comparison by various countries of the locations produced by their own denser national networks with those in the REB showed that the REB 90 per cent confidence ellipses contained the national network location (which is presumably more accurate) less than half of the time. It has been estimated that the location precision for events in the REB can be improved by a factor of 6 after calibration of the network. Two main research strategies exist for reducing systematic error in event locations: (1) the development of regionalized travel times with reliable path calibrations for events of known location, and (2) the development of improved three-dimensional velocity models that give less biased locations. (Appendix C discusses this issue at length.) A catalog of calibration events can advance the latter strategy as well, given the difficulty of eliminating trade-offs between locations and heterogeneity in developing three-dimensional models. From the CTBT operational perspective it is not clear that even the next generation of three-dimensional models (or regionalized travel time curves) will account for the Earth's heterogeneity sufficiently to eliminate systematic location errors for either teleseismic or regional observations. Thus, even if improved velocity models are used, it is desirable to calibrate station and source corrections to account for unmodeled effects. Empirical calibration of location capabilities for a seismic network can proceed on various scales, ranging from use of a few key calibration events such as past nuclear explosions with well-known locations, to more ambitious undertakings such as development of large catalogs of events including well located earthquakes. Some efforts along these lines are being pursued in the DOE CTBT research program. The challenge is to obtain sufficient numbers of ground truth events with well-constrained parameters. Appendix C considers a systems approach to this problem, motivated by the need for calibration of extensive regions. Quarry blasts, earthquake ruptures that visibly break the surface, aftershocks with accurate locations determined from local deployments of portable arrays, and events located by dense local seismic networks can be used for this purpose because such event locations are in some cases accurate to within 1 to 2 km. There are large uncertainties in how to interpolate calibration information from discrete source-receiver combinations, and research on the statistical nature of heterogeneity may provide guidance in this process. Such uncertainty also serves as motivation for further development of velocity models because the many data sources that can be incorporated into such models can resolve heterogeneities that are not calibrated directly by ground truth events. A major basic and applied research effort of great importance for event location capabilities involves developing regionalized travel time curves and velocity models, particularly for the crustal phases that will be detected for small events. These regionalized models must be merged with global structures to permit simultaneous locations with regional and teleseismic signals. Determining regionalized travel time curves usually involves a combination of empirical measurements and modeling efforts, with the latter being important to ensure appropriate identification of regional phases (variations in crustal thickness and source radiation can lead to confusion about which phase is being measured). Historical data bases may be valuable for determining regional travel time curves in areas where new stations are being deployed. A demonstrated research tool that assists
OCR for page 57
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring in this effort involves systematic modeling of regional broadband seismograms for earthquakes and quarry blasts. Confidence in the adequacy of a given regional velocity structure or the associated empirical travel time curve is greatly enhanced if computer simulations demonstrate that the structural model actually accounts for the timing and relative amplitude of phases in the seismogram. By systematic modeling of broadband waveforms from larger earthquakes (for which the source can be determined by analysis of signals from multiple stations), regional Earth structures with good predictive capabilities can be determined. It is then possible either to use the velocity model to predict the times of regional phases or to use travel time curves for which the model has validated the identification of phases. This is of particular value in regions where the crustal heterogeneity (e.g., near continental margins and in mountain belts) causes the energy partitioning to change among phases; for example, some crustal paths do not allow the Lg phase to propagate, and anomalously large Sn phases may be observed instead. Waveform modeling approaches can play a major role in determining the local velocity structures required to interpret regional phases for both event location and event identification, so there is value in further development of seismological modeling techniques that can compute synthetic ground motions for complex models. Nuclear test monitoring research programs have long supported basic development of seismic modeling capabilities because they underlie the quantification of most seismic monitoring methods. Present challenges include modeling of regional distance (to 1000 km) high-frequency crustal phases (up to 10 Hz and higher) for paths in two-and three-dimensional models of the crust. Such modeling must correctly include surface and internal boundaries that are rough, as well as both large-and small-scale volumetric heterogeneities. Current capabilities are limited, and in only a handful of situations have regional waveform complexities been quantified adequately by synthetics. New modeling approaches and faster computer technologies will be required to achieve the level of seismogram quantification for shorter-period regional seismic waves that now exists for global observations of longer-period seismic waves. In parallel with the development of new modeling capabilities is a need for improved strategies for determining characteristics of the crust based on sparse regional observations, so that realistic velocity models can be developed rather than ad hoc structures. Other promising areas for research include methods of using complete waveform information to locate events and improved use of long-period energy. Correlations of waveforms can provide accurate relative locations for similar sources such as mining explosions. The basic idea is that rather than relying on only the relative arrival times of direct P, one can use relative arrival times of all phases in the seismogram to constrain the relative location. The potential for regional event location based on such approaches is not yet fully established, but preliminary work with waveforms from mining areas is promising. There is also potential for improving event locations using surface waves because there have been significant advances in the global maps of phase velocity heterogeneity affecting these phases. Because surface waves with periods of 5–20 seconds are valuable for estimation of source size and event identification, signal processing procedures that accurately time these arrivals (notably using phase-match filters that enhance signal-to-noise by correcting for the systematic dispersive nature of surface waves) have to be developed and will provide information useful for locating events. For applications to small events, existing phase velocity models must be improved and extended to shorter periods. A particularly important aspect of event location requiring further research is determination of depth for small, regionally recorded events. This parameter is of great value for identifying the source but is one of the most challenging problems for regional event monitoring. Based on experience with earthquake monitoring in densely instrumented regions of the world, accurate depth determination is not typically achievable using direct body wave arrival times alone, and more complete waveform information must be used. The complex set of reverberations that exist at regional distances makes identification of discrete seismic ''depth phases" difficult, but complete waveform modeling, as well as cepstral methods (involving analysis of the spectrum of the signal) applied to entire sets of body wave arrivals, hold potential for identifying such phases. Research
OCR for page 58
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring advances in this area potentially can eliminate many events from further analysis in the CTBT monitoring system. Event location techniques that exploit synergy between various monitoring technologies are described in Section 3.6. Size Estimation Every event located by the IMS and NTM will have some recorded signals that can be used to estimate the strength of the source for various frequencies and wavetypes. Because the actual seismograms are retrieved from the field, it is possible to measure a variety of waveform characteristics to characterize the source strength. The standard seismic magnitude scales (see Appendix D) for short-period P waves (mb) and 20-second-period surface waves (MS) are the principal teleseismic size estimators used for event identification and yield estimation for larger events. In December 1995, 88 per cent of the events in the REB were assigned body wave magnitudes, but only 10 per cent have surface wave magnitudes because of low surface wave amplitudes for small events and the sparseness of the network. Operational experience at the prototype IDC is establishing systematic station corrections (average deviations from well-determined mean event magnitudes) that can be applied to reduce biases in the measurements, particularly for small events with few recordings. About 34 per cent of the REB events for December 1995 were assigned a "local magnitude," which was scaled relative to mb. In addition to station corrections, which account for systematic station-dependent biases, it is important to determine regionalized amplitude-distance curves analogous to regional travel time curves. These curves are used in the magnitude formulations and have great variations at regional distances (see Appendix D). Although range-and azimuth-dependent station corrections can absorb regional patterns, it is valuable to have a suitable regional structure for interpolation of general trends. Research is needed to establish the level of regionalization required and the nature of the regional amplitude-distance curves. As new IMS and NTM stations are deployed, each region must be calibrated and an understanding attained of the nature of the seismic wave propagation in that region (including effects such as blockage, which may prevent measurement of some phases). The availability of complete waveform information for each event offers potential source strength estimation that exploits more of the signal than conventional seismic magnitudes. For example, complete waveform modeling can determine accurate seismic moments (measures of the overall fault energy release) that may be superior to seismic magnitudes because they explicitly account for fault geometry. Routine inversion of complete ground motion recordings for seismic moment and fault geometry is now conducted for all events with magnitudes larger than about 5.0 around the world by the academic community that is studying earthquake processes. Similar capabilities have been demonstrated for events with magnitudes as small as 3.5 in well-instrumented seismogenic areas (see Appendix D). These approaches are closely linked to source identification because they explicitly incorporate and solve for generalized representations of forces exerted at the source. The extent to which a sparse network such as the IMS can exploit such waveform modeling approaches is not fully established. In part, it depends on the extent to which adequate regional velocity models are determined and on their waveform prediction capabilities. Further research can establish the operational role of complete waveform analysis for source strength estimation. Operationally, it is usually convenient to use parametric measurements such as magnitudes. Several promising waveform measurements for regional phases can be treated parametrically. These include waveform energy measurements for short-and long-period passbands; coda magnitudes based on frequency-dependent variations of reverberations following principal seismic arrivals; and signal power measurements of Sn, Lg, and other reverberative phases at regional distances. Extension of surface wave measurements to the short-period signals (5–15 seconds) that dominate regional recordings of small events is also necessary. Research on the utility, stability, and regional variability of these source strength estimators should continue. This includes necessary efforts to characterize the effects of source depth, distance, attenuation, heterogeneous crustal structure, and recording site for each approach. This effort is warranted given the limitations of conventional
OCR for page 59
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring mb and Ms measurements for small events recorded by sparse networks. Identification Several major research areas related to seismic source identification have been considered in the preceding discussion of event location and strength estimation. Accurate event location is essential for identification, including the reliable separation of onshore and offshore events and the determination of source depth. When location alone is insufficient to identify the source, secondary waveform attributes must be relied on. For events larger than about magnitude 4.5, well-tested methods based on teleseismic data provide reliable discrimination. For small events, experience has shown that a number of ad hoc methods, often different in different regions, can be used to distinguish explosions (e.g., mine blasting) from earthquakes. However, a primary area for both basic and applied research is to systematize such experience for identifying small events and turn it to solving the problems of CTBT monitoring. Some key areas include the following: Extension of mb:Ms-type discriminants to regional scales for small events: this involves development of regional surrogates for both P-wave and surface-wave magnitudes that retain the frequency-source depth-source mechanism sensitivity of teleseismic discriminants. Improved methods of measuring the surface wave source strength for regional signals are necessary, as mentioned above. Quantification of the regional measures by waveform modeling and source theory is needed to provide a solid physical understanding of such empirical discriminants. Regional S/P type measurements (e.g., Lg/Pn, Lg/Pg, Sn/Pg) have been shown to discriminate source types well at frequencies higher than 3–5 Hz. Research is needed to establish the regional variability of such measurements and to reduce the scatter in earthquake populations. Improved path corrections, beyond standard distance curves, that account for regional crustal variability should be developed further because they appear to reduce scatter for frequencies lower than 3 Hz. Quarry blasts and mining events (explosions, roof collapses, rockbursts) can pose major identification challenges, and research is needed to establish the variability of these sources and the performance of proposed discriminants in a variety of areas. Ripple-fired explosions (with a series of spatially distributed and temporally lagged charges) can often be discriminated from other explosions and earthquakes by the presence of discrete frequencies associated with shot time separation. The discriminant appears to be broadly applicable, although additional testing in new environments is essential. However, it does not preclude a scenario in which the mining explosion masks a nuclear test (see below). Systematic, complete waveform inversion for source type should be explored for regions with well-determined crustal structures, given the constraints of the large spacing between seismic stations in the IMS and NTM. This can contribute directly to source depth determination and source identification. It is likely that such approaches are the key to solving challenges posed by some evasion scenarios involving masking of nuclear test signals by simultaneous quarry blasts, rockbursts, or earthquakes. Strategies for calibrating discriminants in various regions must be established, with procedures to correct for regional path effects being expanded. This includes systematic mapping of blockage effects and attenuation structure. It is also desirable to establish populations of quarry blast signals for each region. Summary of Research Priorities Associated with Seismic Monitoring In summary, a prioritized list of research topics in seismology that would enhance CTBT monitoring capabilities includes: Improved characterization and modeling of regional seismic wave propagation in diverse regions of the world. Improved capabilities to detect, locate, and identify small events using sparsely distributed seismic arrays. Theoretical and observational investigations of the full range of seismic sources. Development of high-resolution velocity models for regions of monitoring concern.
OCR for page 72
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring proportions of radionuclides will be modified by fractionation and rain-out effects.20 The importance of these mechanisms depends on the type release or detonation site (see Appendix G). For example debris from a reactor accident will be highly fractionated because the more volatile fission and activation products are preferentially released. Nuclear device debris will be highly fractionated if the fireball interacts with the surrounding medium. If the medium is soil or rock, there will be extensive localized fallout of the more refractory debris. If the medium is water, there will be localized loss of both refractory debris and the more soluble radioactive gases. Further fractionation takes place during the subsequent transport of the debris and gases by the atmosphere or, in the case of an undersea explosion, by water currents. Substantial fractionation has been observed even in high altitude tests. Because of the lower temperatures involved, low yield devices are usually more fractionated than high yield devices detonated under the same conditions. Rainfall can lead to the fractionation especially of soluble gasses such as isotopes of iodine and bromine. Certain waterborne organisms will also take up particular radionuclides. For example, plankton will concentrate plutonium at levels 2,600 time that of surrounding waters. Because of fractionation effects, it is important to collect and analyze samples as soon as possible after detonation. Deep underground detonations will result in the containment of almost all of the radioactive fission fragments under ground, with the possible exception of some of the radioactive noble gases. Over time, these gases may escape to the air above the site through cracks and fissures in the rock and soil. Most of these gases are produced by the decay of precursors and will be released hours to weeks after the detonation; their activity will diminish slowly with time. The amount of gas released at different times will also depend on the atmospheric pressure over the site, with increased release under conditions of low pressure and reduced or no release under high-pressure conditions. The present plan for remote CTBT radionuclide monitoring is to install a series of fixed stations around the world that have fully automated measurement devices with the ability to collect, process, and send the results to a central station on a daily basis. Alternatively, a sample may be sent to the central facility for processing. In both scenarios the data are subsequently evaluated using intelligent algorithms for positive or negative testing of event signals. The technology for these functions is developed, but the technology for the fixed stations has not had extensive field testing to ensure long-term reliability, especially in remote areas. Research can be pursued to develop more rapid and mobile monitoring capabilities to include air, water, and land monitoring. The sooner radionuclide samples can be collected and analyzed after a test, the greater will be the amount of information obtained about the nature of the test device. For instrumental analysis of airborne radioactive particulates collected on filter paper, the lower limit of detection is determined by the amount of naturally occurring and extraneous human-made radioactivity also collected in the sample. If radiochemical separations are employed, the limit of detection depends only on the activities of the separated fission products. The limit of detection of the radioactive noble gases on a separated gas sample is determined only by interference from other radioactive gases in the sample and by the activity of the gas of interest. The delay time associated with the transportation of the radionuclides to the fixed monitoring locations may be days to weeks. Once a clear radionuclide event is detected, there is a need to backtrack the atmospheric transport history to locate the source of the event. This requires the use of models to propagate the transport and dispersion of radionuclides back in time based on archived measurements of atmospheric properties (wind speeds, temperatures, etc.). Such efforts are limited by the quality and density of available data and fundamental uncertainties on modeling chemical transport in turbulent atmospheric flows. These same models may be used in a predictive mode (e.g., to direct sampling efforts) using the output from forward calculations that predict atmospheric properties and circulation. Because other monitoring technologies will provide rapid 20 Fractionation is the preferential loss of specific radioisotopes after the initial release. The degree of fractionation increases with time, caused by a combination of physical forces (such as gravity) and chemical processes (such as oxidation). Rainstorms will also "wash" radionuclides out of the atmosphere (i.e., rain-out). This is important for all radionuclides except for the noble gases.
OCR for page 73
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring data about the time and location of a potential nuclear explosion, forecasts of radionuclide transport could be used to vector aircraft for sampling closer to the source than would be possible with the fixed radionuclide stations. A capability to back-track ocean currents may also be needed for tests in the ocean or near the ocean surface. Basic research is needed to enhance back-tracking capabilities in general, particularly to improve the accuracy of projections beyond 5 days and to provide realistic uncertainty assessments. For On-Site Inspection, radioactive gases leaking from cracks and fissures may be the only surface indicator of the detonation location. Rapid noble gas monitoring equipment mounted in slow-moving aircraft to perform the initial screening of an area may identify the approximate test location; however, the escaping gas may not be present in sufficient amounts to allow detection. Such aerial surveys may require the collection and rapid analysis of many air samples over a wide area. The detection and identification capabilities of the network of radionuclide detectors (particulate and gas) will be determined by minimum detectable concentrations of the system for particulate isotopes, spatial and temporal variations in the radio-nuclide background at individual stations, atmospheric or waterborne transport processes that disperse the fallout and lower the measurable concentrations of isotopes, radioactive decay processes that reduce the concentration of particular isotopes, and fractionation and rain-out processes that selectively remove particular isotopes from the air, water, or land. Considering the above issues, an assessment of detection capabilities involves calculations of radionuclide transport away from the site of a detonation to downstream detectors. This modeling requires assumptions regarding fallout/release from different detonation scenarios, atmospheric and ocean transport processes, detector efficiencies, and fractionation and rain-out efficiency. Modeling to date, has largely considered the first three factors, independent of radioactive decay processes. Once radionuclides have been measured in the field, identification requires the ability to distinguish the chemical signatures of nuclear explosions from reactor emissions. Preliminary analysis for atmospheric transport of xenon radioisotopes suggests that this task may be difficult more than 2 weeks after an explosion because of radioactive decay processes. In general, the panel concludes that there is a need for a full assessment of the detection and identification capabilities of the radionuclide network, similar to the modeling that has been carried out for the seismic and hydroacoustic systems (e.g. Figures 3.2, 3.3, and 3.4). Major Technical Issues Fixed Station Air Particulate Monitoring The IMS will include surface stations for collecting air samples for radionuclide analysis. All of these stations will be equipped with some type of particulate collection system that involves the collection of atmospheric dust on filter paper. In the proposed U.S. equipment, radioactive material on the filter paper is then counted by using a high-resolution HPGe (high-purity germanium) detector, and the resulting gamma spectrum is analyzed by using computer-based gamma-ray spectroscopy. As shown in Appendix G, only tests conducted above the transition zone in soil (within 100 m of the surface) or at shallow depths in water are likely to result in a significant release of radioactive particulates. In general, this type of monitor will be of little use in detecting events that are contained or are not vented. This particulate detection technology is mature and reliable. The panel concludes that the automated system developed by DOE seems to be designed around the best existing technology. However, operational problems associated with the automated daily collection of aerosols on filters, sample counting, and transmission and storage of data may be challenging. Also the amount of spurious radioactivity collected in the dust can produce background radiation that inhibits good signal-to-noise ratios, thus limiting the sensitivity of gamma-ray counting on a daily basis. A clear plan for designing and implementing a gamma-ray system for daily use must be spelled out in great detail and tested for detection limit capabilities. The use of low-level counting techniques (e.g., shielding, Compton suppression, or other coincidence methods similar to xenon identification)
OCR for page 74
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring may also have to be investigated to achieve the best sensitivities available. A major problem for fixed station radionuclide monitoring is the establishment and maintenance of a reliable monitoring network in remote locations. Many of the proposed 80 radionuclide sampling sites will be in remote locations without dependable power supplies and technical support. Existing stations are located mainly in developed countries and at sites that have reliable technical support and available backup monitoring systems. Particulate monitors all employ high-resolution gamma-ray spectroscopy using large-volume HPGe detectors that must be cooled continually to a low temperature with either liquid nitrogen or electric coolers. Should a detector warm up to ambient temperature through the loss of power or liquid nitrogen supply, it would take many hours before the detector could be brought back on-line and days should the detector's electronics become damaged during the warm-up. High-Resolution Gamma Detector for Ambient Temperature Operation A research effort is needed to develop large-volume, high-efficiency, dependable semiconductor detector materials that can be operated at room temperature and are insensitive to routine atmospheric temperature changes. This type of detector material would have a larger band gap than HPGe, which would allow it to operate at room temperature. Because of the larger band gap, the resolution of detectors using this material will be slightly poorer than the current HPGe detectors but far superior to NaI(TI) detectors. Work on a variety of proposed detectors of this type has been under way for many years. The difficulty in development is incomplete collection of all of the radiation-produced charge pairs. Trapping, especially of the positively charged holes, distorts the resulting gamma spectrum. Small detectors of this type have been developed and are available commercially; they include detectors made from cadmium telluride, cadmium zinc telluride, and mercuric iodide. Research is also under way to make detectors from gallium arsenide, lead iodide, and other semiconductor materials having band gaps in the range of 1 to 2 eV (electron volts). Successful development of this type of detector would provide a rugged, portable, and easy-to-use gamma-radiation monitor that could be started quickly and battery operated. This would be ideal for use in fixed stations in remote areas and for mobile monitoring applications. Fixed Station Radioactive Noble Gas Monitoring It is a CTBT requirement that 40 IMS stations employ some type of radioactive noble gas monitor. As shown in Appendix G, these gases are more likely to be released than radioactive particulates from deep underwater or underground tests. Also, these gases will tend to separate from particulate matter and will not be lost by particulate fallout or rain-out. If rain-out or other atmospheric conditions remove particulates from the air, the automated xenon analysis system would become the most sensitive detector of fission products. Because of the low air sampling rates of current noble gas monitors compared to particulate monitors, CTBT noble gas monitoring is considered to be about 1000 times less sensitive than particulate monitoring, and there have been questions about how extensively such monitors should be employed in the proposed worldwide network (Ad Hoc Committee on a Nuclear Test Ban, 1995). The prototype technology being developed by Battelle (Pacific Northwest National Laboratory [PNNL] Automatic Radioxenon Analyzer) collects atmospheric air at a rate of 7 m3 per hour (Bowyer et al., 1996; Perkins and Casey, 1996). Xenon gas is separated from the other gases by a series of adsorption and deadsorption steps. The xenon gas, containing any radioactive xenon fission isotopes, is then counted by use of coincidence methods (using proven sophisticated electronics) to assay for signals of radioactive xenon. Tests conducted for nuclear power plants have proven that this instrumentation is highly reliable. There are concerns about the NaI(TI) detectors proposed for this system. They will have limited resolution, making it difficult to distinguish the different radioisotopes of xenon because of overlapping spectral lines, especially in the low energy region. Furthermore, the system gain of NaI(TI) detectors is extremely sensitive to ambient temperature since small temperature changes can make significant changes in the system's energy calibration. Thus, these detectors either must use some type of sophisticated
OCR for page 75
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring spectrum stabilizer or must be operated at a near constant room temperature. It may be difficult to achieve this type of temperature control consistently at remote site locations. Here again, it would be useful to have available a high-resolution semiconductor detector able to operate with stability at room temperature. A general problem associated with radionuclide monitoring at a fixed site is the waiting time required for the radioactive gas to reach the stations where it can be collected, counted, and analyzed. However, if radioactive xenon is detected, it is powerful evidence that a nuclear explosion has taken place. The instrumentation for xenon sampling must be field-tested to evaluate its reliability for long periods of time. In addition, identification software must undergo quality assurance and quality control to ensure that the final results do indeed represent an event. The monitoring for radioactive xenon emitted from nuclear power plants can be used for this purpose. Rapid Response Airborne Monitoring The IMS will have no airborne radionuclide monitoring capabilities. In the past, airborne detection played an important role in NTM for nuclear test monitoring, and the United States maintained rapid response aircraft equipped with radiation detection equipment that could locate and follow a radioactive plume once a source had been located and identified. The IMS system is focused on a worldwide network of fixed stations, all of which will monitor radioactive particulates and some of which will monitor radioactive noble gases. Although the response from a fixed station seismic, hydroacoustic, and infrasonic monitoring network will be rapid, the response of the fixed station radionuclide monitoring network will be relatively slow. Even when fission products are detected at a station, much information about the type and location of the nuclear event will be lost because of nuclear decay. With the use of evasive procedures, especially bursts in deep underground boreholes or deep mines or deeply submerged ocean tests, there will be few or no radioactive particulates, and potentially little noble gas will be emitted into the environment to be detected by the fixed station monitors. However, if the approximate location of an event identified as a nuclear explosion is determined by seismic, infrasonic, and/or hydroacoustic methods, rapid response teams using monitoring aircraft and, where politically possible and appropriate, surface or ocean surveillance equipment could be in the field collecting samples. These response teams ideally would be equipped with a rapid radionuclide monitoring capability that is mobile and sensitive and can quickly produce quantitative results. According to available information, no such monitoring systems currently exist or are under development. Of special importance to airborne monitoring is a rapid response noble gas monitor. Noble gases collected soon after a detonation can contain more xenon radioisotopes as well as argon and krypton radioisotopes. Thus, more information can be determined about the explosive type. Depending on test depth and weather conditions, these gases may be the only airborne fission products available for detection. The Ad Hoc Committee on a Nuclear Test Ban of the Conference on Disarmament showed a considerable interest in airborne monitoring triggered by other monitoring techniques (Ad Hoc Committee on a Nuclear Test Ban, 1995). Some experts in attendance felt that the primary value of airborne monitoring was for atmospheric explosions over ocean areas or underwater, where fission product evidence can be lost quickly. Monitoring would be directed primarily at remote neutral areas not adequately covered by the network of groundbased stations. They discussed the need for both particulate and noble gas monitors on aircraft that would also be airborne laboratories. This element of the IMS was not pursued because of cost considerations. A relatively rapid radioactive noble gas monitor was developed in the late 1970s for monitoring radioactive noble gas in and around nuclear power stations (Jabs and Jester, 1976). This approach could be investigated for use as a portable, sensitive, rapid CTBT noble gas monitor. For most radioisotopes of argon, krypton, and xenon of interest to this work, such a system could achieve a lower limit of detection (LLD) sensitivity of less than 1 mBq/m3. This detection sensitivity approaches that of the PNNL Automatic Radioxenon Analyzer, but with a sample collection and analysis time of less than an hour per sample. The basic concept of this system is the rapid
OCR for page 76
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring compression of filtered air into a spherical steel chamber at pressures of 100 to 200 atmospheres. A high-resolution germanium detector is positioned in the center of the chamber. Counting for one-half hour can be used to achieve the LLD given above. By using off-line gamma-ray spectroscopy, a single setup could collect and analyze samples at a rate of one sample per hour or less. Such a system could be operated from an airplane searching for radioactive gases within a few hours of a suspected nuclear event. Crew safety and decontamination factors would have to be considered in any system feasibility studies for other than dedicated airplanes. Proposed Rapid Particulate and Radioiodine Airborne Monitor Any rapid response team should have available a capability to collect and analyze quickly airborne particulates containing fission products from a nuclear test. There is another important group of fission products that has not been mentioned in any of the proposed radionuclide monitoring programs: the iodine fission products including 129I (half-life, t1/2 = 1.6 × 107 years), 130I (t1/2 = 8 days), 132I (t1/2 = 2.3 hours), and 133I (t1/2 = 21 hours). Iodine-132 lasts longer in the environment than suggested by its short half-life because of its precursor 132 Te (t1/2 = 3.3 days). These biologically significant fission products are frequently in a volatile form that will pass through paper particulate filters and will not be seen by the radioactive xenon monitoring systems. A well-established procedure for the separate collection of particulates and radioiodines would be the use of a train of stacked filter cartridges. These rugged cartridges allow the passage of a large amount of sampled air in a short time. The most common particulate filter would be a Hepa filter cartridge, whereas the most efficient radioiodine filter would be a silver zeolite cartridge. Such a filter train would have to be installed on the inlet of the compressed air noble gas monitor previously mentioned to prevent the interior of the pressure vessel from becoming contaminated with fission products. These inlet filter cartridges could be used to provide the samples for subsequent gamma-ray spectroscopy. The cartridges could be changed at the completion of the collection of each compressed air sample and counted using an auxiliary HPGe detection system. Because of its long half-life, 129 I is difficult to detect by gamma-ray spectroscopy but there are extremely sensitive neutron activation analysis procedures for its detection. It is recognized that the IMS technologies are fixed, and they were designed explicitly to avoid potential monitoring of reactor releases. Thus, much of this technical discussion may apply only to NTM. Rapid Response Waterborne Monitoring Instrumentation and research to support waterborne radionuclide analysis are being carried out by groups at Lawrence Livermore National Laboratory, Sandia National Laboratories, and the Naval Research Laboratory. The primary motivation for these projects is to enhance the capability to monitor radioactive materials from nuclear explosions and nuclear reactors that have been dumped in the ocean. These systems will involve sampling by aircraft, remote underwater stations, and buoys. The remote systems will involve communication by satellite telemetry and most will utilize Na(TI) detectors. To support these efforts, there appears to be a need for research on deep water collection and concentration techniques for the analysis of dissolved and suspended fission and activation products. The most likely method might be a combination of filtration and mixed bed ion exchange that would avoid chemical fractionation. The dissolved gasses could then be extracted from water samples for subsequent noble gas monitoring. To improve the performance of these systems, there is also a need for a high efficiency, high resolution gamma detector that does require a high capacity cooling system. Improvements in Basic Data Used to Make Source Term Estimates Sensible recommendations for additional work on source term estimates are contained in a study by experts from Lawrence Livermore National Laboratory (LLNL) and other National Laboratories (see Appendix G). Given the absence of available data, a few experiments would improve
OCR for page 77
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring the characterization of those source terms that have the highest uncertainties, and only a modest effort would be required to obtain the needed information. These include high-altitude releases using 102Rh and underwater tests using xenon as a tracer. In addition, a more complete literature search should be conducted to obtain information useful for improving source term calculations. There should be an aggressive effort to obtain such data from old reports and notebooks at National Laboratories, universities, and other U.S. sources. Such data should also be sought from foreign countries that have conducted nuclear tests, as well as from other countries that have maintained an atmospheric monitoring program. Data collected should be made available to the broader scientific community for both monitoring and scientific purposes, to the extent consistent with national security considerations and terms of acquisition. The information sought should include data on radionuclide measurements and the sizes of radioactive clouds as a function of time. It should also include any data on chemical fractionation and atmospheric partitioning of debris that takes place during the release of radioactive materials and gases under various blast conditions. Any information on the movement of radionuclides after underwater tests would be extremely useful. Improvement in Air Trajectory Models for Backtracking Calculations Many atmospheric scientists and engineers employ various back-trajectory models to identify the source terms of various airborne emissions such as sulfur and organic or inorganic constituents. Organizations such as NOAA, the Canadian Atmospheric Environmental Services, EPA, and some U.S. National Laboratories and their European equivalents have successfully used various backward and forward trajectory models (Evans, 1995; Mason and Bohlin, 1995). Usually, such models are reliable for only 5 to 7 days of backward or forward trajectories. While there is general agreement among models that predict large-scale atmospheric flows, different chemical transport models used for forecasting or back-tracking often provide different results, particularly if the initial conditions vary slightly. For this reason, there should be a major research effort to improve the algorithms for back-tracking and prediction of chemical transport in the atmosphere. This effort should include an intercomparison of the models as part of program validation. An ideal test would be determination of the location of various nuclear power stations in the Northern Hemisphere based on their emissions of xenon radioisotopes. The automatic radioxenon analyzer being developed by PNNL could be employed to provide data for these tests. Such an investigation followed by a sensitivity analysis would give a clear indication of the accuracy of these models. Rapid Radiochemical and Instrumental Techniques for Radionuclide Analysis of Filter Papers To discriminate fission products resulting from nuclear weapons testing from those produced in nuclear reactors, one can establish a list of CTBT discriminant fission products (Wimer, 1997). The basic requirement is to perform an isotope assay of each of these discriminant isotopes in a mixed fission product sample to 10 per cent within 24 hours. Samples would be particulate filter papers collected from field systems such as DOE particulate samplers or from the proposed rapid response teams. Rapid collection and analysis of these samples is again required because of the short half-lives of some products such as 97Zr (17 hours) and 143Ce (33 hours). It should also be possible to establish a list of specific fission products that can provide information about the nature of a nuclear weapon that has been detonated. Distribution of the fission products formed will vary with the fissile material employed (i.e., uranium-235 or plutonium-239) and the mean energy of the neutrons causing the fissioning. It is felt that some of the fission products of the lanthanides will be relevant (Wimer, 1997). Of the lanthanides, the fission products of europium (i.e., 154Eu, 155Eu, 156Eu, and 157Eu) may be the most useful. Potentially useful lanthanide fission products include 153 Sm and 159Gd. Other fission products that may be useful for characterizing the fissile material include 72Zn, 72Ga 103Ru, and 111Ag. Aside from instrumental gamma spectroscopy, no rapid procedures currently exist for analysis of selected fission products that are collected on particulate
OCR for page 78
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring filter papers. A determination of the ratio of the activity of one or more of these europium radionuclides to that of lanthanum-140 (140La) would clearly show the type of nuclear fuel used in a test device. Research is needed to establish either a radiochemical procedure or some type of instrumental method to allow rapid determination of these critical fission products in the field. It is recommended that this problem be investigated to determine whether or not some existing radiochemical procedure can be modified for this purpose. If not, such a procedure should be developed rapidly. An ideal procedure would be an automated chemical separation that isolates the discriminant elements into sample forms suitable for HPGe counting. Chemical yield information ideally would be generated during these separations. For certain discriminant fission products, instrumental techniques such as gamma-gamma coincidence counting may be useful to pick them out of the gamma background. Although improved observations of a signal in the presence of high background radiation is possible, detection efficiencies are much lower, and the ultimate count time may extend for several days. Also, the complexity of such instruments cannot be allowed to lower the mean-time-between-failure requirement for unattended operations. Procedures must also be established for rapidly transporting, from fixed stations to a laboratory that can perform this type of analysis, those filter paper samples found to contain recently produced fission products. Other Areas that Should be Investigated to Improve Radionuclide Detection Sensitivity Other nonradiochemical or instrumental procedures for low-level counting include Compton suppression and gamma-gamma coincidence methods. These techniques, although mature, have never been fully employed for CTBT monitoring. The primary advantage of such coincidence methods is the enhanced signal-to-noise ratio, as in the case of the beta-gamma coincidence employed in the radioxenon analyzer being developed by PNNL. It is also well known that the larger the detector volume, the greater is the detection efficiency, especially for higher-energy gamma lines. Summary of Research Priorities Associated with Radionuclide Monitoring The top priorities are: Research to improve models for back-tracking and forecasting the air borne transport of radionuclide particulates and gases Research and data survey to improve the understanding of source term data.21 Understanding of atmospheric rain-out and underground absorption of radionuclides from nuclear explosions. Assessment of the detection capabilities of the IMS radionuclide network. A secondary priority is: Research on rapid radiochemical analysis of filter papers. A long-term research priority is: Development of a high resolution, high efficiency gamma detector capable of stable ambient temperature monitoring. There is also a need to develop infrastructure for rapid airborne and waterborne monitoring of noble gases and particulates and to develop a rugged system for gas sampling during On-Site-Inspections. While the panel notes that these are primarily systems development and implementation tasks, they facilitate many of the above radionuclide monitoring problems. 3.5 OTHER TECHNOLOGIES The primary additional monitoring technology involves satellite systems that can monitor the optical, electromagnetic, and nuclear radiation from nuclear explosions, including x rays, gamma rays, and electromagnetic pulses (EMPs). Satellites also provide imaging capabilities that play a role in monitoring the underground testing environment. Much of the current system involves sensors that are deployed on the network of Global Positioning 21 Source terms give the amounts of various diagnostic radionuclides likely to be released by explosions of different size, depth, and environment. See Appendix G.
OCR for page 79
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring System (GPS) satellites. DOE conducts a substantial research program in this area, and satellites play a key role in NTM monitoring of atmospheric and space environments. There will be no satellite data provided by the IMS, and the technical capabilities of the U.S. national system are sensitive so the panel does not address this area in detail. Although U.S. satellite capabilities are substantial, intrinsic limitations of overhead methods preclude sole reliance on this system for monitoring CTBT compliance in all environments. However, the capabilities provided by satellite systems must be assessed when evaluating the need for research on and improving the performance of other monitoring technologies. Potential synergies with other monitoring methodologies are discussed in the next section. The research program supporting CTBT monitoring must also accommodate innovative new monitoring approaches beyond those of the well-established systems described in this chapter. For example, there is potential use of high-resolution GPS sensors to monitor ionospheric oscillations induced by large explosions. Also synthetic aperture radar may be used to monitor changes in ground surface in areas of concern that have been identified by other means (Meade and Sandwell, 1996). The basic mechanism by which the United States can sustain its technological edge and develop new creative monitoring strategies is the maintenance of a broadly based research program that is driven by verification needs, not constrained by existing operational perspectives; this is discussed further in Chapter 4. 3.6 Opportunities for New Monitoring Synergies Each of the three technologies—seismic, hydroacoustic, and infrasonic—has primary and complementary roles. Seismic sensors are the best detectors of signals generated by underground events; seismic and hydroacoustic sensors are sensitive detectors of signals from events in water; and satellite and infrasonic sensors can monitor explosions in the atmosphere. However, energy couples from one medium to another and this coupling offers the opportunity for detection by multiple technologies. For example, explosions and earthquakes on land can generate hydroacoustic waves at continental margins, and upcoming seismic waves that strike the ocean floor can generate pressure waves in the water that are detectable by hydroacoustic sensors. Similarly, the surface motion from a shallow explosion on land or in the oceans can generate pressure pulses in the air that may be detected by infrasonic sensors. Conversely, explosions detonated at low altitudes in the atmosphere near the ocean's surface can generate signals that propagate underwater and could be detected by hydroacoustic sensors, and explosions in water can generate seismic waves that are detectable by seismic instruments on land (e.g., by T-phase stations). Synergy in CTBT monitoring occurs at all stages of the monitoring process (i.e., detection, event association, location, identification) and often involves an interplay among them. For example, in the detection and association areas, joint association of hydroacoustic and seismic waves may define events that would not have been recognized by either of the technologies alone. In other situations, the preliminary event definition may come from a single technology. The preliminary event location and origin time from this initial detection can focus the processing of data from other stations on limited time windows and azimuths of arrival. This focused, sensitive processing has at least two benefits. First, it allows the use of tuned processing techniques operating at lower signal-to-noise ratios. These techniques would generate an unacceptable number of spurious detections if used routinely. For example, hydroacoustic processing routinely operated with the lower signal-to-noise ratios used in focused processing would detect a large number of signals from explosions conducted for oil and gas exploration in the oceans. The resulting flood of detections would pose severe challenges to the association process, resulting in both incorrect associations and missed events. A related benefit of focused processing is that it can provide confidence in the monitoring of small signals that otherwise would have been undetected or detected but incorrectly associated with other signals. Conversely, focused processing as the result of a seismic detection may detect and identify small direct and reflected hydroacoustic phases associated with the preliminary definition of the event and thus strengthen confidence in the initial association. Another example of hydroacoustic and seismic synergy is found in the fact that characteristics of
OCR for page 80
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring the hydroacoustic signals (i.e., the ''bubble pulse") can provide definitive identification information for seismic events detected in the ocean. The use of T-phase stations is yet another form of synergy between hydroacoustics and seismic waves. These seismic instruments, located on islands, record seismic waves generated by the impact of hydroacoustic waves. It is intended that the use of this phenomenon will provide additional coverage of the oceans and a greater possibility of detecting hydroacoustic waves from sources in remote or shadowed oceanic regions. The resulting detections will then be combined with detections from hydroacoustic stations in the association process. Note that the hydroacoustic signals from atmospheric events may be too small to be detected by the T-phase stations. Furthermore, the noise level at these stations is expected to be high. In general, the coupling between seismic and hydroacoustic waves is very poorly understood. For this reason, research on T-phase coupling is given the highest priority in the area of research synergy. Still other forms of synergy involve infrasound, seismic, and hydroacoustic technologies in various forms. Processing data from the infrasonic arrays can be expected to provide constraints on the source locations based on azimuth determinations. However, the origin times are likely to be poorly constrained. The locations and approximate origin times can be used to focus the beams of seismic arrays and thus enhance their phase detection capabilities. If an event is located at sea, hydroacoustic processing can be focused on appropriate time windows. The interplay between seismic and infrasonic sensors provides yet another form of synergy in the detection process. Atmospheric pressure waves generated by near-surface chemical explosions (e.g., mining explosions) cause the surface of the Earth to move, and this motion is recorded by seismic instruments. These surface motions arrive after the seismic waves generated from the same source, appearing as lonesome waves that may be associated incorrectly with other phases or viewed as the only detected wave from a small event. Collocation of seismic and infrasonic sensors would indicate the coincidence of atmospheric waves and seismic motions, thus reducing the uncertainty of these detections. With this identification, seismic waves could be removed from further processing to reduce the number of phases involved in the association process. For stations near mining districts, this synergy of infrasound and seismology would be a significant benefit. Because estimates of the back-azimuths from infrasonic data can have smaller variances than those from seismic arrays, combined use of these data for event location would also be useful. In addition, near-surface mining explosions can be identified by associating seismic locations with infrasonic detections. However, such identification does not preclude masking, and the absence of radionuclide detections may be necessary to confirm that a surface explosion was not nuclear. Events identified as suspicious on the basis of seismic, hydroacoustic, and infrasound observations may serve to prompt rapid response deployments of radionuclide equipment if such are developed for NTM. Forward predictions or radionuclide detections by permanent stations can also be made in this case using atmospheric transport models. Additional synergies between the technologies will emerge from experience with the complete international and national monitoring systems. Most of the synergies will depend on regional properties, and their definition and effectiveness will depend on calibration and operational experience in the region. Many potential synergies exist between satellite methodologies and other approaches. For example, the development of ground truth data bases for each monitoring method could be augmented by coordinated use of overhead imagery. This has been established in the past for major nuclear test sites, where surface collapse craters could be associated with seismic event locations to calibrate the seismic array, eliminating biases due to unknown Earth structure. Future applications along similar lines could be pursued for calibration of quarry mines, earthquakes that rupture the surface, volcanic eruptions, and atmospheric disturbances (e.g., lightning). Overhead imagery has remarkable capabilities, but the need for focusing attention on a given region must come independently. For example, for On-Site Inspections, overhead imagery in a localized region can narrow down the region for assessment, but this must be guided by the initial location estimate from other methodologies. In the case where candidate testing sites are identified prior to an actual test, perhaps by seismically recorded quarry blasting of unusually large size in the area, an archive of overhead images could be compiled for comparison with
OCR for page 81
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring images taken after a suspicious event is detected. This cannot be done with high resolution on a global scale, but it can be done for limited areas if there is a basis for defining those regions. Such approaches are still limited by the fact that underground testing does not have to result in detectable surface features such as collapse craters above the source. (The U.S. testing experience at Rainier Mesa has been that craters almost never were produced at the surface for these tunnel shots.) High-resolution, multispectral satellite imaging capabilities will become more available in coming years, which will make some aspects of satellite monitoring capabilities available to all nations. This availability should open up research opportunities on a broad scale. Summary of Research Priorities Associated with Synergy In summary, a prioritized list of research topics to increase the synergy between CTBT monitoring technologies includes: Improved understanding of the coupling between hydroacoustic signals and ocean island-recorded T-phases, with particular application to event location in oceanic environments. Integration of hydroacoustic, infrasound and seismic wave arrivals into association and location procedures. Use of seismo-acoustic signals together with an absence of radionuclide signals for the identification of mining explosions. Explore the synergy between infrasound, NTM, and radionuclide monitoring for detecting, locating, and identifying evasion attempts in broad ocean areas. Determine the false alarm rate for each monitoring technology when operated alone and in conjuction with other technologies. 3.7 ON-SITE INSPECTION METHODS The limited temporal duration of some of the effects associated with underground nuclear explosions (e.g., seismic aftershocks and radioactive gases) places a premium on rapid access to the site. The limited spatial extent of the anomalies and the limits on their detectability place a premium on the accuracy of the locations determined by remote sensors. Although aerial surveillance may serve to pinpoint some regions of interest and eliminate others, the 1000 km2 location goal for remote sensors still leaves a large area to be covered by an On-Site Inspection. As stated before, improvements in the accuracy of locations determined by remote sensors are essential for effective OSI. A high priority for the overall OSI process is the elimination of significant systematic errors in the location capability of remote sensors, since these errors could completely negate the value of an On-Site Inspection. Reduction in the size of the random error is also important. However, the deterrence value of OSI may be preserved even if the random error is of moderate size as long as the systematic error is small. Even if such improvements do occur, an effective OSI regime demands coupling an effective reconnaissance mode with rapidly deployable, efficient, focused operations at specific sites. These local operations employ relatively well-known geophysical technologies. What are needed are rapid deployment methods for sensitive instruments, criteria for evaluating the significance of locally collected data, and the signal processing ability to evaluate data quickly. For OSI, radioactive gases leaking from cracks and fissures may be the only indicator of detonation location. Rapid noble gas monitoring equipment mounted in slow-moving aircraft may be required to perform the initial screening of an area to identify the approximate test location. Such air surveys may require the collection and rapid analysis of many air samples over a wide area. If the general area of radioactive noble gas release has been identified by airborne screening, surface vehicles can be used for sample collection. (e.g., Jester and Baratta, 1982; Jester et al., 1982). In the broad, sense similar considerations apply for tests at sea. However, the limitations imposed by lack of IMS resources (e.g., ships, planes, and airborne gas and water samplers), the absence of long-lived deformation effects, and the possibility of decreased accuracy of location estimates for events in the ocean make it difficult to conduct an oceanic OSI. However, the possibility that currents may slowly disperse a relatively intense radio-active slurry from test debris may offer some hope of detection if seawater samples can be obtained
OCR for page 82
Research Required to Support Comprehensive Nuclear Test Ban Treaty Monitoring from the proper locations. Finally, it should be noted that OSI at sea does not, in general, offer the same deterrence benefits as those on land. Its value could be enhanced if it were possible to attribute the nuclear device to a limited number of sources on the basis of chemical and physical properties of materials in the debris—another area of potentially fruitful research.
Representative terms from entire chapter: