The differences between the available aeration technologies are primarily a function of the complexity of their design and operation, the flowrates treated, and the radon removal efficiency achieved. The most efficient systems are capable of achieving >99% radon removal by increasing the surface area available for mass transfer of radon from water to air. However, these systems usually require more maintenance than simpler technologies. The more complex technologies are most practical for larger communities that must treat large volumes of water and have a large staff and tax base to support the more extensive capital and operation and maintenance requirements. Most aeration technologies require that the water system operate at atmospheric pressure to allow the release of radon to the air. This means that the systems must be repressurized to supply water to the community. Descriptions of existing and emerging aeration techniques for removing radon from water are given in appendix C.

Issues/Secondary Effects of Aeration

Intermedia Pollution

In its proposed rule, EPA (1991b) recognized that emissions from aeration systems potentially could result in a degradation of air quality and pose some incremental health risk to the general population because of the release of radon to the air. On the basis of EPA's analysis (EPA 1989; EPA 1988b), the increased risk is much smaller than the risk posed by radon in the water. In its initial evaluation with the AIRDOSE model, EPA (1988b) used radon concentrations in water of 68,000 Bq m-3 (range, 37,000–598,000 Bq m-3) based on data from 20 water systems in the United States. Assuming a 100% transfer of radon from the water to air, EPA estimated that radon would be emitted into the ambient air at 0.10 Bq y-1. EPA used an air dispersion model (including radon and its progeny) and assumed ingestion and inhalation exposures in a 50-km radius, and it calculated a maximal lifetime individual risk of 4 × 10-5 (0.016 cancer case y-1). Extrapolated to drinking-water plants throughout the United States, that translated to 0.4 and 0.9 cancer cases per year due to off-gas emissions from all drinking-water supplies meeting MCLs of radon of 7,400 or 37,000 Bq m-3 in water, respectively. EPA used a similar approach to assess the risks associated with dispersion of coal and oil combustion products.

In an evaluation with the MINEDOSE model, EPA (1989) used worst-case scenarios from four treatment facilities whose raw water radon concentrations were 49,000 to 4,074,000 Bq m-3. In only one facility was there a significant potential increase in cancer risk associated with radon emissions when a single point source was assumed. However, EPA found that this large water utility actually used a number of wells at various locations, instead of one source, and that reduced the risk because of dispersion over a greater area. The highest raw water radon concentrations did not always result in the greatest



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement