5
Summary and Recommendations

Recommended Flood Frequency Distribution

Based on consideration of the available data and consistent with Bulletin 17-B guidelines, the committee recommends the use of the flood frequency distribution given in Table 5.1 for estimation of quantiles and exceedance probabilities of three-day rain flood flows on the American River. Note, however, that this recommendation is made for annual exceedance probabilities greater than 1 in 200. For smaller exceedance probabilities, the committee believes there is compelling evidence that the true probability distribution flattens. If it is necessary to extrapolate the distribution for smaller exceedance probabilities, the recommended distribution provides a basis that is consistent with Bulletin 17-B. However, in view of the possibility that the true distribution flattens, other estimation approaches should be investigated.

Our recommended distribution is based on the systematic record of three-day rain flood flows estimated by the USACE from the USGS flow record for Fair Oaks, and upon the historical record for 1848-1904 which included an estimated large three-day flow associated with the 1862 historic flood. Based on several independent analyses conducted by the committee and the USACE, the committee conclude that the three-day rain flood record is an accurate representation of the magnitude of the flood flows over the period of record and that the observed increase in frequency in large floods since 1950 is not an artifact of the method by which flood peaks were computed. The estimate of the three-day flow associated with the 1862 flood is based on the use of an instantaneous peak flow estimated by Bossen (1941) and a regression model developed by the committee. In its frequency analysis the committee assumes that this flow was the largest three-day flow in the historic period from 1848 to 1905.

The committee used the Expected Moments Algorithm (Cohn et al., 1997) to fit a log-Pearson type III distribution to the systematic and historical data, assuming a fixed skew of -0.1. The skew is based on a weighted average of a regional skew (-0.1) and the sample skew (-0.06). The committee estimated the regional skew by averaging the sample skew of the log three-day flow series from seven rivers on the west slope of the central Sierra Nevada. The Expected Moments Algorithm matches log space sample and population moments, and hence is consistent with Bulletin 17B. However, it makes more effective use of historical and paleoflood information than does the weighted moments method recommended by Bulletin 17-B. Approximate confidence intervals were estimated by Monte Carlo simulation.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 101
--> 5 Summary and Recommendations Recommended Flood Frequency Distribution Based on consideration of the available data and consistent with Bulletin 17-B guidelines, the committee recommends the use of the flood frequency distribution given in Table 5.1 for estimation of quantiles and exceedance probabilities of three-day rain flood flows on the American River. Note, however, that this recommendation is made for annual exceedance probabilities greater than 1 in 200. For smaller exceedance probabilities, the committee believes there is compelling evidence that the true probability distribution flattens. If it is necessary to extrapolate the distribution for smaller exceedance probabilities, the recommended distribution provides a basis that is consistent with Bulletin 17-B. However, in view of the possibility that the true distribution flattens, other estimation approaches should be investigated. Our recommended distribution is based on the systematic record of three-day rain flood flows estimated by the USACE from the USGS flow record for Fair Oaks, and upon the historical record for 1848-1904 which included an estimated large three-day flow associated with the 1862 historic flood. Based on several independent analyses conducted by the committee and the USACE, the committee conclude that the three-day rain flood record is an accurate representation of the magnitude of the flood flows over the period of record and that the observed increase in frequency in large floods since 1950 is not an artifact of the method by which flood peaks were computed. The estimate of the three-day flow associated with the 1862 flood is based on the use of an instantaneous peak flow estimated by Bossen (1941) and a regression model developed by the committee. In its frequency analysis the committee assumes that this flow was the largest three-day flow in the historic period from 1848 to 1905. The committee used the Expected Moments Algorithm (Cohn et al., 1997) to fit a log-Pearson type III distribution to the systematic and historical data, assuming a fixed skew of -0.1. The skew is based on a weighted average of a regional skew (-0.1) and the sample skew (-0.06). The committee estimated the regional skew by averaging the sample skew of the log three-day flow series from seven rivers on the west slope of the central Sierra Nevada. The Expected Moments Algorithm matches log space sample and population moments, and hence is consistent with Bulletin 17B. However, it makes more effective use of historical and paleoflood information than does the weighted moments method recommended by Bulletin 17-B. Approximate confidence intervals were estimated by Monte Carlo simulation.

OCR for page 101
--> TABLE 5.1 Summary of Three-Day Flood Quantile Estimates for the American River at Fair Oaks Using the Expected Moments Algorithm (EMA)a Data and Assumptions:   Systematic Observations: 1905 - 1997 Historical Period: 1848 - 1904 Historical Flood 1862; 147,000b Upper Bound for Remainder of Historical Period 147,000b Paleoflood Observations: not included Estimated Distribution Moments:   Log(10) Mean: 4.3329 Log(10) Std. Deviation: 0.4149 Log(10) Skewness Coefficient: -0.1000 Estimated Three-Day Mean Flood Quantiles and 90% Confidence Intervalsc: Q10 (Pexceed = 0.10) 72,500 cfs (60,000 cfs; 88,000 cfs) Q20 (Pexceed = 0.05) 101,000 cfs (81,000 cfs; 126,000 cfs) Q50 (Pexceed = 0.02) 145,000 cfs (109,000 cfs; 192,000 cfs) Q100 (Pexceed = 0.01) 185,000 cfs (131,000 cfs; 257,000 cfs) Q200 (Pexceed = 0.005) 230,000 cfs (154,000 cfs; 338,000 cfs) a Flood quantile estimates are based on rain floods only. b Corresponds to estimated 1862 three-day mean Q. c Based on the LPIII fitted using a log skew of -0.1 to the systematic record and the historical record from 1848 that included the historical 1862 flood. Sensitivity analysis using the recommended approach indicates that censoring below various flows with exceedance probabilities ranging from about 0.94 to 0.31 does not significantly affect the estimated distribution. The committee chose not to apply the expected probability adjustment to the distribution obtained by application of the expected moments algorithm. In developing the recommended flood frequency distribution, it was decided not to use paleoflood information recently obtained by the U.S. Bureau of Reclamation. Use of the paleoflood data implies that the log skew is much more negative, and as a result when the paleoflood data is used with the systematic and historical data, the resulting fitted log-Pearson type III distribution does not provide an adequate description of the flood flow frequency relationships for floods with exceedance probabilities from 0.5 up to and beyond 0.002. Beyond Bulletin 17-B While its preferred estimate of the frequency distribution of three-day rain flood flows on the American River is consistent with the systematic and historical data, the committee is uncomfortable with extrapolating it much beyond the flow with an exceedance probability of 0.005. Use of the recommended distribution to estimate the exceedance probabilities of two recent PMF estimates yields values that

OCR for page 101
--> are relatively high, suggesting that for very large flows the upper tail of the "true" distribution flattens relative to the upper tail of our preferred estimated distribution (The term "flattens" refers to the flood distribution as plotted in Figure 3-3.). The paleoflood information supports this conclusion; however it was decided not to use the USBR paleoflood information to extrapolate the frequency distribution of three-day volumes beyond an exceedance probability of 0.005 because the committee was uneasy about climate variability during the 1,350- to 3,500-year period for which there is a paleoflood record given the present understanding of likely global climate variations over that period. To further explore the extrapolation issue, the committee developed a partial duration series of basin average precipitation. An exponential fit to this series crossed the recommended fit to the three-day flow series, clearly an impossibility. While it is possible that the fitted precipitation series is the source of the problem, it seems more likely that the upper tail of the flow distribution flattens. Using the estimated distribution of average basin precipitation and a simple regression model of the rainfall-runoff relationship, the committee estimated a three-day flow distribution that flattens in response to the constraint imposed by precipitation. While this estimated distribution is based on incomplete data and simplifying assumptions, the general approach should be explored as a potential method of extrapolating the flood frequency distribution. The committee did not have time to develop a recommendation regarding extrapolation of the frequency distribution beyond the flow with an exceedance probability of 1 in 200. This is clearly an area in need of research. One complicating factor is the observed post-1950 increase in large floods. Post-1950 Increase In Frequency Of Large Floods There is little doubt that the observed frequency of large floods on the American River is much greater in the period from 1950 to the present than it was in the period from 1905 to 1950. Based on the present understanding of climate dynamics, it is not possible to assess the relative contribution of natural and anthropogenic factors to this observed increase. More importantly, it is not possible to predict its likely persistence in time. The committee is very uncomfortable with this situation, but it has little choice given the absence of information. However, even if the post-1950 increase in large floods is due to natural climate variations, Sacramento and the surrounding areas face a severe flood risk. If the increase is due to anthropogenic factors, the already high risk increases. Implications For Floodplain Certification Based on the USACE 1998 100-year flood estimate, the Federal Emergency Management Agency (FEMA) issued new floodplain maps for Sacramento. As a result of these new maps, most of the floodprone areas of Sacramento were classified as being in the so-called AR zone. Generally, this designation would have resulted in

OCR for page 101
--> building restrictions and higher flood insurance rates. In this case, FEMA waived the increases in flood insurance rates, but enforced the building restrictions. If adopted, the 100-year flood estimate recommended by this committee may result in removal of the floodprone areas of Sacramento from the AR zone. This would result in suspension of the building restrictions.1 It would also likely reduce the political pressure to achieve a solution to the acute flooding threat facing Sacramento. If the 100-year flood estimate does indeed imply that floodprone areas of Sacramento along the American River levees are not in the 100-year floodplain, it will be by the thinnest of margins. Because the uncertainties in this estimate are so large, the evidence that these areas are not in the 100-year floodplain is far from compelling. In fact, there is about equal evidence that these areas belong or do not belong in the 100-year regulatory floodplain. The worst consequence of falsely designating such floodprone areas to be in the regulatory floodplain would be the requirement for building restrictions that in the future may prove unnecessary. The worst consequence of falsely designating such floodprone areas to be out of the regulatory floodplain would be a prolonged delay in solving acute flood problems, a delay that could have catastrophic results. Given the gross inequality of these two consequences, the committee strongly recommends that authorities consider the situation carefully and the large uncertainties in the estimated 100-year floods, and attempt to develop a flood risk management strategy that addresses the significant risk of flooding in Sacramento. Research Needs Flood frequency analysis has been practiced for nearly a century and has seen significant developments in both technological and sociopolitical contexts. Despite progress, much remains to be learned. This improved understanding will be problematic when new knowledge and methods are proposed to be incorporated into nationwide guidelines, such as Bulletin 17-B. In particular, it will raise questions as to whether previously completed flood frequency analyses need to be revised and whether such revisions would significantly change the boundaries of regulatory floodways and floodplains. In this context, one needs to be careful to distinguish between changes in flood frequency curves occasioned by the collection of additional data and those caused by changes in methods of data analysis and prediction. From a scientific point of view, both types of changes should be expected as data bases grow and knowledge advances, but the latter type of change is much more difficult to deal with from a sociopolitical point of view. In effect, to what extent should sociopolitical issues resisting change overshadow advances in scientific methodology and vice versa? How can a compromise be reached and how can it be implemented 1   The 100-year flood estimate recommended in this report is for unregulated maximum average three-day rain flood discharges at Fair Oaks. Floodplain designation in Sacramento is based on the 100-year regulated annual maximum instantaneous discharge in Sacramento. Determination of the latter requires modeling of the hydrology and hydraulics of the river and associated flood-mitigation systems.

OCR for page 101
--> without thwarting the goal of national consistency underlying the genesis of Bulletin 17-B? Answers to such questions will require both scientific study and informed public debate. As was pointed out by the NRC Committee on Flood Risk Management in the American River Basin, the need for future research and issue resolution should not be used as an excuse for inaction now. While that committee's comment was directed specifically to the American River situation, this committee believes that the ongoing needs and opportunities being experienced by Sacramento suggest that the time is ripe to begin to seriously reassess policy and strategies for flood risk assessment and management not only for the Sacramento case but for the nation as a whole. For example, a similar issue has arisen in Tucson regarding temporal changes during this century in both the frequency of floods, and changes in the relative contributions by different members of the population of generating mechanisms (Webb and Betancourt, 1992). The committee recommends the establishment of a new interagency research effort focused on flood risk assessment and management. The impetus for such action is clear: rising property damages and loss of life; 30 years of experience with the National Flood Insurance Program; aging federal policy and technical guidance; improvements in scientific methods of computing and modeling; emergence of understanding of paleohydrologic and climate variability issues; and a growing data base and availability of information. Virtually all of these issues have arisen in the Sacramento case, and can be expected to arise in others as well. It is envisioned that this recommended interagency effort will emphasize research programs oriented towards coordinated flood risk reduction, including meteorologic, hydrologic and hydraulic, and policy and socio-economic aspects of flood management. Participating agencies should include such entities as the U.S. Geological Survey, the National Weather Service, the Federal Emergency Management Agency, the U.S. Army Corps of Engineers, the U.S. Bureau of Reclamation, the Tennessee Valley Authority, the Federal Energy Regulatory Commission, the National Science Foundation, and appropriate state, regional, and local agencies. Participation, in perhaps an ex-officio role, might also be considered for the academic community through a periodic rotation system. In their deliberations, committee members identified a number of specific issues that should be addressed by the recommended interagency effort. These issues are summarized below: (1)   Enormous progress has been made in the analysis of flood data since the last major revisions were made to Bulletin 17-B. This progress has largely involved regionalization and the collection and use of historical and paleoflood data. In addition, a number of methods have been developed to handle mixed distributions, including aggressive censoring. These and other innovations in flood frequency analysis should be considered in a revision of Bulletin 17-B. (2)   A very strong research need is to better understand interannual to century scale climate variability as it relates to the potential for winter/spring floods in the American River basin and surrounding areas. This of course is a major undertaking by the earth science community. As indicated in Chapter 4, a framework for

OCR for page 101
--> formally conducting such analyses to better estimate potentially changing flood frequency distributions and their uncertainty is needed. Historical and paleoclimate and hydrologic data as well as future model projections would need to be integrated in this framework. Efforts should be continued to develop more detailed, comprehensive and systematic documentation of all major and significant floods, as part of a national database on floods. These efforts need to tie in information on ocean and atmosphere circulation conditions to the information on floods. (3)   A decision analytic framework that uses information as to the uncertainty of the flood frequency estimates explicitly in the analysis of the design level of flood protection is also needed. Dynamic and static risk analyses as discussed in Chapter 4 may be needed. Such a framework would consider the length of the record, climatic factors, the length of the planning period, an implicit long range climate forecast associated with this period, considerations of risk and estimate uncertainty, and a prescription of how the decisions could be periodically re-evaluated.