Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 34
--> 4 Interpretive Studies Introduction Interpretive studies performed by the U.S. Geological Survey (USGS) represent the interface between data-gathering efforts and research activities and their application to problems of local, regional, and national significance. Interpretive studies are one of the crucial links between the research and service missions of the USGS through the support and application of research activities in major mission areas. The conduct of problem-specific interpretive studies at the district level offers the opportunity for cooperators and various interested parties to benefit directly from research by the USGS through the application of research results to practical local problems. Local client-driven interpretive studies symmetrically support research efforts by expanding the consistent national database available for research, as well as providing practical testing of new research results in applied management problems. Feedback from these applications, in turn, generates and refines the emerging questions and problems that drive the evolving research program toward timely challenges that are well suited to the USGS's expertise. The value added through the intersection of USGS interpretive studies and research is well demonstrated in the agency's work in bridge scour and flood peak estimation on ungaged watersheds. The state of the practice in the development of interpretive studies is largely driven by funded cooperators and serves a valuable technology transfer function for the more mature research products. Strategic opportunities to strengthen the role of interpretive studies have been identified in supporting public policy on risk management for hydrologic haz-
OCR for page 35
--> Typical damage caused by debris flow. Photo courtesy of U.S. Geological Survey. ards, planning responsive monitoring networks, and evaluating rapid-onset events. Research needs related to interpretive studies include more rigorous treatment of nonstationarity in hydrologic risk assessment, including nonstationarity and persistence in climatic linkages, changing land use and watershed conditions, and the dynamics of channel morphology and sediment transport. The role of the USGS in interpretive studies includes technology transfer, integrating cooperator needs with the research program, and information generation. Information Generation Information on hydrologic hazards is not equivalent to the data used to create the information. Data are measurements made to describe selected aspects of physical, ecological, or socioeconomic systems. For example, measured streamflow data can lead to information on hydrologic hazards but such data by themselves are not hazard information. Instruments, laboratories, and surveys are examples of tools for securing data. Information on hydrologic hazards is created by the interpretation of data, using particular analytical frameworks and concepts. Data interpretation to create information is done by hydrologists and other scientists, engineers, policy makers, and the general public. These various data users apply differing concep-
OCR for page 36
--> tual frameworks to data as they create information relevant to their particular purposes. The value of information differs according to the decision-making setting. While statistical decision theory defines information value in terms of the economic gains from improving a particular decision outcome, organizational theorists describe a different value for information. Decision makers always are scanning and monitoring the organization's environment for signals that would require shifts in programs and policies. To illustrate, tracking national trends in weather-related economic damages might signal a need to make changes to the national flood information program. Such global trends data have little value to the farmer making a planting or harvest decision. Information on hydrologic hazards may be used to predict and/or understand (1) the probability of hazardous events, (2) the adverse consequences of the events, and/or (3) the complex causal pathways linking events to their hazardous consequences. Flood Frequency Analysis State of the Practice Interpretive studies related to flood frequency analysis have traditionally focused on the application of standard hydrologic methods described by the U.S. Interagency Committee on Water Data (1982) to analyze the characteristics and frequency of flooding. Statewide surveys (Clement, 1987; Curtis, 1987; Guimaraes and Bohman, 1991) as well as flood frequency analysis for river basins (Landers and Wilson, 1991) provide resource managers with standard, reliable baseline information on flood frequency that directly supports design and floodplain management activities throughout the nation. Interpretive studies have also integrated the products of the national research program in hydrologic regionalization and the estimation of flood peaks at ungaged sites (Stedinger and Tasker, 1985, 1986; Tasker and Stedinger, 1989, 1992; Tasker and Slade, 1994; Tasker et al., 1996), culminating in national regression equations for flood peak estimation (Jennings et al., 1994). Flood-related interpretive studies documenting the magnitude and hydrologic characteristics of extreme events expand the national information base on flood risk. These activities utilize hydrologic and hydraulic techniques (Dalrymple and Benson, 1967; Kirby, 1981; Fulford, 1994) as well as paleohydrologic techniques (Costa, 1986; Cohn and Stedinger, 1987). Nearly all mathematical idealizations of hydrologic processes presume that the underlying probability distribution for a random variable, such as flood recurrence, remains constant through time. Stationarity is an assumption underlying most conventional techniques for estimating flood exceedance probabilities.
OCR for page 37
--> However, changes in climate, vegetation, and other watershed conditions are known to drastically alter flood responses in many situations. For paleoflood records extending back beyond the past century the key stationarity questions center on climate variability. Deviations from purely random event generation occur because catastrophic floods are generated in many cases by nonrandom occurrences of atmospheric phenomena (Ely et al., 1993, 1994, 1996). The scientifically interesting goal is to document such occurrences by developing long-term paleoflood datasets at multiple localities. Without such information it is impossible to evaluate the validity of any statistical flood frequency analysis, especially those based only on temporally restricted conventional datasets. Ultimately, the nonstationarity ''problem'' in flood frequency analysis will have to be viewed in the context of hydroclimatological models for long-period climatic variations. Such models hold the promise of meshing paleoflood studies and other aspects of flood geomorphology with the relatively short time scales of gaged flow records. Opportunities Independence, Stationarity, and Homogeneity The USGS interpretive studies in regional flood frequency analysis present a timely opportunity for technology transfer in reevaluating traditional methods used in flood frequency estimation. Common practice estimates flood frequency by using annual or partial duration series of flood peaks. Various statistical estimators have been developed, all of which assume that historical flood peak series represent a sample of independent realizations drawn from a stationary homogeneous stochastic process. Emerging research provides growing evidence of deviations from these assumptions. A number of natural and anthropogenic sources of variability may result in nonstationarity in flood peak series. Secular trends in climatic variability can be demonstrated using traditional gaged records (Bradley, 1998) as well as through the use of dendrochronology, geomorphic interpretation of sediment deposits, and other paleohydrologic techniques (Jarrett, 1991; Enzel et al., 1993; Salas et al., 1994). The gaged record of flood peaks at a site may frequently contain flood events caused by several very different mechanisms, such as rainfall and snowmelt, or tropical and nontropical storm systems (Hirschfield and Wilson, 1960) following varying antecedent watershed conditions. Similarly, systematic climate variability such as the El Niño/Southern Oscillation can result in predictable transitions between persistent climate states that exhibit significant and fundamentally different hydrologic characteristics (Dracup and Kahya, 1994; Canadian Electricity Association, 1994). The distribution of flood peaks under these circumstances may better be described as a mixture distribution of two (or more) distinct populations (Hirschfield and Wilson, 1960; Waylen and Woo, 1982) rather than a
OCR for page 38
--> single homogeneous distribution. Systematic hydrologic variability (Landwehr and Slack, 1990; Cayan and Webb, 1992) can produce interannual persistence in hydrologic extremes that significantly alter flood frequencies and flood risks estimated from historical flood peak series (Webb and Betancourt, 1990). The use of historical gage records and paleohydrologic techniques can similarly be used to identify persistence in hydrologic extremes affecting flood frequency estimation. Booy and Morgan (1985) have demonstrated interannual persistence (clustering) of flood peaks in Canadian streamflow records, using a Monte Carlo test for the Hurst phenomenon. Accounting for serial correlation in gaged flood peaks, the estimated recurrence interval of the design discharge for the city of Winnipeg's levee system decreased from 169 years to 70 years. The USGS's activities in regional flood frequency studies represent an opportunity to reevaluate traditional estimates of flood risk, accounting for improved understanding of the spatial and temporal characteristics of hydrologic extremes. Alternate Methods for Estimation and Regionalization The Survey's technical and institutional expertise may provide the opportunity to lead a systematic reevaluation of statistical methods for flood frequency analysis. The research advances that have been made since the adoption of the recommendations in Bulletin 17-B (USIACWD, 1982) have been significant. The Survey is strategically positioned to coordinate a systematic reevaluation of "standard" flood frequency techniques commonly used in the National Flood Insurance Program and the federal design and planning process by the Corps of Engineers and other agencies. Along with improved techniques for flood quantile estimation, the explicit computation of quantitative confidence limits represents essential information for planning, resource allocation, and risk-based decision making. Extreme Events and Risk-Based Decision Making Extreme event information commonly developed through interpretive studies (such as regional flood frequency analyses) also provides the information base to support improved risk-based decision making. For example, the observation of elevation effects in Rocky Mountain flood peaks allowed Jarrett (1993) to reevaluate flood frequencies, recognizing the significance of the linkages between the spatial patterns of orographic rainfall in regions of complex terrain and the associated flooding mechanisms (Barros and Lettenmaier, 1994; Barros and Kuligowski, 1998). This observation, supported by geomorphic paleoflood evidence (Grimm et al., 1995), dramatically lowered the estimate of extreme flood magnitudes in these high-elevation watersheds. While hydrologically interesting, this analysis clearly has direct implications in both the design and the operation of water resource systems in these watersheds (see Box 4.1). Similar integration
OCR for page 39
--> BOX 4.1 Elevation Limit to Rainfall Flooding Prior to recent USGS studies it was widely believed that large rainfall floods could occur at any elevation in the Rocky Mountains. One such rainfall flood was the Big Thompson flood of 1976, which killed 140 people and caused over $35 million in damages. For more than a decade USGS hydrologists have documented the size of many contemporary and prehistoric floods on rivers throughout the Rocky Mountains. An analysis of this large detailed dataset by USGS hydrologists indicates that there is an elevation limit to rainfall flooding. Floods in river basins above about 5,500 feet in the Northern Rocky Mountains (or 7,500 feet in the Southern Rocky Mountains) are comparatively small and result from snowmelt rather than high-intensity rainfall. This research has led to substantially lower estimates of the 100-year flood (and the probable maximum flood) in high-altitude basins throughout the Rocky Mountains. These results have important implication for floodplain management, implementation of flood warning systems, and the design of hydraulic structures in flood-plains. Dam safety guidelines developed before this USGS research was done suggested that many dams in the Rocky Mountains were underdesigned. The cost of rebuilding spillways throughout Colorado to meet these guidelines was expected to be $184 million. One of the dams thought to have an underdesigned spillway is Olympus Dam in Estes Park, Colorado, located 7,500 feet above sea level. The spillway is designed for a flood of 22,000 cubic feet per second (cfs). The guidelines for spillways in this area would have required a redesign to accommodate a flood of 84,000 cfs. The USGS research showed that no floods in this basin had ever exceeded 5,000 cfs in the past 10,000 years. Thus, the USGS was able to demonstrate that the costly spillway reconstruction at this high altitude was not necessary, and spillway criteria for the Rocky Mountain region are being rewritten to reflect these findings. These downward revisions of flood risk mean that spillway modifications will not be necessary at some dams and that reservoir storage set aside for flood control can be used for water storage for municipal, industrial, irrigation, recreation, or habitat-related uses. Thus, the USGS findings not only result In a savings of redesign costs but also in more beneficial storage in the existing reservoirs. of risk-based decision making and flood frequency analysis is observed in the ongoing discussion over the appropriate use of expected probability estimators and maximum likelihood estimators (NRC, 1995; Stedinger, 1996) to estimate flood exceedance probabilities and the likelihood of future flood damage. Cumulative Impacts of Watershed Activities Changes in catchment hydrologic response owing to land clearance (Bosch and Hewlett, 1982) and urbanization (Seaburn, 1969; Wallace, 1971; Ferguson
OCR for page 40
--> and Suckling, 1991) have been well documented. The resulting nonstationarity in streamflow challenges the traditional assumptions and the use of historical streamflow records in flood frequency estimation. The regulation of large river systems (Collier et al., 1996) alters the flow duration characteristics of streams through both the storage and release of runoff and the resulting changes in channel form (Pizzuto, 1994). Nonstationarity from development activities represents a challenging opportunity to account for human-induced changes in estimating flood frequency. The Hydroclimatic Data Network (Slack and Landwehr, 1992) represents a core set of stream gages selected to provide a representative database of unregulated streamflow throughout the nation. The ways in which land clearance and development affect existing streamflow records show the need for maintaining continuous records on relatively unregulated streams as well as refining techniques for flood frequency analysis that account for land-use changes and other sources of nonstationarity. Paleohydrology Recent developments in the field of paleoflood hydrology and statistical techniques for making use of such information provide new and often unappreciated opportunities for improving estimates of flood frequency relations at gaged and ungaged sites (O'Connor et al., 1994; Enzel et al., 1996; Ely, 1997; Baker, 1998). The USGS has over the last several decades collected streamflow records at thousands of sites in the United States. This information provides a vast resource for flood frequency and other investigations. Still, when frequency analyses are required at particular sites, often there is no streamflow gage located near that site requiring the use of regional estimates. In fact, even with a gaged record available, streamflow records are generally shorter than optimum to estimate the 50-or 100-year flood employed in floodplain planning, and the design of bridges, culverts, and flood control structures. The availability of better statistical techniques has invigorated the development of techniques for obtaining botanical and physical paleoflood records, and suggested new and revised procedures and approaches for field investigations of historical, botanical and physical paleoflood information (Stedinger and Baker, 1987). The procedures for collecting this information in the field need to be better documented and made more widely known. Limitations on interpretation and accuracy of paleodata need to be carefully documented. Role of the USGS Interpretive studies provide consistent databases developed with uniform data collection techniques to support research issues. These efforts can, by design, include measurement of land-use and watershed characteristics to support
OCR for page 41
--> current research and future interpretive analyses. The USGS also plays a crucial role in rapidly mobilizing resources to document and study extreme hydrologic events when they occur. This function is vital to the ongoing study of hazards from hydrologic extremes. The USGS interpretive studies serve a national technology transfer function, providing cooperators with access to new techniques that support design and hydrologic risk management. The USGS is well qualified to provide national leadership in the development and application of emerging science in climate variability, improved statistical estimation, hydrologic regionalization, and the impacts of changing land uses on the evaluation of flood risks and risk-based decision making. Flood/River Gaging State of the Practice The USGS is the primary federal agency responsible for supplying hydrologic data to federal, state, and local agencies as well as private users. The USGS operates and maintains more than 85 percent of the nation's stream gaging stations, including 98 percent of those used for real-time river forecasting. The national stream gaging program is more fully described in Wahl et al. (1995). One of the crucial roles of the USGS is to maintain current rating curves for stream gages used in forecasting and management of the nation's water resources. Changes in river cross-section geometry (Moody and Meade, 1990) can alter the stage-discharge relationship used to convert gaged stream heights into estimated discharges. The USGS routinely verifies and updates these rating curves and provides revised ratings to the user community. The agency also assures the quality and accuracy of archived historical streamflow data. During flood events, when scour, erosion, and out-of-bank flows can significantly change the cross-section of rated river channels, the USGS performs direct streamflow measurements to allow forecasters and managers to incorporate rapidly changing conditions in their management activities. For example, during the great flood of 1993 in the upper midwestern United States, over 2,000 field visits to collect supplemental discharge measurements or to check and repair gaging equipment were made by USGS hydrologic technicians (National Weather Service, 1994). Faced with growing demands for hydrologic data in a time of limited resources, the USGS continues to develop and apply techniques intended to help optimize the national stream gage network through the use of new cost-effective measurement technologies and optimal network design (Medina and Tasker, 1985). Extensions of hydrologic regionalization can help quantify the value of adding or discontinuing gaging activities at a particular location. Gaging networks can therefore be designed, analyzed, and modified through the use of
OCR for page 42
--> formal mathematical optimization techniques that balance the cost and information content of gaging activities (Moss and Tasker, 1991, 1995; Tasker, 1991b). Opportunities While the USGS maintains most of the stream gages in the nation, the National Weather Service, as the federal agency responsible for issuing flood forecasts, relies heavily on the accuracy and reliability of these gages for real-time river forecasting. Since most river gages record river stage rather than discharge, one of the significant sources of error in real-time flood forecasting can be the dynamic changes in mobile bed channel cross-sections during floods. Current practices address this problem through supplemental discharge measurements made during flooding events, when channel changes are believed to be significant. These supplemental measurements are not always possible during the particularly hazardous conditions associated with extreme floods (when they might be most valuable). The USGS expertise in scour and sediment transport, fluvial geomorphology, river hydraulics, and hydraulic measurement technologies may provide the basis for supplementing discharge measurements during floods with real-time process-based adjustments to gage rating curves. Multiobjective Network Design The USGS's expertise and application of optimization techniques to the cost-effective design of stream gaging networks provides a methodological foundation to help balance the competing demands for hydrologic data that are increasingly placed on the national stream gage network. For example, long records of unregulated discharge, such as the Hydroclimatic Data Network, provide an invaluable database to support research and risk-based decision making related to climate variability and change. In contrast, flood forecasting to protect lives and property may demand gaging for critical catchments associated with widespread land clearance and development. Growing demands placed on the national gaging program suggest opportunities to employ multiple-objective optimization in the design and modification of gaging networks. These techniques may be usefully applied to consider network design with mixed sensors (such as continuous and peak stage recorders) as well as tradeoffs between potentially competing demands for limited gaging resources (e.g., flood vs. drought needs, or flood warnings vs. climatic monitoring). New Measurement Technologies The National Research Council (1992) has noted the need and opportunity to take advantage of new and emerging technologies to support the national stream gaging program. The National Weather Service's deployment of the WSR-88D
OCR for page 43
--> Doppler weather radar was identified as one such timely opportunity. The evolving technologies in dual polarized radar (see Box 4.2) represent an extension that may ultimately be added to the WSR-88D weather radars. Obtaining data, particularly in an era of funding cutbacks, may require new and innovative instrumentation and monitoring techniques. Recent advances in microprocessors, electronics, and satellite communications are leading to a new generation of reliable, precise, and relatively inexpensive field instrumentation equipment. Improved instrumentation for stream gaging may improve the economy and reliability of the nation's gaging network for flood hazard evaluation. Large-scale remote sensing capabilities, such as NASA's Mission to Planet Earth and defense technology conversions, afford many opportunities to enhance the study of extreme events through cooperative research initiatives and inter-agency coordination. Existing and anticipated multisensor earth observations from space can document inundated land areas, suspended sediment concentrations, basin characteristics, and even approximate peak discharges for individual extreme flood events. When combined with local gage information and postflood field surveys, such monitoring from space can supplement conventional databases on extreme floods, presumably at greatly reduced cost. In addition to more accurate and cost-effective technology, the diverse needs of cooperators suggest the value of multiple levels of reliability in stream gaging. As resources for data collection, processing, publication, and archiving have BOX 4.2 Improved Rainfall Estimates with Modern Instruments Modernization and restructuring of the National Weather Service included the deployment of a national network of WSR-88D Doppler radars, which estimate precipitation rates from radar reflectivity. These estimates of precipitation rates can be Integrated over time to yield estimates of precipitation amount. While the spatial and temporal coverage is excellent, approximately 2 kilometers and 6 minutes, respectively, there are accuracy problems with the precipitation amounts, especially in snow and convective situations. Field experiments with alternating horizontally and vertically polarized radar signals have been successful in assessing the size and shape of falling precipitation particles. This "dual polarized" radar yields vastly improved estimates of rainfall rates. There is a distinct possibility that further testing will lead to an upgrade of the existing WSR-88D Doppler network to include dual polarization capability. The resultant rainfall data have the potential to revolutionize rainfall-based hydrology as we know it. Accurate rainfall estimates would be available over most of the continental United States every 15 to 30 minutes with an average horizontal spacing of 2 kilometers. Runoff and other hydrologic models will require extensive rethinking to take full advantage of this improved dataset.
OCR for page 44
--> declined, some users have elected to support only the maintenance of real-time measurement and telemetry capability at existing gage sites, without incurring the additional expense to verify and publish provisional observations. For water managers primarily interested in real-time river stages (e.g., for real-time reservoir operation), this may represent a cost-effective tradeoff. Alternatively, the USGS may want to consider offering cooperators data with lower levels of reliability, at lower cost. For example, for water managers primarily interested in accurate measurement and monitoring of low flows, cost-effective telemetered sensors are available that can be inexpensively installed on existing structures such as bridge abutments and river intake structures. While these installations would not be expected to survive extreme flooding, the sensors could be removed seasonally, or replaced if lost due to flooding. Compared to the expense of constructing and maintaining a powered gage house providing standard levels of reliability, this may offer a cost-effective data source for many users. The integration of real-time data collected by state and municipal data collection networks may similarly represent a useful supplement to the Survey's core gaging network. Synthesis of data from a variety of sources may require little more than the cost-effective agreement on common data standards between agencies. Role of the USGS The provision of reliable high-quality streamflow data remains a fundamental and invaluable role of the USGS. Interpretive studies directed at improving the effectiveness of the nation's stream gage network will continue to support this vital mission as the demand for streamflow data continues to experience resource limitations. The USGS's role in continuously improving the design and operation of the national gaging program will remain essential, even as new technologies and interdisciplinary hydrologic methods are integrated into USGS activities. River Scour State of the Practice Investigative studies of bridge scour conducted with local and regional cooperators provide a consistent information base to support research on scour and sediment transport. Interpretive studies also provide the mechanism to transfer products of the USGS research effort to other federal agencies, cooperators, and resource managers. Scour studies addressing the risk of failure for bridges and highway structures have a more direct risk management focus than flood frequency estimation. Beyond developing design criteria, the application of rapid screening techniques (Holnbeck and Parrett, 1997) provides a cost-effective means for resource managers to target detailed scour investigations for structures posing the greatest risks. Investigative studies also provide case studies of failure
OCR for page 45
--> modes for forensic analysis, such as the 1995 failure of the 1–Fasins. Pp. 61-61 in Georgia Wate5 bridges across Los Gatos Creek in California. These investigations support verification of analytical techniques and suggest additional research priorities. Opportunities Products of the National Research Program have suggested criteria for risk-based design of bridge and highway structures that can be more cost effective than traditional design criteria. The USGS's interpretive studies on bridge scour provide the opportunity to integrate risk-based criteria in the design of vulnerable structures. Scour-related failures of bridges and highway structures represent opportunities to better understand the critical factors and potential deficiencies in design criteria and estimated scour risks. The evaluation of failures may be enhanced by distinguishing failures in which the original science or design data may have been inadequate or failures due to hydrologic changes or errors in estimating design conditions. Adjustments of flood frequency estimates resulting from nonstationarity and persistence can result in adjustments to scour-based designs as well. Scour on Floodplains Severe flooding on the San Jacinto River in Houston resulted in scour damage to several major interstate pipelines transporting refined petroleum products along the eastern seaboard. The scour-related failure included scour in the broad coastal plain floodplain, resulting from major out-of-bank flows. Estimates of scour depth are used in traditional design practices for pipeline river crossings. Deep scour in the lithified materials of the coastal floodplain represents a low-probability/high-consequence event, for which protection in the form of deep excavation is not generally considered cost effective. This analysis is constrained by the uncertainty associated with estimating scour risks in floodplains. The need for risk assessment and cost-effective mitigation from this hazard represents an opportunity to extend the USGS's expertise in flood hydrology, geomorphology, and scour mechanics to develop management criteria for this low-probability/ high-consequence hazard. Nonstationarity The cumulative impacts of watershed activities can significantly alter the sediment budget and flow frequency characteristics of rivers (Graf et al., 1991), resulting in significant changes in scour risks. Beyond the use of new scour equations, the appropriate design flow selected for rivers experiencing, or expected to experience, significant land-use changes, represents a risk-based allocation problem balancing higher design and construction costs against the risk that
OCR for page 46
--> design discharge quantiles will change in the future. Interpretive studies of scour and failure in watersheds that have experienced land-use changes provide the opportunity to integrate research in watershed processes and the impact of land-use changes to risks and costs associated with the failure of structures located in the river channel. The USGS's interdisciplinary expertise in flood frequency analysis, fluvial geomorphology, and sediment transport provides the foundation to identify the limits of current practice and identify new challenges and research needs in design criteria and risk evaluation. The expertise and measurement technology developed to support the scour program may be directly transferable to support the dynamic evaluation of discharge and channel cross-section changes during flooding, supporting the national gaging program as well. Role of the USGS Interpretive scour studies support improved scour modeling, prediction, and design criteria for bridges and highway structures. The innovative use of acoustic Doppler profiling dramatically expands the information base on channel dynamics under high-flow conditions, supporting refinement of methods for scour prediction and stable design. Regional interpretive studies conducted in partnership with the states represent both technology transfer mechanisms from the USGS research program to local cooperators, and opportunities to verify and evaluate improved management and prediction capability. Drought Frequency and Hazard State of the Practice Frequency estimation techniques developed for floods are also applied to the estimation of low-flow frequencies (Vogel and Kroll, 1990). Regionalization of low-flow characteristics supports low-flow frequency analysis at ungaged sites. The USGS also conducts studies directly related to the management of drought risks (Hirsch, 1978), as well as the potential drought impacts of climate change (Tasker, 1991 a, 1993; Tasker et al., 1991; Wolock et al., 1993). Opportunities Frequency estimation for low flows raises many of the same issues related to assumptions of stationarity, homogeneity, and independence encountered in flood frequency estimation (Booy and Morgan, 1985; Cayan and Webb, 1992; CEA, 1994). The estimation of extreme drought risks represents a significant opportunity for both technology transfer from the National Research Program and integration of hydrologic science with planning and resource management. While
OCR for page 47
--> existing gage records provide data for the application of traditional frequency estimation techniques to drought risk, longer secular trends in streamflow may alter the drought risk estimated from the relatively short database of gaged streamflow. For example, tree ring evidence has suggested the unusually wet nature of the available hydrologic record that served as the basis for the ''over-allocation'' of the Colorado River in 1922 (Stockton et al., 1985). Dendroclimatogy used by Cleave and Stable (1989) similarly indicated significantly greater interannual persistence in hydrologic extremes than that recorded in twentieth-century stream gage records on the White River of Arkansas. A substantial opportunity exists to improve the accuracy and to quantify the uncertainty in estimates of hydrologic extremes under nonstationary conditions. While the western United States experienced an extreme multiyear drought from 1985 to 1991, no comparable drought has been recorded in historical streamflow records in the humid east. A number of Atlantic slope basins nevertheless experienced persistent droughts in the 1930s and 1960s. Given this experience, prudent water management would attempt to determine the likelihood of a "California" drought in the east. Traditional frequency analysis based on the assumed independence of annual low-flow series cannot provide an adequate response. The USGS's expertise in hydroclimatic linkages and flow frequency analysis represents an opportunity to improve the accuracy and quantify the uncertainty in estimating extreme drought risks. The spatial extent of extreme droughts may substantially exceed boundaries of hydrologically "homogeneous" regions identified with traditional regionalization techniques. For example, the drought of 1988 was rare not only in the magnitude of the precipitation deficit and low streamflows recorded at sites throughout the nation but also in the spatial extent of drought conditions across the midwest (Kunkel and Angel, 1989; Kunkel et al., 1989). (The Great Flood of 1993 manifested similarly extreme in both the magnitude of precipitation—about 30 inches in 4 months—and the spatial extent of summer flooding (Lott, 1993).) These events suggest a varying spatial scale associated with the recurrence interval of extreme hydrologic events that may not be well represented using standard methods of hydrologic regionalization and frequency analysis. Moreover, these extreme events are least likely to be well represented in our relatively short historical streamflow records. The use of paleohydrology to evaluate the magnitude and frequency of hydrologic extremes, as well as the spatial extent (Jarrett, 1990) of extreme hydrologic events, represents a significant opportunity to integrate traditional regionalization approaches and the emerging understanding of hydroclimatic linkages to improve the estimation of extreme hydrologic risks. Biological Versus Hydrologic Flow Duration Frequency-based estimates of low-flow quantiles (such as the 7-day, 10-year low flow or 7Q10) are regularly used in water quality regulation to derive
OCR for page 48
--> wasteload allocations and total maximum daily loads. The U.S. Environmental Protection Agency (EPA) also supports the use of "biological" low-flow quantiles, complementing the familiar statistical low-flow estimates (USEPA, 1990, 1991). Estimation of biological low-flow quantiles has been developed to reflect the continuous recovery periods thought to be required by biological populations recovering from low-flow stress. Heuristic techniques have been developed that utilize the historical hydrologic record to identify flow levels that would empirically achieve nominal frequency duration criteria, based on historical streamflow records. Biological low flows for acute, 1-hour/3-year (1B3) and chronic 4-day/ 3-year (4B3) low-flow quantiles are recommended and have been utilized by EPA (CFR, 1992) in setting toxic discharge standards for aquatic life protection. The uncertain relationship between biological and hydrologic flows (e.g., 7Q10) represents an opportunity to clarify management criteria and permit decisions through the integration of Water Resources Division's expertise in hydrologic frequency analysis and Biological Resources Division's expertise in habitat requirements for aquatic species. The Survey's strengths in these areas suggest a natural opportunity to integrate research on low flow frequencies and habitat requirements with the science-based regulatory initiatives in instream flow maintenance and wetland restoration within the USEPA and USDA. Role of the USGS Drought-related interpretive studies expand the information base supporting research (Liu and Stedinger, 1991) and represent a valued mechanism to transfer research products to resource managers and cooperators (Ludwig and Tasker, 1993). The USGS continues to play an integral role in support of drought management, serving as a source for interdisciplinary hydrologic expertise.
Representative terms from entire chapter: