Article

On the status of flood frequency analysis. Hydrol Process, 16: 3737-3740

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... The former emerging from the random natural variability of the hydrometeorological systems is inherent and irreducible, whereas the latter originates from the incomplete/insufficient knowledge on the system under study. Epistemic uncertainty arises from various sources such as parameter estimation and model structure (Apel et al., 2004;Kunkel, 2013;Singh and Strupczewski, 2002), which can be reduced by increasing the sample size and improving the knowledge of the process and its driver(s), respectively (Serinaldi, 2009;Van Gelder, 2001). In addition, epistemic uncertainty can also originate from selecting a distribution using statistical means (e.g., goodness-of-fit test), following local guideline, or taking a subjective perspective, as the true distribution is not possibly known a priori (Debele et al., 2017;Salas et al., 2013;Singh and Strupczewski, 2002). ...
... Epistemic uncertainty arises from various sources such as parameter estimation and model structure (Apel et al., 2004;Kunkel, 2013;Singh and Strupczewski, 2002), which can be reduced by increasing the sample size and improving the knowledge of the process and its driver(s), respectively (Serinaldi, 2009;Van Gelder, 2001). In addition, epistemic uncertainty can also originate from selecting a distribution using statistical means (e.g., goodness-of-fit test), following local guideline, or taking a subjective perspective, as the true distribution is not possibly known a priori (Debele et al., 2017;Salas et al., 2013;Singh and Strupczewski, 2002). In the nonstationary HFA, the selection of the model structure is particularly critical (Khaliq et al., 2006;, while in the stationary HFA the ignorance or indolence of the nonstationarity might yield considerable uncertainty. ...
Article
The use of the nonstationary hydrological frequency analysis (HFA) has been prompted when nonstationarity is diagnosed in hydrometeorological data. However, the inconclusive identification of the physical process(es) and driver(s) behind the nonstationarity challenges the identification of an appropriate model structure, and consequently might hinder its reliable implementation. To date, no solid consensus on whether the nonstationary HFA is always superior to the stationary HFA has been reached. Therefore, this paper aimed to advance the understanding of the stationary and nonstationary HFAs under nonstationary scenarios by illustratively comparing their performance in real applications, and examining the effects of the nonstationarity on the stationary HFA through a simulation study, especially from the perspective of the uncertainty. The investigation of the effects of the nonstationarity on the stationary HFA was conducted in two fundamental nonstationary scenarios, namely temporal trends in the mean and variance, in which the degree of nonstationarity was quantifiable and known a priori. The HFAs were conducted using the Particle Filter, a Bayesian filtering technique which was recently employed in the stationary HFA and was further extended for the nonstationary HFA in this paper. The illustrative comparison did not demonstrate a consistent superiority of either HFA approach in terms of both fitting efficiency and uncertainty. This result thus implied that the stationarity HFA could outperform the nonstationary HFA in some cases. Besides, the simulation investigation of the stationary HFA revealed that the increase of nonstationarity degree would lead to the deterioration in the analysis accuracy and the elevation of uncertainty. The uncertainty in the stationary HFA was found to primarily originate from the nonstationarity, while the distribution selection would be the other but secondary source of uncertainty. The results from both the illustrative comparison and the simulation investigation suggested that whether the stationary or nonstationary HFA outperforms the other could be associated with the degree and pattern of nonstationarity. Therefore, it is recommended to consider them when developing a strategic framework to select an appropriate approach to deal with nonstationary hydrometeorological data.
... For example, a flood with 100-year return period has been recommended in France for flood hazard mappings, while flood with 104-year return period considered for design and safety analysis of mega water infrastructures [2]. Moreover, several countries have developed frameworks suggesting which probability distribution should be used for flood frequency analysis, such as the lognormal distribution in China [20], the three-parameter log-Pearson type 3 distribution for the United States of America [21], and the generalized extreme value distribution in the United Kingdom [22,23]. However, despite the importance of investigating the robust flood frequency prediction model for improving the design, as well as for the efficient planning and management of the regional water resource projects, the flood frequency analysis in the region has not received much attention. ...
... The Weibull distribution has two versions of two-parameter and three parameter distributions with the shape parameter (α), scale parameter (β) and location parameter (γ, with γ set to zero yields the two-parameter Weibull distribution). Moreover, the generalized extreme value distribution has been popularly used for flood frequency analyses in the United Kingdom [23]; it has three parameters reflecting on shape (k), the scale (σ), and the location (µ). The extreme value type-II is one of the commonly used probability distributions for modeling extreme events with shape parameter (α), scale parameter (β), and location parameter (γ, γ = 0 yields the two-parameter Frechet distribution). ...
Article
Full-text available
The frequency and intensity of flood quantiles and its attendant damage in agricultural establishments have generated a lot of issues in Ethiopia. Moreover, precise estimates of flood quantiles are needed for efficient design of hydraulic structures; however, quantification of these quantiles in data-scarce regions has been a continuing challenge in hydrologic design. Flood frequency analysis is thus essential to reduce possible flood damage by investigating the most suitable flood prediction model. The annual maximum discharges from six representative stations in the Upper Blue Nile River Basin were fitted to the commonly used nine statistical distributions. This study also assessed the performance evolution of the probability distributions with varying spatial scales, such that three different spatial scales of small-, medium-, and large-scale basins in the Blue Nile River Basin were considered. The performances of the candidate probability distributions were assessed using three goodness-of-fit test statistics, root mean square error, and graphical interpretation approaches to investigate the robust probability distribution for flood frequency analysis over different basin spatial scales. Based on the overall analyses, the generalized extreme value distribution was proven to be a robust model for flood frequency analysis in the study region. The generalized extreme value distribution significantly improved the performance of the flood prediction over different spatial scales. The generalized extreme value flood prediction performance improvement measured in root mean square error varied between 5.84 and 67.91% over other commonly used probability distribution models. Thus, the flood frequency analysis using the generalized extreme value distribution could be essential for the efficient planning and design of hydraulic structures in the Blue Nile River Basin. Furthermore, this study suggests that, in the future, significant efforts should be put to conduct similar flood frequency analyses over the other major river basins of Ethiopia.
... For instance, it is pointed out in Beven (2001) that "several different distributions might give acceptable fits to the data available, all producing different frequency estimates in extrapolation". Statistical models do well in capturing information from past observations and use them for predicting future behavior, but statistics alone might not be enough to explain future flood characteristics (Wood, 1976;Klemeš, 1993;Campbell, 2001;Singh and Strupczewski, 2002;Kidson and Richards, 2005;Stedinger and Griffis, 2011). As pointed out by Klemeš (1993): "If any advance has to be done in flood frequency analysis it better come from more information on the physics of the phenomena, not from more mathematics". ...
... The estimation of flood quantiles depends on the method used for RFFA, so an inefficient method may have errors which could prevent any gaining from using regionalization (Rao et al., 2008). Moreover, many assumptions upon which regionalization methods are based are far from satisfied in most applications (Singh and Strupczewski, 2002). However, it is to be expected that the more homogeneous a region is, the greater the gain in using regional instead of at-site estimation (Cunnane, 1988;Rosbjerg, 2007). ...
Article
Full-text available
The risk of floods is expected to increase due to global warming and population growth. Future floods can be tougher in Central American due to its geographical location which is constantly under threat of hurricanes. In addition, there is a high vulnerability associated with lack of development that makes the region one of the highest flood risk areas on the globe. Aiming to contribute to the prevention and mitigation of natural disasters, the “Research Capacity Building in Nature-Induced Disaster Mitigation in Central America (CANDIM)” project directs efforts to build local research capacity in topics related to natural disasters. Within the CANDIM project the main goal of my PhD research will be to gain an understanding of the flood processes so as to be able to assess future flood scenarios in the short and long term, which are useful for flood forecasting and flood management respectively. The research will be done by means of a field laboratory at the Choluteca River floodplain in Tegucigalpa, the capital city of Honduras. The generality of the developed methods will be tested in other flood-prone areas in the Central American region. Prior to the PhD work, the purpose of this introductory essay is to describe the state of the art on techniques used for flood prediction; this includes hydrology, hydraulics and data analysis. As such, this essay also helps to establish guidelines for future specific research directions in my PhD project. This essay also discusses how flood-prediction techniques interact with other sub-activities inherent to flood-risk-management and early-warning systems, thus possibilities for interdisciplinary collaborations are proposed. The survey begins with the study of the flood-risk-management system. Then the topic of flood prediction for flood hazard assessment and for flood forecasting is covered, including commonly used modeling techniques such as rainfall-runoff modeling, flow routing, flood frequency analysis and probable maximum flood. The essay continues with a description of the physical characteristics and data availability of the field laboratory to be used in this project. Following the literature review, future research and interdisciplinary collaboration possibilities are proposed; the essay concludes with a plan for completion of the PhD project.
... However, when design is based on estimations of fitted distributions, there is a possibility that the modeling and sampling errors effect design floods (Alila and Mtiraoui, 2002). Many distributions have been suggested by researchers for fitting flood extremes data (e.g., Cunnane, 1989;Blazkova and Beven, 1997;Singh and Strupczewski, 2002;Saf, 2008). Use of probability distributions in different situations depends on the properties of distributions (Haan, 1994) and the hydroclimatic regime of the region. ...
... Use of probability distributions in different situations depends on the properties of distributions (Haan, 1994) and the hydroclimatic regime of the region. Examples of most frequently used distributions are the Log-Pearson Type III distribution in the US, the generalized extreme value distribution in Great Britain and the Log-Normal distribution in China (Singh and Strupczewski, 2002). There are more studies on flood frequency distributions such as Wallis (1988); Vogel et al., (1993);Haktanir (1991); Mutua (1994); Abdul Karim and Chowdhury (1995). ...
Conference Paper
Full-text available
Coastal regions are highly vulnerable to flood disasters. Due to climate change impacts on storms' frequency and severity, studies on coastal planning and adaptation to the hazards of sea level rise and storm surge are of increasing concern. Evaluation of flood inundation depth in different regions can be useful in mitigating flood hazards. This study aims to provide a methodology for inundation depth analysis, considering both inland and coastal flooding. Lower Manhattan in New York City has been considered as the case study. New York as the financial center of the world has a high population density, complex lifeline, and interconnected water bodies with high vulnerability to flood and wind hazard. For this purpose, GSSHA (grid surface and subsurface hydrological model) developed by USACOE is used. GSSHA uses two-step explicit finite volume scheme to route water for overland flow. In this study, this model is used for evaluation of inundation depth and for evacuation zone mapping. The results show a potential shift in the current evacuation maps. This could be attributed to climate change and climate variability of the storms characteristics in the study region.
... Considering this vulnerability, frequency analysis can be performed on the extreme water levels to estimate floods with different return periods based on estimations of fitted distributions. All around the world, manydistributions have been suggested for fitting extreme data including floods (e.g., Cunnane, 1989;Blazkova and Beven, 1997;Singh and Strupczewski, 2002;Saf, 2008). Using the proper flood frequency, an optimum estimation of extreme water level in different recurrence intervals can be achieved.Log-Pearson Type III distribution is one of the most frequently used distributions in the USA (Singh and Strupczewski, 2002).. ...
... All around the world, manydistributions have been suggested for fitting extreme data including floods (e.g., Cunnane, 1989;Blazkova and Beven, 1997;Singh and Strupczewski, 2002;Saf, 2008). Using the proper flood frequency, an optimum estimation of extreme water level in different recurrence intervals can be achieved.Log-Pearson Type III distribution is one of the most frequently used distributions in the USA (Singh and Strupczewski, 2002).. ...
Conference Paper
Destruction caused by recent hurricanes on natural and built environment along New York City coastlines emphasize the significance of delineating coastal floodplain and mapping the potential area of inundation followed by coastal floods. Floodplain delineation and appropriate planning in these areas could help mitigate some flood damage. In this study, a geographic information system (GIS) based method is proposed and applied for floodplain delineation in the coastal part of the Bronx River watershed, Bronx, New York City. Extreme water level in recurrence intervals of 2, 5, 10, 25, 50, and 100 years is determined by frequency analysis of the extreme sea levels in the region provided by NOAA (National Oceanic and Atmospheric Administration). To determine coastal floodplain followed by increasing water level, HEC-RAS model is used. The GIS (geographic information system) capabilities are used for illustration of inundated regions. The results of the study show that the extent of floodplain perpendicular to the river direction can be increased by up to 500 ft regarding the water level with 100-year return period. Taking into account these results in determining land use of the floodplain beside the river enables mitigation of the damages associated with coastal flooding caused by extreme water levels.
... The distributions suggested for fitting flood extremes data have been many (Singh and Strupczewski, 2002). Oztekin et al. (2007) applied parameter estimation methods to a comprehensive list of different distributions. ...
... Different studies were undertaken on distribution selection for flood data all over the world. The three-parameter log-Pearson type 3 distribution is the most frequently used distribution in the USA, whereas the generalized extreme value distribution in Great Britain, the lognormal distribution in China (Singh and Strupczewski, 2002). Several flood distributions have also been studied, for example in USA (Wallis, 1988;Vogel et al., 1993); UK, Australia, Italy Scotland, Turkey and Kenya (Haktanir, 1991;Mutua, 1994;Abdul Karim and Chowdhury, 1995). ...
Article
Full-text available
The paper discusses how Normal, Lognormal, and log-Pearson type 3 distributions were investigated as distributions for modelling at-site annual maximum flood flows using the Hazen, Weibull, and California plotting positions at Ogun-Oshun river basin in Nigeria. All the probability distributions when matched with Weibull plotting position gave similar values near the center of the distribution but varied considerably in the tails. The Weibull plotting position when matched with Normal, Log-normal and Log Pearson Type III probability distributions gave the highest Coefficient of determinations of 0.967, 0.987, and 0.986 respectively. Hazen plotting position gave minimal errors with the RMSE of 6.988, 6.390, and 6.011 for Normal, Log-normal, and Log-Pearson Type III probability distributions respectively. This implies that, predicting statistically using Hazen plotting position, the central tendency of predicted values to deviate from observed flows will be minimal for the period under consideration. Minimum absolute differences of 2.3516 and 0.5763 at 25-and 50-year return periods were obtained under the Log-Pearson Type III distribution when matched with Weibull plotting position, while an absolute difference of 0.2338 at 100-year return period was obtained under the Log-Pearson Type III distribution when matched with California plotting position. Comparing the probability distributions, Log-Pearson Type III distribution with the least absolute differences for all the plotting positions is the best distribution among the three for Ona River under Ogun-osun river basin study location.
... However, when designing is based on estimations of fitted distributions, there is this possibility that the modeling and sampling errors affect estimations of design floods (Alila and Mtiraoui 2002). Many probability distributions have been used by different investigators for extreme flood data (e.g., Cunnane 1989;Blazkova and Beven 1997;Singh and Strupczewski 2002;Saf 2008). Use of probability distributions in different situations depends on the hydro-climatic regime of the region (Haan 1994). ...
... Use of probability distributions in different situations depends on the hydro-climatic regime of the region (Haan 1994). Examples of the most frequently used distributions are Log-Pearson Type III distribution in US, Generalized Extreme Value distribution in Great Britain and Log-Normal distribution in China (Singh and Strupczewski 2002). There are more studies on flood frequency distributions in different parts of the worlds (Wallis 1988;Vogel et al. 1993;Haktanir 1991;Mutua 1994;Abdul-Karim and Chowdhury 1995). ...
Article
Full-text available
Climate change has resulted in sea level rise and increasing frequency of extreme storm events around the world. This has intensified flood damage especially in coastal regions. In this study, a methodology is developed to analyze the impacts of climate change on sea level changes in the coastal regions utilizing an artificial neural network model. For simulation of annual extreme sea level, climate signals of Sea Surface Temperature, Sea Level Pressure and SLP gradient of the study region and some characteristic points are used as predictors. To select the best set of predictors as neural network model input, feature selection methods of MRMR (Minimum Redundancy Maximum Relevance) and MI (Mutual Information) are used. Future values of the selected predictors under greenhouse gas emission scenarios of B1, A1B and A2 are used in the developed neural network model to project water level for the next 100 years. Sea levels with different recurrence intervals are determined using frequency analysis of historical and projected water level as well, and the impact of climate change in extreme sea level is investigated. The developed methodology is applied to New York City to determine the coastal region vulnerability to water level changes. The results of this study show remarkable increase in sea level in the New York City, which is an indicative of coastal areas vulnerability and the need to take strategic actions in dealing with climate change.
... Four main steps are required in order to carry out a comprehensive hydrological FA: i) de- 136 scriptive and exploratory analysis and outlier detection, ii) verification of FA assumptions, i.e. 137 stationarity, homogeneity and independence, iii) modeling and estimation and iv) evaluation and 138 analysis of the risk. The first step (i) is commonly carried out in univariate hydrological FA as 139 pointed out, e.g. by Rao and Hamed (2000), Kite (1988) and Stedinger et al. (1993) whereas in 140 the multivariate framework it was investigated recently by Chebana and Ouarda (2011b). Con- 141 trary to the univariate setting, exploratory analysis in the multivariate and functional settings is 142 not straightforward and requires more efforts. ...
... Univariate outliers are well defined and 318 their detection is straightforward (e.g. Hosking and Wallis, 1997; Rao and Hamed, 2000). This 319 topic is also relatively well developed in the multivariate setting (e.g. ...
Article
Full-text available
The prevention of flood risks and the effective planning and management of water resources require river flows to be continuously measured and analyzed at a number of stations. For a given station, a hydrograph can be obtained as a graphical representation of the temporal variation of flow over a period of time. The information provided by the hydrograph is essential to determine the severity of extreme events and their frequencies. A flood hydrograph is commonly characterized by its peak, volume, and duration. Traditional hydrological frequency analysis (FA) approaches focused separately on each of these features in a univariate context. Recent multivariate approaches considered these features jointly in order to take into account their dependence structure. However, all these approaches are based on the analysis of a number of characteristics and do not make use of the full information content of the hydrograph. The objective of the present work is to propose a new framework for FA using the hydrographs as curves: functional data. In this context, the whole hydrograph is considered as one infinite-dimensional observation. This context allows us to provide more effective and efficient estimates of the risk associated with extreme events. The proposed approach contributes to addressing the problem of lack of data commonly encountered in hydrology by fully employing all the information contained in the hydrographs. A number of functional data analysis tools are introduced and adapted to flood FA with a focus on exploratory analysis as a first stage toward a complete functional flood FA. These methods, including data visualization, location and scale measures, principal component analysis, and outlier detection, are illustrated in a real-world flood analysis case study from the province of Quebec, Canada.
... Several standard frequency distributions have been extensively studied in the statistical analysis of hydrologic data. Physical processes which generate extreme events are rarely considered for the choice of the model Singh and Strupczewski, 2002). The selection of the most appropriate distribution of annual maximum or peaks over threshold series has received widespread attention. ...
... It turns out that many seemingly small effects in the generative process of the Lognormal distribution can lead to a power-law tail instead (Champernowne, 1953;Mandelbrot, 1997Mandelbrot, , 2003Turcotte, 1997). Singh and Strupczewski (2002) noted that great strides in hydrologic data collection and conceptualization of the processes surrounding floods have been made. However, there is no consideration of the hydrologic processes in the frequency analysis and for model selection. ...
Article
The study of the tail behaviour of extreme event distributions is important in several fields such as hydrology, finance, and telecommunications. Based on two classifications and five graphical criteria, this paper presents a practical procedure to select the class of distributions that provides the best fit to a dataset, especially for the right tail (large extreme events). Some numerical illustrations show that, almost all graphical tools allow discriminating between the regularly varying class and the sub-exponential class of distributions and lead to coherent conclusions.
... Warnings "to recognize the nature of the physical processes involved and their limitations in connection with the use of statistical methods" were also raised by Horton (1931). One of the more pressing problems in FFA considered left unresolved at the end of the twentieth century (e.g., Bobée and Rasmussen 1995;Singh and Strupczewski 2002) regards the need of better integrating the knowledge of hydrological processes within the FFA methods used for engineering problems in order to reduce the gap between practice and research in hydrology. ...
Article
Full-text available
The Two-Component Extreme Value (TCEV) distribution is traditionally known as the exact distribution of extremes arising from Poissonian occurrence of a mixture of two exponential exceedances. In some regions, flood frequency is affected by low-frequency (decadal) climate fluctuations resulting in wet and dry epochs. We extend the exact distribution of extremes approach to such regions to show that the TCEV arises as the distribution of annual maximum floods for Poissonian occurrences and (at least two) exponential exceedances. A case study using coastal basins in Queensland and New South Wales (Australia) affected by low-frequency climate variability, shows that the TCEV produces good fits to the marginal distribution over the entire range of observed values without the explicit need to resort to climate covariates and removal of potentially influential low values. Moreover, the TCEV reproduces the observed dog-leg, a key signature of different flood generation processes. A literature review shows that the assumptions underpinning the TCEV are conceptually consistent with available evidence on climate and flood mechanisms in these basins. We provide an extended domain of the TCEV distribution in the L-moment ratio diagram to account for the wider range of parameter values encountered in the case study and show that for all basins, L-skew and L-kurtosis fall within the extended domain of the TCEV.
... Based on their probability distributions, these methods calculate the average return period to extreme streamflows based on their probability distributions. In classical flood frequency analysis (FFA) and using extreme value theory, by fitting an estimated probability distribution to the observed streamflow values, the probability distribution of the maximum discharges or extreme floods is defined (Fréchet 1927;Gumbel 1958;Khaleghi and Varvani 2018;Singh and Strupczewski 2002;Merz and Bloschl 2008). The FFA methods have some limitations, the main disadvantage of these models is that most do not allow the generation of temperature time series, which is essential in predicting how snow melts. ...
Article
Full-text available
Prediction of rainfall-runoff process, peak discharges, and finally flood hydrograph is essential for flood risk management and river engineering projects. In most previous studies in this field, the precipitation rates have been entered into the models without considering seasonal and monthly distribution. In this study, the daily precipitation data of 144 climatology stations in Iran were used to evaluate the seasonal and monthly pattern of flood-causing precipitation. Then, by determining the rainy seasons and seasonal fit of precipitation with a probabilistic model and using regional precipitation, a semi-distributed conceptual model of rainfall-runoff (MORDOR-SD) was trained and validated using the observed discharge data. Flood prediction was performed using climatic data, modeling of hydrological conditions, and extreme flow data with high performance. According to the results, the Nash–Sutcliffe and Kling–Gupta coefficients were 0.69 and 0.82 for the mean daily streamflow, 0.98 and 0.98 for the seasonal streamflow, 0.98 and 0.94 for the maximum discharges, and 0.57 and 0.78 for low flows, respectively. Moreover, the maximum daily discharges in different return periods were estimated using the results of the MORDOR-SD model, considering the probability distribution function of the probabilistic model of central precipitation (MEWP), the probabilistic model of adjacent precipitation, and probability distribution function of the previous precipitation. Finally, the extreme flows were predicted and compared using different methods including the SCHADEX, regional flood analysis, GRADEX, and AGREGEE. The results showed that the methods GRADEX, AGREGEE, and SCHADEX have the highest performance in predicting extreme floods, respectively.
... However, in practice applications, frequency analysis calculation remains the most common and effective method of determining the design flood. It is a critical basis for the design of hydraulic facilities and flood control management measures, especially within the framework of ongoing climate change, and is essential for identifying extreme values with a specified likelihood of occurrence [10][11]. Hydrologic frequency analysis began about 1880~1890, Herschel and Rafter in the United States first applied the frequency curve (then called the duration curve); in 1896, Horton applied the frequency analysis method to the runoff study; in 1913~1914, Fuller and Hazen published papers describing the application of the frequency method; in 1921, Hazen proposed the use of logarithmic lattice probability paper and began fitting lines on it, which was the first lognormal distribution application [12]. ...
Article
Full-text available
Global warming is causing dramatic climate change, leading to rainfall that triggers a more severe risk of flooding. The conventional moment method and linear moment method’s basic theories for the design of flood frequency analyses were introduced and compared for frequency analysis of rainfall. The rainfall data from 149 long-series representative rainfall stations in Jiangsu Province were processed using the random forest (RF) algorithm to address the missing data encountered during the actual rainfall monitoring process. The frequency was calculated using conventional and linear moment methods, and the differences and advantages of the two methods were analyzed by comparing the calculation results of different sites under each method. The results show that under low design frequencies, both the conventional moment and linear moment methods exhibit minimal errors, rendering them suitable for calculating design rainfall. The linear moment method outperforms the conventional moment method in terms of the unbiasedness of the estimation process and for very large values, and that the parameters estimated by the linear moment method are more accurate. In practical hydrological frequency calculations, different computation methods can be chosen according to specific needs to enhance calculation accuracy.
... These forecasts, although brief, can still offer an opportunity to mitigate the effects of flood events. Nevertheless, the unreliability of meteorological forecasts has resulted in numerous false alarms, causing people to no longer take these predictions seriously [19]. As a consequence of the inaccuracies in flood forecasting using rainfall data, statistical methods such as Normal, Extreme Value Type I, Log Normal, Log Pearson Type III, and others have been employed to predict floods. ...
Article
Full-text available
Hydraulic structures like weirs, dams, spillways, and bridges require precise estimation of flood peaks at the intended return period in order to be planned, built, and maintained. In this paper, the findings of a study conducted on the Robigumoro River and the flow measurements taken are presented. The flood frequency analysis of the Robigumoro River was performed using the Gumbel distribution, which is a probability distribution commonly used for modeling river flows. This analysis is crucial as it aims to safeguard the lives and properties located downstream from the catchment area. The Gumbel distribution was employed to model the highest annual river flow over a span of 20 years (1990-2009). The investigation was carried out by the Ethiopian water and energy office, Abay Basin Development Authority. The Robigumoro River's maximum annual discharge over a 20-year period (1990-2009) was modeled using Gumbel distribution technique. From the trend line equation, R 2 value of 0.935 which shows that Gumbel's distribution is suitable for predicting expected flow in the river. It can be concluded that the Gumbel distribution can accurately forecast expected river flow. The flood peak values were calculated using the same procedure for various return times. This helps with storm management in the research region. The estimated discharges obtained using the Gumbel's distribution and return periods (T) of 2 years, 10 years, 50 years, 100 years, 150 years, 200 years, 300 years and 400 years are 177.327m 3 /s, 320.784m 3 /s, 446.553m 3 /s, 499.722m 3 /s, 530.727m 3 /s, 552.698m 3 /s, 583.38m 3 /s, and 605.577m 3 /s respectively. The accuracy of flood forecasts in the basin indicates their potential use in various applications such as the design of crucial hydraulic structures, river reach planning, construction of bridges, and conservation efforts for Robigumoro watershed.
... Floods historically had a special place in the development of hydrology, mainly because their statistical (frequency) analysis plays an essential role in designing civil engineering structures (Singh and Strupczewski 2002;Rogger et al., 2012a;Zhu et al., 2018;Mishra et al. 2022). It can be argued that both hydrology and water engineering have developed (coevolved) with hydrological extremes when dealing with changing flood risk over time for human societies (Di Baldassarre et al. 2017), a topic particularly relevant when dealing with the threat of changes in flooding dynamics, which has become an essential topic of discussion in recent times (Blöschl et al. 2015;Berghuijs et al. 2017;Sharma, Wasko, and Lettenmaier 2018;Blöschl et al. 2019). ...
Article
Flood frequency analysis lies at the core of hydrology and water engineering as one of the most required estimates for water planning and design of hydraulic structures. For ungauged basins, where no information is available, various flood regionalisation techniques have varying degrees of complexity and resulting performance, depending on the study's goal, the region analysed, and the information available. This study evaluates the use of hydrological models for flood regionalisation in Chile, using 1) A large sample dataset of 101 catchments; 2) the continuous simulation approach with the GR4J model; 3) the leave-one-out strategy for performance testing; and, 4) two regionalisation methods: Nearest neighbour (NN) and physical similarity, together with several alternative objective functions for calibration purposes and regionalisation strategy (in all cases adopting a single criterion, single variable and determinist approach for the parameter’s selection). Our results showed that performance (both in calibration–validation and regionalisation) is highly variable (in terms of reproducing the runoff hydrograph and flood statistics), depending on the catchment’s aridity (e.g., around 66–82% of catchments with NSE above 0 in humid regions but it severely drops to 12–44% of catchments with NSE above 0 when evaluating arid catchments). We also found that flood-specific calibration strategies produce better results for floods but poorer performance in runoff hydrograph reproduction. Finally, we highlight that our regionalisation results were in close agreement with those from one of the currently recommended methods by Chilean engineering for flood regionalisation. This is particularly promising, considering that the continuous simulation approach gives access to the complete time series and not only flood statistics. We end this manuscript by discussing several sources of uncertainty, hoping that these can be accounted for in future studies.
... This is a dangerous data manipulation because physically dependent variables, such as flow depth and velocity, are considered to randomly vary in a wide range of possible combinations among all the bridge sites that form the group under consideration, introducing uncertainty that is higher than reasonably expected. Flood estimation has historically been a big issue in the hydrological community (Singh and Strupczewski 2002;Zaghloul et al. 2020). In this context, flood prediction in ungauged basins (PUB) encompasses higher uncertainty (in comparison with gauged ones) because of the lack of data available (e.g., Hrachowitz et al. 2013). ...
... Additional studies have evaluated the suitability of selecting distributional alternatives for flood frequency analysis, including [39][40][41][42][43]. Given findings reported by the World Meteorological Organization (WMO) (1989) wherein a survey was conducted and concluded that log-normal distribution is one of the widely used distributions for flood frequency analysis [44], this method was selected for flood frequency analyses in this study. ...
Article
Full-text available
Hydrologic drought is a frequent phenomenon in the transboundary Kabul River Basin (KRB), the vital resource shared between the two nations of Afghanistan and Pakistan. While the KRB has vast water resources, these resources are subject to extreme hydrologic events and, as a result, are not adequately managed to deal with the stress during drought conditions in the transboundary setting with no formal agreement or treaty. Rapid population growth and increases in agricultural land will require balanced water distribution to meet the array of needs. The Soil and Water Assessment Tool (SWAT) is used to evaluate distribution options for flow frequencies under existing and proposed large dams in the headwaters of the KRB. The calibrated SWAT streamflow results are employed for statistical analyses of the Standardized Streamflow Index (SSI) and Annual Cumulative Deficit Volume (ACDV) to investigate hydrologic drought time series and identify the role of proposed dams to be used for drought mitigation. Based on the SSI, proposed dams can provide additional storage that will partially address hydrologic droughts in the future. At the same time, restrictions on agricultural land expansion and water intakes are other measures to facilitate balanced water resource availability. This study discusses the intricacies of transboundary conflict and cooperation, water rights, and drought risk management; as well, recommendations for a KRB transboundary Drought Task Force (DTF) between Afghanistan and Pakistan are provided, to develop a science-based policy for using the stored waters in large dams for drought relief, fairly and transparency.
... Studies on the FFA is mainly based on certain popular probability frequency distribution functions, such as Extreme Value distribution (EV), Gumbel's Extreme Value distribution (GEV), Log-Pearson Type III, log-normal (LN), etc. [3]. Careful investigation on these methods has shown that suitability of frequency distribution functions varies with variation in a geographical area, such as GEV in Britain [39], LN distribution in China [40] and LPT-3 distribution in the USA [41,42]. Beside, these analyses can provide only some statistical values, where spatial aspect of flood is absent. ...
Article
Full-text available
Floods are one of the major concerns in the world today. The lower reaches of the river coming from the western side of West Bengal are often affected by floods. Thereby estimation and prediction of flood susceptibility in the light of climate change have become an urgent need for flood mitigation and is also the objective of this study. The historical floods (1978–2018) of the monsoon-dominated lower Dwarkeswar River, as well as the possibility of future floods (2020–2075), were investigated applying peak flow daily data. The possibilities of future flow and floods were estimated using rainfall data from MIROC5 of CMIP5 Global Circulation Model (GCM). Besides, four extreme value distribution functions like log-normal (LN), Log-Pearson Type III (LPT-3), Gumbel’s extreme value distribution (EV-I) and extreme value distribution-III (EV-III) were applied with different recurrence interval periods to estimate its probability of occurrences. The flood susceptibility maps were analyzed in HEC-RAS Rain-on-grid model and validated with Receiver Operating Characteristic (ROC) curve. The result shows that Log-Pearson-Type-III can be very helpful to deal with flood frequency analysis with minimum value in Kolmogorov–Smirnov (K–S = 0.11676), Anderson–Darling (A–D = 0.55361) and Chi-squared test (0.909) and highest peak discharge 101.9, 844.9, 1322.5, 1946.2, 2387.9 and 2684.3 cubic metres can be observed for 1.5, 5, 10, 25, 50 and 75 years of return period. Weibull’s method of flood susceptibility mapping is more helpful for assessing the vulnerable areas with the highest area under curve value of 0.885. All the applied models of flood susceptibility, as well as the GCM model, are showing an increasing tendency of annual peak discharge and flood vulnerability. Therefore, this study can assist the planners to take the necessary preventive measures to combat floods.
... The advantage of the parametric model is related to the robustness of the conclusions even for a misspecified model (van den Goorbergh et al., 2005). In particular for the hydraulic design of major structures, parametric models are preferred to nonparametric ones as reported by Singh and Strupczewski (2002). Note that in Patton (2003Patton ( , 2004 and Fermanian et al. (2004), the variation is taken into account using conditional copula whereas Dias and Embrechts (2010) considered copulas in multivariate time series analysis. ...
Article
To study hydrological events, such as floods and droughts, frequency analysis (FA) techniques are commonly employed. FA relies on some assumptions, especially, the stationarity of the data series. However, the stationarity assumption is not always fulfilled for a variety of reasons such as climate change and human activities. Thus, it is essential to check the stationarity or we should develop models that take into account the non-stationarity in a new risk assessment framework. On the other hand, a majority of hydrological phenomena are described by a number of correlated characteristics. To model the dependence structure between these hydrological variables, copulas are the most employed tool. Generally in the literature, the multivariate model is assumed to be the same over time even though multivariate stationarity is required. Considering the non-stationarity in the dependence structure is important because when the copula parameter changes, the multivariate quantile curve changes accordingly. Different scenarios can be considered when choosing a multivariate non-stationary model since several variables and a dependence structure are involved. The objective of the present study is to construct a model that integrates simultaneously multivariate and non-stationarity aspects along with hypothesis testing. For the copula part, we consider versions called Dynamic copulas and series of association measures are obtained through rolling windows of the corresponding series. Adapted versions of the AIC criterion are employed to select the final model (margins and copula). The procedure is applied to a flood volume and peak dataset from Iran. The obtained model constitutes of a lognormal distribution for the margins with linear trend in the peak series, stationary for the volume series and a quadratic trend in the logistic Gumbel copula parameter for the dependence structure.
... The conventional approach in rainfall frequency analysis chooses a proper parametric probability distribution model to individually fit the observed annual maximum rainfall data of different durations. The choice of a distribution model for the rainfall intensity-frequency relations is largely statistical without much physical justification [51]. ...
Article
Full-text available
Hydro-infrastructural systems (e.g., flood control dams, stormwater detention basins, and seawalls) are designed to protect the public against the adverse impacts of various hydrologic extremes (e.g., floods, droughts, and storm surges). In their design and safety evaluation, the characteristics of concerned hydrologic extremes affecting the hydrosystem performance often are described by several interrelated random variables—not just one—that need to be considered simultaneously. These multiple random variables, in practical problems, have a mixture of non-normal distributions of which the joint distribution function is difficult to establish. To tackle problems involving multivariate non-normal variables, one frequently adopted approach is to transform non-normal variables from their original domain to multivariate normal space under which a large wealth of established theories can be utilized. This study presents a framework for practical normal transform based on the third-order polynomial in the context of a multivariate setting. Especially, the study focuses on multivariate third-order polynomial normal transform (TPNT) with explicit consideration of sampling errors in sample L-moments and correlation coefficients. For illustration, the modeling framework is applied to establish an at-site rainfall intensity–duration-frequency (IDF) relationship. Annual maximum rainfall data analyzed contain seven durations (1–72 h) with 27 years of useable records. Numerical application shows that the proposed modeling framework can produce reasonable rainfall IDF relationships by simultaneously treating several correlated rainfall data series and is a viable tool in dealing with multivariate data with a mixture of non-normal distributions.
... To carry out frequency analysis, Log-Pearson Type III has been fitted on storm surge data. Selecting the best-fitting distribution is not within the scope of this study, so Log-Pearson Type III was selected because it is the most widely used distribution in the United States for flood studies, as indicated by Singh and Strupczewski (2002); it is also suggested by the US Water Resources Council (Karamouz et al. 2017). A Log-Pearson Type III distribution as illustrated in Eq. (1) was fitted to the storm surge data from 1926 to 2017 including Kings Point data and data generated using Battery Park Station data ...
Article
Full-text available
The alteration of a watershed's hydrologic response due to urban development, population growth, global warming, and sea level rise have increased the frequency and intensity of floods. In order to cope with the new challenges in coastal flood management, many efforts were made after Superstorm Sandy. These efforts call for better understanding of flood hazard, better understanding of the operation of infrastructures in a resilience context, ways to mitigate hazard impacts, rebuilding efforts by adaptive design, and developing a unified scale of resilience for measuring performance. In this paper, attempts have been made to address implementation of these measures. The main purpose of this study is to improve resilience in infrastructures, particularly wastewater-treatment plants (WWTPs), which play a pivotal role in urban lifelines in New York City. To do this, first, a flood inundation map has been generated to evaluate the current response of the study area to a 100-year flood and determine the flood inundation depth at WWTPs. Next, a multicriteria decision-making (MCDM) approach was utilized to quantify the resilience index as a system performance indicator. Afterward, two approaches based on resilience-improving measures have been considered to improve this index. The first approach is to provide redundancies between WWTPs to make a platform for WWTPs' cooperation to move the sewage between them. The second approach is to implement adaptive hazard mitigation practices such as best management practices (BMPs) based on a proposed framework of a key initiative in New York City. In order to prioritize various groups of BMPs, five methods of flood mitigation practices, namely resist, delay, discharge, store, and retreat, have been ranked using experts' opinions in a MCDM framework. Thereafter, resilience improvements based on the two aforementioned approaches have been compared by considering financial resources allocation to each WWTP, and the most efficient alternative solutions have been chosen. The methodology outlined in this paper can be utilized in other urban coastal settings to plan for better flood preparedness.
... By the conventional method, rainfall frequency analysis is conducted by choosing a proper parametric probability distribution model to separately fit the observed annual maximum rainfall data of different durations. The choice of a distribution model for the rainfall intensity-frequency relations is largely statistical without much physical justification (Singh and Strupczewski 2002). The use of parametric distribution models has the advantage of automatic compliance of monotonicity condition requiring that the magnitude of rainfall intensity (or depth) of a given duration will increase with the cumulative probability. ...
Article
Full-text available
Establishing the rainfall intensity–duration–frequency (IDF) relations by the conventional method, the use of parametric distribution models has the advantage of automatic compliance of monotonicity condition of rainfall intensity and frequency. However, fitting rainfall data to a distribution separately by individual duration may possibly produce undulation and crossover of IDF curves which does not comply physical reality. This frequently occurs when rainfall record length is relatively short which often is the case. To tackle this problem this study presents a methodological framework that integrates the third-order polynomial normal transform (TPNT) with the least squares (LS) method to establish rainfall IDF relations by simultaneously considering multi-duration rainfall data. The constraints to preserve the monotonicity and non-crossover in the IDF relations can be incorporated easily in the LS-based TPNT framework. Hourly rainfall data at Zhongli rain gauge station in Taiwan with 27-year record are used to establish rainfall IDF relations and to illustrate the proposed methodology. Numerical investigation indicates that the undulation and crossover behavior of IDF curves can be effectively circumvented by the proposed approach to establish reasonable IDF relations.
... The empirical method is one of the most popular methods for engineering design and planning. Thus, many hydrologists have carried out various researches to improve flood frequency analyses [23][24][25][26][27]. The traditional flood frequency analysis was performed under the stationary assumption, which means that floods are the consequence of identically distributed and independent random process. ...
Article
Full-text available
Due to global climate change, it is possible to experience the new trend of flood in the near future. Therefore, it is necessary to consider the impact of climate change on flood when establishing sustainable water resources management policy. In order to predict the future flood events, the frequency analysis is commonly applied. Traditional methods for flood frequency analysis are based on the assumption of stationarity, which is questionable under the climate change, although many techniques that are based on stationarity have been developed. Therefore, this study aims to investigate and compare all of the corresponding effects of three different data sets (observed, RCP 4.5, and 8.5), two different frequency models (stationary and non-stationary), and two different frequency analysis procedures (rainfall frequency first approach and direct discharge approach). As a result, the design flood from the observed data by the stationary frequency model and rainfall frequency first approach can be concluded the most reasonable. Thus, the design flood from the RCP 8.5 by the non-stationary frequency model and rainfall frequency first approach should be carefully used for the establishment of flood prevention measure while considering climate change and uncertainty.
... Frequency analysis can be performed on either extreme rainfall or water level data, depending on the considered flood type (i.e., inland or coastal). Different probability distributions have been suggested to fit to flood extremes' data (e.g., Singh and Strupczewski 2002;Saf 2008;Li et al. 2015). According to Abida and Ellouze (2007) the most common distributions for flood analysis are Gumbel, generalized extreme value, log-Pearson type III and log normal. ...
Article
Full-text available
Due to increasing flood severities and frequencies, studies on coastal vulnerability assessment are of increasing concern. Evaluation of flood inundation depth and extent is the first issue in flood vulnerability analysis. This study has proposed a practical framework for reliable coastal floodplain delineation considering both inland and coastal flooding. New York City (NYC) has been considered as the case study because of its vulnerability to storm surge-induced hazards. For floodplain delineation, a distributed hydrologic model is used. In the proposed method, the severities of combined inland and coastal floods for different recurrence intervals are determined. Through analyzing past storms in the study region, a referenced (base) configuration of rainfall and storm surge is selected to be used for defining flood scenarios with different return periods. The inundated areas are determined under different flooding scenarios. The inundation maps of 2012 superstorm Sandy in NYC is simulated and compared with the FEMA revised maps which shows a close agreement. This methodology could be of significant value to the planners and engineers working on the preparedness of coastal urban communities against storms by providing a platform for updating inundation maps as new events are observed and new information becomes available.
... (Adeboye and Alatise, 2007). The dispersions recommended for fitting surge extremes information have been numerous (Singh and Strupczewski, 2002 This review in this way applies the Gumbel factual dispersion for surge recurrence investigation. To the best of the creator's information, no past reviews in the region have endeavored to model surge releases utilizing the Gumbel dispersion, a stochastic creating structure that deliver irregular results and surge streams fit Gumbel dissemination display. ...
Article
Full-text available
Flood frequency analysis is the most important statistical technique in understanding the nature and magnitude of high discharge in a river. The objective of frequency analysis is to relate the magnitude of events to their frequency of occurrence through probability distribution. The scale and shape parameters of the distribution were estimated using method of moments. The study which was carried in Vijayawada aimed at Prediction of Flood frequency analysis of Krishna river of Prakasam barrage at Vijayawada using Gumbel's, California, Hazen's methods. These are estimated using different flood data from 1990-2014 of Prakasam Barrage which were collected from Water Resources Department of Vijayawada. The magnitude of the flood out comes to be for 20 years and 50 years is1823.33TMC and 1873.34 TMC.
... Other references on the application and extension of nonparametric methods can be found in Refs 13-15. Singh and Strupczewski [16] reported that nonparametric methods are of limited use for the hydraulic design of major structures. ...
Chapter
It is widely acknowledged that both climate and land use changes modify flood frequency, thereby challenging the traditional assumption that the underlying stochastic process is stationary in time, and that the annual maximum flood corresponds to an independent identically distributed (iid) process. In this article, we employ a semiparametric approach to estimate flood quantiles conditional on selected “climate indices” that carry the signal of structured low-frequency climate variation, and influence the atmospheric mechanisms that enhance or retard local precipitation and flood potential. The semiparametric approach that maximizes the local likelihood of the observed annual maximum peak in the climatic predictor state space is applied to estimate conditional flood quantiles for the Blacksmith Fork River near Hyrum (BFH), Utah. The estimated conditional flood quantiles correlate well with the observed annual maximum peaks, thus offering prospects for reconstructing past flood series as well as for short-term forecasting. Keywords: floods; climate; semiparametric methods; design; flood; flood reconstructions; forecasting
... On the other hand, well-evidenced sea level rising will undoubtedly intensify the probability of extreme events in the coastal region (Cheon and Suh, 2016;Fortunato et al., 2016). The rate of global sea level rise is about 2 mm per year during the last century and has the potential to increase in the future due to the impact of global warming (DeConto and Pollard, 2016;IPCC, 2007) Therefore, it is necessary to conduct the flood frequency estimation and investigate the changing behaviors of flooding disaster in the PRD, which is of great importance for hazard prevention, mitigation and sustainable region development (Al-Futaisi and Stedinger, 1999;Singh and Strupczewski, 2002). ...
... Flood frequency analysis is one of the most challenging problems for assessment of the widest possible range of lowprobability flood events under limited data (Costa, 1978;Jin and Stedinger, 1989;Bobée et al., 1993;Singh and Strupczewski, 2002;Naulet et al., 2005;Stedinger and Griffis, 2011;Kjeldsen et al., 2014;Nguyen et al., 2014;Toonen, 2015;Halbert et al., 2016). Conventional flood frequency analyses over time scales of decades and centuries in arid and semiarid regions may provide misleading results because the gauged flood record is too short to accurately assess long-term flood information Baker et al., 1983;Stedinger and Cohn, 1986;Stedinger and Baker, 1987;Hua, 1987;O'Connor and Webb, 1988;Guo, 1991;Fanok and Wohl, 1997). ...
Article
Water depth above the flood deposits has not been taken into account in calculations of the palaeoflood peak stages, which only provide a minimum estimate of palaeoflood stage. Here we present a new method, slackwater flow depth, to assess palaeoflood peak stage and to reduce the underestimation of palaeoflood stage. Palaeoflood slackwater deposits (SWDs) were indentified by palaeohydrological criteria in cliff riverbank on the Yanhe River, middle Yellow River basin. Palaeoflood events recorded in four layers of SWD were dated by optical stimulated luminescence to 9.5–8.5 ka. The estimation of palaeoflood maximum stage was 778.3 m using the slackwater flow depth method and the palaeoflood peak discharge is 15,000 m³/s using the step-backwater method. Palaeoflood results greatly extend the current flood data series in the Yanhe River basin. The regional flood history including gauged flood, historical and palaeoflood data was compiled and evaluated for the major tributaries of the middle Yellow River. The relationship between palaeoflood peak discharges and drainage areas in this region fit well with the global maximum curves. The results of site-specific and regional palaeoflood evaluations demonstrate that the approach estimates the true palaeoflood peak stage and discharges and improves the flood frequency analysis of extreme and rare floods for a particular basin. Meanwhile, the advantages and uncertainties of this method need ongoing discussion in palaeoflood investigations.
... In addition the model employed here can also incorporate subjective adjustment factors resulting from consultation with stakeholders with local hydrological knowledge. The ability to add possibly subjective, expert hydrological information to the Bayesian model (also see Viglione et al., 2013) can perhaps assuage the concern that flood frequency analysis is dominated by statistical analysis rather than hydrology (Singh and Strupczewski, 2002). This point was borne out Flood Risk Manager, 2014; Lane et al., 2011;Odoni and Lane, 2010), but the subjective process is unlikely to be done consistently or transparently if it is not part of the recommended protocol. ...
Article
Full-text available
This paper describes a Bayesian statistical model for estimating flood frequency by combining uncertain annual maximum (AMAX) data from a river gauge with estimates of flood peak discharge from various historic sources that predate the period of instrument records. Such historic flood records promise to expand the time series data needed for reducing the uncertainty in return period estimates for extreme events, but the heterogeneity and uncertainty of historic records make them difficult to use alongside Flood Estimation Handbook and other standard methods for generating flood frequency curves from gauge data. Using the flow of the River Eden in Carlisle, Cumbria, UK as a case study, this paper develops a Bayesian model for combining historic flood estimates since 1800 with gauge data since 1967 to estimate the probability of low frequency flood events for the area taking account of uncertainty in the discharge estimates. Results show a reduction in 95% confidence intervals of roughly 50% for annual exceedance probabilities of less than 0.0133 (return periods over 75 years) compared to standard flood frequency estimation methods using solely systematic data. Sensitivity analysis shows the model is sensitive to 2 model parameters both of which are concerned with the historic (pre-systematic) period of the time series. This highlights the importance of adequate consideration of historic channel and floodplain changes or possible bias in estimates of historic flood discharges. The next steps required to roll out this Bayesian approach for operational flood frequency estimation at other sites is also discussed.
... However it is important to note that although the GL and GP distributions are recommended for the UK, a wide range of statistical distributions exist (see Kidson and Richards (2005) for summary of distributions) which are recommended for use in other countries (e.g. USA: log-Pearson type 3, China: lognormal, and previously in UK: generalised extreme value (Singh and Strupczewski, 2002)). Whichever distribution is selected, the fitting of a statistical distribution removes the variability of the flood peak sample and allows for extrapolation/interpolation of the data. ...
Thesis
Full-text available
This thesis aims to address the role of uncertainty in climate change impact studies, with particular focus on the impacts of climate change on UK flooding. Methods are developed to quantify the uncertainty associated with climate variability, hydrological model parameters and flood frequency estimation. Each is evaluated independently, before being combined to assess the relative importance of the different sources of uncertainty in the ‘top down’ impact study framework over multiple time horizons. The uncertainty from climate variability is addressed through the creation of a resampling methodology to be applied to global climate model outputs. Through resampling model precipitation, the direction of change for both mean monthly flows and flood quantiles are found to be uncertain with large possible ranges. Hydrological model parameter uncertainty is quantified using Monte Carlo methods to sample the model parameter space. Through sensitivity experiments, individual hydrological model parameters are shown to influence the magnitude of simulated flood quantile changes. If a larger number of climate scenarios are used, hydrological model parameter uncertainty is small only contributing up to 5% to the total range of impacts. The uncertainty in estimating design standard flood quantiles is quantified for the Generalised Pareto distribution. Flood frequency uncertainty is found to be most important for nearer time horizons, contributing up to 50% to the total range of climate change impacts. In catchments where flood estimation uncertainty is less important, global climate models are found to contribute the largest uncertainty in the nearer term, between 40% and 80% of the total range, with emissions scenarios becoming increasingly important from the 2050s onwards.
... Most of the studies based on popular theoretical probability distributions (or frequency distribution functions), namely generalised extreme value (GEV), extreme value (EV), Log-normal (LN), Log-Pearson Type III (LPT-3) distribution, etc. The GEV distribution is frequently used distribution in Great Britain (Chow et al. 2010), whereas the LN distribution in China (Singh and Strupczewski 2002) and the LPT-3 distribution has been recommended by federal agencies in the USA (Benson 1968;Wallis and Wood 1985). The validity of the results in the application of these probability distributions is theoretically subject to the hypothesis that the hydrological data series are stationary (Mujumdar and Kumar 2012), independent and identically distributed (Stedinger and Vogel 1993;Khaliq et al. 2006). ...
Article
Full-text available
Estimation of flood intensity for a desired return period is of prime importance for flood management through flood plain zoning. Flood frequency analysis enables estimation of the probability of occurrence of a certain hydrological event of practical importance by fitting a probability distribution to one that is empirically obtained from recorded annual maximum discharge and/or stage data. This case study considers the use of four probability distributions, namely Gumbel’s extreme value distribution (EV-I), extreme value distribution-III (EV-III), log-normal (LN) and Log-Pearson Type III (LPT-3) in flood modelling of monsoon-dominated Ajay River and illustrates the applicability of goodness of fit (GOF) and D-index tests procedures in identifying which distributional model is best for the specific data. Twenty-five years (1985–2009) of existing and estimated annual peak discharge (Q max) data have been used for analyzing the trend of flood occurrence. After identifying the best fit model, the peak gauge height data (h max) are then analysed combining with geographic information systems (GIS) for predicting flood affected area and preparing inundation map at a specific return period (T). Results of the study showed that the LPT-3 distribution is better suited for modelling flood data for Ajay at Nutanhat in West Bengal. The computed Q max for LPT-3 distribution are slightly higher as compared to the results obtained by EV-I, EV-III and LN which are used for vulnerability assessment. The analysis also predicts that the affected area will be ranging from 235 to 290 km2 in near future (at 25- to 200-year T). These findings provide clear picture for the pattern of hydrological fluxes and aftermath in the next decades in lower Ajay River Basin (ARB). Sustainable planning and developmental measures that consider the modelled pattern of hydrological fluxes of the study area were recommended for decision making.
... [3] Usually, the probability distribution of the extreme floods is defined by fitting a given probability distribution to the sample of streamflow observations available at the considered site [Singh and Strupczewski, 2002], using extreme value theory [Fr echet, 1927;Gumbel, 1958]. Flood frequency analysis is based on hypotheses of independent and identically distributed properties with respect to the streamflow series considered. ...
Article
Stochastic flood simulation methods are typically based on a rainfall probabilistic model (used for simulating continuous rainfall series or for estimating probabilities of random rainfall events) and on a rainfall-runoff model. Usually, both of these models are calibrated over observed hydrometeorological series, which may be subject to significant variability and/or nonstationarity over time. The general aim of this study is thus to propose and test a methodology for performing a sensitivity analysis of extreme flood estimations to observed hydrometeorological variability. The methodology consists of performing a set of block-bootstrap experiments: for each experiment, the data used for calibration of a particular model (e.g., the rainfall probabilistic model) is bootstrapped while the model structure and the calibration process are held constant. The SCHADEX extreme flood estimation method has been applied over six catchments located in different regions of the world. The results show first that the variability of observed rainfall hazard has the most significant impact on the extreme flood estimates. Then, consideration of different rainfall-runoff calibration periods generates a significant spread of extreme flood estimated values. Finally, the variability of the catchment saturation hazard has a nonsignificant impact on the extreme flood estimates. An important point raised by this study is the dominating role played by outliers within the observed records for extreme flood estimation.
... Methods such as L-moment diagrams and associated goodness-of-fit procedures have been advocated for evaluating the suitability of various distributional alternatives for modeling flood flows in a region [17]. However, the true distribution and its parameters may still differ significantly from the empirically fitted distribution, particularly when samples are small [18]. ...
Article
Full-text available
Flood frequency analysis using partial series data has been shown to provide better estimates of small to medium magnitude flood events than the annual series, but the annual series is more often employed due to its simplicity. Where partial series average recurrence intervals are required, annual series values are often "converted" to partial series values using the Langbein equation, regardless of whether the statistical assumptions behind the equation are fulfilled. This study uses data from Northern Tasmanian stream-gauging stations to make empirical comparisons between annual series and partial flood frequency estimates and values provided by the Langbein equation. At T = 1.1 years annual series estimates were found to be one third the magnitude of partial series estimates, while Langbein adjusted estimates were three quarters the magnitude of partial series estimates. The three methods converged as average recurrence interval increased until there was no significant difference between the different methods at T = 5 years. These results suggest that while the Langbein equation reduces the differences between the quantile estimates of annual maxima derived from annual maxima series and partial duration series flood frequency estimates, it does not provide a suitable alternative method to using partial series data. These results have significance for the practical estimation of the magnitude-frequency of small floods.
... These methods have been flexible enough to incorporate the mixture of flood information that may be available for a single catchment: site-specific, regional, historical and palaeoflood information. However, concomitant theoretical advance in FFA has been somewhat limited; for example, Singh and Strupczewski (2002) have noted that the fitting of flood frequency models is largely a statistical exercise somewhat divorced from hydrological input. Given the many implicit assumptions in FFA, the safest approach lies in testing and citing the sensitivity of a given analysis to a range of possible techniques at each stage of the modelling process. ...
Article
Full-text available
Flood frequency analysis (FFA) is a form of risk analysis, yet a risk analysis of the activity of FFA itself is rarely undertaken. The recent literature of FFA has been characterized by: (1) a proliferation of mathematical models, lacking theoretical hydrologic justification, but used to extrapolate the return periods of floods beyond the gauged record; (2) official mandating of particular models, which has resulted in (3) research focused on increasingly reductionist and statistically sophisticated procedures for parameter fitting to these models from the limited gauged data. These trends have evolved to such a refined state that FFA may be approaching the 'limits of splitting'; at the very least, the emphasis was shifted early in the history of FFA from predicting and explaining extreme flood events to the more soluble issue of fitting distributions to the bulk of the data. However, recent evidence indicates that the very modelling basis itself may be ripe for revision. Self-similar (power law) models are not only analytically simpler than conventional models, but they also offer a plausible theoretical basis in complexity theory. Of most significance, however, is the empirical evidence for self-similarity in flood behaviour. Self-similarity is difficult to detect in gauged records of limited length; however, one positive aspect of the application of statistics to FFA has been the refinement of techniques for the incorporation of historical and palaeoflood data. It is these data types, even over modest timescales such as 100 years, which offer the best promise for testing alternative models of extreme flood behaviour across a wider range of basins. At stake is the accurate estimation of flood magnitude, used widely for design purposes: the power law model produces far more conservative estimates of return period of large floods compared to conventional models, and deserves closer study.
... It is perhaps not surprising, however, given that with the exception of Chow's (1954) theory of 'multiplicative processes' giving rise to hydrological variables that conform to a log-normal distribution, the majority of FFA distributions in present use articulate no basis in hydrological theory as justification for their application (Singh & Strupczewski 2002). An inspection of the literature confirms that the utility of the plethora of distributions available is judged largely on the basis of empirical goodness of fit (Alila & Mtiraoui 2002). ...
Article
Full-text available
Conventional Flood Frequency Analysis (FFA) has been criticized both for its questionable theoretical basis, and for its failure in extreme event prediction. An important research issue for FFA is the exploration of models that have theoretical/explanatory value as the first step towards more accurate predictive attempts. Self-similar approaches offer one such alternative, with a plausible theoretical basis in complexity theory that has demonstrable wide applicability across the geophysical sciences. This paper explores a simple self-similar approach to the prediction of extreme floods. Fifty river gauging records from the USA exhibiting an outlier event were studied. Fitting a simple power law (PL) relation to events with return period of 10 years or greater resulted in more accurate discharge and return period estimates for outlier events relative to the Log-Pearson III model. Similar success in predicting record events is reported for 12 long-term rainfall records from the UK. This empirical success is interpreted as evidence that self-similarity may well represent the underlying physical processes generating hydrological variables. These findings have important consequences for the prediction of extreme flood events; the PL model produces return period estimates that are far more conservative than conventional distributions.
Article
Full-text available
Bivariate frequency analysis is an emerging method for the assessment of compound floods, especially in coastal regions. Changing climate, which usually leads to changes in characteristics of extreme hydrometeorological phenomena, makes the application of nonstationary methods more critical. In this research, a methodology is developed to apply frequency analysis on extreme sea level using physically-based hydroclimatic variables as covariates based on univariate Generalized Extreme Value (GEV) as the probability distribution function and copula methods. The results show that for extreme sea level, the location parameter of marginal distribution is directly related to the covariate variable of maximum temperature. For precipitation, the scale parameter is related to the covariate variable of minimum temperature, and the shape parameter is time-dependent. The univariate return periods of hurricanes Sandy and Irene are estimated at 85 and 12 years in nonstationary GEV distribution, respectively, while for stationary GEV distribution they are estimated at 1200 and 25 years, and in the bivariate frequency analysis of water level and precipitation, the normal copula function has more flexibility compared to other competitors. Using time-varying copula, the bivariate return periods of Hurricanes Sandy and Irene are 109 years and 136 years, respectively. The results confirm the importance of incorporating rainfall and extreme sea level in coastal flood frequency analysis. Although the proposed methodology can be applied to other hydro-climatological variables, the findings of this research suggest the necessity of considering nonstationarity in the analysis of extreme hydrologic events.
Article
Full-text available
Although climate change impacts vary globally, for the Kabul River Basin (KRB), concerns are primarily associated with frequent flooding. This research describes the influence of headwater reservoirs on projections of climate change impacts and flood frequency, and how riparian countries can benefit from storing of floodwaters for use during dry seasons. Six climate change scenarios and two Representative Concentration Pathways (RCPs) are used in three periods of a quarter-century each. The Soil and Water Assessment Tool (SWAT) is used to assess how the proposed reservoirs will reduce flooding by ∼38% during the wet season, reduce the flood frequency from five to 25 years return period, and increase low flows by ∼110% during the dry season, which reflect an ∼17.5% reduction in the glacier-covered area by the end of the century. The risks and benefits of reservoirs are highlighted in light of the developmental goals of Afghanistan and Pakistan.
Article
Full-text available
Though metrics that quantify the long‐term water balance of a catchment have been known to influence the statistics of flood frequency data, little has been done to understand exactly what the substantive links are that causes baseflow to impact instantaneous annual maximum flows (IAMF). As abrupt changes to baseflow can occur via climate change and human disturbances, understanding the effects of baseflow on IAMF behavior can provide insight into how groundwater alterations and climate change can impact flood risk assessment, particularly for regional data aggregation. We thus seek to quantify baseflow impacts across the flood frequency curve considering a top down approach enumerating the differential impacts of baseflow (as Baseflow Index) across the IAMF flood frequency curve and then analyzing the impact of baseflow at the event‐scale. Results show that baseflow exerts significant influence over the entire flood frequency curve, showing a slight increase in effect with higher return period flows. At the event‐scale, the permeability of the catchment (proxied by baseflow rate during rising limb), significantly impacted the variability in IAMF and had more of an effect than antecedent moisture of the catchment (proxied by antecedent discharge). Furthermore, the effects of each of the event‐scale variables changed across climate regions and return periods, suggesting that changes to climate and groundwater abstractions would greatly affect the flood frequency curve.
Article
Full-text available
An adequate characterization of extreme floods is key for the correct design of the infrastructures and for the flood risk estimation. However, the short length of the rainfall and flow data series along with the low probability of occurrence of this type of event cause that, to date, their adequate estimation still presents significant difficulties. This paper presents a methodology for the estimation of extreme floods based on the continuous generation of precipitation data series using weather generators and the integration of information of various types (systematic and non-systematic). The results obtained in the case study, Rambla de la Viuda, indicate that the joint use of continuous synthetic data series generated by a stochastic weather generator, a hydrological model and the integration of systematic and non-systematic information reduces the uncertainty in the estimation of extreme floods.
Conference Paper
The standard L-moment based regional frequency analysis (RFA) involves four steps: data screening, identification of homogenous region, selection of distribution, and estimation of associated parameters. This study augments the last two steps of RFA using Bayesian statistics in order to limit the subjectivity of distribution selection and account for parameter and model uncertainty. Bayesian model averaging (BMA) was employed along with multiple performance measures to combine distributions and provide a consensus prediction that avoids subjectivity and quantify modeling uncertainty. The differential evolution adaptive metropolis (DREAM) algorithm, the latest addition in MCMC sampling, was adopted to estimate parameters of selected distributions and quantify parameter uncertainty. The results based on extreme precipitation data from 85-station and 10-duration in Eastern U.S. suggest that the coupled BMA and DEARM delivers a less subjective prediction with a better performance than the single best distribution as measured by Taylor diagram, Anderson Darling, and Bootstrap efficiency measures.
Article
Hydrological series lengths are decreasing due to decreasing investments and increasing human activities. For short sequences, a copula-based composite likelihood approach (CBCLA) has been employed to enhance the quality of hydrological design values. However, the Pearson type III (P-III) distribution for short annual precipitation records has not yet been thoroughly investigated using the CBCLA. This study used the CBCLA to incorporate the concurrent and non-concurrent periods contained in data of various lengths into an integrated framework to estimate the parameters of precipitation frequency distributions. The marginal distributions were fitted using the P-III distribution, and the joint probability was constructed using a copula which offers flexibility in choosing arbitrary marginals and dependence structure. Furthermore, the uncertainties in the estimated precipitation design values for the short series obtained from this approach were compared with those obtained from univariate analysis. Then, Monte-Carlo simulations were performed to examine the feasibility of this approach. The annual precipitation series at four stations in Weihe River basin, China, were used as a case study. Results showed that CBCLA with P-III marginals reduced the uncertainty in the precipitation design values for the short series and the reduction in the uncertainty became more significant with longer adjacent series.
Article
Full-text available
Estimation of probability distribution and return periods of flood peak flows is needed for the planning, design, and management of flood control. Much of the research on at-site and regional flood frequency analysis has focused on the determination of the best probability distribution. The first objective of this study was to determine, evaluate and compare the goodness of fit of popular probability distribution functions (PDFs) to sequences of annual maximum stream-flows measured in West Mediterranean river basins of Turkey. Besides Gumbel distribution, which is generally preferred because of its simplicity and generality in extreme hydrologic data, distributions like Pareto, Log-logistic, Pearson Type III, Log-Pearson Type III, Log-normal with two and three parameters, and Generalized Extreme Value distributions are applied to the series of annual floods with time periods ranging from 20 to 61 years for 37 gauging stations. Another objective of the study was to compare and evaluate the parameter estimation methods and goodness of fit tests for the basins. For parameter estimation, the traditionally used method of moments and, recently widely used, that of probability weighted moments were used. To make an evaluation of the suitability of the parameters obtained by both methods to the data, detailed chi-square (parametric) tests were applied twice with equal-length intervals, equal-probability intervals and Kolmogorov-Smirnov (non-parametric) goodness-of-fit tests. The results demonstrated that when chi-square goodness of fit test is applied for both parameter estimation methods (moments and probability weighted moment methods), Gumbel probability distribution was obtained as the best fitting one to the floods in West Mediterranean river basins in Turkey, according to chi-square test with equal-probability and equal-length class intervals for both of the methods. Besides, the application of chi-square goodness of fit test for both parameter estimations with average chi-square approach resulted in Log-Pearson Type III for both with equal class intervals as optimal distribution. Similar results were obtained for chi-square distribution with equal probability approach. Log-Pearson Type III distribution was the best suitable one for each of the parameter estimation methods in Kolmogorov-Smirnov goodness of fit test. These results indicated that it may be more appropriate to use Log-Pearson Type III distribution instead of the widely used Gumbel distribution for probability distribution modeling of extreme values in West Mediterranean river basins.
Article
Poland is relatively scarce in water in some regions of the country. Due to variability of hydrological processes the reliable water resources available at 95% of time are equal to about 22 km 3, twice of the average annual water withdrawals. These resources are sufficient for satisfying objectively motivated needs of people, industry, energy and agriculture, if managed in efficient way. Local and seasonal shortages of water in Poland result from its contamination and underdeveloped infrastructure. To eliminate these problems and ensure protection against floods and droughts, large amounts of financial means must be spent in coming decades in order to implement the EU Water Framework Directive by the year 2030. The efficient water management requires the ability to quantify its objectives, proper forecasting of changing physical and economic conditions and application of reasonable decision-making procedures. The multidisciplinary research potential of Poland is adequate to meet educational and technological needs required for sustainable water management.
Conference Paper
Applications of scale-invariant ideas have been one of the most exciting areas of research in hydrology. The past studies showed positive evidences regarding the scale-invariant behavior in hydrological processes. In this paper, the scale-invariant approach found its way in frequency analysis for design rainfall depth estimation. In practice, intersections among design rainfalls for different durations under a same return period often occurs, which may provide wrong information in engineering planning and design. Furthermore, frequency analysis is currently regarded as inappropriate due to its lack of physical principles. In this paper, the scale-invariant approach is proposed, coupled with a constrained regression analysis, to make corrections on the possible intersections. The aim of this paper is to investigate the feasibility of applying such approach to deal with the intersection problem. A Hong Kong case study is used to demonstrate the application. The results are interpreted as more reasonable than those based on frequency analysis and are capable of providing substantial support for engineering planning, design, and management purpose when only a small sample of data are available.
Chapter
A comparison of different methods for estimating T-year events is presented, all based on the Extreme Value Type I distribution. Series of annual maximum flood from ten gauging stations at the New Zealand South Island have been used. Different methods of predicting the 100-year event and the connected uncertainty have been applied: At-site estimation and regional index-flood estimation with and without accounting for intersite correlation using either the method of moments or the method of probability weighted moments for parameter estimation. Furthermore, estimation at ungauged sites were considered applying either a log-linear relationship between at-site mean annual flood and catchment characteristics or a direct log-linear relationship between 100-year events and catchment characteristics. Comparison of the results shows that the existence of at-site measurements significantly diminishes the prediction uncertainty and that the presence of intersite correlation tends to increase the uncertainty. A simulation study revealed that in regional index-flood estimation the method of probability weighted moments is preferable to method of moment estimation with regard to bias and RMSE.
Article
Temporal rainfall is usually represented as a series of storms with different duration D and average intensity ID, separated by dry periods. Inside each storm, rainfall intensity fluctuates, producing different average values Id in different intervals of duration d ≤ D. We consider storm models of this type to predict the distribution of the annual maximum of Id. Of special importance is the dependence of the upper tails of ID and Id on D and (d, D, ID), respectively. From the analysis of 23 hourly rainfall records, we conclude that all marginal and conditional upper tails of ID and Id have lognormal shape. The upper tail of Id is largely independent of D and is the same as the upper tail of ID for D = d. Both tail distributions scale in a multifractal way with duration. These findings greatly simplify the parameterization of the model and the estimation of rainfall extremes. We compare extreme rainfall results from this model with those produced by methods that use only the marginal upper tail of Id and the number of wet d intervals in one year, and with the empirical distributions from the historical annual maxima.
Article
Flood frequency analysis (FFA) provides information about the probable size of flood flows. Empirical methods are more commonly employed in engineering design and planning, and among empirical methods the at-site frequency analysis is by far the most commonly used method. When applying the methods of at-site flood frequency analysis, it is clear that the role of hydrology seems minor at best and the role of statistics seems to be the lead one, whereas it should be the other way round. FFA entails the estimation of the upper quantiles of an assumed form of a probability density function of the annual or partial duration maximum flows, as the true function is not known. In the paper, the five two-parameter models and their three-parameter counterparts have been assumed successively for describing the annual peak flows for Nowy Targ gauging station on the Dunajec River. The 1 % quantile has been estimated by four optimization criteria. To find the best fitting model, three discrimination procedures have been applied. The best fitting model and, thus, hydrological design value depends on the optimization criterion and the procedure of discrimination. It is characteristic for hydrological size of samples. At the same time, the designers of the hydraulic structures want to have a unique value, not accepting the uncertainty. It seems essential that we should go back and start examining the way in which we have been doing the hydrological frequency analysis.
Article
The probability distribution of the annual maximum peak flows on the River Tiber at the town of Rome (Ripetta) can be evaluated using the information available since the XV century. In this paper the probability distribution of two series of annual peak flows maxima observed at the Ripetta gauge is analysed: the systematic series, including all the data observed since the beginning of the systematic stage record in 1782, and the censored series of the exceptional floods that inundated the town of Rome, starting from the XV century. In order to find a criterion for choosing the “optimum” distribution law for the high return period quantiles, an index that takes into account both the accuracy and the uncertainty of the quantiles estimation is proposed.
Article
Full-text available
The desirable properties of an estimator relative to a hypothetical population may be irrelevant in practice unless the population at issue more or less resembles the hypothetical population. Evidence that floods are distributed with long, stretched upper tails suggests that use of the more common distributions results in a rather precise underestimation of the extreme quantiles and thereby in the underdesign of flood protection measures.
Chapter
Risk, Reliability, Uncertainty, and Robustness of Water Resource Systems is based on the Third George Kovacs Colloquium organized by the International Hydrological Programme (UNESCO) and the International Association of Hydrological Sciences. Thirty-five leading scientists with international reputations provide reviews of topical areas of research on water resource systems, including aspects of extreme hydrological events: floods and droughts; water quantity and quality dams; reservoirs and hydraulic structures; evaluating sustainability and climate change impacts. As well as discussing essential challenges and research directions, the book will assist in applying theoretical methods to the solution of practical problems in water resources. The authors are multi-disciplinary, stemming from such areas as: hydrology, geography, civil, environmental and agricultural engineering, forestry, systems sciences, operations research, mathematics, physics and geophysics, ecology and atmospheric sciences. This review volume will be valuable for graduate students, scientists, consultants, administrators, and practising hydrologists and water managers.
Book
After five decades, the field of Statistical Hydrology continues to evolve and remains a very active area of investigation. Researchers continue to examine various distributions, methods of estimation of parameters, and problems related to regionalization. However, much of this research appears in journals and reports and usually in a form not easily accessible to practitioners and students-producing a gap between research and practice. Flood Frequency Analysis fills this gap by presenting many of these distributions and estimation procedures in a unified format within a single, self-contained book. Focusing on distribution families popular within the hydrologic community, the authors discuss three parameter estimation methods for each distribution: the method of moments, the maximum likelihood method, and the method of probability weighted moments. They present the details behind the procedures to provide the basis for the computations, and they illustrate each procedure with real data. Most of the computations discussed have been programmed for use with personal computers, and executable versions of these programs are available on CD-ROM from the senior author. Only increased use of new methods and distributions can produce a consensus on their validity. With other books on the subject either limited in scope or seriously outdated, Flood Frequency Analysis provides the ideal vehicle for practicing hydrologists and engineers to explore and apply the latest methods and research results, and in doing so, contribute to the advancement of the field.
Book
The book covers entropy-based techniques for parameter estimation in hydrology, dividing the material into 22 chapters. The techniques are presented for 20 distributions used in hydrology, including the distributions of the normal family, the extreme value family, the Pareto family and the logistic family. A comparative assessment of the entropy-based techniques is made with traditional methods of parameter estimation, such as the methods of moments, maximum likelihood, probability-weighted moments, L-moments, and least squares, using field data as well as Monte Carlo simulation experiments.
Article
Asymptotic bias in large quantiles and moments for three parameter estimation methods, including the maximum likelihood method (MLM), moments method (MOM) and linear moments method (LMM), is derived when a probability distribution function (PDF) is falsely assumed. It is illustrated using an alternative set of PDFs consisting of five two-parameter PDFs that are lower-bounded at zero, i.e., Log-Gumbel (LG), Log-logistic (LL), Log-normal (LN), Linear Diffusion (LD) and Gamma (Ga) distribution functions. The stress is put on applicability of LG and LL in the real conditions, where the hypothetical distribution (H) differs from the true one (T). Therefore, the following cases are considered: H=LG; T=LL, LN, LD and Ga, and H=LL, LN, LD and Ga, T=LG. It is shown that for every pair (H; T) and for every method, the relative bias (RB) of moments and quantiles corresponding to the upper tail is an increasing function of the true value of the coefficient of variation (cv), except that RB of moments for MOM is zero. The value of RB is smallest for MOM and the largest for MLM. The bias of LMM occupies an intermediate position. Since MLM used as the approximation method is irreversible, the asymptotic bias of the MLM-estimate of any statistical characteristic is not asymmetric as is for the MOM and LMM. MLM turns out to be the worst method if the assumed LG or LL distribution is not the true one. It produces a huge bias of upper quantiles, which is at least one order higher than that of the other two methods. However, the reverse case, i.e., acceptance of LN, LD or Ga as a hypothetical distribution while LG or LL as the true one, gives the MLM-bias of reasonable magnitude in upper quantiles. Therefore, one should be highly reluctant in choosing the LG and LL in flood frequency analysis, especially if MLM is to be applied.
Article
A warmer Britain may also be wetter, at least seasonally, which might suggest that the frequency and magnitude of riverine flooding may increase. It may be difficult to distinguish the effects of climate warming from the effects of continuing land use change, given the naturally stochastic nature of the inputs into hydrological systems. Prediction of the effects of climate change on riverine flooding is clearly very uncertain. These uncertainties need to be assessed, particularly for more extreme events. Further research on data collection techniques and model improvements to reduce levels of predictive uncertainty is required. -Author
Article
The concept of a robust model is briefly explored. In the context of flood frequency analysis, two necessary properties of a robust model are advanced, namely, resistance and efficiency. Strategies for seeking more robust models are discussed. Because of its versatility, the five-parameter Wakeby distribution can credibly be considered a parent flood distribution. Four regionalized Wakeby parents are employed in simulation studies to search for robust models. These parents were shown by Houghton to be representative of U.S. flood experience in the sense that certain raw flood data characteristics could be reproduced. A limited range of sampling experiments were undertaken. The results suggest that of the site-specific estimators considered, the two-parameter log normal maximum likelihood estimator is most resistant, with Gumbel estimators employing either maximum likelihood or probability-weighted moments displaying comparable resistance. Several estimators which utilize regional flood information were compared. Included were empirical Bayes estimators which are structurally similar to James-Stein rules and regionalized estimators based on the flood index method. These estimators exhibited substantial improvements in aggregate risk performance over their site-specific counterparts, particularly for short record lengths. Regionalized estimators appear to be preferable for short record lengths, while estimators which combine both site and regional flood information are preferable for longer record lengths. When such estimation procedures are considered, other distributional models such as log Pearson type III and Wakeby become practical alternatives to the two-parameter log normal model.
Article
The International Symposium on Flood Frequency and Risk Analysis was held at Louisiana State University, Baton Rouge, May 14-17, 1986. The symposium was sponsored by Louisiana State University, the National Science Foundation, U.S. Army Research Office, the Louisiana district of the U.S. Geological Survey, and the Baton Rouge office of Woodward-Clyde Consultants; it was cosponsored by about 20 professional organizations, including AGU. The symposium was attended by nearly 200 participants, 95 of whom came from the United States, 15 from Canada, and the remaining from 35 other countries. Of the participants from the United States, 24 were from universities, 35 were from state, federal, and local governmental agencies, and 35 were from consulting engineering firms.
Article
A currently used approach to flood frequency analysis is based on the concept of parametric statistical inference. In this analysis the assumption is made that the distribution function describing flood data is known, for example, a log-Pearson type III distribution. However, such an assumption is not always justified and often leads to other difficulties; it could also result in considerable variability in the estimation of design floods. A new method is developed in this article based on the nonparametric procedure for estimating probability distribution function. The results indicate that design floods computed from the different assumed distribution and from the nonparametric method provide comparable results. However, the nonparametric method is a viable alternative with the advantage of not requiring a distributional assumption, and has the ability of estimating multimodal distributions.
Article
A multitude of natural processes occurring in environmental and water resources are diffusive processes, and their observations exhibit a natural lower bound of zero and practically no upper bound. Such processes can be modeled using a dye diffusion equation whose solution yields a concentration distribution. This function, when normalized, leads to a two-parameter probability distribution that is seen to be a superposition of two normal distributions. Parameters of this distribution were estimated by the methods of moments and maximum likelihood for Monte Carlo-generated processes.
Article
The impulse response of a linear convective-diffusion analogy (LD) model used for flow routing in open channels is proposed as a probability distribution for flood frequency analysis. The flood frequency model has two parameters, which are derived using the methods of moments and maximum likelihood. Also derived are errors in quantiles for these parameter estimation methods. The distribution shows that the two methods are equivalent in terms of producing mean values—the important property in case of unknown true distribution function. The flood frequency model is tested using annual peak discharges for the gauging sections of 39 Polish rivers where the average value of the ratio of the coefficient of skewness to the coefficient of variation equals about 2.52, a value closer to the ratio of the LD model than to the gamma or the lognormal model. The likelihood ratio indicates the preference of the LD over the lognormal for 27 out of 39 cases. It is found that the proposed flood frequency model represents flood frequency characteristics well (measured by the moment ratio) when the LD flood routing model is likely to be the best of all linear flow routing models.
Article
Asymptotic bias in large quantiles and moments for four parameter estimation methods, including the maximum likelihood method (MLM), method of moments (MOM), method of L-moments (LMM), and least squares method (LSM), is derived when a probability distribution function (PDF) is falsely assumed. The first three estimation methods are illustrated using the lognormal and gamma distributions forming an alternative set of PDFs. It is shown that for every method when either the gamma or lognormal distribution serves as the true distribution, the relative asymptotic bias (RB) of moments and quantiles corresponding to the upper tail is an increasing function of the true value of the coefficient of variation (cv), except that RB of moments for MOM is zero. The value of RB is the smallest for MOM and largest for MLM. The bias of LMM occupies an intermediate position. The value of RB from MLM is larger for the lognormal distribution as a hypothetical distribution with the gamma distribution being assumed to be the true distribution than it would be in the opposite case. For cv=1 and MLM, it equals 30, 600, 320% for mean, variance and 0.1% quantile, respectively, while for MOM, the moments are asymptotically unbiased and the bias for 0.1% quantile amounts to 35%. An analysis of 39 70-year long annual peak flow series of Polish rivers provides an empirical evidence for the necessity to include bias in evaluation of the efficiency of PDF estimation methods.
Article
For dealing with hydrological non-stationarity in flood frequency modelling (FFM) and hydrological design, it is necessary to account for trends. Taking the case of at-site FFM, statistical parametric techniques are discussed for investigation of the time-trend. The investigation entails (1) an identification of a probability distribution, and (2) development of a trend software. The Akaike Information Criterion (AIC) was used to identify the optimum distribution, i.e. the distribution and trend function, which enabled an identification of the optimum non-stationary FFM in a class of 56 competing models. The maximum likelihood (ML) method was used to estimate the parameters of the identified model using annual peak discharge series. A trend can be assumed in the first two moments of a probability distribution function and it can be of either linear or parabolic form. Both the annual maximum series (AMS) and partial duration series (PDS) approach were considered in the at-site frequency modeling.
Flood Frequency Analysis: Rainfall-Runoff Mod-eling Entropy-Based Parameter Estimation in Hydrol-ogy Non-stationary approach to at-site flood frequency modeling I. Maximum like-lihood estimation
  • Lloyd Eh B C Hydro
  • Canada Vancouver
  • Rao Ar
  • Fl Hamed Kh Boca Raton
  • Singh
Lloyd EH. 1995. Comments In B.C. Hydro Seminar on Proba-bilities of Extreme Hydrological Events, B.C. Hydro, Vancouver, Canada. Rao AR, Hamed KH. 2000. Flood Frequency Analysis. CRC Press: Boca Raton, FL. Singh VP. 1988. Hydrologic Systems, Vol. 1: Rainfall-Runoff Mod-eling. Prentice Hall: Englewood Cliffs, NJ. Singh VP. 1996. Entropy-Based Parameter Estimation in Hydrol-ogy. Kluwer Academic Publishers: Boston. Strupczewski WG, Singh VP, Feluch W. 2001a. Non-stationary approach to at-site flood frequency modeling I. Maximum like-lihood estimation. Journal of Hydrology 248: 123– 142.
B.C. Hydro Seminar on Probabilities of Extreme Hydrological Events
  • Lloyd EH
Risk, Reliability, Uncertainty, and Robustness of Water Resources Systems
  • Klemeš V
Comments In B.C. Hydro Seminar on Probabilities of Extreme Hydrological Events
  • E H Lloyd
Lloyd EH. 1995. Comments In B.C. Hydro Seminar on Probabilities of Extreme Hydrological Events, B.C. Hydro, Vancouver, Canada.
  • V P Singh
Singh VP. 1988. Hydrologic Systems, Vol. 1: Rainfall-Runoff Modeling. Prentice Hall: Englewood Cliffs, NJ.
Comments InB.C. Hydro Seminar on Probabilities of Extreme Hydrological Events B.C. Hydro Vancouver Canada
  • Lloydeh