Article

What do we mean by ‘uncertainty’? The need for a consistent wording about uncertainty assessment in hydrology

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Uncertainty estimation in hydrological modelling is receiving increasing attention by researchers and practitioners. However, the transfer of relative know-how from scientists to end-users is still difficult, notwithstanding the relevant research activity developed in the last 10 years. This problem was rightly pointed out in a recent commentary on HPToday by Beven (2006), who even wondered whether the results of uncertainty analyses will undermine the confidence of stakeholders in the science of hydrology. One of the main reasons that so far prevented the hydrologic community from efficiently communicating the knowledge about uncertainty estimation is certainly the impracticality of a systematic testing of the many methods proposed recently. As Beven (2006) pointed out, to perform extensive validation in hydrology is not easy, and impossible in some cases. However, I think this is not sufficient to justify the unquestionable trouble that applied hydrologists have to deal with when they try to identify from the current literature the best uncertainty estimation method for their needs. The topic of uncertainty assessment in hydrology suffers today from the lack of a coherent terminology and a systematic approach, which would allow us to clearly classify the practical problems to be solved and, consequently, the methods that can be used. Even the term 'uncertainty' itself is sometimes used without any reference to the precise scientific definition of its meaning. The result of this situation is that it is extremely difficult (if not impossible) to assess the prerogatives and limitations of individual methods that are currently used by researchers to quantify the reliability of hydrological simulations, design variables and forecasts. The aim of this comment is to provide some personal opinions and ideas in order to better identify (a) the questions concerning uncertainty estimation currently raised in applied hydrology and (b) the peculiarities of the most widely-known uncertainty estimation methods. The purpose is to contribute to the discussion about the strategy that could be used in order to convey the results of uncertainty estimation to stakeholders without undermining their confidence in hydrological studies.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Additionally, we also need to include uncertainty regarding the regionalisation process and the FFA procedure regarding the statistical extrapolation approach, the distribution choice, and the parameter estimation method. Regarding these sources of uncertainty, notice that individual assessments of uncertainty within the model structure, model parameters or data inputs are not the same as uncertainty assessments of the model outputs (Montanari 2007). And while the former can be combined to form the latter within the context of chaining uncertainty with a Monte Carlo approach (McMillan, Westerberg, and Krueger 2018), by assuming the uncertainty model (statistics) of the input and the system (such as the GLUE framework, Beven and Binley 2014), or with Bayesian analysis, (Schoups and Vrugt 2010;Sadegh and Vrugt 2014), the latter can also be analysed independently with a post-process approach, by using the statistics of the model error (Sikorska, Montanari, and Koutsoyiannis 2015;Koutsoyiannis and Montanari 2022). ...
... And while the former can be combined to form the latter within the context of chaining uncertainty with a Monte Carlo approach (McMillan, Westerberg, and Krueger 2018), by assuming the uncertainty model (statistics) of the input and the system (such as the GLUE framework, Beven and Binley 2014), or with Bayesian analysis, (Schoups and Vrugt 2010;Sadegh and Vrugt 2014), the latter can also be analysed independently with a post-process approach, by using the statistics of the model error (Sikorska, Montanari, and Koutsoyiannis 2015;Koutsoyiannis and Montanari 2022). Such distinction is relevant because while from a purely theoretical perspective, there is not enough research to say which approach is better, we can infer that, from a pragmatic perspective, the post-process approaches have limited utility when evaluated in regionalisation studies (Montanari 2007), given the inherent lack of streamflow data for their use (unless we could find ways to regionalise these post-process uncertainty assessments themselves). ...
Article
Flood frequency analysis lies at the core of hydrology and water engineering as one of the most required estimates for water planning and design of hydraulic structures. For ungauged basins, where no information is available, various flood regionalisation techniques have varying degrees of complexity and resulting performance, depending on the study's goal, the region analysed, and the information available. This study evaluates the use of hydrological models for flood regionalisation in Chile, using 1) A large sample dataset of 101 catchments; 2) the continuous simulation approach with the GR4J model; 3) the leave-one-out strategy for performance testing; and, 4) two regionalisation methods: Nearest neighbour (NN) and physical similarity, together with several alternative objective functions for calibration purposes and regionalisation strategy (in all cases adopting a single criterion, single variable and determinist approach for the parameter’s selection). Our results showed that performance (both in calibration–validation and regionalisation) is highly variable (in terms of reproducing the runoff hydrograph and flood statistics), depending on the catchment’s aridity (e.g., around 66–82% of catchments with NSE above 0 in humid regions but it severely drops to 12–44% of catchments with NSE above 0 when evaluating arid catchments). We also found that flood-specific calibration strategies produce better results for floods but poorer performance in runoff hydrograph reproduction. Finally, we highlight that our regionalisation results were in close agreement with those from one of the currently recommended methods by Chilean engineering for flood regionalisation. This is particularly promising, considering that the continuous simulation approach gives access to the complete time series and not only flood statistics. We end this manuscript by discussing several sources of uncertainty, hoping that these can be accounted for in future studies.
... The challenging character of probabilistic hydrological modelling has been widely acknowledged in the literature (see, e.g., references [6,10,11]). Assumptions are certainly unavoidable when it comes to modelling [6], and probabilistic predictions are not (and should not be expected to be) perfect [11]. ...
... To further reduce uncertainty, one has to optimize the statistical modelling part of the probabilistic methodology, which is commonly related to the modelling of the hydrological model errors (see, e.g., references [6,45,61,62,64,65,69]). These errors are known to be heteroscedastic and correlated (see, e.g., references [6,10,87]). Based on the below-discussed properties of machine-learning quantile regression algorithms, we believe that their use for solving the problem of interest could further reduce uncertainty (to some extent) by increasing the amount of information gained from the available historical records. ...
Article
Full-text available
We conduct a large-scale benchmark experiment aiming to advance the use of machine-learning quantile regression algorithms for probabilistic hydrological post-processing “at scale” within operational contexts. The experiment is set up using 34-year-long daily time series of precipitation, temperature, evapotranspiration and streamflow for 511 catchments over the contiguous United States. Point hydrological predictions are obtained using the Génie Rural à 4 paramètres Journalier (GR4J) hydrological model and exploited as predictor variables within quantile regression settings. Six machine-learning quantile regression algorithms and their equal-weight combiner are applied to predict conditional quantiles of the hydrological model errors. The individual algorithms are quantile regression, generalized random forests for quantile regression, generalized random forests for quantile regression emulating quantile regression forests, gradient boosting machine, model-based boosting with linear models as base learners and quantile regression neural networks. The conditional quantiles of the hydrological model errors are transformed to conditional quantiles of daily streamflow, which are finally assessed using proper performance scores and benchmarking. The assessment concerns various levels of predictive quantiles and central prediction intervals, while it is made both independently of the flow magnitude and conditional upon this magnitude. Key aspects of the developed methodological framework are highlighted, and practical recommendations are formulated. In technical hydro-meteorological applications, the algorithms should be applied preferably in a way that maximizes the benefits and reduces the risks from their use. This can be achieved by (i) combining algorithms (e.g., by averaging their predictions) and (ii) integrating algorithms within systematic frameworks (i.e., by using the algorithms according to their identified skills), as our large-scale results point out.
... Uncertainty can be defined as an attribute of information (Zadeh, 2005;Montanari, 2007). In the context of hydrology, uncertainty is generally meant to be a quantitative indication of reliability for a given hydrological quantity, either observed or inferred by using models. ...
... Given such scenario, the issue of representing, quantifying and reducing uncertainty in hydrological modeling continues to attract attention of most researchers (eg. Montanari, 2007) and no one involved in hydrological modeling can escape from uncertainty analysis. Being based on this very notion, this thesis was formulated and performed accordingly. ...
Thesis
In recent years, hydrological models are more and more widely applied by hydrologists and water resources planners as a tool to examine watershed scale processes and to evaluate the hydrologic effect of various management scenarios. The use of watershed models is increasing in response to growing demands for economic and efficient use of available water and thus for the formulation of appropriate water management strategies to cope future water challenges. These hydrological models are simplification of reality and therefore there is always some degree of associated risk or uncertainty. It is vital that uncertainty is recognized and properly accounted for. Such scenario has presented an opportunity for this thesis to analyze different aspects of hydrological modeling focusing uncertainty estimation. This thesis primarily focuses on different aspects of hydrological modeling like calibration, validation, and sensitivity analysis and uncertainty estimation for the streamflow with a view to check the reliability of the model. It uses multiple model evaluation techniques to assess which part of the model works good and which part of the model is weak. This study also makes use of carefully calibrated and validated hydrological model to comprehensively assess the water balance scenario of the study watershed. Because the distributed hydrological model requires the proper spatial representation of the watershed, spatially distributed satellite rainfall data were also analyzed and then used to predict the response with an objective of assessing the validity of such data. Finally uncertainties in the model prediction were estimated by two different approaches. A popular semi-distributed hydrological model, Soil and Water Assessment Tool (SWAT), was used to perform hydrological analysis of West Seti River Basin of Nepal. Physically based models where large number of parameters involved, Sensitivity analysis of parameters proved to be helpful in selecting the parameters for calibration. Calibration and Validation for daily streamflow at the basin outlet produced good results (Nash-Sutcliffe Efficiency, NSE= 74% and 68% respectively) while for monthly streamflow, the performance improved (Nash-Sutcliffe Efficiency= 93% and 85% respectively). This indicated the effect of time step on model performance. Further analysis revealed that model well captured the peak flows for daily time step but peak flows were underestimated for monthly time step. Decomposition of NSE performance criteria showed that the correlation component dominated the NSE value while the contributions of other components (bias and variability) were quite low. The analysis of daily TRMM rainfall data and its comparison with the observed data showed considerable bias though some stations showed high correlation and some low correlation. The mean monthly discharge at the basin outlet generated by TRMM data indicated satisfactory results (NSE=50% and R2=0.84) but the results may be improved by using the TRMM data after applying proper bias correlation. Finally the uncertainty analysis of the SWAT simulated daily discharge based on meta-Gaussian approach covered most of the observed data inside the 95% confidence interval (94% and 95% during calibration and validation period respectively) band but on the expense of wide interval band. On the other hand, the uncertainty analysis of mean monthly discharge by Sequential uncertainty fitting (SUFI-2) produced a narrow 95PPU band indicating less uncertainty range. But it bracketed relatively less (58% and 64% during calibration and validation respectively) observed data within 95PPU band. Further analysis revealed that model failed to bracket most of the peak flows as well as the recession part indicating high uncertainty in peak flow and recession flow estimation. Given the high uncertainty in the model prediction, its implications in the decision-making process should be properly accounted for. In the end it can be said that the approach adopted in this study can pave a way forward for application of model evaluation techniques and uncertainty analysis in hydrological modeling.
... Uncertainty is a persistent and significant issue in hydrological research, directly impacting the applicability and credibility of model results, yet remains without definitive resolution (Beven and Binley, 1992;Georgakakos et al., 2004;Beven, 2006;Montanari, 2007;Pechlivanidis et al., 2011). The uncertainties in hydrological models manifest in data, model structure, parameters, and initial conditions (Refsgaard et al., 2006;Clark et al., 2016). ...
Article
Full-text available
Hydrological modeling, leveraging mathematical formulations to represent the hydrological cycle, is a pivotal tool in representing the spatiotemporal dynamics and distribution patterns inherent in hydrology. These models serve a dual purpose: they validate theoretical robustness and applicability via observational data and project future trends, thereby bridging the understanding and prediction of natural processes. In rapid advancements in computational methodologies and the continuous evolution of observational and experimental techniques, the development of numerical hydrological models based on physically-based surface-subsurface process coupling have accelerated. Anchored in micro-scale conservation principles and physical equations, these models employ numerical techniques to integrate surface and subsurface hydrodynamics, thus replicating the macro-scale hydrological responses of watersheds. Numerical hydrological models have emerged as a leading and predominant trend in hydrological modeling due to their explicit representation of physical processes, heightened by their spatiotemporal resolution and reliance on interdisciplinary integration. This article focuses on the theoretical foundation of surface-subsurface numerical hydrological models. It includes a comparative and analytical discussion of leading numerical hydrological models, encompassing model architecture, numerical solution strategies, spatial representation, and coupling algorithms. Additionally, this paper contrasts these models with traditional hydrological models, thereby delineating the relative merits, drawbacks, and future directions of numerical hydrological modeling.
... Determining the output's statistical data is the most straightforward method for evaluating the uncertainty of hydrologic model output (Montanari 2007). However, the numerical solution used to get statistical results and the absence of knowledge regarding the exact statistical traits of the system structure and input are two problems with this approach. ...
Chapter
Hydrologic models differ regarding hydrologic processes, their parameters, inputs and outputs. Consequently, the results of different models for a particular period for a study area also differ. Uncertainty sources of hydrologic predictions are divided into three groups: uncertainties due to model parameters, uncertainties due to model structure and uncertainties due to observation data (for input and calibration). Another source is the uncertainty generated from boundary conditions, i.e., scenario uncertainty. Different uncertainty analysis techniques are reviewed and summarised to identify which is more appropriate for analysing a particular hydrologic modelling uncertainty. Finally, research prospects on uncertainty analysis of hydrologic modelling are proposed.
... Pappenberger et al. (2006) developed a decision tree model to represent different uncertainty causes. According to Montanari (2007), other uncertainty methods can be categorized into (a) analytical methods (Tung, 1996), (b) approximation methods (Melching, 1992), (c) methods based on the analysis of model errors (Montanari and Brath, 2004), (d) Monte Carlo methods (Kuczera and Parent, 1998), (e) Bayesian methods (Beven and Binley, 1992), and (f) methods based on fuzzy set concept (Maskey et al., 2004). ...
Article
Full-text available
Accurate measurement of continuous stream discharge poses both excitement and challenges for hydrologists and water resource planners, particularly in mountainous watersheds. This study centers on the development of rating curves utilizing the power law at three headwaters of the lesser Himalayas—Aglar, Paligaad, and Balganga—through the installation of water level recorders for stage measurement and salt dilution for discharge measurement from 2014 to 2016. The stream stage–discharge relationship, crucially known as the rating curve, is susceptible to numerous factors in mountainous watersheds that are often challenging to comprehend or quantify. Despite significant errors introduced during the rating curve development, such as stemming from observations, modeling, and parameterization, they are frequently overlooked. In this study, acknowledging the inherent uncertainty, we employ the maximum-likelihood method to assess uncertainty in the developed rating curve. Our findings reveal substantial inconsistency in the stage–discharge relationship, particularly during high flows. A novel contribution of this study is introducing a weighing factor concept that correlates uncertainty with the morphological parameters of the watershed. The higher value of the weighting factor in Paligaad (0.37) as compared to Balganga (0.35) and less in the case of Aglar (0.27) will have more uncertainty. The authors contend that precise rating curves and comprehensive uncertainty analyses can mitigate construction costs, foster robust decision-making, and enhance the perceived credibility of decisions in hydrology and water resource management.
... When a measurement can be repeated, its uncertainty can be described using probability distributions of the measured values or measurement errors compared to a reference (Montanari, 2007;Foody and Atkinson, 2003;Povey and Grainger, 2015). Figure 2 illustrates the relationship between uncertainty and other related terms when uncertainty is described by the probability distribution of the measured value or error. ...
Article
Full-text available
Satellite remote sensing (RS) data are increasingly being used to estimate total evaporation, often referred to as evapotranspiration (ET), over large regions. Since RS-based ET (RS-ET) estimation inherits uncertainties from several sources, many available studies have assessed these uncertainties using different methods. However, the suitability of methods and reference data subsequently affects the validity of these evaluations. This study summarizes the status of the various methods applied for uncertainty assessment of RS-ET estimates, discusses the advances and caveats of these methods, identifies assessment gaps, and provides recommendations for future studies. We systematically reviewed 676 research papers published from 2011 to 2021 that assessed the uncertainty or accuracy of RS-ET estimates. We categorized and classified them based on (i) the methods used to assess uncertainties, (ii) the context where uncertainties were evaluated, and (iii) the metrics used to report uncertainties. Our quantitative synthesis shows that the uncertainty assessments of RS-ET estimates are not consistent and comparable in terms of methodology, reference data, geographical distribution, and uncertainty presentation. Most studies used validation methods using eddy-covariance (EC)-based ET estimates as a reference. However, in many regions such as Africa and the Middle East, other references are often used due to the lack of EC stations. The accuracy and uncertainty of RS-ET estimates are most often described by root-mean-squared errors (RMSEs). When validating against EC-based estimates, the RMSE of daily RS-ET varies greatly among different locations and levels of temporal support, ranging from 0.01 to 6.65 mm d−1, with a mean of 1.18 mm d−1. We conclude that future studies need to report the context of validation, the uncertainty of the reference datasets, the mismatch in the temporal and spatial scales of reference datasets to those of the RS-ET estimates, and multiple performance metrics with their variation in different conditions and their statistical significance to provide a comprehensive interpretation to assist potential users. We provide specific recommendations in this regard. Furthermore, extending the application of RS-ET to regions that lack validation will require obtaining additional ground-based data and combining different methods for uncertainty assessment.
... Additionally, conceptual models are often more efficient in terms of computational resources and time required for calibration and prediction, making them suitable for practical applications with limited resources (Gupta and Waymire, 2010). Moreover, these models are flexible and can be easily adjusted and adapted to different hydrological systems or regions, as they rely on empirical relationships rather than detailed knowledge of physical processes, which may vary in different contexts (Montanari, 2007). ...
... Because uncertainty is related to unknown errors in many fields of science, it is often described using probability distributions (Montanari, 2007;Foody and Atkinson, 2003). In metrology, measurement uncertainty is a non-negative parameter that determines the probability distribution of the (possible) values attributed to the measurand (i.e., the quantity intended to be measured) (JCGM, 2012;Loew et al., 2017). ...
Preprint
Full-text available
Satellite remote sensing (RS) data are increasingly being used to estimate total evaporation or evapotranspiration (ET) over large regions. Since RS-based ET (RS-ET) estimation inherits uncertainties from several sources, many available studies have assessed these uncertainties using different methods and reference data. However, the suitability of methods and reference data subsequently affects the validity of these evaluations. This study summarizes the status of the various methods applied for uncertainty assessment of RS-ET estimates, discusses the advances and caveats of these methods, identifies assessment gaps, and provides recommendations for future studies. We systematically reviewed 601 research papers published from 2011 to 2021 that assessed the uncertainty or accuracy of RS-ET estimates. We categorized and classified them based on (i) the methods used to assess uncertainties, (ii) the context where uncertainties were evaluated, and (iii) the metrics used to report uncertainties. Our quantitative synthesis shows that the uncertainty assessments of RS-ET estimates are not consistent and comparable in terms of methodology, reference data, geographical distribution, and uncertainty presentation. Most studies used validation methods using Eddy Covariance (EC) based ET estimates as reference. However, in many regions such as Africa and the Middle East, other references are often used due to the lack of EC stations. The accuracy and uncertainty of RS-ET estimates are most often described by Root-Mean-Squared Error (RMSE). When validating against EC-based estimates, the RMSE of daily RS-ET varies greatly among different locations and levels of temporal support, ranging from 0.01 to 6.65 mm/day with a mean of 1.12 mm/day. We conclude that future studies need to report the context of validation, the uncertainty of the reference datasets, the mismatch in temporal and spatial scales of reference datasets to that of the RS-ET estimates, and multiple performance metrics with their variation in different conditions and statistical significance to provide a comprehensive interpretation to assist potential users. We provide specific recommendations in this regard. Furthermore, extending the application of RS-ET to regions that lack validation will require obtaining additional ground-based data and combining different methods for uncertainty assessment.
... Although considerable progress has been accomplished in addressing various uncertainties of hydrological modeling, model uncertainty (or structural uncertainty) is still a major challenge in hydrology Montanari 2007;Mustafa et al., 2020) and is more elusive than other uncertainties (Kirchner 2006). Hence, a more consistent framework for addressing the different types of uncertainty is required while considering the contextdependence and uniqueness of the study context (Beven, 2000) in hydrological forecasting despite developing a range of alternative models (e.g., Clark et al., 2015;Blöschl et al., 2019). ...
Preprint
Full-text available
In the present study, we review the methods and approaches used for uncertainty handling in hydrological forecasting of streamflow, floods, and snow. This review has six thematic sections: (1) general trends in accounting uncertainties in hydrological forecasting, (2) sources of uncertainties in hydrological forecasting, (3) methods used in the studies to address uncertainty, (4) multi-criteria approach for reducing uncertainty in hydrological forecasting and its applications (5) role of remote sensing data sources for hydrological forecasting and uncertainty handling, (6) selection of hydrological models for hydrological forecasting. Especially, a synthesis of the literature showed that approaches such as multi-data usage, multi-model development, multi-objective functions, and pre-/post-processing are widely used in recent studies to improve forecasting capabilities. This study reviews the current state-of-the-art and explores the constraints and advantages of using these approaches to reduce uncertainty. The comparative summary provided in this study offers insights into various methods of uncertainty reduction, highlighting the associated advantages and challenges for readers, scientists, hydrological modelers, and practitioners in improving the forecast task. A set of freely accessible remotely sensed data and tools useful for uncertainty handling and hydrological forecasting are reviewed and pointed out.
... The uncertainty mentioned in this study refers to the discrepancy in water budget components in the nine simulation realizations, and it is usually defined by a statistical method. According to Montanari (2007), the coefficient of variation (CV) was used to measure uncertainty. Thus, the CV is calculated with respect to the uncertainty of each water budget component based on the multi-year average results of the nine simulation realizations. ...
Article
Global warming potentially increases precipitation and intensifies water exchange, thereby accelerating the hydrological cycle. The Tibetan Plateau (TP) is an Asian water tower in which the water budget varies and its anomaly exerts stress on resource availability. Few studies have quantified long-term water budgets across TP owing to scarcity of ground-based observations and uncertainties in remote sensing data. In this study, water budget components (i.e., precipitation, glacial melting [GM], evapotranspiration [ET], runoff, and soil moisture [SM] state) in TP are synthetically estimated for the past three decades. The water budget estimation benefits from a GM-coupled hydrological ensemble modeling, which is forced by nine precipitation products with seven from satellite methods. The results show that the ensemble modeling effectively captures the dynamics of runoff, ET, and terrestrial water storage. The long-term average annual water input (sum of precipitation and GM) was approximately 438 mm, with ∼4% contribution from GM, for which the annual ET and runoff take away was approximately 263 and 173 mm, respectively. From 1984 to 2015, the four water fluxes significantly increased with varying rates (2.3 mm/yr, precipitation; 0.9 mm/yr, GM; 1.5 mm/yr, ET; 1.1 mm/yr, runoff), which suggested an accelerating hydrological cycle. Particularly, increasing GM (∼5.8 mm/yr) in the Nyainqentanglha Mountains in southern TP induced high-yield runoff (> 800 mm). These estimations aid in yielding robust solutions for water management in TP and neighboring regions. The accelerated hydrological cycle implies potential flooding risk and vulnerability of the hydrological system under climate change.
... All these points have been discussed at length in the hydro-meteorological literature [3,[30][31][32][33][34][35][36], but the most important point on the list, relevant to the Bayesian decision approaches [37], remains number five, because if decision makers could fully grasp the benefits, in terms of increased decision reliability in conjunction with reduction of expected damages and increase of expected benefits, they would unavoidably turn in favor of probabilistic forecasting. ...
Article
Full-text available
In Water Resources Planning and Management, decision makers, although unsure of future outcomes, must take the most reliable and assuring decisions. Deterministic and probabilistic prediction techniques, combined with optimization tools, have been widely used to meet the objective of improving planning as well as management. Bayesian decision approaches are available to link probabilistic predictions to optimized decision schemes, but scientists are not fully able to express themselves in a language familiar to decision makers, who fear basing their decisions on “uncertain” forecasts in the vain belief that deterministic forecasts are more informative and reliable. This situation is even worse in the case of climate change projections, which bring additional degrees of uncertainty into the picture. Therefore, a need emerges to create a common approach and means of communication between scientists, who deal with optimization tools, probabilistic predictions and long-term projections, and operational decision makers, who must be facilitated in understanding, accepting, and acknowledging the benefits arising from operational water resources management based on probabilistic predictions and projections. Our aim here was to formulate the terms of the problem and the rationale for explaining and involving decision makers with the final objective of using probabilistic predictions/projections in their decision-making processes.
... Regarding uncertainty assessment, we could employ the "cascades of uncertainty" framework 577 while from a purely theoretical perspective, there is not enough research to say which approach is 593 better, we can infer that, from a pragmatic perspective, the post-process approaches have limited 594 utility when evaluated in regionalization studies (Montanari 2007), given the inherent lack of 595 streamflow data for their use (unless we could find ways to regionalize these post-process 596 uncertainty assessments themselves). 597 ...
Preprint
Full-text available
Flood frequency analysis lies at the core of hydrology and water engineering as one of the most required estimates for water planning and design of hydraulic structures. For ungauged basins, where no information is available, there are various flood regionalization techniques with varying degrees of complexity and resulting performance, depending on the study's goal, the region analyzed, and the information available. This study evaluates the use of hydrological models for flood regionalization in Chile, using 1) A large sample dataset of 101 catchments; 2) the continuous simulation approach with the GR4J model; 3) the “leave one out” strategy for performance testing; and, 4) two parameter regionalization methods: Nearest neighbor and physical similarity, together with several alternative objective functions for calibration purposes and regionalization strategy (in all cases adopting a single criterion, single variable and determinist approach for the parameter’s selection). Our results showed that performance (both in calibration-validation and regionalization) is highly variable (in terms of reproducing the runoff hydrograph and flood statistics), depending on the catchment’s aridity (for example, around 66-82% of catchments with NSE above 0 in humid regions but it severely drops to 12-44% of catchments with NSE above 0 when evaluating arid catchments). We also found that flood specific calibration strategies produce better results for flood reproduction but poorer performance in runoff hydrograph reproduction. Finally, we highlight that our regionalization results were in close agreement with those from one of the currently recommended methods by Chilean engineering for flood regionalization. This is particularly promising, considering that the continuous simulation approach gives access to the complete time series and not only flood statistics. We end this manuscript by discussing several sources of uncertainty, hoping that these can be accounted for in future studies.
... The conventional solution/compromise to a lack of such data is to conduct a sensitivity analysis on selected parameters. Montanari (2007) argues that in ungauged or scarcely gauged catchments, sensitivity analysis provides good uncertainty estimations. ...
... This view is very different from Aristotle's one that suggests that randomness is lack-of-purpose noise (d10, section 2.3.3.1). Regarding uncertainties, Nearing et al. (2016) reminded the extent of many discussions on the role of uncertainties (d3, d10; e.g., Pappenberger and Beven, 2006;Sivapalan, 2009), their nature (d7, d8; e.g., Koutsoyiannis, 2010;Montanari, 2007), and their appropriate handling (d9; Beven et al., , 2008Clark et al., 2012;Mantovan and Todini, 2006;Stedinger et al., 2008;Vrugt et al., 2009). Thus, although fuzzy by nature, randomness can be studied as well in the light of Aristotelean causality by asking its source (d7), its form (d8), its propagation (d9), as well as its meaning and purpose (d10). ...
Thesis
Full-text available
Hydrological systems seem simple, "everything flows", but prove to be even more complex when one tries to differentiate and characterize flows in detail. Hydrology has thus developed a plurality of models that testify to the complexity of hydrological systems and the variety of their causal representations, from the simplest to the most sophisticated. Beyond a subjective complexity linked to our difficulty in understanding or our attention to detail, hydrological systems also present an intrinsic complexity. These include the nonlinearity of processes and interactions between variables, the number of variables in the system, or dimension, and how they are organized to further simplify or complicate the system's dynamics. The thesis addresses these aspects of hydrological complexity. An epistemological and historical analysis of the concept of causality explores the human understanding of hydrological systems. Based on empirical approaches applied to the limestone karstic system of the Lhomme at Rochefort in Belgium, the thesis then studies methods to analyze the nonlinearity of the Lhomme river recession and associate it with the geomorphological complexity of the watershed. The thesis also handles the discrimination of dominant dynamic behaviors in the hydrological continuum of the Rochefort caves subsurface based on an electrical resistivity model of the subsurface and clustering methods grouping time-series according to their similarity. Ref: Delforge, Damien. Causal analysis of hydrological systems : case study of the Lhomme karst system, Belgium . Prom. : Vanclooster, Marnik ; Van Camp, Michel Permalink: http://hdl.handle.net/2078.1/240635
... In this case, the systematic analysis of change drivers (uncertainty sources) offers a set of results around potential scenarios to frame uncertainty (Refsgaard et al. 2007;Rodrigues et al. 2015), while the driver sensitivity analysis is proposed as a part of the results of this study. Montanari (2007), however, advocates that some methods commonly used for uncertainty assessment do not address uncertainty, but only model sensitivity. Moreover, although some studies indicate that climate projections surpass hydrological uncertainties (Bates et al. 2008;Nóbrega et al. 2011), Honti, Scheidegger, andStamm (2014) reinforce that different methods of uncertainty assessment may lead to different conclusions. ...
Article
Full-text available
Climate change and increasing water demand in urban environments necessitate planning water utility companies’ finances. Traditionally, methods to estimate the direct water utility business interruption costs (WUBIC) caused by droughts have not been clearly established. We propose a multi-driver assessment method. We project the water yield using a hydrological model driven by regional climate models under radiative forcing scenarios. We project water demand under stationary and non-stationary conditions to estimate drought severity and duration, which are linked with pricing policies recently adopted by the Sao Paulo Water Utility Company. The results showed water insecurity. The non-stationary trend imposed larger differences in the drought resilience financial gap, suggesting that the uncertainties of WUBIC derived from demand and climate models are greater than those associated with radiative forcing scenarios. As populations increase, proactively controlling demand is recommended to avoid or minimize reactive policy changes during future drought events, repeating recent financial impacts.
... However, a hydrological prediction model is only for simulating hydrological which is usually incomplete (Wetterhall et al. 2013;Ravines et al. 2008) that accept different hydrological and meteorological input and use the parameters of the conceptual model; and these complex factors are due to unreliability in hydrological predictions (Freer et al. 1996;Montanari and Grossi 2008;Montanari 2007). The principle of logical decision-making in uncertainty shows that when a definite prediction is wrong, the consequences are likely worse than in a situation where no prediction is available (Krzystofowicz 1999;Wetterhall 2013;Ramos et al. 2013). ...
Article
Full-text available
Investigating the interaction of water resources such as rainfall, river flow and groundwater level can be useful to know the behavior of water balance in a basin. In this study, using the rainfall, river flow and groundwater level deficiency signatures for a 60-day duration, accuracy of vine copulas was investigated by joint frequency analysis. First, while investigating correlation of pair-variables, tree sequences of C-, D- and R-vine copulas were investigated. The results were evaluated using AIC, Log likelihood and BIC statistics. Finally, according to the physics of the problem and evaluation criteria, D-vine copula was selected as the best copula and the relevant tree sequence was introduced. Kendall’s tau test was used to evaluate the correlation of pair-signatures. The results of the Kendall’s tau test showed that pair-signatures studied have a good correlation. Using D-vine copula and its conditional structure, the joint return period of groundwater deficiency signature affected by rainfall and river flow deficiency signatures was investigated. The results showed that the main changes in the groundwater level deficiency is between 0.3 and 2 m, which due to the rainfall and the corresponding river flow deficiency, return periods will be less than 5 years. Copula-based simulations were used to investigate the best copula accuracy in joint frequency analysis of the studied signatures. Using copula data of the studied signatures, the groundwater deficiency signature was simulated using D-vine copula and a selected tree sequence. The results showed acceptable accuracy of D-vine copula in simulating the copula values of the groundwater deficiency signature. After confirming the accuracy of D-vine copula, the probability of occurrence of groundwater deficiency signature was obtained from the joint probability of occurrence of other signatures. This method can be used as a general drought monitoring system for better water resources management in the basin.
... Uncertainty analysis is where the accuracy of model outcomes is determined. Uncertainty has been rapidly embraced by other fields (in particular, hydrology; Beven, 1996Beven, , 2000Haff, 1996;Bronstert, 2004;Montanari, 2007) and may become an important component of modeling in fluvial geomorphology. However, uncertainty in numerical models has many origins: input data, model simplifications, algorithm structure, calibration process, calibration and validation data, as well as equifinality. ...
Chapter
Numerical models of fluvial geomorphology have become an important tool for investigating and understanding how our landscapes are shaped by water. During the last two decades many types of model have been developed simulating processes ranging from bank erosion during a single flood to entire catchments over thousands of years. These models all have their bespoke purpose as well as advantages and limitations. In this chapter, we present a review of the different types of model that have been developed for modeling fluvial geomorphology. Additionally, we discuss the difficulties and issues faced modeling fluvial processes such as calibration, validation, nonlinearity, and uncertainty. Finally, we discuss what possibilities and challenges lay ahead for modeling of fluvial geomorphology.
... The conventional solution/compromise to a lack of such data is to conduct a sensitivity analysis on selected parameters. Montanari (2007) argues that in ungauged or scarcely gauged catchments, sensitivity analysis provides good uncertainty estimations. ...
... The conventional solution/compromise to a lack of such data is to conduct a sensitivity analysis on selected parameters. Montanari (2007) argues that in ungauged or scarcely gauged catchments, sensitivity analysis provides good uncertainty estimations. ...
Thesis
Full-text available
In most studies on flood risk in Nigeria, priority is given to urban flood risk than rural flood risk. Smallholder farmers who mostly inhabit rural areas and flood plains are known to be at great risk of flooding, but to date, there have been no studies on the impacts they experience and how they cope with these impacts. To establish the existence of significant change in precipitation in the study area, statistical analysis of precipitation data was conducted using linear regression and non-parametric Mann Kendall test. Landsat satellite images for 1986, 2003 and 2016 were obtained from the United States Geological Survey’s (USGS) website and used to collect and analyze data related to land use. To determine flood risk in the context of land use change, the HEC-HMS model version 3.5 was used. Land use maps corresponding to 1986, 2003, and 2016 LULC conditions were analysed and prepared for the calculation of CN values using Soil Conservation Service (SCS-CN) method. CN map was prepared by integrating the maps of hydrologic soil groups and land use in ArcGIS software. Based on selected morphological parameters, a flood risk map was developed for the study area. The morphologic flood risk showed that most parts of the basin face moderate flood risk. Higher runoff values were observed under 2016 LULC conditions mainly due to intense deforestation relative to those of 1986 and 2003. The results indicated that runoff and peak discharge are significantly affected by deforestation instead of changes in the rainfall pattern. Flood inundation maps of the study area were prepared for the flood events of 2012 and 2015. Maximum depth for the 2012 and 2015 flood events was calculated as 10.58m and 7.39m, respectively. The socioeconomic aspects of the research were conducted using household survey method. Using snowball sampling techniques, 400 flood-affected households were selected. The results show that the adaptive capacity of smallholder farmers in the study area remains very low and hampered by several factors such as poverty and weakened social networks. The synopsis of the examined variables leads to the following conclusion: Instead of climatic factors – as one indicator of vulnerability – it is social inequality and poverty in the region as well as the structural economic differences between various groups that mainly drives vulnerability in the study area.
... Since there may be multiple models that are consistent with observed data, predictions of future or hypothetical hydrological scenarios can vary widely. Uncertainty of model results refers to the potential variability of the results due to different model structure, input parameters, and forcing variables (Beven, 1993;Montanari, 2007). Models that are able to produce results that are close to the true values and with low uncertainty are preferred over others. ...
Article
Nowadays, mathematical models of hydrological systems are used routinely to guide decision making in diverse subjects, such as: environmental and risk assessments, design of remediation strategies for contaminated sites, and evaluation of the impact of climate change on water resources. The correct development and use of them is relevant beyond the realm of hydrology. The continuous improvement in computational power and data collection are leading to the development of increasingly complex models, which integrate multiple coupled physical processes to achieve a better representation of the modeled system. Most of the parameters included in models are difficult to measure directly, so they must be estimated from collected data through a calibration procedure. Furthermore, when models are used to make forecasts about future or hypothetical scenarios, it is important to bound the uncertainty of their results. Therefore, the application of systematic approaches for parameter estimation, sensitivity, and uncertainty analysis to integrate data and models and quantify potential errors, is more necessary now than it was in the past. Even though methodological frameworks for these purposes exist, they have had a slow adoption due to their high computational cost and the required technical knowledge to apply them. We analyze existing methodologies, discuss remaining challenges, and present a survey of emerging trends for the application of parameter estimation and uncertainty analysis in hydrological modeling. Thus, the main objective of this overview article is contributing to improving the quality of models and to their correct use as support tools for decision‐making. This article is categorized under: Science of Water Science of Water > Hydrological Processes Science of Water > Methods
... De manière générale, les observations sont soumises à différentes incertitudes (Beven, 1993 ;Montanari, 2007 ;Vrugt et al., 2008 ;Etter et al., 2018), notamment quand il s'agit des observations liées aux évènements de crue majeurs. À titre d'exemple, nous présen-tons des incertitudes constatés dans les observations lors de la crue d'octobre 2018 sur le bassin versant de l'Aude. ...
Thesis
La prévision de crues joue un rôle fondamental dans l’anticipation et la mise en œuvre de mesures visant à protéger les personnes et les biens. L’objectif de cette thèse est d’examiner notre capacité à améliorer la simulation et la prévision d’événements majeurs de crues soudaines en France. Premièrement, nous examinons les limites de l’approche globale de modélisation hydrologique et la contribution du modèle hydrologique semi-distribué GRSD, à maillage fin et au pas de temps horaire, à la simulation d’évènements majeurs de crue. Nous proposons une modification de la structure de ce modèle afin qu’il soit mieux adapté à reproduire la réponse des bassins versants aux fortes intensités de pluie. Une adaptation de la structure du modèle, basée sur le calcul du rendement de pluies, a abouti à l’introduction d’un nouveau paramètre et à la proposition d’un nouveau modèle (GRSDi) mieux capable de simuler la réponse hydrologique à des pluies intenses qui surviennent en automne. Deuxièmement, nous explorons comment une approche de prévision d’ensemble météorologique, combinée au modèle hydrologique semi-distribué, peut contribuer à mieux prévoir les évènements de crues rapides, l’amplitude et l’instant d’occurrence des débits de pointe, que ce soit sur des bassins jaugés ou non-jaugés. Les résultats ont permis d’identifier, du point de vue hydrologique, les forces et faiblesses des produits proposés. Les travaux menés constituent un pas en avant vers l’utilisation de modèles hydrologiques conceptuels, continus et semi-distribués, dans le cadre de la prévision de crues majeures et rapides en contexte méditerranéen.
... To develop probabilistic maps through the estimation of uncertainties, two main procedures can be employed: Bayesian approaches [54] and the Generalized Likelihood Uncertainty Estimation (GLUE) methodology [55]. GLUE is probably the most common method used for uncertainty estimation in hydrological studies [56]. Di Baldassarre et al. [43] applied the GLUE procedure to incorporate uncertainties from the LISFLOOD-FP raster-based model parameters (channel's roughness) in flood inundation. ...
Article
Full-text available
An integrated framework was employed to develop probabilistic floodplain maps, taking into account hydrologic and hydraulic uncertainties under climate change impacts. To develop the maps, several scenarios representing the individual and compounding effects of the models’ input and parameters uncertainty were defined. Hydrologic model calibration and validation were performed using a Dynamically Dimensioned Search algorithm. A generalized likelihood uncertainty estimation method was used for quantifying uncertainty. To draw on the potential benefits of the proposed methodology, a flash-flood-prone urban watershed in the Greater Toronto Area, Canada, was selected. The developed floodplain maps were updated considering climate change impacts on the input uncertainty with rainfall Intensity–Duration–Frequency (IDF) projections of RCP8.5. The results indicated that the hydrologic model input poses the most uncertainty to floodplain delineation. Incorporating climate change impacts resulted in the expansion of the potential flood area and an increase in water depth. Comparison between stationary and non-stationary IDFs showed that the flood probability is higher when a non-stationary approach is used. The large inevitable uncertainty associated with floodplain mapping and increased future flood risk under climate change imply a great need for enhanced flood modeling techniques and tools. The probabilistic floodplain maps are beneficial for implementing risk management strategies and land-use planning.
... (2009) (Tung, 1996); (2) yaklaşım yöntemleri, örneğin birinci dereceden ikinci moment yöntemi (Melching, 1992); (3) simülasyon ve örneklemeye dayalı (Monte Carlo) yöntemler (Kuczera ve Parent, 1998); (4) Bayes yöntemleri, örneğin Bayes yönteminin bir çeşidi olan genelleştirilmiş olabilirlik belirsizlik tahmini (GLUE) (Beven ve Binley, 1992); (5) model hatalarının analizine dayanan yöntemler (Montanari ve Brath, 2004); (6) bulanık küme teorisine dayanan yöntemler (Maskey ve diğ, 2004). Bu sınıflandırma aynı zamanda literatürde bulunan sınıflandırma ile de uyumludur (Shrestha ve Solomatine, 2006;Montanari, 2007). ...
Thesis
Full-text available
Water is a natural source of vital importance for all living things. The growing population and the resulting needs put pressure on the development of water resources. Knowing the amount of water available is one of the most important elements in the development of water resources. Therefore, the determination of the water movement in the hydrological cycle has gained importance. An important component of the hydrological cycle is the streamflow. Streamflow is measured by the gauging stations which are established at particular points of the river. For efficient use and development of water resources, these measurements must be complete, long-term and reliable. However, the lack of measurements in some rivers is the most important problem encountered in the development of water resources. Therefore, hydrological studies may in the danger of being unscientific and uneconomic due to gauging stations that cannot be placed everywhere. Considering that smaller water resources will be utilized in the future, it is clear that streamflow estimation problem will increase. In order to solve this problem, it is important to make accurate and reliable streamflow estimates. In the literature, there are many methods advanced for completing missing data or for estimating streamflow in ungauged basins. Especially in the last decade, researchers have adopted a variety of statistical methods that require the transfer of hydrological information from a gauged source (donor) basin to an ungauged (target) basin for streamflow estimation in ungauged basins. These methods are usually referred to as regional methods. In order to estimate daily streamflows, regression analysis (REG) method, drainage area ratio (DAR) method, multiple source stations based DAR (MDAR) method, inverse distance weighted (IDW) method, inverse similarity weighted (ISW) method, standardization with mean (SM) method, standardization with mean and standard deviation (SMS) method and various variations of these individual methods were applied to the study basins. Various ensemble methods were tried to investigate the applicability of weighted estimates of individual methods. Two separate studies were conducted, one in the Porsuk River Basin, which has sparse streamflow observations, and another in the Euphrates-Tigris Basin which is the richest basin of Turkey in terms of surface water. Nash-Sutcliffe efficiency (NSE), root mean square error (RMSE), RMSE-observations standard deviation ratio (RSR), determination of coefficient (R2), percent bias (PBIAS) and relative error (RE) were used to evaluate the performance of each method. Drainage area ratio (DAR) method is one of the oldest information transfer methods for obtaining streamflow values at the target station from the source station. This method is straightforward to apply and is in widespread use by hydrologists because it requires no additional information other than the streamflow values at the target station and the drainage areas of the source and target stations. In the traditional application of this method, area normalized streamflow values are transferred from only single source station to the target station. In addition to the drainage area, there are some other factors that have a significant influence on unique streamflow characteristics of a station. Because the DAR method is used with only single source station, systematic errors can be encountered in the estimation of a target station. When more streamflow gauging stations are used to estimate streamflow for the target station, this method is referred to as the multiple source stations based drainage area ratio (MDAR) method. The MDAR method assumes that the streamflow estimates at the target station can be computed as the weighted average of the estimates of the multiple source stations selected. The inverse distance weighted (IDW) method is one of the most widely used interpolation methods based on the geographical distance between the source and the target stations. This method can be considered as a variant of the DAR method. In the IDW method used in this study, area normalized streamflow values are directly transferred to a target station from multiple source stations. The IDW method estimates the streamflow value for the target station by taking the geographical distance between the source station and the target station as the weight. The closer the geographical distance between the source station and the target station is, the larger the influence on the target station will be. That is, when the distance decreases, the weight coefficient increases. The IDW method, also called an inverse distance to a power, is a weighted average interpolator and the main factor affecting the accuracy of the IDW method is the value of the power parameter. As the power parameter increases, more influence is given to the source stations close to the target station. In the literature, the value of the power parameter is commonly chosen as 2, which is known as the inverse distance square weighted. Alternatively, the inverse similarity weighted (ISW) method, which is similar to the IDW method, can be applied on the basis of multiple source stations. Unlike the IDW method, the ISW method uses the physical similarity distance instead of the geographical distance between the target and the source station. Standardization with mean (SM) and standardization with mean and standard deviation (SMS) are common transfer methods for streamflows. SM method takes into account the ratio of streamflow to the mean streamflow. In this study, the SM method applied with annual (SM1) and monthly (SM12) variations. SM1 standardizes the daily time series with the annual mean streamflow, while SM12 standardizes the daily time series with monthly mean streamflow. Whereas, the SMS method is based on the assumption that the standardized streamflows at both a target and a source station are approximately equal. In this study, the SMS method applied with annual (SMS1) and monthly (SMS12) variations. The distinction between these variations of the SMS method is similar to the distinction between SM1 and SM12. In the application of the Porsuk River Basin, firstly, REG, DAR, SM and SMS methods were used to complete the missing data at the target station. the most appropriate completing method was selected for each station. Missing data at each target station were completed using the source station whose data was available. Secondly, assuming that each station in turn is an ungauged basin, DAR, MDAR, IDW, SM and SMS methods were used to estimate daily streamflow. Statistical parameters (mean and standard deviation of the streamflow) used in the SM and SMS methods were calculated by suggested regression equations based on logarithmic relationships between statistical parameters and drainage area. In order to get more consistent estimation and reduce the uncertainties in the individual methods, the estimates from two or three of the individual methods were weighted according to their relative performance and then combined to obtain the recommended ensemble estimates for each station. The performance of these ensemble methods was compared with that of the individual methods. Promising estimation results from ensemble methods were obtained for each station. In addition, selected percentiles for different hydrological conditions of the flow duration curves of the observed and the estimated data were compared. As a result, obtained successful results in such a poorly gauged basin are expected to contribute to the streamflow estimation literature. In the application of the Euphrates-Tigris Basin, DAR, MDAR, IDW, ISW, SM and SMS methods were used to estimate daily streamflows. For the application of the methods, 8 streamflow gauging stations from the Middle Euphrates Basin and 7 streamflow gauging stations from the Upper Euphrates Basin, which have complete and long-term data were selected. Each basin has been evaluated separately. Three different power parameters (1, 2 and 3) of the IDW and ISW methods were compared to determine their accuracy and suitability for estimating daily streamflow values. The Regional multivariate stepwise regression method was used to estimate long-term streamflow statistics for use in SM and SMS methods. The mean and standard deviation values of the streamflows on an annual and monthly basis were calculated based on the various hydrometeorological and hydromorphological variables. The source stations were identified for each target station based on their geographical proximity, physical similarity, and correlation. Basin characteristics such as geographical, topographical, and climate variables were considered for determining the physical similarity between the source and target stations. The individual method weighting system based on absolute error, NSE, RSR and PBIAS values has been developed for performance-weighted ensemble methods. The estimation performance of various pairwise and triple combinations of the individual methods (ensemble methods) was compared with that of individual methods. In addition, the daily streamflow data were smoothed with the symmetric two-sided moving average (MA) filtering in order to reduce noise. The estimation performance of the statistical methods was tested and compared by using daily streamflow data with preprocessing and without preprocessing. The statistical methods using smoothed daily streamflow data by the MA, which are referred to as DAR-MA, MDAR-MA, IDW-MA, ISW-MA, SM-MA, SMS-MA were proposed. Block Bootstrap simulation method was used to test the reliability of the methods. By using 1000 block bootstrap data obtained for each station, the percentiles of the flow duration curves (values having exceedance probability) were calculated for different hydrological conditions and the 95% confidence interval was determined. It was tested whether the percentiles of the flow duration curves of the estimations obtained by DAR, MDAR, IDW, ISW, SM, SMS, and ensemble methods remain within these intervals. As a result, the methods used in the study have been shown to be easily applicable for decision making and design in water resources projects that have difficulty in obtaining data.
... The Generalized Likelihood Uncertainty Estimation (GLUE) methodology of the uncertainty analysis frameworks in hydrological modeling [16] can be used to analyse model parameter uncertainty and understand model behaviour. There have been arguments that the method is subjective mainly for the threshold assumption [16,17] and suggestions to value the method as a weighted sensitivity analysis instead of a probabilistic method [18]. The threshold assumption is intended as a boundary of acceptability, where values above the threshold are considered to be behavioral [19] and are further used in the uncertainty analysis. ...
Preprint
Full-text available
Rainfall runoff modeling has been a subject of interest for decades due to a need to understand a catchment system for management, for example regarding extreme event occurrences such as flooding. Tropical catchments are particularly prone to the hazards of extreme precipitation and the internal drivers of change in the system, such as deforestation and land use change. A model framework of dynamic TOPMODEL, DECIPHeR v1 - considering the flexibility, modularity and portability - and Generalized Likelihood Uncertainty Estimation (GLUE) method are both used in this study. They reveal model performance for the streamflow simulation in a tropical catchment, i.e. the Kelantan River in Malaysia, that is prone to flooding and experiences high rates of land use change. 32 years' continuous simulation at a daily time scale simulation along with uncertainty analysis resulted in a Nash Sutcliffe Efficiency (NSE) score of 0.42 from the highest ranked parameter set, while 25.35% of the measurement falls within the uncertainty boundary based on a behavioral threshold NSE 0.3. The performance and behavior of the model in the continuous simulation suggests a limited ability of the model to represent the system, particularly along the low flow regime. In contrast, the simulation of eight peak flow events achieves moderate to good fit, with the four peak flow events simulation returning an NSE > 0.5. Nonetheless, the parameter scatterplot from both the continuous simulation and analyses of peak flow events indicate unidentifiability of all model parameters. This may be attributable to the catchment modelling scale. The results demand further investigation regarding the heterogeneity of parameters and calibration at multiple scales.
... Toy models have also been exploited for other modelling situations in geoscience (see e.g., Koutsoyiannis 2006, 2010; see also the references in Koutsoyiannis 2006), while falling into the broader category of simulation or synthetic experiments, which are increasingly conducted within various hydrological contexts, including some more relevant to the present Chapter (see e.g., Kavetski et al. 2002; Vrugt et al. 2003, 2005; Montanari 2005; Vrugt and Robinson 2007; Vrugt et al. 2008; Renard et al. 2010; Schoups and Vrugt 2010; Montanari and Koutsoyiannis 2012; Montanari and Di Baldassarre 2013; Tyralis et al. 2013; Vrugt et al. 2013; Sadegh and Vrugt 2014; Sadegh et al. 2015; Sikorska et al. 2015; Vrugt 2016; Tyralis and Papacharalampous 2017; see also Chapters 3 and 4 herein). Discussions on the significance of this type of experiments can be found inMontanari (2007). In fact, simplified modelling situations can be useful as starting points for achieving effective real-world modelling, especially in cases where analytical solutions exist (see e.g., Volpi 2012). ...
Thesis
Full-text available
This thesis falls into the scientific areas of stochastic hydrology, hydrological modelling and hydroinformatics. It contributes with new practical solutions, new methodologies and large-scale results to predictive modelling of hydrological processes, specifically to solving two interrelated technical problems with emphasis on the latter. These problems are: (A) hydrological time series forecasting by exclusively using endogenous predictor variables (hereafter, referred to simply as “hydrological time series forecasting”); and (B) stochastic process-based modelling of hydrological systems via probabilistic post-processing (hereafter, referred to simply as “probabilistic hydrological post-processing”). For the investigation of these technical problems, the thesis forms and exploits a novel predictive modelling and benchmarking toolbox. This toolbox is consisted of: (i) approximately 6 000 hydrological time series (sourced from larger freely available datasets), (ii) over 45 ready-made automatic models and algorithms mostly originating from the four major families of stochastic, (machine learning) regression, (machine learning) quantile regression, and conceptual process-based models, (iii) seven flexible methodologies (which together with the ready-made automatic models and algorithms consist the basis of our modelling solutions), and (iv) approximately 30 predictive performance evaluation metrics. Novel model combinations coupled with different algorithmic argument choices result in numerous model variants, many of which could be perceived as new methods. All the utilized models (i.e., the ones already available in open software, as well as those automated and proposed in the context of the thesis) are flexible, computationally convenient and fast; thus, they are appropriate for large-sample (even global-scale) hydrological investigations. Such investigations are implied by the (mainly) algorithmic nature of the methodologies of the thesis. In spite of this nature, the thesis also provides innovative theoretical supplements to its practical and methodological contribution. Technical problem (A) is examined in four stages. During the first stage, a detailed framework for assessing forecasting techniques in hydrology is introduced. Complying with the principles of forecasting and contrary to the existing hydrological (and, more generally, geophysical) time series forecasting literature (in which forecasting performance is usually assessed within case studies), the introduced framework incorporates large-scale benchmarking. The latter relies on big hydrological datasets, large-scale time series simulation by using classical stationary stochastic models, many automatic forecasting models and algorithms (including benchmarks), and many forecast quality metrics. The new framework is exploited (by utilizing part of the predictive modelling and benchmarking toolbox of the thesis) to provide large-scale results and useful insights on the comparison of stochastic and machine learning forecasting methods for the case of hydrological time series forecasting at large temporal scales (e.g., the annual and monthly ones), with emphasis on annual river discharge processes. The related investigations focus on multi-step ahead forecasting. During the second stage of the investigation of technical problem (A), the work conducted during the previous stage is expanded by exploring the one-step ahead forecasting properties of its methods, when the latter are applied to non-seasonal geophysical time series. Emphasis is put on the examination of two real-world datasets, an annual temperature dataset and an annual precipitation dataset. These datasets are examined in both their original and standardized forms to reveal the most and least accurate methods for long-run one-step ahead forecasting applications, and to provide rough benchmarks for the one-year ahead predictability of temperature and precipitation. The third stage of the investigation of technical problem (A) includes both the examination-quantification of predictability of monthly temperature and monthly precipitation at global scale, and the comparison of a large number of (mostly stochastic) automatic time series forecasting methods for monthly geophysical time series. The related investigations focus on multi-step ahead forecasting by using the largest real-world data sample ever used so far in hydrology for assessing the performance of time series forecasting methods. With the fourth (and last) stage of the investigation of technical problem (A), the multiple-case study research strategy is introduced −in its large-scale version− as an innovative alternative to conducting single- or few-case studies in the field of geophysical time series forecasting. To explore three sub-problems associated with hydrological time series forecasting using machine learning algorithms, an extensive multiple-case study is conducted. This multiple-case study is composed by a sufficient number of single-case studies, which exploit monthly temperature and monthly precipitation time series observed in Greece. The explored sub-problems are lagged variable selection, hyperparameter handling, and comparison of machine learning and stochastic algorithms. Technical problem (B) is examined in three stages. During the first stage, a novel two-stage probabilistic hydrological post-processing methodology is developed by using a theoretically consistent probabilistic hydrological modelling blueprint as a starting point. The usefulness of this methodology is demonstrated by conducting toy model investigations. The same investigations also demonstrate how our understanding of the system to be modelled can guide us to achieve better predictive modelling when using the proposed methodology. During the second stage of the investigation of technical problem (B), the probabilistic hydrological modelling methodology proposed during the previous stage is validated. The validation is made by conducting a large-scale real-world experiment at monthly timescale. In this experiment, the increased robustness of the investigated methodology with respect to the combined (by this methodology) individual predictors and, by extension, to basic two-stage post-processing methodologies is demonstrated. The ability to “harness the wisdom of the crowd” is also empirically proven. Finally, during the third stage of the investigation of technical problem (B), the thesis introduces the largest range of probabilistic hydrological post-processing methods ever introduced in a single work, and additionally conducts at daily timescale the largest benchmark experiment ever conducted in the field. Additionally, it assesses several theoretical and qualitative aspects of the examined problem and the application of the proposed algorithms to answer the following research question: Why and how to combine process-based models and machine learning quantile regression algorithms for probabilistic hydrological modelling?
... Further, structural errors in a model can lead to mismatches between modeled and observed results, particularly when the number of processes and the associated number of adjustable and uncertain model parameters increases (Kirchner, 2006). Most mechanistic models do not address model input uncertainty, model parameter uncertainty, nor assess model structural errors in a systematic manner (Montanari, 2007). The extent to which this is an issue depends upon the research question, but when applied to conservation management decisions, then honest accounting of model uncertainty is essential. ...
Article
Full-text available
Just as organisms do not passively exist in their environment, developing sea turtle embryos affect their own incubation microclimate by producing metabolic heat and other waste products. This metabolic heat ultimately contributes to successful development, but it is unclear how it influences embryonic traits such as hatchling sex, and whether trends exist in the magnitude of metabolic heat produced by different species. In this systematic review, we document all empirical measurements of metabolic heat in sea turtle species, its impact on their development, and explore the methods used to predict nest temperature and how these methods account for metabolic heat. Overall we found metabolic heat increases throughout incubation in all seven species. Typically, the temperature at the center of an egg clutch peaked during the final third of incubation, and exceeded the adjacent sand temperature by 2.5°C. Metabolic heat was not influenced by species or clutch size (n = 16 studies), but this finding was likely influenced by inconsistency in the methods used to measure metabolic heat, and by localized differences in sand properties, such as moisture, albedo, and thermal conductivity. The influence of metabolic heat on embryo sex determination and survival was dependent on the sand temperature surrounding the nest chamber, and had an appreciable affect only when it caused nest temperatures to exceed key thresholds for sex determination and survival. Methods for modeling nest temperature were either correlative (70% of publications) or mechanistic (30% of publications). Correlative models describe relationships between empirical data to predict sand temperature, while mechanistic models are based on physical laws and do not require extrapolation when predicting novel situations, such as climate change. Most correlative models (74%) accounted for metabolic heat when predicting nest temperatures, but no mechanistic models did—largely because they were developed for predicting primary sex ratios, which are widely believed to be unaffected by metabolic heat. Consequently, developing a generalisable mechanistic model that incorporates metabolic heat will be critical for predicting incubation success as the sand temperatures at sea turtle beaches increase due to anthropogenic climate change.
... The holistic model should consider uncertainties from the different water sub-systems modeled and uncertainties due to the coupling of the sub-systems ( Tscheikner-Gratl et al., 2019 ). However, there are differences in the perception of uncertainties across the environmental modelling and integrated water modelling community and across the different water sectors and a standardized methodological approach to identify, quantify, reduce, report and communicate uncertainties is still missing ( Bach et al., 2014 ;Montanari, 2007 ;Vanrolleghem et al., 2011 ). An overview on uncertainty sources for the integrated water modelling can be found in the study of Tscheikner-Gratl et al. (2019) , whereas a practical approach for the quantification of uncertainty in integrated water models is proposed by Tscheikner-Gratl et al. (2017) . ...
Article
Full-text available
Water - the most vital resource, negatively affected by the linear pattern of growth - still tries to find its positioning within the emerging concept of circular economy. Fragmented, sectorial circularity approaches hide the risk of underestimating both the preservation of and impacts to water resources and natural capital. In this study, a game changing circularity assessment framework is developed (i.e. MSWCA). The MSWCA follows a multi-sectoral systems approach, symbiotically managing key water-related socio-economic (i.e. urban water, agro-food, energy, industry and waste handling) and non-economic (i.e. natural environment) sectors. The MSWCA modelling framework enables the investigation of the feedback loops between the nature-managed and human-managed systems to assess water and water-related resources circularity. The three CE principles lie at the core of the developed framework, enabling the consideration of physical, technical, environmental and economic aspects. An indicators database is further developed, including all the relevant data requirements, as well as existing and newly developed indicators assessing multi-sectoral systems' circularity. The MSWCA framework is conceptually applied to a fictional city, facilitating its understanding and practical use.
... Further, an updated version should better characterize heterogeneities within each catchment, both for the time series and attributes. Additionally, since data uncertainties are omnipresent (Montanari, 2007;Blöschl et al., 2019b;Addor et al., 2019), they should be further explored by including additional data sources. ...
Article
Full-text available
We introduce a new catchment dataset for large-sample hydrological studies in Brazil. This dataset encompasses daily time series of observed streamflow from 3679 gauges, as well as meteorological forcing (precipitation, evapotranspiration, and temperature) for 897 selected catchments. It also includes 65 attributes covering a range of topographic, climatic, hydrologic, land cover, geologic, soil, and human intervention variables, as well as data quality indicators. This paper describes how the hydrometeorological time series and attributes were produced, their primary limitations, and their main spatial features. To facilitate comparisons with catchments from other countries, the data follow the same standards as the previous CAMELS (Catchment Attributes and MEteorology for Large-sample Studies) datasets for the United States, Chile, and Great Britain. CAMELS-BR (Brazil) complements the other CAMELS datasets by providing data for hundreds of catchments in the tropics and the Amazon rainforest. Importantly, precipitation and evapotranspiration uncertainties are assessed using several gridded products, and quantitative estimates of water consumption are provided to characterize human impacts on water resources. By extracting and combining data from these different data products and making CAMELS-BR publicly available, we aim to create new opportunities for hydrological research in Brazil and facilitate the inclusion of Brazilian basins in continental to global large-sample studies. We envision that this dataset will enable the community to gain new insights into the drivers of hydrological behavior, better characterize extreme hydroclimatic events, and explore the impacts of climate change and human activities on water resources in Brazil. The CAMELS-BR dataset is freely available at https://doi.org/10.5281/zenodo.3709337 (Chagas et al., 2020).
... Further, an updated version should better characterize heterogeneities within each catchment, both for the time series and attributes. Additionally, 450 since data uncertainties are omnipresent (Montanari, 2007;Blöschl et al., 2019b;Addor et al., 2019), they should be further explored by including additional data sources. ...
Preprint
Full-text available
Abstract. We introduce a new catchment dataset for large-sample hydrological studies in Brazil. This dataset encompasses daily time series of observed streamflow from 3713 gauges, as well as meteorological forcing (precipitation, evapotranspiration and temperature) for 897 selected catchments. It also includes 63 attributes covering a range of topographic, climatic, hydrologic, land cover, geologic, soil and human intervention variables, as well as data quality indicators. This paper describes how the hydrometeorological time series and attributes were produced, their primary limitations and their main spatial features. To facilitate comparisons with catchments from other countries, the data follow the same standards as the previous CAMELS (Catchment Attributes and MEteorology for Large-sample Studies) datasets for the United States, Chile and Great Britain. CAMELS-BR complements the other CAMELS datasets by providing data for hundreds of catchments in the tropics and in the Amazon rainforest. Importantly, precipitation and evapotranspiration uncertainties are assessed using several gridded products and quantitative estimates of water consumption are provided to characterize human impacts on water resources. By extracting and combining data from these different data products and making CAMELS-BR publicly available, we aim to create new opportunities for hydrological research in Brazil and to facilitate the inclusion of Brazilian basins in continental to global large-sample studies. We envision that this dataset will enable the community to gain new insights into the drivers of hydrological behavior, better characterize extreme hydroclimatic events, and explore the impacts of climate change and human activities on water resources in Brazil. The CAMELS-BR dataset is freely available at https://doi.org/10.5281/zenodo.3709337 (Chagas et al., 2020).
... It is well-known that quantifying uncertainty in hydrology is a hard problem (Beven, 2016;Clark et al., 2011;Montanari, 2007;Renard et al., 2010), which is a polite way of saying that strictly reliable uncertainty accounting is impossible. We cannot quantify what we do not know. ...
Article
Full-text available
Model evaluation and hypothesis testing are fundamental to any field of science. We propose here that by changing slightly the way we think and communicate about inference—from being fundamentally a problem of uncertainty quantification to being a problem of information quantification—allows us to avoid certain problems related to testing models as hypotheses. We propose that scientists are typically interested in assessing the information provided by models, not the truth value or likelihood of a model. Information theory allows us to formalize this perspective.
... Uncertainty is a subject of ongoing discussions in hydrology (see e.g., Beven, 1993;Vogel, 1999;Beven, 2000Beven, , 2001Beven and Feer, 2001;Krzysztofowicz, 2001a;Pappenberger and Beven, 2006;Koutsoyiannis and Montanari, 2007;Montanari, 2007;Koutsoyiannis et al., 2009;Koutsoyiannis, 2010;Kuczera et al., 2010;Ramos et al., 2010 ;Weijs et al., 2010;Koutsoyiannis, 2011 ;Juston et al., 2012;Ramos et al., 2013 ;Nearing et al., 2016 ). Hydrological modelling uncertainty is traditionally recognised within the model calibration and validation phases ( Montanari, 2011 ) in the context of the widely accepted evaluation framework proposed by Kleme š (1986) . ...
Article
Predictive hydrological uncertainty can be quantified by using ensemble methods. If properly formulated, these methods can offer improved predictive performance by combining multiple predictions. In this work, we use 50-year-long monthly time series observed in 270 catchments in the United States to explore the performances provided by an ensemble learning post-processing methodology for issuing probabilistic hydrological predictions. This methodology allows the utilization of flexible quantile regression models for exploiting information about the hydrological model’s error. Its key differences with respect to basic two-stage hydrological post-processing methodologies using the same type of regression models are that (a) instead of a single point hydrological prediction it generates a large number of “sister predictions” (yet using a single hydrological model), and that (b) it relies on the concept of combining probabilistic predictions via simple quantile averaging. A major hydrological modelling challenge is obtaining probabilistic predictions that are simultaneously reliable and associated to prediction bands that are as narrow as possible; therefore, we assess both these desired properties of the predictions by computing their coverage probabilities, average widths and average interval scores. The results confirm the usefulness of the proposed methodology and its larger robustness with respect to basic two-stage post-processing methodologies. Finally, this methodology is empirically proven to harness the “wisdom of the crowd” in terms of average interval score, i.e., the average of the individual predictions combined by this methodology scores no worse –usually better− than the average of the scores of the individual predictions.
... Examples of toy experiments from the probabilistic hydrological modelling literature are available in Krzysztofowicz (1999) , Beven and Freer (2001) , Stedinger et al. (2008) , Farmer and Vogel (2016) , and Volpi et al. (2017) . Toy models have also been exploited for other modelling situations in geoscience (see e.g., Koutsoyiannis, 2006 see also the references in Koutsoyiannis, 2006 ), while falling into the broader category of simulation or synthetic experiments, which are increasingly conducted within various hydrological contexts, including some more relevant to the present study (see e.g., Kavetski et al., 2002 ;Vrugt et al., 2003 ;Montanari, 2005 ;Vrugt et al., 2005 , Vrugt andRobinson, 2007 ;Vrugt et al., 2008 ;Renard et al., 2010 ;Schoups and Vrugt, 2010 ;Montanari and Koutsoyiannis, 2012 ;Montanari and Di Baldassarre, 2013 ;Tyralis et al., 2013 ;Vrugt et al., 2013 ;Sadegh and Vrugt, 2014 ;Sadegh et al., 2015 ;Sikorska et al., 2015 ;Vrugt, 2016 ;Tyralis and Papacharalampous, 2017 ;Papacharalampous et al., , 2019a; see also Montanari (2007) for a discussion on the significance of this type of experiments. In fact, simplified modelling situations can be useful as starting points for achieving effective real-world modelling, especially in cases where analytical solutions exist (see e.g., Volpi et al., 2012 ). ...
Article
We introduce an ensemble learning post-processing methodology for probabilistic hydrological modelling. This methodology generates numerous point predictions by applying a single hydrological model, yet with different parameter values drawn from the respective simulated posterior distribution. We call these predictions “sister predictions”. Each sister prediction extending in the period of interest is converted into a probabilistic prediction using information about the hydrological model’s errors. This information is obtained from a preceding period for which observations are available, and is exploited using a flexible quantile regression model. All probabilistic predictions are finally combined via simple quantile averaging to produce the output probabilistic prediction. The idea is inspired by the ensemble learning methods originating from the machine learning literature. The proposed methodology offers larger robustness in performance than basic post-processing methodologies using a single hydrological point prediction. It is also empirically proven to “harness the wisdom of the crowd” in terms of average interval score, i.e., the obtained quantile predictions score no worse –usually better− than the average score of the combined individual predictions. This proof is provided within toy examples, which can be used for gaining insight on how the methodology works and under which conditions it can optimally convert point hydrological predictions to probabilistic ones. A large-scale hydrological application is made in a companion paper.
... Uncertainty is defined as a measure of the lack of accuracy concerning observed data and the modeling outcome. In hydrology, uncertainty has traditionally been estimated using probability theory (Montanari, 2007). Several attempts have been made to assess uncertainties in design rainfall estimation and rainfall runoff modeling (Ewen et al., 2006;Hailegeorgis et al., 2013;Kavetski et al., 2006;Meshgi and Khalili, 2009;Renard et al., 2010;Tung and Wong, 2014;Wu et al., 2011;Yen and Ang, 1971;Yu and Cheng, 1998). ...
Preprint
Full-text available
Inverse problems are ubiquitous in hydrological modelling for parameter estimation, system understanding, sustainable water resources management, and the operation of digital twins. While statistical inversion is especially popular, its sampling-based nature often inhibits its application to computationally costly models, which has compromised the use of the Generalized Likelihood Uncertainty Estimation (GLUE) methodology, e.g., for spatially distributed (partial) differential equation based models. In this study we introduce multilevel GLUE (MLGLUE), which alleviates the computational burden of statistical inversion by utilizing a hierarchy of model resolutions. Inspired by multilevel Monte Carlo, most parameter samples are evaluated on lower levels with computationally cheap low-resolution models and only samples associated with a likelihood above a certain threshold are subsequently passed to higher levels with costly high-resolution models for evaluation. Inferences are made at the level of the highest-resolution model but substantial computational savings are achieved by discarding samples with low likelihood already on levels with low resolution and low computational cost. Two example inverse problems, using a rainfall-runoff model and groundwater flow model, demonstrate the substantially increased computational efficiency of MLGLUE compared to GLUE as well as the similarity of inversion results. Findings are furthermore compared to inversion results from Markov-chain Monte Carlo (MCMC) and multilevel delayed acceptance MCMC, a corresponding multilevel variant, to compare the effects of the multilevel extension. All examples demonstrate the wide-range suitability of the approach and include guidelines for practical applications.
Article
Full-text available
Reducing uncertainty in quantifying basin‐scale water‐budget components is one of the most vexing problems for water‐resource managers. Although a vast literature addresses methods and scales of component estimates, uncertainty associated with these estimates is often lacking. Here we review sources of uncertainty and compile reported uncertainty estimates for measurement and estimation methods for six water‐budget components: precipitation, streamflow, evapotranspiration, subsurface discharge, infiltration, and recharge. Quantifying the uncertainty in each water‐budget component data is required before total water‐budget uncertainty can be determined. This article is categorized under: Science of Water > Hydrological Processes Science of Water > Methods
Article
Accurate runoff prediction is critical for various fields of hydrology, agriculture, and environmental studies. Numerous hydrologic models have been developed and demonstrate good performances in runoff simulation. However, errors are inherent in forecasted runoff predictions, which can cause uncertainty in real-time flood warning systems. In order to improve the predictive performance of hydrologic modeling, this study used a deep learning approach as a post-processor to correct for errors associated with hydrologic data. The proposed model uses the long short-term memory model with sequence-to-sequence structure as a post-processor to improve runoff forecasting. Specifically, the deep learning approach was used to estimate errors in forecasted hourly runoff provided from National Water Model in Russian River basin, California, United States. Error prediction in hourly runoff with lead times between 1 and 18 hours were developed using observed precipitation and errors from upstream stream gages to improve the predictive performance of National Water Model. The predictive performance of the model was evaluated using numerous statistical metrics, and results show that the long short-term memory model with sequence-to-sequence post-processor improved runoff predictions compared to standalone results from the National Water Model. Statistical values of percent bias decreased from a range of -60%–80% to -15%–10% when the post-processor model was used, and similarly root mean square errors of runoff prediction decreased from 120 cms to 20 cms. Thus, this study demonstrates the power of deep learning model to improve hydrologic modeling results, especially those with short forecasting lead times.
Article
Increasing studies have highlighted that the remarkable inherent uncertainty in design flood hydrograph (DFH) can potentially undermine flood management decisions. In order to quantitatively trace the propagation of DFH uncertainty in reservoir flood control system, we propose a novel methodological framework including three corn parts. First, a copula-based DFH estimation model integrating Bayes’ theorem is presented to estimate DFH under model parameter uncertainty. Second, we perform an optimal reservoir operation model for flood control (OROMFC) with uncertain DFH as input variable to derive reservoir flood control operations (i.e. output variable). Third, an information theory-based model is designed to trace the DFH uncertainty propagation in reservoir flood control system. A reservoir flood control system in the Han River basin in China is selected as case study. Related results indicate that uncertainty in reservoir flood control operations reduces in comparison with the remarkable uncertainty in DFH due to the performance of the OROMFC. Specifically, uncertainty in reservoir flood control operations in periods close to peak flow is much smaller compared with that in other periods. The phenomenon further highlights the importance of reservoir flood control operations during periods prior to peak flow. Additionally, we find that uncertainty in flood peak of DFH is the dominant factor affecting reservoir flood control operations compared with that in flood volume of DFH. Moreover, we explore the impact of reservoir flood control capacity on reservoir flood operations in the context of DFH uncertainty. An interesting linear expression is found and fitted for identifying design flood events inducing reservoir overtopping under specific reservoir flood control capacity.
Article
Full-text available
青藏高原的径流变化影响着亚洲数十亿人口的水资源供给, 而该区域气象和水文观测站点稀少, 致使径流和水量平衡估算具有较大的挑战. 研究基于八个长时间序列的降水产品开展径流集合模拟, 在陆面水文模型中考虑冰川产流过程, 探讨青藏高原地区降水、径流, 以及径流贡献分量 (降雨径流、融冰径流、融雪径流)的时空变化, 并识别它们的不确定性. 模拟结果表明, 在1984-2015年间, 青藏高原降水量和径流分别约为423 mm/a和212 mm/a, 且均有增加趋势; 降雨、融雪和融冰对总径流的贡献分别为66%, 12%和22%; 融雪径流较为稳定, 但降雨和融冰径流均表现出增加趋势, 特别是位于高原南部的昌都和林芝等地区, 融冰径流以6 mm/a的速度快速增长. 鉴于降水数据之间的差异,模拟的降雨径流和融雪径流具有较大的不确定性, 尤其是在青藏高原西部和南部区域, 最高不确定性超过70%. 本研究考虑了青藏高原降水数据的不确定性, 以水文集合模拟的方式重点揭示降雨、融冰、融雪等对径流的贡献, 为该地区水资源评估提供科学依据.
Conference Paper
In this paper we are going to derive annual erosivity equation for Manasbal lake catchment, which can also be used for adjoining Himalayan region. Due to high variations of annual erosivity from region to region and lack of such a study in the region of Manasbal Lake (Lower Himalayas), this study held a very high significance. Another important purpose of this paper is to highlight the ease data-analysis has brought to us. In Civil engineering, especially in the field of Water resource engineering the data sets like the precipitation data, humidity data, Temperature data etc. which we work with are of a very large scale. For example the data used in this study has over 3,40,000 rows, which would have been almost impossible to analyze manually. Therefore it is important to understand the need of data analysis in the future of Water Resource Engineering.
Article
Full-text available
Rainfall runoff modeling has been a subject of interest for decades due to a need to understand a catchment system for management, for example regarding extreme event occurrences such as flooding. Tropical catchments are particularly prone to the hazards of extreme precipitation and the internal drivers of change in the system, such as deforestation and land use change. A model framework of dynamic TOPMODEL, DECIPHeR v1—considering the flexibility, modularity, and portability—and Generalized Likelihood Uncertainty Estimation (GLUE) method are both used in this study. They reveal model performance for the streamflow simulation in a tropical catchment, i.e., the Kelantan River in Malaysia, that is prone to flooding and experiences high rates of land use change. Thirty-two years’ continuous simulation at a daily time scale simulation along with uncertainty analysis resulted in a Nash Sutcliffe Efficiency (NSE) score of 0.42 from the highest ranked parameter set, while 25.35% of the measurement falls within the uncertainty boundary based on a behavioral threshold NSE 0.3. The performance and behavior of the model in the continuous simulation suggests a limited ability of the model to represent the system, particularly along the low flow regime. In contrast, the simulation of eight peak flow events achieves moderate to good fit, with the four peak flow events simulation returning an NSE > 0.5. Nonetheless, the parameter scatter plot from both the continuous simulation and analyses of peak flow events indicate unidentifiability of all model parameters. This may be attributable to the catchment modeling scale. The results demand further investigation regarding the heterogeneity of parameters and calibration at multiple scales.
Thesis
Full-text available
Catchment models are conventionally evaluated in terms of their response surface or likelihood surface constructed from model runs using different sets of model parameters. Model evaluation methods are mainly based upon the concept of the equifinality of model structures or parameter sets. The operational definition of equifinality is that multiple model structures/parameters are equally capable of producing acceptable simulations of catchment processes such as runoff. Examining various aspects of this convention, in this thesis I demonstrate their shortcomings and introduce improvements including new approaches and insights for evaluating catchment models as multiple working hypotheses (MWH). First (Chapter 2), arguing that there is more to equifinality than just model structures/parameters, I propose a theoretical framework to conceptualise various facets of equifinality, based on a meta-synthesis of a broad range of literature across geosciences, system theory, and philosophy of science. I distinguish between process-equifinality (equifinality within the real-world systems/processes) and model-equifinality (equifinality within models of real-world systems), explain various aspects of each of these two facets, and discuss their implications for hypothesis testing and modelling of hydrological systems under uncertainty. Second (Chapter 3), building up on this theoretical framework, I propose that characterising model-equifinality based on model internal fluxes — instead of model parameters which is the current approach to account for model-equifinality — provides valuable insights for evaluating catchment models. I developed a new method for model evaluation — called flux mapping — based on the equifinality of runoff generating fluxes of large ensembles of catchment model simulations (1 million model runs for each catchment). Evaluating the model behaviour within the flux space is a powerful approach, beyond the convention, to formulate testable hypotheses for runoff generation processes at the catchment scale. Third (Chapter 4), I further explore the dependency of the flux map of a catchment model upon the choice of model structure and parameterisation, error metric, and data information content. I compare two catchment models (SIMHYD and SACRAMENTO) across 221 Australian catchments (known as Hydrologic Reference Stations, HRS) using multiple error metrics. I particularly demonstrate the fundamental shortcomings of two widely used error metrics — i.e. Nash–Sutcliffe efficiency and Willmott’s refined index of agreement — in model evaluation. I develop the skill score version of Kling–Gupta efficiency (KGEss), and argue it is a more reliable error metric that the other metrics. I also compare two strategies of random sampling (Latin Hypercube Sampling) and guided search (Shuffled Complex Evolution) for model parameterisation, and discuss their implications in evaluating catchment models as MWH. Finally (Chapter 5), I explore how catchment characteristics (physiographic, climatic, and streamflow response characteristics) control the flux map of catchment models (i.e. runoff generation hypotheses). To this end, I formulate runoff generating hypotheses from a large ensemble of SIMHYD simulations (1 million model runs in each catchment). These hypotheses are based on the internal runoff fluxes of SIMHYD — namely infiltration excess overland flow, interflow and saturation excess overland flow, and baseflow — which represent runoff generation at catchment scale. I examine the dependency of these hypotheses on 22 different catchment attributes across 186 of the HRS catchments with acceptable model performance and sufficient parameter sampling. The model performance of each simulation is evaluated using KGEss metric benchmarked against the catchment-specific calendar day average observed flow model, which is more informative than the conventional benchmark of average overall observed flow. I identify catchment attributes that control the degree of equifinality of model runoff fluxes. Higher degree of flux equifinality implies larger uncertainties associated with the representation of runoff processes at catchment scale, and hence pose a greater challenge for reliable and realistic simulation and prediction of streamflow. The findings of this chapter provides insights into the functional connectivity of catchment attributes and the internal dynamics of model runoff fluxes.
Article
Full-text available
This paper is derived from a keynote talk given at the Google's 2020 Flood Forecasting Meets Machine Learning Workshop. Recent experiments applying deep learning to rainfall‐runoff simulation indicate that there is significantly more information in large‐scale hydrological data sets than hydrologists have been able to translate into theory or models. While there is a growing interest in machine learning in the hydrological sciences community, in many ways, our community still holds deeply subjective and nonevidence‐based preferences for models based on a certain type of “process understanding” that has historically not translated into accurate theory, models, or predictions. This commentary is a call to action for the hydrology community to focus on developing a quantitative understanding of where and when hydrological process understanding is valuable in a modeling discipline increasingly dominated by machine learning. We offer some potential perspectives and preliminary examples about how this might be accomplished.
Preprint
Full-text available
We suggest that there is a potential danger to the hydrological sciences community in not recognizing how transformative machine learning could be for the future of hydrological modeling. Given the recent success of machine learning applied to modeling problems, it is unclear what the role of hydrological theory might be in the future. We suggest that a central challenge in hydrology right now should be to clearly delineate where and when hydrological theory adds value to prediction systems. Lessons learned from the history of hydrological modeling motivate several clear next steps toward integrating machine learning into hydrological modeling workflows.
Article
Full-text available
The estimation of design rainfalls is necessary to estimate the exceedance probabilities of extreme floods required to design hydraulic structures and to quantify the risk of failure of these structures. New approaches to estimating extreme rainfall events are being developed internationally. This paper reviews methods for estimating design rainfalls, particularly extreme events, in South Africa and internationally, and highlights the need to update methods used for estimating extreme rainfall events in South Africa as a platform for future research.
Article
Full-text available
1] The impact of input errors in the calibration of watershed models is a recurrent theme in the water science literature. It is now acknowledged that hydrological models are sensitive to errors in the measures of precipitation and that those errors bias the model parameters estimated via the standard least squares (SLS) approach. This paper presents a Bayesian uncertainty framework allowing one to account for input, output, and structural (model) uncertainties in the calibration of a model. Using this framework, we study the impact of input uncertainty on the parameters of the hydrological model ''abc.'' Mostly of academic interest, the ''abc'' model has a response linear to its input, allowing the closed form integration of nuisance variables under proper assumptions. Using those analytical solutions to compute the posterior density of the model parameters, some interesting observations can be made about their sensitivity to input errors. We provide an explanation for the bias identified in the SLS approach and show that in the input error context the prior on the input ''true'' value has a significant influence on the parameters' posterior density. Overall, the parameters obtained from the Bayesian method are more accurate, and the uncertainty over them is more realistic than with SLS. This method, however, is specific to linear models, while most hydrological models display strong nonlinearities. Further research is thus needed to demonstrate the applicability of the uncertainty framework to commonly used hydrological models. Citation: Huard, D., and A. Mailhot (2006), A Bayesian perspective on input uncertainty in model calibration: Application to hydrological model ''abc,'' Water Resour. Res., 42, W07416, doi:10.1029/2005WR004661.
Article
Full-text available
1] Rainfall-runoff models have received a great deal of attention by researchers in the last decades. However, the analysis of their reliability and uncertainty has not been treated as thoroughly. In the present study, a technique for assessing the uncertainty of rainfall-runoff simulations is presented that makes use of a meta-Gaussian approach in order to estimate the probability distribution of the model error conditioned by the simulated river flow. The proposed technique is applied to the case study of an Italian river basin, for which the confidence limits of simulated river flows are derived and compared with the respective actual observations. Citation: Montanari, A., and A. Brath (2004), A stochastic approach for assessing the uncertainty of rainfall-runoff simulations, Water Resour. Res., 40, W01106, doi:10.1029/2003WR002540.
Article
Full-text available
1] Several methods have been recently proposed for quantifying the uncertainty of hydrological models. These techniques are based upon different hypotheses, are diverse in nature, and produce outputs that can significantly differ in some cases. One of the favored methods for uncertainty assessment in rainfall-runoff modeling is the generalized likelihood uncertainty estimation (GLUE). However, some fundamental questions related to its application remain unresolved. One such question is that GLUE relies on some explicit and implicit assumptions, and it is not fully clear how these may affect the uncertainty estimation when referring to large samples of data. The purpose of this study is to address this issue by assessing how GLUE performs in detecting uncertainty in the simulation of long series of synthetic river flows. The study aims to (1) discuss the hypotheses underlying GLUE and derive indications about their effects on the uncertainty estimation, and (2) compare the GLUE prediction limits with a large sample of data that is to be simulated in the presence of known sources of uncertainty. The analysis shows that the prediction limits provided by GLUE do not necessarily include a percentage close to their confidence level of the observed data. In fact, in all the experiments, GLUE underestimates the total uncertainty of the simulation provided by the hydrological model. Citation: Montanari, A. (2005), Large sample behaviors of the generalized likelihood uncertainty estimation (GLUE) in assessing the uncertainty of rainfall-runoff simulations, Water Resour. Res., 41, W08406, doi:10.1029/2004WR003826.
Article
Full-text available
[1] Hydrologic models use relatively simple mathematical equations to conceptualize and aggregate the complex, spatially distributed, and highly interrelated water, energy, and vegetation processes in a watershed. A consequence of process aggregation is that the model parameters often do not represent directly measurable entities and must therefore be estimated using measurements of the system inputs and outputs. During this process, known as model calibration, the parameters are adjusted so that the behavior of the model approximates, as closely and consistently as possible, the observed response of the hydrologic system over some historical period of time. In practice, however, because of errors in the model structure and the input (forcing) and output data, this has proven to be difficult, leading to considerable uncertainty in the model predictions. This paper surveys the limitations of current model calibration methodologies, which treat the uncertainty in the input-output relationship as being primarily attributable to uncertainty in the parameters and presents a simultaneous optimization and data assimilation (SODA) method, which improves the treatment of uncertainty in hydrologic modeling. The usefulness and applicability of SODA is demonstrated by means of a pilot study using data from the Leaf River watershed in Mississippi and a simple hydrologic model with typical conceptual components.
Article
Various methods exist for assessing the safety of a structural or mechanical system that has uncertain parameters. These methods are either statistical (probabilistic), in which case the probability of failure is sought, or deterministic (possibilistic), in which case bounds on the response are sought. Well-known statistical methods include the first-order reliability method (FORM) and the second-order reliability method (SORM), while deterministic methods include interval analysis, convex modeling, and fuzzy set theory (although the categorization of the latter approach as a deterministic method is debatable). The development of probabilistic and possibilistic methods has tended to occur independently, with specialized algorithms being developed for the implementation of each technique. It is shown here that a wide range of probabilistic and possibilistic methods can be encompassed by a single mathematical algorithm, so that, for example, existing codes for FORM and SORM can potentially be employed for other methods, thus allowing the designer to readily choose the method most suited to the available data. A second common algorithm is also derived, and the analysis is illustrated by application to a simple system composed of N structural components in series.
Article
No Abstract.Received: 5 May 2006; Accepted: 1 June 2006
Article
The purpose of this analytic-numerical Bayesian forecasting system (BFS) is to produce a short-term probabilistic river stage forecast based on a probabilistic quantitative precipitation forecast as an input and a deterministic hydrologic model (of any complexity) as a means of simulating the response of a headwater basin to precipitation. The BFS has three structural components: the precipitation uncertainty processor, the hydrologic uncertainty processor, and the integrator. A series of articles described the Bayesian forecasting theory and detailed each component of this particular BFS. This article presents a synthesis: the total system, operational expressions, estimation procedures, numerical algorithms, a complete example, and all design requirements, modeling assumptions, and operational attributes.
Article
Although uncertainty about structures of environmental models (conceptual uncertainty) is often acknowledged to be the main source of uncertainty in model predictions, it is rarely considered in environmental modelling. Rather, formal uncertainty analyses have traditionally focused on model parameters and input data as the principal source of uncertainty in model predictions. The traditional approach to model uncertainty analysis, which considers only a single conceptual model, may fail to adequately sample the relevant space of plausible conceptual models. As such, it is prone to modelling bias and underestimation of predictive uncertainty.In this paper we review a range of strategies for assessing structural uncertainties in models. The existing strategies fall into two categories depending on whether field data are available for the predicted variable of interest. To date, most research has focussed on situations where inferences on the accuracy of a model structure can be made directly on the basis of field data. This corresponds to a situation of ‘interpolation’. However, in many cases environmental models are used for ‘extrapolation’; that is, beyond the situation and the field data available for calibration. In the present paper, a framework is presented for assessing the predictive uncertainties of environmental models used for extrapolation. It involves the use of multiple conceptual models, assessment of their pedigree and reflection on the extent to which the sampled models adequately represent the space of plausible models.
Article
It may be endemic to mechanistic modelling of complex environmental systems that there are many different model structures and many different parameter sets within a chosen model structure that may be behavioural or acceptable in reproducing the observed behaviour of that system. This has been called the equifinality concept. The generalised likelihood uncertainty estimation (GLUE) methodology for model identification allowing for equifinality is described. Prediction within this methodology is a process of ensemble forecasting using a sample of parameter sets from the behavioural model space, with each sample weighted according to its likelihood measure to estimate prediction quantiles. This allows that different models may contribute to the ensemble prediction interval at different time steps and that the distributional form of the predictions may change over time. Any effects of model nonlinearity, covariation of parameter values and errors in model structure, input data or observed variables, with which the simulations are compared, are handled implicitly within this procedure. GLUE involves a number of choices that must be made explicit and can be therefore subjected to scrutiny and discussion. These include ways of combining information from different types of model evaluation or from different periods in a data assimilation context. An example application to rainfall-runoff modelling is used to illustrate the methodology, including the updating of likelihood measures.
Article
This paper describes a methodology for calibration and uncertainty estimation of distributed models based on generalized likelihood measures. the GLUE procedure works with multiple sets of parameter values and allows that, within the limitations of a given model structure and errors in boundary conditions and field observations, different sets of values May, be equally likely as simulators of a catchment. Procedures for incorporating different types of observations into the calibration; Bayesian updating of likelihood values and evaluating the value of additional observations to the calibration process are described. the procedure is computationally intensive but has been implemented on a local parallel processing computer. the methodology is illustrated by an application of the Institute of Hydrology Distributed Model to data from the Gwy experimental catchment at Plynlimon, mid-Wales.
Conference Paper
It is a deep-seated tradition in science to view uncertainty as a province of probability theory. The generalized theory of uncertainty (GTU), which is outlined in this paper, breaks with this tradition and views uncertainty in a broader perspective. Uncertainty is an attribute of information. A fundamental premise of GTU is that information, whatever its form, may be represented as what is called a generalized constraint. The concept of a generalized constraint is the centerpiece of GTU.
On undermining the science? Hydrological Processes (HPToday)
  • Kj Beven
Beven KJ. 2006. On undermining the science? Hydrological Processes (HPToday) 20: 3141– 3146.
Improved treatment of uncertainty in hydrologic modeling: Combin-ing the strengths of global optimization and data assimilation Toward a generalized theory of uncertainty (GTU)—an outline
  • Diks Ja Cgh Vrugt
  • Gupta Hv
  • W Bouten
  • Verstraten
  • Jm
Vrugt JA, Diks CGH, Gupta HV, Bouten W, Verstraten JM. 2005. Improved treatment of uncertainty in hydrologic modeling: Combin-ing the strengths of global optimization and data assimilation. Water Resources Research 41: W01017, DOI:10·1029/2004WR003059. Zadeh LA. 2005. Toward a generalized theory of uncertainty (GTU)—an outline. Information Sciences 172: 1–40.
  • Zadeh