Article

Uncertainty in the environmental modelling process - A framework and guidance

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

A terminology and typology of uncertainty is presented together with a framework for the modelling process, its interaction with the broader water management process and the role of uncertainty at different stages in the modelling processes. Brief reviews have been made of 14 different (partly complementary) methods commonly used in uncertainty assessment and characterisation: data uncertainty engine (DUE), error propagation equations, expert elicitation, extended peer review, inverse modelling (parameter estimation), inverse modelling (predictive uncertainty), Monte Carlo analysis, multiple model simulation, NUSAP, quality assurance, scenario analysis, sensitivity analysis, stakeholder involvement and uncertainty matrix. The applicability of these methods has been mapped according to purpose of application, stage of the modelling process and source and type of uncertainty addressed. It is concluded that uncertainty assessment is not just something to be added after the completion of the modelling work. Instead uncertainty should be seen as a red thread throughout the modelling study starting from the very beginning, where the identification and characterisation of all uncertainty sources should be performed jointly by the modeller, the water manager and the stakeholders.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... In the level subcomponent, it was criticized that in the previous matrix it was classified with groups for technical examination and was open to multiple interpretations, and four levels were proposed in expressing the level of uncertainty with a sound setup of the "measurement scales theory" including shallow, medium, deep uncertainty and recognized ignorance (Kwakkel et al., 2010, p. 312). Despite these changes, the common characteristics of the studies are the use of location, level, and nature dimensions to understand uncertainty (see Refsgaard et al., 2007;Van der Sluijs et al., 2008). The content of these three dimensions is briefly summarized below. ...
... Afterwards, Ascough II et al. (2008), by combining the uncertainty criteria with their types, defined a wider framework, where they proposed criteria for information, variable, decision, and linguistic uncertainty. Mosadeghi et al., (2013) have used the uncertainty framework including uncertainty types of location, level, and nature, which also formed the bases for the works of Walker et al (2003) and Refsgaard et al. (2007). ...
Article
Full-text available
Over the past few decades, the world has become an increasingly dangerous and complex place, and thus, expectations from spatial planning have changed. The study defines the concept of uncertainty as an important problem area of spatial planning. Based on lack of native studies on this subject, it is aimed to reveal how the uncertainties in spatial planning process are handled in international literature. It consists of two basic steps. In the first step, a three-stage model, "Uncertainty Components of Spatial Planning" is proposed. These stages involve (i) the conceptualization, (ii) the classification and (iii) the evaluation of uncertainty. In the second step, a triangular framework was formed for the conceptualization stage of this model having components of (1) identification and modelling, (2) theories and processes, (3) legal regulations. The theoretical handling suggested that the concept of uncertainty is synonymously used with the concepts of vagueness and ambiguity in everyday life despite their differences. It is also found that uncertainty is the subject of many international studies having a common point of presenting either a model or a method to evaluate uncertainty. These studies were categorized in three groups in handling uncertainty; (1) in multidisciplinary context within a general framework, (2) in the field of planning under two subcategories (2a and 2b), and (3) in the field of environment. The studies carried out in the second category allowed for regular conceptual patterns in themselves, and they were shallower and more inward-oriented than those studies in the 1 st and 3 rd groups, and there is an apparent interaction between the 1 st and the 3 rd groups. In the model proposed, the focus was only on (i) the conceptualization. However, as the origin, definition and basis of the concept of uncertainty were revealed, it might provide an important initiation for future studies. The study is original in introducing the concept of uncertainty to native literature by elaborating on how it is handled in international studies. Proposals were offered on how to place this concept on a theoretical basis before establishing an evaluation framework for uncertainties within the spatial planning process in Türkiye.
... The lack of a consensus on the most appropriate UE method is to be expected given that the sources of uncertainty associated with environmental modelling applications are dominated by a lack of knowledge (epistemic uncertainties; e.g. Refsgaard et al., 2007;Beven, 2009 rather than solely by random variability (aleatory uncertainties). Rigorous statistical inference applies to the latter, but it might lead to unwarranted confidence if applied to the former, especially where some data might be disinformative in model evaluation (e.g. ...
... Beven and Al-cock (2012) suggest a condition tree approach that records the modelling choices and assumptions made during analyses and, thus, provides a clear audit trail (e.g. . The audit trail consequently provides a vehicle that promotes transparency, best practice and communication with stakeholders (Refsgaard et al., 2007;Beven and Alcock, 2012). To encourage best practice, the process of defining a condition tree and recording an audit trail has been made an integral part of the CURE toolbox via a condition tree graphical user interface (GUI). ...
Article
Full-text available
There is a general trend toward the increasing inclusion of uncertainty estimation in the environmental modelling domain. We present the Consortium on Risk in the Environment: Diagnostics, Integration, Benchmarking, Learning and Elicitation (CREDIBLE) Uncertainty Estimation (CURE) toolbox, an open-source MATLABTM toolbox for uncertainty estimation aimed at scientists and practitioners who are not necessarily experts in uncertainty estimation. The toolbox focusses on environmental simulation models and, hence, employs a range of different Monte Carlo methods for forward and conditioned uncertainty estimation. The methods included span both formal statistical and informal approaches, which are demonstrated using a range of modelling applications set up as workflow scripts. The workflow scripts provide examples of how to utilize toolbox functions for a variety of modelling applications and, hence, aid the user in defining their own workflow; additional help is provided by extensively commented code. The toolbox implementation aims to increase the uptake of uncertainty estimation methods within a framework designed to be open and explicit in a way that tries to represent best practice with respect to applying the methods included. Best practice with respect to the evaluation of modelling assumptions and choices, specifically including epistemic uncertainties, is also included by the incorporation of a condition tree that allows users to record assumptions and choices made as an audit trail log.
... Spatial models are widely used for prediction and decision-making in ecology (DeAngelis and Diaz, 2019), especially to predict intricate interactions within ecosystems, whether focusing on population dynamics, anthropogenic activities (Glaum et al., 2020;Plagányi et al., 2014) or conservation strategies (Bach et al., 2022;Gernez et al., 2023). They enable assessing the effectiveness of various scenarios representing adjustments to a reference state to model the actions implemented by management strategies (Refsgaard et al., 2007). Hence, spatial models are increasingly used to evaluate the effectiveness of restoration or rehabilitation measures, by comparing their impact with that of other management strategies (Possingham et al., 2015). ...
Preprint
Full-text available
Conservation measures are implemented to support biodiversity in areas that are degraded or under anthropogenic pressure. Over the past decade, numerous projects aimed at rehabilitating a fish nursery function in ports, through the installation of artificial structures, have emerged. While studies conducted on these solutions seem promising on a very local scale (e.g., higher densities of juvenile fish on artificial fish nurseries compared to bare port infrastructures), no evaluation has been undertaken yet to establish their contribution to the renewal of coastal fish populations or their performance compared to other conservation measures such as fishing regulation. Here, we used a coupled model of fish population dynamics and fisheries management, ISIS-fish, to describe the coastal commercial fish population, the white seabream (Diplodus sargus) in the highly artificialized Bay of Toulon. Using ISIS-Fish, we simulated rehabilitation and fisheries management scenarios. We provided the first quantitative assessment of the implementation of artificial structures in ports covering 10% and 100% of the available port area and compared, at population level and fishing fleets level, the quantitative consequences of these rehabilitation measures with fishing control measures leading to strict compliance with minimum catch sizes. The rehabilitation of the nursery function in ports demonstrated a potential to enhance the renewal of fish populations and catches. When the size of projects is small the outcomes they provide remain relatively modest in contrast to the impact of regulatory fishing measures. However, we have demonstrated that combining fishing reduction measures and rehabilitation projects has a synergistic effect on fish populations, resulting in increased populations and catches. This study is the first quantitative assessment of fish nursery rehabilitation projects in port areas, by evaluating their effectiveness in renewing coastal fish populations and fisheries and comparing their outcomes with fishing control measures. Small-scale port-area nursery rehabilitation projects can support fish populations, but are less effective than controlling fisheries.
... We see the use of urban structure types as 813 promising potential representatives for these proxies, especially when combined with 814 other socio-economic data. We argue that, when those proxy distributions are 815 combined with uncertainty assessments, beneficial assessments can be performed 816 even in data-scarce environments (Refsgaard et al., 2007). Yet of course the availability 817 of sufficient data is superior to the use of proxy distributions. ...
Article
Flood risk in urban areas will increase massively under future urbanization and climate change. Urban flood risk models have been increasingly applied to assess impacts of urbanization on flood risk. For this purpose, different methodological approaches have been developed in order to reflect the complexity and dynamics of urban growth. To assess the state-of-the art in the application of flood risk models under urbanization scenarios, we conducted a structured literature review and systematically analyzed 93 publications with 141 case studies. Our review shows that hydrological and hydrodynamic flood models are the most commonly used approaches to project flood risk. Future urbanization is mostly considered as urban sprawl through the adjustment of land use maps and roughness parameters. A low number of approaches additionally consider transitions of urban structures and densification processes in their urbanization scenarios. High-resolution physically based flood models are constantly developed and already well suited for describing quantifiable processes in data-rich contexts. In regions with data paucity, however, we argue that a reduction of the level of detail in flood models accompanied by a higher resolved representation of urbanization patterns should be explored to improve the quality of flood risk projections under future urbanization trends. For this purpose, we also call for the development of integrative model structures such as causal network models that have greater explanatory power and enable the processing of qualitative data.
... Generally, using hydrological models for DSM and quantifying the model uncertainties requires separate methods beyond the model simulation (Moges et al., 2020). Many reviews highlighted the most commonly applied methods to assess uncertainty (Refsgaard et al., 2007;Moges et al., 2020;Herrera et al., 2022). Among the reviewed methods, there are inverse modelling (i.e. ...
Article
Full-text available
Integrated hydrological models (IHMs) help characterize the complexity of surface–groundwater interactions. The cascade routing and re-infiltration (CRR) concept, recently applied to a MODFLOW 6 IHM, improved conceptualization and simulation of overland flow processes. The CRR controls the transfer of rejected infiltration and groundwater exfiltration from upslope areas to adjacent downslope areas where that water can be evaporated, re-infiltrated back to subsurface, or discharged to streams as direct runoff. The partitioning between these three components is controlled by uncertain parameters that must be estimated. Thus, by quantifying and reducing those uncertainties, next to uncertainties of the other model parameters (e.g. hydraulic and storage parameters), the reliability of the CRR is improved and the IHM is better suited for decision support modelling, the two key objectives of this work. To this end, the remotely sensed MODIS-ET product was incorporated into the calibration process for complementing traditional hydraulic head and streamflow observations. A total of approximately 150,000 observations guided the calibration of a 13-year MODFLOW 6 IHM simulation of the Sardon catchment (Spain) with daily stress periods. The model input uncertainty was represented by grid-cell-scale parameterization, yielding approximately 500,000 unknown input parameters to be conditioned. The calibration was carried out through an iterative ensemble smoother. Incorporating the MODIS-ET data improved the CRR implementation, and reduced uncertainties associated with other model parameters. Additionally, it significantly reduced the uncertainty associated with net recharge, a critical flux for water management that cannot be directly measured and rather is commonly estimated by IHM simulations.
... Equifinality can also be caused by a larger number of parameters than available observations or by correlations between parameters [586,608]; 7. Analysing uncertainties, together with calibration and uncertainty reduction, is crucial for the accurate representation and prediction of systems. Some uncertainties can be removed, such as epistemic uncertainty, which arises from missing or imperfect knowledge [609][610][611][612]. Sources of uncertainty include: input data (from input data and external forces on the system), technical (related to codes, software constraints, etc.), structural (related to the conceptual model and simplification of the system), parameters (related to parameter distribution, parameter estimation, and constraint variables), prediction uncertainty, and calibration data uncertainty [613,614]. Uncertainty can be reduced by adding additional data on current and historical system behaviour (data assimilation, e.g., through high-resolution monitoring and hydrogeophysical screening [615]), by comparing different parameterisation methods and models, and by changing model resolution and other modelling measures based on experience. Statistical methods dedicated to the broad field of uncertainty analysis can be divided into six classes: Monte Carlo sampling, response surface-based methods including polynomial chaos expansion and machine learning, multi-model approaches, Bayesian statistics, multi-criteria analysis, and least-squares-based inverse modelling [614]. ...
Article
Full-text available
The pollution of groundwater and soil by hydrocarbons is a significant and growing global problem. Efforts to mitigate and minimise pollution risks are often based on modelling. Modelling-based solutions for prediction and control play a critical role in preserving dwindling water resources and facilitating remediation. The objectives of this article are to: (i) to provide a concise overview of the mechanisms that influence the migration of hydrocarbons in groundwater and to improve the understanding of the processes that affect contamination levels, (ii) to compile the most commonly used models to simulate the migration and fate of hydrocarbons in the subsurface; and (iii) to evaluate these solutions in terms of their functionality, limitations, and requirements. The aim of this article is to enable potential users to make an informed decision regarding the modelling approaches (deterministic, stochastic, and hybrid) and to match their expectations with the characteristics of the models. The review of 11 1D screening models, 18 deterministic models, 7 stochastic tools, and machine learning experiments aimed at modelling hydrocarbon migration in the subsurface should provide a solid basis for understanding the capabilities of each method and their potential applications.
... The integration of various assessment approaches, each carrying its own set of uncertainties, leads to compounded uncertainty (Hamel and Bryant, 2017). The sources of uncertainty in integrated modelling could arise from the context and framing of the system under study (i.e., the scope of the assessment), input uncertainty driven by external parameters (e.g., natural variability of climatic parameters), parameter uncertainty (i.e., pertaining the parameter value potentially arising from measurement errors), structural model uncertainty (simplified or incomplete description of the modelled system relative to reality), model technical uncertainty which are implementation related uncertainties (e.g., coding and software technical bugs) (Walker et al., 2003;Refsgaard et al., 2007). The total uncertainty of the model output (s) is attributed to various sources of uncertainties (Loucks et al., 2005). ...
Article
Full-text available
Integrating ecosystem services and life cycle assessment is gaining increasing attention for the analysis of environmental costs and benefits associated with human activities covering multiple geographical scales and life cycle stages. Such integration is particularly relevant for evaluating the sustainability of nature-based solutions. However, merging these methods introduces additional uncertainties. This paper introduces a novel protocol to assess uncertainties in combined ecosystem services-life cycle assessment, focusing on ecosystem services accounting, life cycle inventory of foreground systems, and life cycle impact assessment characterisation factors. Applied to a nature-based solution case study compared to no-action and energy-intensive scenarios, the uncertainties were analysed using multi-method global sensitivity analysis. The robustness of the analysis results was assessed through convergence plots and statistical tests. Findings reveal significant uncertainties, especially in life cycle impact assessment characterisation factors, with the extent varying by impact category. Uncertainties in foreground life cycle inventory, particularly in land use of nature-based solutions scenario, are also notable. Compared to these, uncertainties associated with indicators of impact on ecosystem services (uncertainty arising from input variability in ecosystem services accounting) are relatively lower. This study underscores the critical role of uncertainty assessment in enhancing the reliability of integrated assessments for nature-based solutions, providing a framework to identify and quantify key uncertainties, thereby supporting more informed decision-making.
... Another source of uncertainty emanates from lake models mainly due to their attempt to simulate highly complex systems, particularly the biogeochemical processes. These processes cannot be fully described in the model's equations, so simplifications are made, resulting in model inaccuracy and uncertainty (Refsgaard et al., 2007;Puy et al., 2022). Model uncertainty can also come from model calibration process that heavily depends on available observations. ...
... These models facilitate the understanding of complicated causal relationships between external and internal drivers in water bodies (Arhonditsis and Brett, 2004;Cheng et al., 2015;Mooij et al., 2010). However, as these models become increasingly intricate, they introduce additional sources of uncertainty by simplifying reality, thereby compromising their accuracy and identifiability (McDonald and Urban, 2010;Refsgaard et al., 2007). Given this prevailing challenge, modelers are confronted with the need for objective and quantitative methods to evaluate the performance of such models (Bennett et al., 2013). ...
... Uncertainty in measurements has attracted attention in recent years in the physical and natural sciences [25,26], but not very prominently with respect to anthropometric data and there is a need for further research in this area [27]. Recently, it has been recognized in the case of ergonomics that systems are sensitive to various uncertainties, including inaccuracy, randomness and vagueness in reported data [26][27][28][29][30]. The two principal categories of uncertainty often cited in the literature are denoted as aleatory uncertainty (statistical variability) and epistemic uncertainty (lack of knowledge), which are the basis of most published taxonomies [25,26,30]. ...
Technical Report
Full-text available
Anthropometric analysis plays an important role in the development of human 'habitability' in aerospace vehicles, long-duration spaceflight and surface missions to Mars and beyond. The design of the physical workspace in aerospace vehicles has constraints imposed by minimizing mass, volume and the limited interior dimensions of the cockpit and the range of possible spatial movements allowed by human physiology. The interior design requires the optimal placement of actuators with appropriate interface geometry for grip accessibility that is limited by human anatomical dimensions, including operator height, posture and arm reach. The nature and variability of flight crew introduces uncertainties that constrain the layout of the human-machine interface and the desired minimization of workspace volume. This paper describes computational issues in using anthropometric data, including statistical variability, but also raises many epistemic uncertainties that can affect the design of a workspace for aerospace systems. Epistemic uncertainties are reflected in human demographic origins, physical layout of the human-machine interface, unknown biophysical factors and measurement error in body dimensions. Analysis and discussion are provided on context, methodology, uncertainty assessment, and interpretation. For completeness, the issue of human biomechanics is introduced to complement statistical interpretations of human factors and to provide a possible future path of research based on dynamic simulation to support orthodox stationary methods.
... The generation of the Groundwater Potentiality index (GWPI) map is subject to certain limitations stemming from data quality weaknesses and errors in structural specifications and model assumptions (Refsgaard et al., 2007;Rahmati et al., 2016). An essential aspect of improving the interpretation of model predictions lies in understanding the associated uncertainty related to input data (Loosvelt et al., 2012). ...
Article
Full-text available
The mapping of water potential in arid regions holds crucial significance for the sustainable management of water resources in these vulnerable environments. The Anzi sub-basin in Morocco is confronted with water scarcity, presenting a significant challenge. The region's arid climate and limited accessibility to groundwater resources further exacerbate this scarcity, ultimately constraining the region's socioeconomic development, heavily reliant on agriculture. To address this issue, the current study employs mathematical modeling, specifically the frequency ratio (FR) method, and geomatics tools such as remote sensing (RS) and geographic information systems (GIS) to map and assess potential groundwater zones. The study considers multiple factors influencing water availability, including the density of drainage nodes, lineament density, permeability, slope, NDVI, curvature, distance to drainage, topographic wetness index (TWI), sediment transport index (STI), distance to lineament, and rainfall. The investigation identifies 45 wells, of which 70% are used for model application and the remaining 30% for validation. The model application results in the identification of five potential zones: very high, high, moderate, low, and very low, encompassing 3.97%, 10.11%, 23.38%, 35.02%, and 27.53% of the total study area, respectively. Furthermore, the model's validity is confirmed through the ROC (receiver operating characteristic) curve analysis, indicating a reliability value of 82.5% for the FR method in mapping groundwater potential. These findings align with the results obtained from geophysical prospecting in the region. Notably, the locations of high-flow wells generally correspond to moderately to highly conductive anomalies of resistivity, primarily attributed to the presence of quaternary formations, geological faults, and geophysically derived faults. These results are further validated by proton magnetic resonance surveys.
... The ATNL trends described in this article, however, contain several types of model uncertainty: input uncertainty, structure uncertainty (process descriptions), parameter uncertainty and technical uncertainty (Refsgaard et al. 2007). The uncertainty of the results is also increased by the chaining of the models. ...
Article
Full-text available
The national scale nutrient load modelling system VEMALA-ICECREAM was used to simulate agricultural total nitrogen(TN) loading and its trends for all Finnish watersheds for the period from 1990–2019. Across Finland, agricultural TN loading (ATNL) has decreased from 17.4 kg ha-1 a-1 to 14.4 kg ha-1 a-1 (moving 10-year averages) since the 1990s. The main driver of the decrease in simulated ATNL is a reduction in mineral fertilizer use, which has decreased the N surplus in the soils. The TN leached fraction, however, did not show a trend but did have high annual variability due to variations in runoff; this corresponds to an average of 14.4% of the TN applied. The ATNL was considerably higher in the Archipelago Sea catchment compared to other Finnish Baltic Sea sub-catchments, with the lowest ATNL found in the Vuoksi catchment in Eastern Finland. The highest decrease of ATNL was simulated for Vuoksi and Gulf of Finland catchments. In the Bothnian Sea, Bothnian Bay and Archipelago Sea catchments, the decreasing trend of ATNL was smaller but still significant, with the exception of the Quark catchment, where there was no significant change. The differences in decreasing trends between regions can be explained by the heterogeneity of catchment characteristics, hydrology and agricultural practices in different regions.
... The guidelines for the implementation of Sobol' SA in Python with the SALib-package suggest that the Sobol' sequence should be used to produce nested input parameter samples across stocks (Herman and Usher 2017). The computational costs for all Sobol' sensitivity indices would add up to 2 k−1 · N model evaluations (Refsgaard et al. 2007). Because of the computational time of BSEMs, the authors focus on the analysis of first-order, secondorder and total-order effects. ...
Article
Building-Stock Energy Models (BSEMs) have grown in popularity, implementation, scale and complexity. Yet, BSEM quality assurance processes have lagged behind. This article proposes a scalable methodology to apply Uncertainty (UA) and Sensitivity Analysis (SA) to BSEMs and studies the performance of eleven common UA-SA methods (OAT, SRC, SRRC, FFD, Morris, Sobol’, eFAST, FAST-RBD, DMIM, PAWN, DGSM) for three UA-SA targets: screening, ranking and indices. Applying UA and SA to BSEMs requires a two-step input parameter sampling that samples ‘across stocks’ and ‘within stocks’. To make efficient use of computational resources, practitioners should (i) distinguish between three UA-SA targets and (ii) choose a method based on the aimed UA-SA target. The computational cost varies according to the UA-SA target and method; (i) for screening: OAT, SRC, SRRC, FFD and Morris; (ii) for ranking: SRC, SRRC and Morris and (iii) for indices: Sobol’ is the most efficient, among the tested UA-SA methods.
... Uncertainties may occur while setting up and running the model. Uncertainties come either from measurement errors or from the model itself [33][34][35]. To be used efficiently, the model's performance must be tested. ...
Article
Full-text available
Eutrophication in the Lobo watershed remains a major problem. The work carried out has focused on chemical and biological analyses in the lake or in its immediate environment: they did not sufficiently take into account the diffuse transfer of nutrients over the entire watershed. This study aims to assess the nutrient (N and P) loads in the Lobo watershed, an agricultural area, to understand the spatio-temporal impacts of land management practices on eutrophication. The methodology uses two steps: streamflow calibration and nutrient (N and P) estimation using the Soil and Water Assessment Tool (SWAT) watershed model. Thus, the nutrient inputs were estimated based on the levels of N and P in every kilogram of Nitrogen-phosphorus-Potassium (NPK) type fertilizers applied by farmers. The average quantities of N and P applied to the crops were 47.24 kg ha−1 and 21.25 kg ha−1. Results show a good performance on flow calibration as evidenced using evaluation criteria R2, Nash–Sutcliffe Efficiency (NSE), and Percent Bias (PBIAS) of 0.63, 0.62, and −8.1, respectively. The yields of inorganic N and soluble P varied from 0 to 0.049 kg ha−1 and from 0 to 0.31 kg ha−1. These results show that the crops’ inorganic nitrogen requirements were higher than the demands for soluble phosphorus. Simulations relating to the organic N transfer revealed values ranging from 0.2 to 5 kg ha−1, while the transport of organic phosphorus was estimated to vary from 0.3 to 1.3 kg ha−1.
... The methodology was found to be adaptable to needs expressed by stakeholders, and the type and resolution of available data. Most notable was the need for continuous interactions with local stakeholders (e.g., Refsgaard et al., 2007) at each stage of application, which helped build-up stakeholder confidence in the output and understanding of model assumptions and procedures applied. The KULTURisk methodology was found to be a useful decision-making tool, which gives insight into (physical, social, and economic) benefits of proposed measures for flood risk reduction. ...
... Knowledge on the occurrence of uncertainties affects the methodological choices in sequential ecological and economic research (cf. Refsgaard et al., 2007), who concluded that suitable approaches are needed to address data and human uncertainties. In this research, we chose to use BBN and MCS due to their ability to include uncertainty. ...
... These complexities, non-linearities, and stochastic behavior among different components in the hydrological process will affect the reliability of physically based models and conventional statistical methods, thus paving the way for different sources of uncertainty. For real-life flood forecasting systems, hydrological forecasts are inevitably very uncertain (Beven & Binley 1992;Gupta et al. 1998;Refsgaard et al. 2007). ...
Article
Full-text available
Floods and their associated impacts are topics of concern in land development planning and management, which call for efficient flood forecasting and warning systems. The performance of flood warning systems is affected by uncertainty in water level forecasts, which is due to their inability to measure or calculate a modeled value accurately. Predictive uncertainty is an emerging type of uncertainty modeling technique that emphasizes total uncertainty quantified as a probability distribution conditioned on all available knowledge. Predictive uncertainty analysis was done using quantile regression (QR) for machine learning-based flood models – Hybrid Wavelet Artificial Neural Network model (WANN) and Hybrid Wavelet Support Vector Machine model (WSVM) for different lead times. Comparing QR models of WANN and WSVM revealed that the slope, intercept, spread of forecast, and width of confidence band of the WANN model are more for each quantile indicating more uncertainty as compared to the WSVM model. In both models, with an increase in lead time, uncertainty has shown an increasing trend as well. The performance evaluation of inference obtained from QR models was evaluated using uncertainty statistics such as prediction interval coverage probability, average relative interval length (ARIL), and mean prediction interval (MPI).
... Looking into the future, HSPF outputs for each of the three climatic models projected a significant decrease in the Côa River streamflow, both in terms of monthly mean and annual mean values. Under the assumption that the uncertainties added by the hydrological model can be partially offset when normalizing the simulated flows, taking into account observational flows [47], we were able to evaluate the response of the catchment. Therefore, it is not surprising to observe negative percentages during future periods with respect to the historical period. ...
Article
Full-text available
The increasing gap between water demands and availability is a significant challenge for sustainable water management, particularly in the context of growing irrigation needs driven by climate change. In the Côa region (inner-north Portugal), agriculture plays a vital role in the local economy, ensuring food security and contributing to the conservation of natural resources, though also threatened by climate change. The present study assesses how streamflow in the Côa River can be affected by climate change. The HSPF (Hydrological Simulation Program-FORTRAN) hydrological model was coupled with three global-regional climate model chains to simulate historical monthly and annual streamflow (1986-2015), and to predict future (2040-2099) streamflow under RCP8.5. Irrigation scenarios were subsequently developed considering a potential future increase from 10% to 50% per decade. The evaluation of HSPF performance during the historical period revealed good agreement (R 2 > 0.79) between simulated and observed flows. A general decrease in streamflow is found in the future, particularly in 2070-2099, with annual mean streamflow projected to decrease by −30% until 2099. Interannual variability is also expected to increase. Generally, the simulations indicated higher future flows in winter/early spring, whilst they are expected to decrease over the rest of the year, suggesting drought intensification. An increase in water demands for irrigation, potentially rising from 46 hm 3 ·yr −1 (baseline scenario) up to 184 hm 3 ·yr −1 (50% increase per decade) may lead to unsustainable irrigation. Managing these opposite trends poses significant challenges, requiring a comprehensive and integrated approach from stakeholders and policymakers. Strategies should focus on both demand-side and supply-side measures to optimize water use, improve water efficiency, and preserve water availability.
... Uncertainties may occur while setting up and running the model. Uncertainties come either from measurement errors or from the model itself [19][20][21]. To be used efficiently, the model's performance must be tested. ...
Preprint
Full-text available
Eutrophication in the Lobo watershed remains a major problem. The work carried out has focused on chemical and biological analyzes in the lake or in its immediate environment: they did not sufficiently take into account the diffuse transfer of nutrients over the entire watershed. This study aims to assess the nutrient (N and P) loads in the Lobo watershed, an agricultural area, to understand the spatio-temporal impacts of land management practices on eutrophication. The methodology uses two steps: streamflow calibration and nutrient (N and P) estimation using the Soil and Water Assessment Tool (SWAT) watershed model. Thus, the nutrient inputs were estimated based on the levels of N and P in every kilogram of NPK-type fertilizers applied by farmers. The average quantities of N and P applied to the crops were 47.24 kg N/ha and 21.25 kg P/ha. Results show a good performance on flow calibration as evidenced by evaluation criteria R2, NSE and PBIAS of 0.63, 0.62 and -8.1, respectively. The yields of inorganic N and soluble P varied from 0 to 0.049 kg N/ha and from 0 to 0.31 kg P/ha. These results show that the crops’ in-organic nitrogen requirements were higher than the demands for soluble phosphorus. Simulations relating to the organic N transfer revealed values ranging from 0.2 to 5 kg N/ha, while the transport of organic phosphorus was estimated to vary from 0.3 to 1.3 kg P/ha.
... Due to the high complexity involved in environmental modeling, simplifications and assumptions are necessary to consider the different processes that interact with each other (Wainwright & Mulligan, 2013). Consequently, different sources of uncertainty arise in environmental modeling, including parameter, model input, measurement uncertainty and conceptual uncertainty (Gong et al., 2013;Refsgaard et al., 2007). The latter, also referred to as structural uncertainty, pertains to the choice of model itself, and has gained renewed interest in the past decades as an important source of predictive uncertainty (Bredehoeft, 2005;Gong et al., 2013;Gupta et al., 2012;Höge et al., 2019;Neuman, 2003;Rojas et al., 2008). ...
Article
Full-text available
Bayesian model selection (BMS) and Bayesian model justifiability analysis (BMJ) provide a statistically rigorous framework for comparing competing models through the use of Bayesian model evidence (BME). However, a BME‐based analysis has two main limitations: (a) it does not account for a model's posterior predictive performance after using the data for calibration and (b) it leads to biased results when comparing models that use different subsets of the observations for calibration. To address these limitations, we propose augmenting BMS and BMJ analyses with additional information‐theoretic measures: expected log‐predictive density (ELPD), relative entropy (RE) and information entropy (IE). Exploring the connection between Bayesian inference and information theory, we explicitly link BME and ELPD together with RE and IE to highlight the information flow in BMS and BMJ analyses. We show how to compute and interpret these scores alongside BME, and apply the framework to a controlled 2D groundwater setup featuring five models, one of which uses a subset of the data for calibration. Our results show how the information‐theoretic scores complement BME by providing a more complete picture concerning the Bayesian updating process. Additionally, we demonstrate how both RE and IE can be used to objectively compare models that feature different data sets for calibration. Overall, the introduced Bayesian information‐theoretic framework can lead to a better‐informed decision by incorporating a model's post‐calibration predictive performance, by allowing to work with different subsets of the data and by considering the usefulness of the data in the Bayesian updating process.
... The most common approach for evaluating the impact of deep uncertainty (see BOX 1) associated with multiple plausible futures is to use scenarios McPhail et al., 2020) (Bankes, 1993;Bankes et al., 2013;Bankes et al., 2001;Maier et al., 2016;McPhail et al., 2020;Refsgaard et al., 2007). Scenarios have been referred to as the "possible" or "a concise summary of" future "states of worlds" IPCC, 2012;Lempert, 2013;Mahmoud et al., 2009) or "alternative hypothetical (or diverse) futures" (Groves and Lempert, 2007;van Notten et al., 2005). ...
Preprint
Full-text available
Traditionally, reservoir management has been synonymous with the operation of engineering infrastructure systems, with the majority of literature on the topic focusing on strategies that optimize their operation and control. This is despite the fact that reservoirs have major impacts on society and the environment, and the mechanics of how to best manage a reservoir are often overshadowed by both environmental changes and higher-order questions associated with societal values, risk appetite and politics, which are highly uncertain and to which there are no "correct" answers. As a result, reservoirs have attracted more controversy than any other type of water infrastructure. In this paper, we address these often-ignored issues by providing a review of reservoir management through the lens of wickedness, competing objectives and uncertainty. We highlight the challenges associated with reservoir management and identify research efforts required to ensure these systems best serve society and the environment into the future.
... They all differ in complexity with respect to the number of reservoirs and parameters, which need to be well thought out in order to preserve physical realism and limit equifinality on model parameters. Careful sensitivity analyses and uncertainty assessment should be considered along with model results to avoid over-interpretation (Refsgaard et al., 2007). Reservoir models can be seen as a compromise between simulation performance and insight into the functioning of a system. ...
Article
Full-text available
Hydrological models are widely used to characterize, understand and manage hydrosystems. Lumped parameter models are of particular interest in karst environments given the complexity and heterogeneity of these systems. There is a multitude of lumped parameter modelling approaches, which can make it difficult for a manager or researcher to choose. We therefore conducted a comparison of two lumped parameter modelling approaches: artificial neural networks (ANNs) and reservoir models. We investigate five karst systems in the Mediterranean and Alpine regions with different characteristics in terms of climatic conditions, hydrogeological properties and data availability. We compare the results of ANN and reservoir modelling approaches using several performance criteria over different hydrological periods. The results show that both ANNs and reservoir models can accurately simulate karst spring discharge but also that they have different advantages and drawbacks: (i) ANN models are very flexible regarding the format and amount of input data, (ii) reservoir models can provide good results even with a few years of relevant discharge in the calibration period and (iii) ANN models seem robust for reproducing high-flow conditions, while reservoir models are superior in reproducing low-flow conditions. However, both modelling approaches struggle to reproduce extreme events (droughts, floods), which is a known problem in hydrological modelling. For research purposes, ANN models have been shown to be useful for identifying recharge areas and delineating catchments, based on insights into the input data. Reservoir models are adapted to understand the hydrological functioning of a system by studying model structure and parameters.
... In response, efforts have been made to guard against uncertainty in CBA. Analytical methods for addressing uncertainty in CBA are targeted at preliminary identification and characterization of sources of uncertainty, assessment of uncertainty levels for the various sources, propagation of uncertainty through models, tracing, and ranking sources of uncertainty, or reducing uncertainty after an assessment has been carried out (Refsgaard et al., 2007). Where the aim is more confident decisions, methods are recommended that characterize and propagate uncertainty through the analysis workflow and when possible, use results as a means to reduce (sensitivity of project to) uncertainty. ...
... Uncertainty assessment in the simulation chain aims to consider all sources of error in the methodology. Several tools and methodologies are available to account for uncertainty (Refsgaard et al. 2007, Matott et al. 2009 The main sources of uncertainty in this study arise from both inflow hydrograph and reservoir water level probability estimates. Inflow hydrograph estimates in the future, i.e future flood quantiles, have two sources of uncertainty: (i) uncertainty in delta changes of precipitation associated with a given return period (red in Fig. 5), and (ii) errors in design peak discharge estimates obtained with the RIBS model (blue in Fig. 5). ...
Article
Full-text available
Climate change will likely increase the frequency and magnitude of extreme precipitation events and floods, increasing design peak flows that could lead to underestimates in current spillway capacity. Therefore, new methodologies for hydrological dam safety assessment considering climate change are required. This study presents a methodology that considers the impact of climate change on both inflow hydrographs and initial reservoir water levels. Moreover, the uncertainty in the procedure is assessed. The methodology is applied to the Eugui Dam in the River Arga catchment (Spain). An ensemble of 12 climate models is used. The results show an increase in the maximum reservoir water level during flood events and in the overtopping probability in the Representative Concentration Pathway 8.5 (RCP 8.5 scenario), especially in the 2071–2100 time window. The proposed methodology can be useful to assess future hydrological dam safety, fulfilling the requirements of recent regulations to consider the impact of climate change on dams.
... The guidelines (Herman and Usher 2017) for the implementation of Sobol' SA in Python with the SALib-package suggest that the Sobol' sequence should be used to produce nested input parameter samples across stocks. The computational costs for all Sobol' sensitivity indices would add up to 2 k−1 · N model evaluations (Refsgaard et al. 2007). Because of the computational time of BSEMs, the authors focus on the analysis of first-order, secondorder and total-order effects. ...
Article
Despite broad recognition of the need for applying Uncertainty (UA) and Sensitivity Analysis (SA) to Building-Stock Energy Models (BSEMs), limited research has been done. This article proposes a scalable methodology to apply UA and SA to BSEMs, with an emphasis on important methodological aspects: input parameter sampling procedure, minimum required building stock size and number of samples needed for convergence. Applying UA and SA to BSEMs requires a two-step input parameter sampling that samples ‘across stocks’ and ‘within stocks’. To make efficient use of computational resources, practitioners should distinguish between three types of convergence: screening, ranking and indices. Nested sampling approaches facilitate comprehensive UA and SA quality checks faster and simpler than non-nested approaches. Robust UA-SA's can be accomplished with relatively limited stock sizes. The article highlights that UA-SA practitioners should only limit the UA-SA scope after very careful consideration as thoughtless curtailments can rapidly affect UA-SA quality and inferences. Abbreviations, definitions and indices BEM: Building Energy Model; BSEM: Building-Stock Energy Model; UA: Uncertainty Analysis focuses on how uncertainty in the input parameters propagates through the model and affects the model output parameter(s); SA: Sensitivity Analysis is the study of how uncertainty in the output of a model (numerical or otherwise) can be apportioned to different sources of uncertainty in the model input factors; GSA: Global Sensitivity Analysis (e.g. Sobol’ SA);LSA: Local Sensitivity Analysis (e.g. OAT); OAT: One-At-a-Time; LOD: Level of Development; Y: The model output; Xi: The i-th model input parameter and X∼i denotes the matrix of all model input parameters but Xi; Si: The first-order sensitivity index, which represents the expected amount of variance reduction that would be achieved for Y, if Xi was specified exactly. The first-order index is a normalized index (i.e. always between 0 and 1); STi: The total-order sensitivity index, which represents the expected amount of variance that remains for Y, if all parameters were specified exactly, but Xi. It takes into account the first and higher-order effects (interactions) of parameters Xi and can therefore be seen as the residual uncertainty; SH: The higher-order effects index is calculated as the difference between STi and Si and is a measure of how much Xi is involved in interactions with any other input factor; Sij: The second order sensitivity index, which represents the fraction of variance in the model outcome caused by the interaction of parameter pair (Xi,Xj); M: Mean (µ); SD: Standard deviation (σ); Mo: Mode; n: number of buildings in the modelled stock;N: number of samples (i.e. matrices of (k+2) or (2k+2) stock model runs; batches of (k+2) or (2k+2) are required to calculate Sobol’ indices); K: number of uncertain parameters; ME: number of model evaluations (i.e. stocks to be calculated); *: Table 1: Aleatory uncertainty: Uncertainty due to inherent or natural variation of the system under investigation;Epistemic uncertainty: Uncertainty resulting from imperfect knowledge or modeller error; can be quantified and reduced.
... In the context of ES assessment, uncertainty can be defined as the unknown order or nature of things, a lack of confidence about possible outcomes and/or the inability to assign probabilities to these outcomes (Refsgaard et al., 2007;IPBES, 2020). A common typology identifies three dimensions of uncertainty: (i) source of uncertainty (e.g., inputs, drivers and model structure, parameters, model technical implementation, assessment tools), (ii) magnitude, and (iii) nature (lack of information or inherent variability) (Hamel and Bryant, 2017;IPBES, 2020;Walker et al., 2003;Grêt-Regamey et al., 2013;Hou et al., 2013). ...
Article
Full-text available
Agroecosystems are facing new challenges in the context of a growing and increasingly interconnected human population, and a paradigm shift is needed to successfully address the many complex questions that these challenges will generate. The transition to providing multiple services within an agroecosystem is a starting point for heightened multifunctionality, however, there is still hesitation among stakeholders about moving towards multi-service systems, largely because of the lack of knowledge linking productivity and multifunctionality. We reason that much of this reticence could be overcome through a better understanding of stakeholder requirements and innovative transdisciplinary research extended in the dimensions of time and space. We assembled experts in France to identify priority research questions for co-constructing projects with stakeholders. We identified 18 key questions, as well as the obstacles that hinder their resolution and propose potential solutions for tackling these obstacles. We illustrate that research into agroecosystem multifunctionality and service production must be a hugely collaborative effort and needs to integrate knowledge from different sectors and communities. Promoting dialogue, standardization and data-sharing would enhance transdisciplinary progress. Biodiversity is highlighted as a key factor to explore and incorporate into modelling approaches, but major advances must be made in the understanding of dynamic changes in the biodiversity-function-service nexus across landscapes. Resolving these research questions will allow us to translate knowledge into decision objectives, identify adaptation and tipping points in agroecosystems and develop social-ecological economic pathways that are adaptive over time.
... In the context of ES assessment, uncertainty can be defined as the unknown order or nature of things, a lack of confidence about possible outcomes and/or the inability to assign probabilities to these outcomes (Refsgaard et al., 2007;IPBES, 2020). A common typology identifies three dimensions of uncertainty: (i) source of uncertainty (e.g., inputs, drivers and model structure, parameters, model technical implementation, assessment tools), (ii) magnitude, and (iii) nature (lack of information or inherent variability) (Hamel and Bryant, 2017;IPBES, 2020;Walker et al., 2003;Grêt-Regamey et al., 2013;Hou et al., 2013). ...
Article
Full-text available
Agroecosystems are facing new challenges in the context of a growing and increasingly interconnected human population, and a paradigm shift is needed to successfully address the many complex questions that these challenges will generate. The transition to providing multiple services within an agroecosystem is a starting point for heightened multifunctionality, however, there is still hesitation among stakeholders about moving towards multi-service systems, largely because of the lack of knowledge linking productivity and multifunctionality. We reason that much of this reticence could be overcome through a better understanding of stakeholder requirements and innovative transdisciplinary research extended in the dimensions of time and space. We assembled experts in France to identify priority research questions for co-constructing projects with stakeholders. We identified 18 key questions, as well as the obstacles that hinder their resolution and propose potential solutions for tackling these obstacles. We illustrate that research into agroecosystem multifunctionality and service production must be a hugely collaborative effort and needs to integrate knowledge from different sectors and communities. Promoting dialogue, standardization and data-sharing would enhance transdisciplinary progress. Biodiversity is highlighted as a key factor to explore and incorporate into modelling approaches, but major advances must be made in the understanding of dynamic changes in the biodiversity-function-service nexus across landscapes. Resolving these research questions will allow us to translate knowledge into decision objectives, identify adaptation and tipping points in agroecosystems and develop social-ecological economic pathways that are adaptive over time.
Chapter
This chapter discusses the notions of uncertainty and risk in relation to decision-making, with an emphasis on their implications for life cycle assessment. In particular, the chapter presents approaches for dealing with uncertain LCA results in decision situations.
Article
Scientific uncertainty is an integral part of the research process and inherent to the construction of new knowledge. In this paper, we investigate the ways in which uncertainty is expressed in articles and propose a new interdisciplinary annotation framework to categorize sentences containing uncertainty expressions along five dimensions. We propose a method for the automatic annotation of sentences based on linguistic patterns for identifying the expressions of scientific uncertainty that have been derived from a corpus study. We processed a corpus of 5956 articles from 22 journals in three different discipline groups, which were annotated using our automatic annotation method. We evaluate our annotation method and study the distribution of uncertainty expressions across the different journals and categories. The results show a predominant concentration of the distribution of the scientific uncertainty expressions in the Results and Discussion section (71.4%), followed by 12.5% of expressions in the Background section, and the largest proportion of uncertainty expressions, approximately 70.3%, are formed as author(s) statements. Our research contributes methodological advances and insights into the diverse manifestations of scientific uncertainty across disciplinary domains and provides a basis for ongoing exploration and refinement of the understanding of scientific uncertainty communication.
Article
Full-text available
The assessment of cropland carbon and nitrogen (C and N) balances plays a key role in identifying cost-effective mitigation measures to combat climate change and reduce environmental pollution. In this paper, a biogeochemical modelling approach is adopted to assess all C and N fluxes in a regional cropland ecosystem of Thessaly, Greece. Additionally, the estimation and quantification of the modelling uncertainty in the regional inventory are realized through the propagation of parameter distributions through the model, leading to result distributions for modelling estimations. The model was applied to a regional dataset of approximately 1000 polygons, deploying model initializations and crop rotations for the five major crop cultivations and for a time span of 8 years. The full statistical analysis on modelling results (including the uncertainty ranges given as ± values) yields for the C balance carbon input fluxes into the soil of 12.4 ± 1.4 t C ha−1 yr−1 and output fluxes of 11.9 ± 1.3 t C ha−1 yr−1, with a resulting average carbon sequestration of 0.5 ± 0.3 t C ha−1 yr−1. The averaged N influx was 212.3 ± 9.1 kg N ha−1 yr−1, while outfluxes of 198.3 ± 11.2 kg N ha−1 yr−1 were estimated on average. The net N accumulation into the soil nitrogen pools was estimated to be 14.0 ± 2.1 kg N ha−1 yr−1. The N outflux consists of gaseous N fluxes composed of N2O emissions of 2.6 ± 0.8 kg N2O–N ha−1 yr−1, NO emissions of 3.2 ± 1.5 kg NO–N ha−1 yr−1, N2 emissions of 15.5 ± 7.0 kg N2–N ha−1 yr−1 and NH3 emissions of 34.0 ± 6.7 kg NH3–N ha−1 yr−1, as well as aquatic N fluxes (only nitrate leaching into surface waters) of 14.1 ± 4.5 kg NO3–N ha−1 yr−1 and N fluxes of N removed from the fields in yields, straw and feed of 128.8 ± 8.5 kg N ha−1 yr−1.
Chapter
Cereals are an important and essential part of the population’s diet. They can be stored for a long time at relatively little cost. This is especially important for overcoming hunger in poor areas of the world. Their production accounts for a substantial amount of the greenhouse gas emission of the agricultural sector, which is in many cases directly affected by a changing climate. The growing world population accompanying these environmental pressures create a necessity to ensure food security while also arranging food systems in a sustainable way beneficial to all stakeholders along the value chain. Another important group of annual crops are vegetables. They play an important role in supplying the human body with vitamins and trace elements. In this paper we propose Ant Colony Optimization (ACO) algorithm for creating basic bio-economic farm model (BEFM). We seeking to assess farm profits and risks, considering various types of government incentives and policies and adverse weather events. Any annual crop can be included in the proposed model. It can be extended to incorporate many environmental goals and directions, targeted by recent EU policies.
Article
This study presents an assessment of the uncertainty associated with 100-year flood maps under three scenarios: present, RCP4.5, and RCP8.5. Through intensive Monte Carlo simulations, the study focuses on evaluating the uncertainty introduced by key model input parameter, namely the roughness coefficient, runoff coefficient, and precipitation intensity. Notably, the precipitation intensity incorporates multiple sources of uncertainty, including RCP scenario, climate model, and probability distribution function. To analyze the uncertainties, a surrogate hydrodynamic/hydrologic model based on a physical 2D model is employed. The findings of this study challenge the traditional use of deterministic flood maps and climate factors, highlighting the necessity of employing probabilistic approaches for the development of accurate and secure flood maps. Moreover, the research findings indicate that the primary source of uncertainty in precipitation is the selection of the probability distribution, followed by the choice of climate model, and to a lesser extent, the specific RCP scenario.
Chapter
This chapter aims to outline the challenge of language uncertainty and vagueness for the construction of predictive models in archaeology. It includes methods and examples to deal with the issue of uncertainty and vagueness arising from archaeological datasets and elaborates quantitative tools to process and integrate it in a Machine Learning (ML) framework. In particular, the chapter is focused on the combination of a fuzzy set approach with the well-known ensemble algorithm of Random Forest (RF). On this basis, Archaeological Predictive Maps (APM) for two case studies are produced and an uncertainty visualization strategy is defined, based on statistics and cognitive theory methods. A procedure is suggested in order to visually represent and communicate the uncertainty in the final output of the modeling procedure. A four-steps methodology is described here, to consistently estimate and process the language vagueness, without increasing the computational cost of an ML environment, so that it is possible to produce APMs, incorporating confidence intervals and subjective values. The goal is to provide archaeologists with the necessary theoretical and methodological infrastructure to critically evaluate and compute various levels of uncertainty and vagueness that are inherent in archaeological databases and to design best practices for establishing scientific transparency of the results, as well as to improve the efficiency of APMs in research and decision-making processes, within cultural heritage management and archaeological research.
Article
We compute the first probabilistic uranium concentration map of Norway. Such a map can support mineral exploration, geochemical mapping, or the assessment of the health risk to the human population. We employ multiple non-linear regression to fill the information gaps in sparse airborne and ground-borne uranium data sets. We mimic an expert elicitation by employing Random Forests and Multi-layer Perceptrons as digital agents equally qualified to find regression models. In addition to the regression, we use supervised classification to produce conservative and alarmistic classified maps outlining regions with different potential for the local occurrence of uranium concentration extremes. Embedding the introduced digital expert elicitation in a Monte Carlo approach we compute an ensemble of plausible uranium concentrations maps of Norway discretely quantifying the uncertainty resulting from the choice of the regression algorithm and the chosen parametrization of the used regression algorithms. We introduce digitated glyphs to visually integrate all computed maps and their associated uncertainties in a loss-free manner to fully communicate our probabilistic results to map perceivers. A strong correlation between mapped geology and uranium concentration is found, which could be used to optimize future sparse uranium concentration sampling to lower extrapolation components in future map updates.
Article
Full-text available
The degree of success of river water diversion planning decisions is affected by uncertain environmental conditions. The adaptive water management framework incorporates this uncertainty at all stages of management. While the most effective form of adaptive management requires experimental comparison of practices, the use of optimization modeling is convenient for conducting exploratory simulations to evaluate the spatiotemporal implications of current water diversion management decisions under future environmental changes. We demonstrate such an explorative modeling approach by assessing river water availability for diversion in a river basin in Northern Spain under two future environmental scenarios that combine climate and land use change. An evolutionary optimization method is applied to identify and reduce trade-offs with Supporting Ecosystem Services linked to environmental flow requirements for relevant local freshwater species. The results show that seasonal shifts and spatial heterogeneity of diversion volumes are the main challenges for the future diversion management of the Pas River. Basin-scale diversion management should take into account the seasonal planning horizon and the setting of tailored diversion targets at the local-level to promote the implementation of adaptive management. The presented assessment can help with strategic placement of diversion points and timing of withdrawals, but it also provides deeper insight into how optimisation can support decision-making in managing water diversion under uncertain future environmental conditions.
Article
Full-text available
Many hydrological applications employ conceptual-lumped models to support water resource management techniques. This study aims to evaluate the workability of applying a daily time-step conceptual-lumped model, HYdrological MODel (HYMOD), to the Headwaters Benue River Basin (HBRB) for future water resource management. This study combines both local and global sensitivity analysis (SA) approaches to focus on which model parameters most influence the model output. It also identifies how well the model parameters are defined in the model structure using six performance criteria to predict model uncertainty and improve model performance. The results showed that both SA approaches gave similar results in terms of sensitive parameters to the model output, which are also well-identified parameters in the model structure. The more precisely the model parameters are constrained in the small range, the smaller the model uncertainties, and therefore the better the model performance. The best simulation with regard to the measured streamflow lies within the narrow band of model uncertainty prediction for the behavioral parameter sets. This highlights that the simulated discharges agree with the observations satisfactorily, indicating the good performance of the hydrological model and the feasibility of using the HYMOD to estimate long time-series of river discharges in the study area. HIGHLIGHTS Local and global sensitivity analysis (SA) approaches were used for SA and parameter identifiability.; Both approaches gave similar results in terms of sensitive parameters for the model output.; A group of sensitive parameters depends on the selected objective criterion.; Precisely identified parameters reduce the model uncertainties and enhance the model performance.; Sensitive, well-defined parameters and model performance increase with catchment size.;
Article
Full-text available
Promotion of soil organic carbon (SOC) sequestration as a potential solution to support climate change mitigation as well as more sustainable farming systems is rising steeply. As a result, voluntary carbon markets are rapidly expanding in which farmers get paid per tons of carbon dioxide sequestered. This market relies on protocols using simulation models to certify that increases in SOC stocks do indeed occur and generate tradable carbon credits. This puts tremendous pressure on SOC simulation models, which are now expected to provide the foundation for a reliable global carbon credit generation system. There exist an incredibly large number SOC simulation models which vary considerably in their applicability and sensitivity. This confronts practitioners and certificate providers with the critical challenge of selecting the models that are appropriate to the specific conditions in which they will be applied. Model validation and the context of said validation define the boundaries of applicability of the model, and are critical therefore to model selection. To date, however, guidelines for model selection are lacking. In this review, we present a comprehensive review of existing SOC models and a classification of their validation contexts. We found that most models are not validated (71%), and out of those validated, validation contexts are overall limited. Validation studies so far largely focus on the global north. Therefore, countries of the global south, the least emitting countries that are already facing the most drastic consequences of climate change, are the most poorly supported. In addition, we found a general lack of clear reporting, numerous flaws in model performance evaluation, and a poor overall coverage of land use types across countries and pedoclimatic conditions. We conclude that, to date, SOC simulation does not represent an adequate tool for globally ensuring effectiveness of SOC sequestration effort and ensuring reliable carbon crediting.
Article
Full-text available
Factor Fixing (FF) is a common method for reducing the number of model parameters to lower computational cost. FF typically starts with distinguishing the insensitive parameters from the sensitive and pursues uncertainty quantification (UQ) on the resulting reduced‐order model, fixing each insensitive parameter at a fixed value. There is a need, however, to expand such a common approach to consider the effects of decision choices in the FF‐UQ procedure on metrics of interest. Therefore, to guide the use of FF and increase confidence in the resulting dimension‐reduced model, we propose a new adaptive framework consisting of four principles: 1) re‐parameterize the model first to reduce obvious non‐identifiable parameter combinations, 2) focus on decision relevance especially with respect to errors in quantities of interest (QoI), 3) conduct adaptive evaluation and robustness assessment of errors in the QoI across FF choices as sample size increases, and 4) reconsider whether fixing is warranted. The framework is demonstrated on a spatially distributed water quality model. The error in estimates of QoI caused by FF can be estimated using a Polynomial Chaos Expansion (PCE) surrogate model. Built with 70 model runs, the surrogate is computationally inexpensive to evaluate and can provide global sensitivity indices for free. For the selected catchment, just two factors may provide an acceptably accurate estimate of model uncertainty in the average annual load of Total Suspended Solids (TSS), suggesting that reducing the uncertainty in these two parameters is a priority for future work before undertaking further formal uncertainty quantification. This article is protected by copyright. All rights reserved.
Article
Traditionally, reservoir management has been synonymous with the operation of engineering infrastructure systems, with the majority of literature on the topic focusing on strategies that optimize their operation and control. This is despite the fact that reservoirs have major impacts on society and the environment, and the mechanics of how to best manage a reservoir are often overshadowed by both environmental changes and higher-order questions associated with societal values, risk appetite and politics, which are highly uncertain and to which there are no “correct” answers. As a result, reservoirs have attracted more controversy than any other type of water infrastructure. In this paper, we address these often-ignored issues by providing a review of reservoir management through the lens of wickedness, competing objectives and uncertainty. We highlight the challenges associated with reservoir management and identify research efforts required to ensure these systems best serve society and the environment into the future.
Article
Full-text available
The aim of this work is to present the food web models developed using the Ecopath with Ecosim (EwE) software tool to describe structure and functioning of various European marine ecosystems (eastern, central and western Mediterranean Sea; Black Sea; Bay of Biscay, Celtic Sea and Iberian coast; Baltic Sea; North Sea; English Channel, Irish Sea and west Scottish Sea; and Norwegian and Barents Seas). A total of 195 Ecopath models based on 168 scientific publications, which report original, updated and modified versions, were reviewed. Seventy models included Ecosim temporal simulations while 28 implemented Ecospace spatiotemporal dynamics. Most of the models and publications referred to the western Mediterranean Sea followed by the English Channel, Irish Sea and west Scottish Sea sub-regions. In the Mediterranean Sea, the western region had the largest number of models and publications, followed by the central and eastern regions; similar trends were observed in previous literature reviews. Most models addressed ecosystem functioning and fisheries-related hypotheses while several investigated the impact of climate change, the presence of alien species, aquaculture, chemical pollution, infrastructure, and energy production. Model complexity (i.e., number of functional groups) increased over time. Main forcing factors considered to run spatial and temporal simulations were trophic interactions, fishery, and primary production. Average scores of ecosystem indicators derived from the Ecopath summary statistics were compared. Uncertainty was also investigated based on the use of the Ecosampler plug-in and the Monte Carlo routine; only one third of the reviewed publications incorporated uncertainty analysis. Only a limited number of the models included the use of the ECOIND plug-in which provides the user with quantitative output of ecological indicators. We assert that the EwE modelling approach is a successful tool which provides a quantitative framework to analyse the structure and dynamics of ecosystems, and to evaluate the potential impacts of different management scenarios.
Preprint
Full-text available
The assessment of cropland carbon and nitrogen (C & N) balances play a key role to identify cost effective mitigation measures to combat climate change and reduce environmental pollution. In this paper, a biogeochemical modelling approach is adopted to assess all C & N fluxes in a regional cropland ecosystem of Thessaly, Greece. Additionally, the estimation and quantification of the modelling uncertainty in the regional inventory are realized through the propagation of parameter distributions through the model leading to result distributions for modelling estimations. The model was applied on a regional dataset of approximately 1000 polygons deploying model initializations and crop rotations for the 5 major crop cultivations and for a timespan of 8 years. The full statistical analysis on modelling results yields for the C balance carbon input fluxes into the soil of 12.4 ± 1.4 tons C ha-1 yr-1 and output fluxes of 11.9 ± 1.3 tons C ha-1 yr-1, with a resulting average carbon sequestration of 0.5 ± 0.3 tons C ha-1 yr-1. The averaged N influx was 212.3 ± 9.1 kg N ha-1 yr-1 while outfluxes were estimated on average of 198.3 ± 11.2 kg N ha-1 yr-1. The net N accumulation into the soil nitrogen pools was estimated to 14.0 ± 2.1 kg N ha-1 yr-1. The N outflux consist of gaseous N fluxes composed by N2O emissions 2.6 ± 0.8 kg N2O-N ha-1 yr-1, NO emissions of 3.2 ± 1.5 kg NO-N ha-1 yr-1, N2 emissions 15.5 ± 7.0 kg N2-N ha-1 yr-1 and NH3 emissions of 34.0 ± 6.7 kg NH3-N ha-1 yr-1, as well as aquatic N fluxes (only nitrate leaching into surface waters) of 14.± 4.5 kg NO3-N ha-1 yr-1, N fluxes of N removed from the fields in yields, straw and feed of 128.8 ± 8.5 kg N ha-1 yr-1.
Article
Full-text available
Unlabelled: Due to our limited knowledge about complex environmental systems, our predictions of their behavior under different scenarios or decision alternatives are subject to considerable uncertainty. As this uncertainty can often be relevant for societal decisions, the consideration, quantification and communication of it is very important. Due to internal stochasticity, often poorly known influence factors, and only partly known mechanisms, in many cases, a stochastic model is needed to get an adequate description of uncertainty. As this implies the need to infer constant parameters, as well as the time-course of stochastic model states, a very high-dimensional inference problem for model calibration has to be solved. This is very challenging from a methodological and a numerical perspective. To illustrate aspects of this problem and show options to successfully tackle it, we compare three numerical approaches: Hamiltonian Monte Carlo, Particle Markov Chain Monte Carlo, and Conditional Ornstein-Uhlenbeck Sampling. As a case study, we select the analysis of hydrological data with a stochastic hydrological model. We conclude that the performance of the investigated techniques is comparable for the analyzed system, and that also generality and practical considerations may be taken into account to guide the choice of which technique is more appropriate for a particular application. Supplementary information: The online version contains supplementary material available at 10.1007/s00477-023-02434-z.
Technical Report
Full-text available
The EU Member States, Norway and the European Commission have jointly developed a common strategy for supporting the implementation of the Directive 2000/60/EC establishing a framework for Community action in the field of water policy (the Water Framework Directive). The main aim of this strategy is to allow a coherent and harmonious implementation of this Directive. Focus is on methodological questions related to a common understanding of the technical and scientific implications of the Water Framework Directive. One of the main short-term objectives of the strategy is the development of nonlegally binding and practical Guidance Documents on various technical issues of the Directive. These Guidance Documents are targeted to those experts who are directly or indirectly implementing the Water Framework Directive in river basins. The structure, presentation and terminology is therefore adapted to the needs of these experts and formal, legalistic language is avoided wherever possible. In the context of this strategy, an informal working group dedicated to best practices in river basin planning issues of the Directive has been set up. The main objective of this working group, launched in July 2001, is the development of a non-legally binding and practical Guidance Documents on four elements of the Water Framework Directive: Identification of river basin districts, planning process, public participation and integrated river basin management planning. Spain and the Commission have the responsibility of the secretariat and animation of the working group that is composed of technical experts from governmental and nongovernmental organisations (NGOs). The present document is the final version of the Guidance on planning process. It presents a general overview of the whole planning cycle and provides some recommendations for its successful implementation. It builds on the input and feedback from a wide range of experts and stakeholders from both EU Member States and candidate countries. We, the water directors of the European Union, Norway, Switzerland and the countries applying for accession to the European Union have examined and endorsed this Guidance by means of a written procedure in March 2003. We would like to thank the participants of the Working Group and, in particular, the leaders for preparing this high quality document We strongly believe this and other Guidance Documents developed under the common implementation strategy will play a key role in the process of implementing the Water Framework Directive. This Guidance Document is a living document that will need continuous input and improvements as application and experience build up in all countries of the European Union and beyond, We agree, however, that this document will be made publicly available in its current form in order to present it to a wider public as a basis for carrying forward ongoing implementation work.
Conference Paper
Full-text available
This paper gives highlight on inter\grated water resources management (IWRN) and global water partnership (GWR) that seeks to balance human, industrial, agricultural and environmental needs. Global water partnership is a working partnership among all those involved in water management-government agencies, public institutions, private companies, professionals' organizations, multilateral development agencies and others committed to Dublin-Rio principles. In this paper attention has been given to areas of concern for IWRM and liaisons to Sudan or local water resources issues as well as the importance of GWP and launching of the Regional Water Partnership, East Africa Region.
Article
Full-text available
The predominant philosophy underlying most environmental modelling is a form of pragmatic realism. The limitations of this approach in practical applications are discussed, in particular, in relation to questions of scale, nonlinearity, and uniqueness of place. A new approach arising out of the concept of equifinality of models (structures and parameter sets) in application is outlined in the form of an uncertain "landscape space" to model space mapping. The possibility of hypothesis testing within this framework is proposed as a means of refining the mapping, with a focus on the differentiation of function within the model space. The approach combines elements of instrumentalism, relativism, Bayesianism and pragmatism while allowing the realist stance that underlies much of the practice of environmental modelling as a fundamental aim. It may be an interim philosophy that is awaiting developments in measurement technique to allow further refinement, but allows some coherent guidance about how to be specific in presenting predictions to end-users.
Article
Full-text available
The difficulties involved in calibrating conceptual watershed models have, in the past, been partly attributable to the lack of robust optimization tools. Recently, a global optimization method known as the SCE-UA (shuffled complex evolution method developed at The University of Arizona) has shown promise as an effective and efficient optimization technique for calibrating watershed models. Experience with the method has indicated that the effectiveness and efficiency of the algorithm are influenced by the choice of the algorithmic parameters. This paper first reviews the essential concepts of the SCE-UA method and then presents the results of several experimental studies in which the National Weather Service river forecast system-soil moisture accounting (NWSRFS-SMA) model, used by the National Weather Service for river and flood forecasting, was calibrated using different algorithmic parameter setups. On the basis of these results, the recommended values for the algorithmic parameters are given. These values should also help to provide guidelines for other users of the SCE-UA method.
Article
Full-text available
The aim of this paper is to provide a conceptual basis for the systematic treatment of uncertainty in model-based decision support activities such as policy analysis, integrated assessment and risk assessment. It focuses on the uncertainty perceived from the point of view of those providing information to support policy decisions (i.e., the modellers’ view on uncertainty) – uncertainty regarding the analytical outcomes and conclusions of the decision support exercise. Within the regulatory and management sciences, there is neither commonly shared terminology nor full agreement on a typology of uncertainties. Our aim is to synthesise a wide variety of contributions on uncertainty in model-based decision support in order to provide an interdisciplinary theoretical framework for systematic uncertainty analysis. To that end we adopt a general definition of uncertainty as being any deviation from the unachievable ideal of completely deterministic knowledge of the relevant system. We further propose to discriminate among three dimensions of uncertainty: location, level and nature of uncertainty, and we harmonise existing typologies to further detail the concepts behind these three dimensions of uncertainty.We propose an uncertainty matrix as a heuristic tool to classify and report the various dimensions of uncertainty, thereby providing a conceptual framework for better communication among analysts as well as between them and policymakers and stakeholders. Understanding the various dimensions of uncertainty helps in identifying, articulating, and prioritising critical uncertainties, which is a crucial step to more adequate acknowledgement and treatment of uncertainty in decision support endeavours and more focused research on complex, inherently uncertain, policy issues.
Article
Full-text available
This paper proposes a subjective scale of scientific uncertainty that allows a source of scientific information to express to a lay audience the subjective level of certainty or uncertainty that it associates with a particular assertion of scientific fact, or to represent the range of expert opinion regarding that certainty or uncertainty. The scale is intended as a tool to help increase the precision and rationality of discourse in controversies in which generalists untrained in natural science must judge the merits of opposing arguments in disputes among scientific experts. It complements the quantitative scale of uncertainty, based on Bayesian statistics, used in the recent report of the Inter‐Governmental Panel on Climate Change. Both of these scales are designed for use in situations where the risk probabilities are not precisely known. The scale takes advantage of the fact that there are many more standards of proof recognized in the US legal system beyond the familiar ‘criminal’ and ‘civil’ standards of ‘beyond a reasonable doubt’ and ‘preponderance of the evidence’, respectively, and that these standards correspond to levels of certainty or uncertainty that constitute acceptable bases for legal decisions in a variety of practical contexts. The levels of certainty or uncertainty corresponding to these standards of proof correspond rather well to the informal scale of certainty used by research scientists in the course of their everyday work, and indeed by ordinary people as they estimate the likelihood of one or another proposition.
Article
Current regimes in resource management are often unsustainable as judged by ecological, economic and social criteria. Many technological resource management regimes are inflexible and not built to adapt to changes in environmental, economic or social circumstances. This inflexibility poses problems in a world characterized by fast change. The water sector is currently undergoing major processes of transformation at local, regional and global scales. Today's situation is challenged by uncertainties, e.g., in water demand (diminishing in industrialized countries, rising in developing countries), by worsening water quality, by pressure for cost-efficient solutions, and by fast changing socio-economic boundary conditions. One expects additional uncertainties, due to climate change, such as a shift in the pattern of extreme events. Hence, new strategies and institutional arrangements are required to cope with risk and change in general. When one considers processes of transformation and change, the human dimension is of particular importance. Institutions and rule systems may cause resistance to change but can also enable and facilitate necessary transformation processes. This paper explores conceptual approaches in social learning and adaptive management. It introduces agent-based modelling, and the link between analytical modelling and participatory approaches as promising new developments to explore and foster changes towards sustainability and the required transformations in technological regimes and institutional settings.
Article
IntroductionHow to Obtain and Install SIMLABSIMLAB Main PanelSample GenerationHow to Execute ModelsSensitivity Analysis
Article
This paper reviews the role of uncertainty in the identification of mathematical models of water quality and in the application of these models to problems of prediction. More specifically, four problem areas are examined in detail: uncertainty about model structure, uncertainty in the estimated model parameter values, the propagation of prediction errors, and the design of experiments in order to reduce the critical uncertainties associated with a model. Enclosed is the main body of the review dealing in turn with (1) identifiability and experimental design, (2) the generation of preliminary model hypotheses under conditions of sparse, grossly uncertain field data, (3) the selection and evaluation of model structure, (4) parameter estimation (model calibration), (5) checks and balances on the identified model, i. e. , model 'verification' and model discrimination, and (6) prediction error propagation.
Article
‘Risk governance’ is by now a widely used expression, combining two concepts that are apparently separate, but belong instead to spheres of investigation and practical interest that are strictly intertwined and partially overlapping. Many discussions about technological innovation or development occur nowadays in the public arena and are broadly framed, including considerations about health, safety, environment, distributional and ethical issues, thus touching on the interests and values of many stakeholders. Many opportunities are emerging for incorporating societal concerns as well as ‘non-standard’ knowledge in the governance of risks. Yet the full realisation of a kind of participative governance is enormously difficult: it requires a shift of mentality, broad changes in professional and institutional practices, and the design and implementation of new instruments and procedures. Copyright , Beech Tree Publishing.
Article
Risk assessment of technological hazards usually requires expert judgment for the quantification of uncertain parameters. One method of incorporating these judgments into the analysis is to encode them as probabilities or probability distributions. This paper discusses a formal procedure for developing these probability distributions. The steps in the procedure include selection of experts, selection and definition of issues, preparation for probability elicitation, probability elicitation methods, postelicitation processing of judgments, and documentation. The procedure is illustrated with examples from technological risk assessment, including reactor safety and nuclear-waste disposal studies.
Article
Scitation is the online home of leading journals and conference proceedings from AIP Publishing and AIP Member Societies
Article
To meet the challenges of sustainability and catchment management requires an approach that assesses resource usage options and environmental impacts integratively. Assessment must be able to integrate several dimensions: the consideration of multiple issues and stakeholders, the key disciplines within and between the human and natural sciences, multiple scales of system behaviour, cascading effects both spatially and temporally, models of the different system components, and multiple databases. Integrated assessment (IA) is an emerging discipline and process that attempts to address the demands of decision makers for management that has ecological, social and economic values and considerations. This paper summarises the features of IA and argues the role for models and information systems as a prime activity. Given the complex nature of IA problems, the broad objectives of IA modelling should be to understand the directions and magnitudes of change in relation to management interventions so as to be able to differentiate between associated outcome sets. Central to this broad objective is the need for improved techniques of uncer-tainty and sensitivity analysis that can provide a measure of confidence in the ability to differentiate between different decisions. Three examples of problems treated with an IA approach are presented. The variations in the way that the different dimensions are integrated in the modelling are discussed to highlight the sorts of choices that can be made in model construction. The conclusions stress the importance of IA as a process, not just as a set of outcomes, and define some of the deficiencies to be overcome.
Article
Although it is widely acknowledged that our understanding of environmental systems cannot be reduced to single predictions and unique explanations, determinism remains a common strategy in physical geography. This paper argues for explicit assessments of uncertainty in environmental data and models as a necessary, although not a sufficient, condition for balancing uncertain scientific arguments against uncertain social, ethical, moral and legal arguments in managing environmental systems. In particular, this paper aims to: (1) demonstrate the importance of assessing uncertainty within a realist research framework; (2) consider the nature of scientific uncertainty as the basis for developing methodologies that question belief; and (3) explore some important aspects of a methodology for evaluating uncertainties in environmental research.
Article
This paper presents the present philosophy and practice used in probability encoding by the Decision Analysis Group at Stanford Research Institute. Probability encoding, the process of extracting and quantifying individual judgment about uncertain quantities, is one of the major functions required in the performance of decision analysis. The paper discusses the setting of the encoding process, including the use of sensitivity analyses to identify crucial state variables for which extensive encoding procedures are appropriate. The importance of balancing modeling and encoding techniques is emphasized and examples of biases and unconscious modes of judgment are reviewed. A variety of encoding methods are presented and their applicability is discussed. The authors recommend and describe a structured interview process that utilizes a trained interviewer and a number of techniques designed to reduce biases and aid in the quantification of judgment.
Article
Over the past few decades Integrated Assessment (IA) has emerged as an approach to link knowledge and action in a way that is suitable to accommodate uncertainties, complexities and value diversities of global environmental risks. Responding to the complex nature of the climate problem and to the changing role of climate change in the international climate policy process, the scientific community has started to include stakeholder knowledge and perspectives in their assessments. Participatory Integrated Assessment (PIA) is in its early stage of development. Methodology varies strongly across PIA projects. This paper analyzes four recent IA projects of climate change that included knowledge or perspectives from stakeholders in one-way or another. Approaches and methods used turn out to differ in whether stakeholders are involved actively or passively, whether the approach is bottom-up or top-down, and whether the different functions in the IA process are open or closed to stakeholder input. Also, differences can be seen in the degree to which boundaries are pre-set that limit the roles and domains of competencies attributed to each scientific or non-scientific participant (so-called boundary work). The paper discusses pros and cons of the various approaches identified, and outlines heuristics and considerations to assist those who plan, design or fund new IA processes with stakeholder input on what approaches best to choose in view of the objectives for stakeholder involvement, in view of the role that the IA plays in the overall risk management process and in view of considerations regarding boundary work.
Article
The main purpose of public participation in integrated water resources modelling is to improve decision-making by ensuring that decisions are soundly based on shared knowledge, experience and scientific evidence. The present paper describes stakeholder involvement in the modelling process. The point of departure is the guidelines for quality assurance for ‘scientific‘ water resources modelling developed under the EU research project HarmoniQuA, which has developed a computer based Modelling Support Tool (MoST) to provide a user-friendly guidance and a quality assurance framework that aim for enhancing the credibility of river basin modelling. MoST prescribes interaction, which is a form of participation above consultation but below engagement of stakeholders and the public in the early phases of the modelling cycle and under review tasks throughout the process. MoST is a flexible tool which supports different types of users and facilitates interaction between modeller, manager and stakeholders. The perspective of using MoST for engagement of stakeholders e.g. higher level participation throughout the modelling process as part of integrated water resource management is evaluated.
Article
The following techniques for uncertainty and sensitivity analysis are briefly summarized: Monte Carlo analysis, differential analysis, response surface methodology, Fourier amplitude sensitivity test, Sobol' variance decomposition, and fast probability integration. Desirable features of Monte Carlo analysis in conjunction with Latin hypercube sampling are described in discussions of the following topics: (i) properties of random, stratified and Latin hypercube sampling, (ii) comparisons of random and Latin hypercube sampling, (iii) operations involving Latin hypercube sampling (i.e. correlation control, reweighting of samples to incorporate changed distributions, replicated sampling to test reproducibility of results), (iv) uncertainty analysis (i.e. cumulative distribution functions, complementary cumulative distribution functions, box plots), (v) sensitivity analysis (i.e. scatterplots, regression analysis, correlation analysis, rank transformations, searches for nonrandom patterns), and (vi) analyses involving stochastic (i.e. aleatory) and subjective (i.e. epistemic) uncertainty.
Article
This article presents the US Geological Survey computer program UCODE, which was developed in collaboration with the US Army Corps of Engineers Waterways Experiment Station and the International Ground Water Modeling Center of the Colorado School of Mines. UCODE performs inverse modeling, posed as a parameter-estimation problem, using nonlinear regression. Any application model or set of models can be used; the only requirement is that they have numerical (ASCII or text only) input and output files and that the numbers in these files have sufficient significant digits. Application models can include preprocessors and postprocessors as well as models related to the processes of interest (physical, chemical and so on), making UCODE extremely powerful for model calibration. Estimated parameters can be defined flexibly with user-specified functions. Observations to be matched in the regression can be any quantity for which a simulated equivalent value can be produced, thus simulated equivalent values are calculated using values that appear in the application model output files and can be manipulated with additive and multiplicative functions, if necessary. Prior, or direct, information on estimated parameters also can be included in the regression. The nonlinear regression problem is solved by minimizing a weighted least-squares objective function with respect to the parameter values using a modified Gauss–Newton method. Sensitivities needed for the method are calculated approximately by forward or central differences and problems and solutions related to this approximation are discussed. Statistics are calculated and printed for use in (1) diagnosing inadequate data or identifying parameters that probably cannot be estimated with the available data, (2) evaluating estimated parameter values, (3) evaluating the model representation of the actual processes and (4) quantifying the uncertainty of model simulated values. UCODE is intended for use on any computer operating system: it consists of algorithms programmed in perl, a freeware language designed for text manipulation and Fortran90, which efficiently performs numerical calculations.
Article
Operational flood management and warning requires the delivery of timely and accurate forecasts. The use of distributed and physically based forecasting models can provide improved streamflow forecasts. However, for operational modelling there is a trade-off between the complexity of the model descriptions necessary to represent the catchment processes, the accuracy and representativeness of the input data available for forecasting and the accuracy required to achieve reliable, operational flood management and warning. Four sources of uncertainty occur in deterministic flow modelling; random or systematic errors in the model inputs or boundary condition data, random or systematic errors in the recorded output data, uncertainty due to sub-optimal parameter values and errors due to incomplete or biased model structure. While many studies have addressed the issues of sub-optimal parameter estimation, parameter uncertainty and model calibration very few have examined the impact of model structure error and complexity on model performance and modelling uncertainty. In this study a general hydrological framework is described that allows the selection of different model structures within the same modelling tool. Using this tool a systematic investigation is carried out to determine the performance of different model structures for the DMIP study Blue River catchment using a split sample evaluation procedure. This investigation addresses two questions. First, different model structures are expected to perform differently, but is there a trade-off between model complexity and predictive ability? Secondly, how does the magnitude of model structure uncertainty compare to the other sources of uncertainty? The relative performance of different acceptable model structures is evaluated as a representation of structural uncertainty and compared to estimates of the uncertainty arising from measurement uncertainty, parametric uncertainty and the rainfall input. The results show first that model performance is strongly dependent on model structure. Distributed routing and to a lesser extent distributed rainfall were found to be the dominant processes controlling simulation accuracy in the Blue River basin. Secondly that the sensitivity to variations in acceptable model structure are of the same magnitude as uncertainties arising from the other evaluated sources. This suggests that for practical hydrological predictions there are important benefits in exploring different model structures as part of the overall modelling approach. Furthermore the model structural uncertainty should be considered in assessing model uncertainties. Finally our results show that combinations of several model structures can be a means of improving hydrological simulations.
Article
This paper describes progress on HarmoniRiB, a European Commission Framework 5 project. The HarmoniRiB project aims to support the implementation of the EU Water Framework Directive (WFD) by developing concepts and tools for handling uncertainty in data and modelling, and by designing, building and populating a database containing data and associated uncertainties for a number of representative basins. This river basin network aims at becoming a ‘virtual laboratory for modelling studies’, and it will be made available for the scientific community. The data may, e.g. be used for comparison and demonstration of methodologies and models relevant to the WFD.
Article
Quality assurance (QA) is defined as protocols and guidelines to support the proper application of models. In the water management context we classify QA guidelines according to how much focus is put on the dialogue between the modeller and the water manager as: (Type 1) Internal technical guidelines developed and used internally by the modeller's organisation; (Type 2) Public technical guidelines developed in a public consensus building process; and (Type 3) Public interactive guidelines developed as public guidelines to promote and regulate the interaction between the modeller and the water manager throughout the modelling process. State-of-the-art QA practices vary considerably between different modelling domains and countries. It is suggested that these differences can be explained by the scientific maturity of the underlying discipline and differences in modelling markets in terms of volume of jobs outsourced and level of competition. The structure and key aspects of new generic guidelines and a set of electronically based supporting tools that are under development within the HarmoniQuA project are presented. Model credibility can be enhanced by a proper modeller-manager dialogue, rigorous validation tests against independent data, uncertainty assessments, and peer reviews of a model at various stages throughout its development.
Article
Integrated environmental resources management is a purposeful activity with the goal to maintain and improve the state of an environmental resource affected by human activities. In many cases different goals are in conflict and the notion “Integrated ” indicates clearly that resources management should be approached from a broad perspective taking all potential trade-offs and different scales in space and time into account. However, we are yet far from putting integrated resources management taking fully into account the complexity of human-technology-environment systems into practice.The tradition of resources management and of dealing with environmental problems is characterized by a command and control approach. The increasing awareness for the complexity of environmental problems and of human-technology-environment systems has triggered the development of new management approaches. The paper discusses the importance to focus on the transition to new management paradigms based on the insight that the systems to be managed are complex adaptive systems. It provides arguments for the role of social learning processes and the need to develop methods combining approaches from hard and soft systems analysis. Soft systems analysis focuses on the importance of subjective perceptions and socially constructed reality. Soft systems methods and group model building techniques are quite common in management science where the prime target of management has always been the social system. Resources management is still quite slow to take up such innovations that
Article
Some scientists argue, with reference to Popper’s scientific philosophical school, that models cannot be verified or validated. Other scientists and many practitioners nevertheless use these terms, but with very different meanings. As a result of an increasing number of examples of model malpractice and mistrust to the credibility of models, several modelling guidelines are being elaborated in recent years with the aim of improving the quality of modelling studies. This gap between the views and the lack of consensus experienced in the scientific community and the strongly perceived need for commonly agreed modelling guidelines is constraining the optimal use and benefits of models. This paper proposes a framework for quality assurance guidelines, including a consistent terminology and a foundation for a methodology bridging the gap between scientific philosophy and pragmatic modelling. A distinction is made between the conceptual model, the model code and the site-specific model. A conceptual model is subject to confirmation or falsification like scientific theories. A model code may be verified within given ranges of applicability and ranges of accuracy, but it can never be universally verified. Similarly, a model may be validated, but only with reference to site-specific applications and to pre-specified performance (accuracy) criteria. Thus, a model’s validity will always be limited in terms of space, time, boundary conditions and types of application. This implies a continuous interaction between manager and modeller in order to establish suitable accuracy criteria and predictions associated with uncertainty analysis.
Article
Although uncertainty about structures of environmental models (conceptual uncertainty) is often acknowledged to be the main source of uncertainty in model predictions, it is rarely considered in environmental modelling. Rather, formal uncertainty analyses have traditionally focused on model parameters and input data as the principal source of uncertainty in model predictions. The traditional approach to model uncertainty analysis, which considers only a single conceptual model, may fail to adequately sample the relevant space of plausible conceptual models. As such, it is prone to modelling bias and underestimation of predictive uncertainty.In this paper we review a range of strategies for assessing structural uncertainties in models. The existing strategies fall into two categories depending on whether field data are available for the predicted variable of interest. To date, most research has focussed on situations where inferences on the accuracy of a model structure can be made directly on the basis of field data. This corresponds to a situation of ‘interpolation’. However, in many cases environmental models are used for ‘extrapolation’; that is, beyond the situation and the field data available for calibration. In the present paper, a framework is presented for assessing the predictive uncertainties of environmental models used for extrapolation. It involves the use of multiple conceptual models, assessment of their pedigree and reflection on the extent to which the sampled models adequately represent the space of plausible models.
Article
In performance assessment studies of radioactive waste disposal in crystalline rocks, one source of uncertainty is the appropriateness of conceptual models of the physical processes contributing to the potential transport of radionuclides. The Alternative Models Project (AMP) evaluates the uncertainty of models of groundwater flow, an uncertainty that arises from alternative conceptualisations of groundwater movement in fractured media. The AMP considers three modelling approaches for simulating flow and advective transport from the waste canisters to the biosphere: Stochastic Continuum, Discrete Fracture Network, and Channel Network. Each approach addresses spatial variability via Monte Carlo simulation, whose realisations are summarised by the statistics of three simplified measures of geosphere performance: travel time, transport resistance (a function of travel distance, flow-wetted surface per volume of rock, and Darcy velocity along a flowpath), and canister flux (Darcy velocity at repository depth). The AMP uses a common reference case defined by a specific model domain, boundary conditions, and layout of a hypothetical repository, with a consistent set of summary statistics to facilitate the comparison of the three approaches. The three modelling approaches predict similar median travel times and median canister fluxes, but dissimilar variability. The three modelling approaches also predict similar values for minimum travel time and maximum canister flux, and predict similar locations for particles exiting the geosphere. The results suggest that the problem specifications (i.e. boundary conditions and gross hydrogeology) constrain the flow modelling, limiting the impact of this conceptual uncertainty on performance assessment.
Article
Quality assurance in model based water management is needed because of some frequently perceived shortcomings, e.g. a lack of mutual understanding between modelling team members, malpractice and a tendency of modellers to oversell model capabilities. Initiatives to support quality assurance focus on single domains and often follow a textbook approach with guidelines and checklists. A modelling process involves a complex set of activities executed by a team. To manage this complex, usually multidisciplinary process, to guide users through it and enhance the reproducibility of modelling work a software product has been developed, aiming at supporting the full modelling process by offering an ontological knowledge base (KB) and a Modelling Support Tool (MoST). The KB consists of a generic part for modelling, but also parts specific for various water management domains, for different types of users and for different levels of modelling complexity. MoST's guiding component filters relevant knowledge from the KB depending on the user profile and needs. Furthermore, MoST supports different types of users by monitoring what they actually do and by producing customized reports for diverse audiences. In this way MoST facilitates co-operation in teams, modelling project audits and re-use of experiences of previous modelling projects.
Article
This paper describes a methodology for calibration and uncertainty estimation of distributed models based on generalized likelihood measures. the GLUE procedure works with multiple sets of parameter values and allows that, within the limitations of a given model structure and errors in boundary conditions and field observations, different sets of values May, be equally likely as simulators of a catchment. Procedures for incorporating different types of observations into the calibration; Bayesian updating of likelihood values and evaluating the value of additional observations to the calibration process are described. the procedure is computationally intensive but has been implemented on a local parallel processing computer. the methodology is illustrated by an application of the Institute of Hydrology Distributed Model to data from the Gwy experimental catchment at Plynlimon, mid-Wales.
Article
321 pages, figures, bibliographie Scenarios deals with how managers can set out and negotiate a successful course into the future for the organization in the face of significant uncertainty. Uncertainties about the future are often felt to be uncomfortable and th us "swept under the table" by collapsing them into a single line forecast. This is tantamount to abdication of managerial responsibility. At worst it means a wild jump in the dark. Facing up to uncertainty changes the perspective on the future completely. The secret of success moves from "finding the best strategy" to "finding the best process". Thinking about scenarios - the different plausible future environments that can be imagined - is the key to thinking the process through and to keep thinking about it as the plans for the future unfold. Scenario planning is dynamic. The focus of attention needs to be on the ongoing "strategie conversation", penetrating both the formai and informai exchange of views through whieh the strategie understanding develops - and actions result. Scenarios deals first with the principles of organizational learning and then moves on to describe practieal and down·to·earth ways in whieh the organization can develop its skill in conducting an ongoing scenario·based strategy process. The methods described are based on many years of practical experience of managers in both large and small organizations; and they are grounded in solid logic.