ThesisPDF Available

Uncertainty Propagation Through Large Nonlinear Models

Authors:
  • BlueFox Data

Abstract

Uncertainty analysis in computer models has seen a rise in interest in recent years as a result of the increased complexity of (and dependence on) computer models in the design process. A major problem however, is that the computational cost of propagating uncertainty through large nonlinear models can be prohibitive using conventional methods (such as Monte Carlo methods). A powerful solution to this problem is to use an emulator, which is a mathematical representation of the model built from a small set of model runs at speci�ed points in input space. Such emulators are massively cheaper to run and can be used to mimic the "true" model, with the result that uncertainty analysis and sensitivity analysis can be performed for a greatly reduced computational cost. The work here investigates the use of an emulator known as a Gaussian process (GP), which is an advanced probabilistic form of regression, hitherto relatively unknown in engineering. The GP is used to perform uncertainty and sensitivity analysis on nonlinear �nite element models of a human heart valve and a novel airship design. Aside from results specifi�c to these models, it is evident that a limitation of the GP is that non-smooth model responses cannot be accurately represented. Consequently, an extension to the GP is investigated, which uses a classi�cation and regression tree to partition the input space, such that non-smooth responses, including bifurcations, can be modelled at boundaries. This new emulator is applied to a simple nonlinear problem, then a bifurcating �nite element model. The method is found to be successful, as well as actually reducing computational cost, although it is noted that bifurcations that are not axis-aligned cannot realistically be dealt with.
A preview of the PDF is not available
... Similarly, the prior specification of w and s 2 from (7) must be conditioned on the training data to give posterior distributions of the hyperparameters. This step is somewhat long-winded and is omitted here, but details can be found in [30]. ...
... The full steps of this integration are lengthy and will be omitted here, but can be found in [30]. The marginalisation finally results in a posterior student's t-distribution over f(x), and a student's t-process over the whole function, which is given by, ...
... The integrals are however presented generally here, and are applicable to any pdf of inputs. Analytical derivation of these integrals for the case of a multivariate normal p(x) can be found in [30]. ...
Article
A major problem in uncertainty and sensitivity analysis is that the computational cost of propagating probabilistic uncertainty through large nonlinear models can be prohibitive when using conventional methods (such as Monte Carlo methods). A powerful solution to this problem is to use an emulator, which is a mathematical representation of the model built from a small set of model runs at specified points in input space. Such emulators are massively cheaper to run and can be used to mimic the “true” model, with the result that uncertainty analysis and sensitivity analysis can be performed for a greatly reduced computational cost. The work here investigates the use of an emulator known as a Gaussian process (GP), which is an advanced probabilistic form of regression. The GP is particularly suited to uncertainty analysis since it is able to emulate a wide class of models, and accounts for its own emulation uncertainty. Additionally, uncertainty and sensitivity measures can be estimated analytically, given certain assumptions. The GP approach is explained in detail here, and a case study of a finite element model of an airship is used to demonstrate the method. It is concluded that the GP is a very attractive way of performing uncertainty and sensitivity analysis on large models, provided that the dimensionality is not too high.
... The full steps of this integration are lengthy, but can be found in [28]. The marginalisation finally results in a posterior student's t-distribution over f(x ), and a student's t-process over the whole function f(x), which is given by ...
... This leads to a direct specification of the likelihood function p(y9X,h,T). Details will not be given here but can be found in [28,32]. ...
... This is a large amount of data for two dimensions, so the effect of reducing the number of training points is also investigated in Section 5.3 as a matter of interest. Full details of the model (including the input deck used for the software, LS-Dyna) can be found in [28]. Fig. 10 shows the movement of the leaflet over 12 ms for a mid-range pressure load. ...
Article
Full-text available
Sensitivity analysis allows one to investigate how changes in input parameters to a system affect the output. When computational expense is a concern, metamodels such as Gaussian processes can offer considerable computational savings over Monte Carlo methods, albeit at the expense of introducing a data modelling problem. In particular, Gaussian processes assume a smooth, non-bifurcating response surface. This work highlights a recent extension to Gaussian processes which uses a decision tree to partition the input space into homogeneous regions, and then fits separate Gaussian processes to each region. In this way, bifurcations can be modelled at region boundaries and different regions can have different covariance properties. To test this method, both the treed and standard methods were applied to the bifurcating response of a Duffing oscillator and a bifurcating FE model of a heart valve. It was found that the treed Gaussian process provides a practical way of performing uncertainty and sensitivity analysis on large, potentially-bifurcating models, which cannot be dealt with by using a single GP, although an open problem remains how to manage bifurcation boundaries that are not parallel to coordinate axes.
... The concept is, therefore, to remove both and from the optimisation problem, by  2 f  marginalising over them. The analytical formulations were summarised in [22], and a much more detailed discussion is available in [31]. Although the expressions are closed-form, they become quite complex and computationally expensive to be solved. ...
Article
The process of manufacturing pultruded FRP (Fiber Reinforced Polymers) profiles involves unavoidable imperfections that affect their structural performances. This is is even more relevant for the stability of axially loaded slender elements, due to the importance of imperfections and notches to initiate the buckling phenomenon. Thus, they become a predominant factor for the design of lightweight FRP beam-like structures. A Bayesian approach is proposed to estimate the presence and location of manufacturing imperfections in pultruded GFRPs (Glass Fiber Reinforced Polymers) profiles. Specifically, the Treed Gaussian Process (TGP) procedure is applied. This approach combines regression Gaussian Processes (GP) and Bayesian-based Recursive Partitioning. The experimental and numerical modal shapes of wide flange pultruded profile were investigated. The experimental data were compared with the numerical results of several Finite Element Models (FEM) characterised by different crack sizes.
... The process of estimating θ (i.e. constructingθ) from observed data is referred to as training and is very well described in [10] from a classical prospective or in [26,27] from a Bayesian standpoint. Once the emulator is trained, its posterior distribution can be sampled many times at an affordable cost to provide data for various analyses. ...
Article
Full-text available
This paper investigates the influence of the uncertainty in different micromechanical properties on the variability of the macroscopic response of cross-laminated timber plates, by means of a probabilistic sensitivity analysis. Cross-laminated timber plates can be modelled using a multiscale finite element approach which although suitable, suffers from high computational cost. Investigating parametric importance can incur considerable time penalty since conventional sensitivity analysis relies on a large number of code evaluations to produce accurate results. In order to address this issue, we build a statistical approximation to the code output and use it to perform sensitivity analysis. We investigate the effect of a collection of parameters on the density and Young's moduli of wood. Additionally, the influence on the response of cross-laminated timber plates subject to bending, in-plane shear and compression loads is investigated due to its relevance within the engineering community. The presented results provide a practical insight into the importance of each micromechanical parameter, which allows research effort to be focused on important wood properties.
... (3)-(5) complete the prior specification of the problem; the posterior distribution of the outputs is then found by conditioning the prior distribution on the training data y (the vector of output points corresponding to the input training set), and integrating out (or marginalising over) the hyperparameters r 2 and b. The calculation is straightforward but very time consuming, a detailed walkthrough can be found in [20]. The integrals involved are usually all Gaussian, and although the expressions are almost always very complicated, the results can be given in closed form. ...
Article
Full-text available
Structural Health Monitoring (SHM) is the engineering discipline of diagnosing damage and estimating safe remaining life for structures and systems. Often, SHM is accomplished by detecting changes in measured quantities from the structure of interest; if there are no competing explanations for the changes, one infers that they are the result of damage. If the structure of interest is subject to changes in its environmental or operational conditions, one must understand the effects of these changes in order that one does not falsely claim that damage has occurred when changes in measured quantities are observed. This problem – the problem of confounding influences – is particularly pressing for civil infrastructure where the given structure is usually openly exposed to the weather and may be subject to strongly varying operational conditions. One approach to understanding confounding influences is to construct a data-based response surface model that can represent measurement variations as a function of environmental and operational variables. The models can then be used to remove environmental and operational variations so that change detection algorithms signal the occurrence of damage alone. The current paper is concerned with such response surface models in the case of SHM of bridges. In particular, classes of response surface models that can switch discontinuously between regimes are discussed.
... The main innovation in terms of the GP algorithm is to integrate out (or marginalise over) the hyperparameters 2 f andˇ, thus removing them from the optimisation problem. The calculation is technically straightforward but very time consuming, a detailed walkthrough can be found in [11]. The integrals involved are usually all Gaussian, and although the expressions are almost always very complicated, the results can be given in closed form. ...
Chapter
For some years, there has been interest in locating cracks in beams by detecting singularities in mode shape curvatures. Most of the work in the past has depended on the estimation of spatial derivatives (smoothed or otherwise) of the experimentally measured mode shape. This problem is made difficult by the fact that numerical differentiation is notorious for amplifying measurement noise, coupled with that fact that very precise estimates of mode shapes are difficult to obtain. One recent approach, introduced by one of the authors, circumvented the noise issue via a method which did not need numerical differentiation. Briefly, the method applied a Gaussian process regression to the data, using a covariance function that could switch between spatial regions; the switch point—which indicated the crack position—could be determined by a maximum likelihood algorithm. The object of the current paper is to present an alternative approach which uses Treed Gaussian Processes (TGPs). The idea is that separate Gaussian Processes, with standard covariance functions, can be fitted over different spatial regions of the beam, with any switching points learned as part of a decision tree structure. The paper also revisits the idea of using differentiated mode shapes, on the premise that the Gaussian process can ‘see through’ the noise created and perceive the underlying structure.
Article
Structural health monitoring (SHM) is the discipline of diagnosing damage and estimating safe remaining life for structures and systems. Often SHM is accomplished by detecting changes in measured quantities from the structure of interest; if there are no competing explanations for the changes, one infers that they are the result of damage. If the structure of interest is subject to changes in its environmental or operational conditions, one must understand the effects of these changes in order that one does not falsely claim that damage has occurred when one observes measurement changes. This problem-the problem of confounding influences-is particularly pressing for civil infrastructure where the given structure is usually openly exposed to the weather and may be subject to strongly varying operational conditions. One approach to understanding confounding influences is to construct a data-based response surface model that can represent measurement variations as a function of environmental and operational variables. The models can then be used to remove environmental and operational variations so that change detection algorithms signal the occurrence of damage alone. The current chapter is concerned with such response surface models in the case of SHM of bridges. In particular, classes of response surface models that can switch discontinuously between regimes are discussed. © 2013 Springer Science+Business Media New York. All rights are reserved.
Conference Paper
Full-text available
This paper forms the second in a short sequence considering the system identification problem for hysteretic systems. The basic model for parameter estimation is assumed to be the Bouc-Wen model, as this has proved particularly versatile in the past. Previous work on the Bouc-Wen system has shown that the system response is more sensitive to some parameters than others and that the errors in the associated parameter estimates vary as a consequence. The objective of the current paper is to demonstrate the use of a principled Bayesian approach to parameter sensitivity analysis for the Bouc-Wen system. The approach is based on Gaussian process emulation and is encoded in the software package Gem-SA. The paper considers a five-parameter Bouc-Wen model, and the sensitivity analysis is based on data generated by computer simulation of a single-degree-of-freedom system.
Conference Paper
Full-text available
A Flight Parameter Sensor Simulation (FPSS) model has been developed to assess the conservatism of the landing gear loads calculated using a hard landing analysis process. Conservatism exists due to factors of safety that are added to the hard landing analysis process to account for uncertainty in the measurement of certain flight parameters. The FPSS model consists of: (1) an aircraft and landing gear dynamic model to determine the 'actual' landing gear loads during a hard landing; (2) an aircraft sensor and data acquisition model to represent the aircraft sensors and flight data recorder (FDR) systems to investigate the effect of signal processing on the flight parameters; (3) an automated hard landing analysis process, representative of that used by airframe and equipment manufacturers, to determine the 'simulated' landing gear loads. Using a technique of Bayesian sensitivity analysis, a number of flight parameters are varied in the FPSS model to gain an understanding of the sensitivity of the difference between 'actual' and 'simulated' loads to the individual flight parameters. A demonstration of this technique is provided with the following flight parameters: aircraft pitch angle, ground speed, vertical descent velocity and tyre-runway friction coefficient. This demonstration shows that the friction coefficient and vertical descent velocity and their interactions have the greatest contribution to the difference between the 'actual' and 'simulated' spin-up drag dynamic response loads.
Article
Two types of sampling plans are examined as alternatives to simple random sampling in Monte Carlo studies. These plans are shown to be improvements over simple random sampling with respect to variance for a class of estimators which includes the sample mean and the empirical distribution function.