Figure 6 - uploaded by Francois M. Hemez
Content may be subject to copyright.
3D model of the LANL drop test.

3D model of the LANL drop test.

Source publication
Article
Full-text available
This paper gives an overall presentation of a research project pursued at Los Alamos National Laboratory for the validation of numerical simulations for engineering structural dynamics. An impact experiment used to develop and test the model validation methodology is presented. Design of experiments techniques are implemented to perform parametric...

Citations

... As shown in Figure 1, simulation model validation is conducted at critical and sensitive locations in the domain identified with the design of experiments technique (green area). This technique assists in the conduct of simulation validation in the domain of interest to establish the domain of validity of the simulation model [12][13][14][15][16][17][18]. If the outcome of simulation validation is satisfactory, predictions are done at locations other than these validation locations when appropriate. ...
Article
Full-text available
Decision makers and other users of simulations need to know quantified simulation credibility to make simulation-based critical decisions and effectively use simulations, respectively. The credibility of a simulation is quantified by its accuracy in terms of uncertainty, and the responsibility of establishing credibility lies with the creator of the simulation. In this volume, we present some state-of-the-art philosophies, principles, and frameworks. The contributing authors involved in this publication have been dedicated to advancing simulation credibility. They detail and provide examples of key advances over the last 10 years in the processes used to quantify simulation credibility—verification, validation, and uncertainty quantification. The philosophies and assessment methods presented here are anticipated to be useful to all technical communities conducting continuum physics-based simulations; for example, issues related to the establishment of simulation credibility in the discipline of propulsion are discussed. We envision that simulation creators will find this volume very useful to guide and assist them in quantitatively conveying the credibility of their simulations.
... As a consequence of uncertainties, highly nonlinear structural responses are not predictable and significant mismatches are observed between simulations and actual real-world experiments. The issue becomes even more pronounced for nonlinear dynamic problems as the real-world structure and the corresponding model (Oberkampf et al. 2000;Hemez et al. 2001;Marshall and Nurick 1998) are very sensitive to their own disjoint sets of uncertainties. From a computational design point of view, it is therefore very difficult to enforce a dynamic behavior. ...
Article
Full-text available
A methodology to enforce a given structural dynamic behavior during an impact while accounting for uncertainty is presented. The approach is based on locating structural fuses that weaken the structure locally and help enforce a deformation mode. The problem of enforcing the crushing of a tube impacting a rigid wall is chosen. In order to find the positions of the fuses, the method identifies distinct structural dynamic behaviors using designs of experiments and clustering techniques. The changes in behavior are studied with respect to variations of the fuse positions and random parameters, such as the thickness. Based on the probabilistic distributions, a measure of the likelihood of occurrence of global buckling is defined. The positions of the fuses are defined using an optimization problem in terms of the likelihood of global buckling and the amount of absorbed energy in the tube. A first formulation of the problem considers variability in the tube’s thickness only. A second formulation also accounts for uncertainties in the positions of the fuses.
... Failure probabilities of structures subject to short duration non-stationary earthquake loading have been estimated using stochastic finite elements and quadratic response surfaces [17]. Model updating for a non-linear impact problem has been studied with response surfaces [15]. Metamodeling error estimation was studied in the context of a single degree of freedom oscillator impacting a nonlinear material [36]. ...
Article
This study examines the performance and reliability of passive and semi-active damping in equipment isolation systems for earthquake protection. Performance and reliability measures are the peak accelerations sustained by the equipment and the peak displacements of the isolation system. A new hybridization of two previously studied semi-active control rules regulates the damping in the semi-active isolation system. A parameter study identifies suitable values for the stiffness and damping parameters of the passive isolation system, and feedback control constants for the semi-active equipment isolation and compares the performance for a set of historical earthquakes and for a set of different building models. The reliability of passive and semi-active equipment isolation systems is assessed separately for four historic earthquakes and in regards to uncertainties in the isolation system. The reliability assessment makes use of a polynomial metamodel of the responses as a function of the relevant random variables. Results illustrate the performance limitations of passive isolation systems in protecting shock- and vibration-sensitive equipment from near-fault ground motions and show the improvements associated with semi-active equipment isolation in terms of the mean and variability of the peak responses. Correlations between the isolation system variables and the responses provide guidance for improved behavior.
... To date, structural dynamics researchers have utilized response surface metamodels in structural dynamics for sensitivity analyses [5], uncertainty quantification [6], and model updating [7]. However, use of low order models will have an increasingly important part to play in applications such as structural health monitoring and damage prognosis, where damage identification is considered a fundamental first step [8]. ...
Article
Full-text available
Metamodels have been used with success in many areas of engineering for decades but only recently in the field of structural dynamics. A metamodel is a fast running surrogate that is typically used to aid an analyst or test engineer in the fast and efficient exploration of the design space. Response surface metamodels are used in this work to perform parameter identification of a simple five degree of freedom system, motivated by their low training requirements and ease of use. In structural dynamics applications, response surface metamodels have been utilized in a forward sense, for activities such as sensitivity analysis or uncertainty quantification. In this study a polynomial response surface model is developed, relating system parameters to measurable output features. Once this relationship is established, the response surface is used in an inverse sense to identify system parameters from measured output features. A design of experiments is utilized to choose points, representing a fraction of the full design space of interest, for fitting the response surface metamodel. Two parameters commonly used to characterize damage in a structural system, stiffness and damping, are identified. First changes are identified and located with success in a linear 5DOF system. Then parameter identification is attempted with a nonlinear 5DOF system and limited success is achieved. This work will demonstrate that use of response surface metamodels in an inverse sense shows promise for use in system parameter identification for both linear and weakly nonlinear systems and that the method has potential for use in damage identification applications.
... The numerical simulation of the propagation of an impact wave through a layer of non-linear, crushable foam material is compared to physical measurements [8]. Details about the experimental set-up, finite element modeling, and sources of uncertainty can be obtained from Reference [9]. In the following, the main source of uncertainty analyzed in this work is briefly described, and analysis results are discussed to illustrate the trade-offs between fidelity-to-data, robustness-to-uncertainty, and prediction looseness. ...
Article
Full-text available
In computational physics and engineering, numerical models are developed to predict the behavior of a system whose response cannot be measured experimentally. A key aspect of science-based predictive modeling is the assessment of prediction credibility. Credibility, which is demonstrated through the activities of Verification and Validation, quantifies the extent to which simulation results can be analyzed with confidence to represent the phenomenon of interest with accuracy consistent with the intended use of the model. This paper argues that assessing the credibility of a mathematical or numerical model must combine three components: 1) Improving the fidelity to test data; 2) Studying the robustness of prediction-based decisions to variability, uncertainty, and lack-of-knowledge; and 3) Establishing the expected prediction accuracy of the models in situations where test measurements are not available. A recently published Theorem that demonstrates the irrevocable trade-offs between "The Good, The Bad, and The Ugly," or robustness-to-uncertainty, fidelity-to-data, and confidence-in-prediction, is summarized. The main implication is that high-fidelity models cannot, at the same time, be made robust to uncertainty and lack-of-knowledge. Similarly, equally robust models do not provide consistent predictions, hence reducing confidence-in-prediction. The conclusion of the theoretical investigation is that, in assessing the predictive accuracy of numerical models, one should never focus on a single aspect. Instead, the trade-offs between fidelity-to-data, robustness-to-uncertainty, and confidence-in-prediction should be explored. The discussion is illustrated with an engineering application that consists in modeling and predicting the propagation of an impact through a layer of hyper-foam material. A novel definition of sensitivity coefficients is suggested from the slopes of robustness-to-uncertainty curves. Such definition makes it possible to define the sensitivity of a performance metric to arbitrary uncertainty, whether it is represented with probability laws or any other information theory. This publication has been approved for unlimited, public release on November 18, 2003 (LA-UR-03-8492, Unclassified).
... It is borrowed from a study documented in Reference [6], where a simulation of the propagation of an impact through a layer of non-linear, crushable foam is compared to physical measurements. The experimental setup, finite element modeling, and sources of uncertainty are discussed in Reference [7]. ...
Article
Full-text available
A key aspect of science-based predictive modeling is to assess the credibility of predictions. To gain confidence in predictions, one should demonstrate consistency between physical observations, expert judgments, and the predictions of equally credible models. This suggests a relationship between fidelity-to-data , robustness-to-uncertainty, and confidence in prediction. The purpose of this work is to explore the interaction between these three aspects of predictive modeling. The concepts of fidelity, robustness, and confidence are first defined in a broad sense. A Theorem is then proven that establishes that these three objectives are antagonistic. This means that high-fidelity models cannot, at the same time, be made robust to uncertainty and lack-of-knowledge. Similarly, equally robust models cannot provide consistent predictions, hence reducing confidence. The conclusion of this theoretical investigation is that, in assessing the predictive accuracy of numerical models, one should never focus on a single aspect only. Instead, the trade-offs between fidelity-to-data, robustness-to-uncertainty, and confidence in prediction should be explored.
... The discussion is illustrated using an engineering application currently dealt with at Los Alamos National Laboratory. References [2][3][4] offer additional details regarding the particular analysis techniques and results to which the discussion refers. ...
... In comparison, integrating equations (2-3) with the same absolute accuracy requires ?t˜O(e) at the most. Combining equations (2) and (3) effectively defines an approximated system where the original differential equations (1) are solved "in average" instead of exactly. ...
Article
Full-text available
INTRODUCTION Today's computational resources make it more than ever possible to model and analyze complex phenomena characterized by complex geometries and boundary conditions, multi-physics, nonlinear effects and variability. An example of such resource is the U.S. Department of Energy's Accelerated Strategic Computing Initiative (ASCI) that has developed several platforms able to sustain over 3 x 10 +12 operations per second (or 3 TeraOps) by distributing computations over arrays of more than 6,000 processors. The next generation of ASCI computers is expected to reach 30 TeraOps by the This publication is unclassified. Approved for unlimited, public release on May 14 , 2001. LA-UR-01-2492. Technical Staff Member, hemez@lanl.gov (E-mail), 505-665-7955 (Phone), 505-665-7836 (Fax). Technical Staff Member, doebling@lanl.gov (E-mail), 505-667-6950 (Phone), 505-665-2137 (Fax). year 2004 with the goal of approaching 100 TeraOps a few years later. Examples of problems requirin
... Hemez, Wilson and Doebling [17] Both of these methods are compared to the more traditional general sensitivity analysis, which examines how much each output feature changes when each input parameter is changed one at a time (a simple, linear finite differencing method). All methods resulted in the same six parameters screening out of the original set of twelve thought to be important. ...
Article
Full-text available
The need for low order models capable of performing damage identification has become apparent in many structural dynamics applications where structural health monitoring and damage prognosis programs are implemented. These programs require that damage identification routines have low computational requirements and be reliable with some quantifiable degree of accuracy. Response surface metamodels (RSMs) are proposed to fill this need. Popular in the fields of chemical and industrial engineering, RSMs have only recently been applied in the field of structural dynamics and to date there have been no studies which fully demonstrate the potential of these methods. In this thesis, several RSMs are developed in order to demonstrate the potential of the methodology. They are shown to be robust to noise (experimental variability) and have success in solving the damage identification problem, both locating and quantifying damage with some degree of accuracy, for both linear and nonlinear systems. A very important characteristic of the RSMs developed in this thesis is that they require very little information about the system in order to generate relationships between damage indicators and measureable system responses for both linear and nonlinear structures. As such, the potential of these methods for damage identification has been demonstrated and it is recommended that these methods be developed further.
... The above model is fit to the data shown in Fig. 3 using the general approach described earlier to minimize the function given in Eq. (9). The Jacobian (sensitivity) matrix is estimated at each optimization iteration by the straightforward method of finite differences. ...
... For example, to include the uncertainty in the impact velocity, it could be included as an additional random variable, with its uncertainty described by its own pdf. By drawing random samples for the velocity from its pdf and including these random values in the simulation process, the effects of uncertain Other uncertainties in the This approach to assessing the consequences of uncertainties in the experimental conditions provides a good basis for designing validation experiments, for example, as discussed in Ref. [9] for structural dynamics applications. ...
Article
Full-text available
We present an approach for assessing the uncertainties in simulation code outputs in which one focuses on the physics submodels incorporated into the code. Through a Bayesian analysis of a hierarchy of experiments that explore various aspects of the physics submodels, one can infer the sources of uncertainty, and quantify them. As an example of this approach, we describe an effort to describe the plastic-flow characteristics of a high-strength steel by combining data from basic material tests with an analysis of Taylor impact experiments. A thorough analysis of the material-characterization experiments is described, which necessarily includes the systematic uncertainties that arise form sample-to sample variations in the plastic behaviour of the specimens. The Taylor experiments can only be understood by means of a simulation code. We describe how this analysis can be done and how the results can be combined with the results of analyses of data from simpler materialcharacterization experiments.
... Generally, these methods consist in studying the effects of the uncertainties affecting the input on the variability of the output, but this can be done in various ways. For example, Gaul et al. [5] ' [6] use fuzzy parameters to quantify the overall uncertainty of the model's output, whereas Hemez et al. [7] ' [8] build meta-models by spanning the space of the most influential parameters and using a specific technique to reduce the computational effort drastically. ...
Article
Still today, some of the phenomena which occur in structural dynam-ics cannot be described, even using a properly updated model: some uncertainties remain. This paper presents the basic concepts of a new theory which uses the concept of Lack of Knowledge (LOK) to model the uncertainty using two bounds of the energy in each sub-structure. The inverse problem, which consists in determining the Lack of Knowledge for each substructure from test results on the whole structure, has been studied on a simple example. The overlined quantities are related to the theoretical, deterministic model.