Figure 1- - uploaded by Stijn Donders
Content may be subject to copyright.
The demonstration model: an elastic FBS assembly of car body, front cradle and rear cradle, with non-uniquely defined coupling stiffness values.

The demonstration model: an elastic FBS assembly of car body, front cradle and rear cradle, with non-uniquely defined coupling stiffness values.

Source publication
Article
Full-text available
Finite Element (FE) analysis is widely employed in today's Computer Aided Engineering (CAE) to model real-world structures in an early design stage. FE models are deterministic, implicitly assuming that all design parameters are precisely known and that the manufacturing process produces identical structures. This is typically not valid. Parameter...

Contexts in source publication

Context 1
... front cradle connections are the main parameters of interest, as they largely determine the propagation of the engine vibrations to the driver seat. The 4 connections between front cradle and car body are modelled as uncorrelated Gaussian parameters, with a mean value of 2.00·10 5 N/m and a standard deviation of 0.10·10 5 N/m in the Z-Z direction; see Figure 10. The stiffness in X-X and Y-Y direction is assumed to be a third of this value. ...
Context 2
... the possibilistic analysis, the parameter scatter described above is modelled with the membership function in Figure 10. The distribution is set to zero outside the boundary [-6σ, +6σ] with respect to the nominal value; if this is omitted, the input ranges from [-∞,+∞], so that all input parameter values would be possible. ...
Context 3
... distribution is set to zero outside the boundary [-6σ, +6σ] with respect to the nominal value; if this is omitted, the input ranges from [-∞,+∞], so that all input parameter values would be possible. As can be seen in Figure 10, the input range has been subdivided into 5 interval levels of membership. For this problem with 4 inputs and 5 intervals, the Transformation Method requires that 1 + 5 ·2 4 = 81 parameter computations are computed to obtain the fuzzy FRF in Figure 11. ...
Context 4
... can be seen in Figure 10, the input range has been subdivided into 5 interval levels of membership. For this problem with 4 inputs and 5 intervals, the Transformation Method requires that 1 + 5 ·2 4 = 81 parameter computations are computed to obtain the fuzzy FRF in Figure 11. ...
Context 5
... validity of this range has been checked with 500 Monte Carlo simulations, uniformly spread in the same parameter range. Figure 12 compares the global TM envelopes (i.e. the boundary at µ A =0 in Figure 11) with the global MC envelopes [20]. The V.A.F. ...
Context 6
... validity of this range has been checked with 500 Monte Carlo simulations, uniformly spread in the same parameter range. Figure 12 compares the global TM envelopes (i.e. the boundary at µ A =0 in Figure 11) with the global MC envelopes [20]. The V.A.F. ...
Context 7
... is equal to 94.0 for the lower envelopes and 90.4 for the upper envelopes. The shape conformity is therefore high but not perfect, as can also be seen in Figure 12. The O.I.P. criterion (see Section 3.2) has a value of 99.7%. ...
Context 8
... means that almost all Monte Carlo data samples are bounded by the TM envelopes. It can be concluded that the fuzzy FRF in Figure 11 is a reliable visualization of all possible outputs for the given problem. The fuzzy FRF in Figure 11 allows the engineer to identify worst-case scenarios on the FRF characteristics. ...
Context 9
... can be concluded that the fuzzy FRF in Figure 11 is a reliable visualization of all possible outputs for the given problem. The fuzzy FRF in Figure 11 allows the engineer to identify worst-case scenarios on the FRF characteristics. A sensitive range has been found around 20 Hz. Figure 13 shows a magnification of the FRF in Figure 11. ...
Context 10
... fuzzy FRF in Figure 11 allows the engineer to identify worst-case scenarios on the FRF characteristics. A sensitive range has been found around 20 Hz. Figure 13 shows a magnification of the FRF in Figure 11. For nominal parameter values, an anti-resonance is found at 22 Hz, and a resonance with an FRF amplitude of 10 -2 kg -1 is found at 22.5 Hz. ...
Context 11
... fuzzy FRF in Figure 11 allows the engineer to identify worst-case scenarios on the FRF characteristics. A sensitive range has been found around 20 Hz. Figure 13 shows a magnification of the FRF in Figure 11. For nominal parameter values, an anti-resonance is found at 22 Hz, and a resonance with an FRF amplitude of 10 -2 kg -1 is found at 22.5 Hz. ...
Context 12
... amplitude at 22.5 Hz is denoted as the stochastic load effect A 22.5 (X). The resistance effect is the maximal amplitude that is allowed (i.e. the "Limit Value" in Figure 13), which is set to A 22.5,max = 1.0·10 -1 kg -1 . The Limit State Function (LSF) is then equal to g(X) = A 22.5,max -A 22.5 (X) = 1.0·10 -1 -A 22.5 (X) (eq. ...
Context 13
... lower dimensional plots already provide insight in the nature of the problem. Figure 14 plots x 1 (the right front connection stiffness between Front Cradle and Car Body) against x 2 (the left front connection stiffness) [20]. Failures seem to be concentrated in the parameter domain where x 1 and x 2 have values above the mean values. ...
Context 14
... other parameter combinations are plotted in this manner, comparable results are obtained. With LMS OPTIMUS [21], the Monte Carlo results are again visualized in Figure 15. The values of x 1 and x 2 (the filled squares in Figure 14) are plotted against the amplitude A 22.5 (X) in (eq. ...
Context 15
... LMS OPTIMUS [21], the Monte Carlo results are again visualized in Figure 15. The values of x 1 and x 2 (the filled squares in Figure 14) are plotted against the amplitude A 22.5 (X) in (eq. 5); an amplitude larger than 0.10 results in a failure. ...

Citations

... These methods have been applied to a variety of systems. See, for example, [18][19][20]. ...
Article
Full-text available
Uncertainty has always been an important consideration when designing, analyzing and testing engineered systems, but computational investigations of the effects of uncertainty are only now beginning to become feasible. Often the limiting factor is the computational expense required to assess the influence of uncertainty on the system. This work provides an overview of techniques that seek to reduce this expense. Sampling methods such as Monte Carlo Simulation (MCS), Latin Hypercube Sampling (LHS) and Low-Discrepancy Sequences will be discussed, as well as reliability methods such as MV, AVM, FORM and SORM. Response surface approximations such as Kriging and Polynomial Chaos will also be discussed, highlighting the fact that all of these uncertainty quantification techniques can be understood in the context of a response surface. The strengths and weaknesses of these uncertainty propagation techniques will be discussed and they will be compared by applying them to two low-order aerospace problems. The examples illustrate a case where most of the methods are not so satisfactory, and another where almost any would perform surprisingly well. Most of these methods are implemented in the Design Analysis Kit for Optimization and Terascale Applications or DAKOTA package, an open source design and optimization toolkit that was created by Sandia National Laboratories beginning in 2001, which was used to perform many of the analyses discussed in the paper.
... The above approaches do not only support the optimization of the body design for specific functional performance attributes, but make it also possible to assess the variability on these attributes taking into account the inherent production variability on individual component characteristics like sheet metal thickness, position and quality of spot welds or the dynamic properties of rubber mounts. Furthermore, the modelling process itself may be subject to uncertainty due to unknown (or unmeasurable) exact values of some of the parameters, or even fundamental model uncertainties [12,13]. ...
Article
All industrial sectors are confronted with the conflicting challenges to design better products in a shorter time, and this at a lower product, production and design cost. Globalization of markets as well as providers, increasing consumer awareness and more and more strict regulations on environmental and safety impact (“sustainable products”) put the industry competitiveness under large pressure. This requires rethinking the way the design issue is addressed, leading to major innovations in the design process As an illustration, the case of vehicle body design will be discussed. Automotive companies are launching new variants (and redesigning existing models) at an unprecedented pace. Since the vast majority of these vechicles are built on common platforms, body engineering is nearly always on the critical path of the vehicle development process. Functional performances such as crash, structural rigidity, and production feasibility are addressed early in the development process with computer simulations. But attributes as interior acoustics, harshness and vibration comfort pose major challenges for being optimized as part of the core digital development process, due to the size and complexity of involved FE models. As a con sequence these are traditionally only considered close to the availability of prototypes Performance issues discovered at this late stage will lead to additional design cycles together with costly modifications.
... The above approaches do not only support the optimization of the body design for specific functional performance attributes, but make it also possible to assess the variability on these attributes taking into account the inherent production variability on individual component characteristics like sheet metal thickness, position and quality of spot welds or the dynamic properties of rubber mounts. Furthermore, the modelling process itself may be subject to uncertainty due to unknown (or unmeasurable) exact values of some of the parameters, or even fundamental model uncertainties [12,13]. ...
Conference Paper
Full-text available
The pressures on product innovation and at the same time product customization and cost control are the key drivers for today's vehicle development process. The requirement to develop multiple innovative body variants for each new vehicle platform while shortening the overall time-to-market, puts body development on the critical path in the vehicle development process. The only possible answer to this challenge is to evaluate and optimize all critical functional attributes like weight, crashworthiness, durability and NVH as early as possible in the process, calling for a fully CAE-driven approach. This paper discusses a number of new technologies that support such approach in relation to NVH. Morphing the FE mesh of a predecessor model enables the analyst to formulate design recommendations in the concept phase. Advanced contribution analysis allows quick detection of weak points. Novel model reduction techniques shorten calculation time, enabling more modifications to be evaluated and making real optimization studies feasible. A first approach is based on a modal projection of the modifications and is used to carry out sensitivity analyses and assess small modifications. The second one is based on wave-based substructuring and allows implementing large modifications in a very efficient way. A complete process, based on the combination of these approaches, is presented and illustrated using industrial examples.
... The above approaches do not only support the optimization of the body design for specific functional performance attributes, but make it also possible to assess the variability on these attributes taking into account the inherent production variability on individual component characteristics like sheet metal thickness, position and quality of spot welds or the dynamic properties of rubber mounts. Furthermore, the modelling process itself may be subject to uncertainty due to unknown (or unmeasurable) exact values of some of the parameters, or even fundamental model uncertainties [12,13]. ...
Conference Paper
Full-text available
The pressures on product innovation and at the same time product customization and cost control are the key drivers for today's vehicle development process. The requirement to develop multiple innovative body variants for each new vehicle platform while shortening the overall time-to-market, puts body development on the critical path in the vehicle development process. The only possible answer to this challenge is to evaluate and optimize all critical functional attributes like weight, crashworthiness, durability and NVH as early as possible in the process, calling for a fully CAE-driven approach. This paper discusses a number of new technologies that support such approach in relation to NVH. Morphing the FE mesh of a predecessor model enables the analyst to formulate design recommendations in the concept phase. Advanced contribution analysis allows quick detection of weak points. Novel model reduction techniques shorten calculation time, enabling more modifications to be evaluated and making real optimization studies feasible. A first approach is based on a modal projection of the modifications and is used to carry out sensitivity analyses and assess small modifications. The second one is based on wave-based substructuring and allows implementing large modifications in a very efficient way. A complete process, based on the combination of these approaches, is presented and illustrated using industrial examples.
... Without going into the details on all these applications, a few examples will be shown to illustrate this. The problem here is actually very similar to the one of uncertainty or reliability in FE based design simulation, where considerations on the uncertainty or variability of the FE model parameters will lead to a structural response description subject to uncertainty [29][30][31]. The techniques used to analyse the FE modelling uncertainty problem are hence also applicable to evaluate the effect of uncertainty in the modal parameters on the structural response. ...
... Uncertainty was introduced on the modal eigenfrequencies and the modal damping ratios. The Transformation Method [31,32] is used to visualize the uncertainty on a typical measured amplitude frequency response function. ...
Conference Paper
Full-text available
As all experimental procedures, Experimental Modal Analysis (EMA) is subject to a wide range of potential testing and processing errors. The modal identification methods are sensitive to these errors, yielding modal results which are uncertain up to certain error bounds. The question hence is what these error bounds on test data and modal parameters are, how these can be reduced but also how these errors will affect the actual use of the data. The paper reviews the main elements of the test data and modal modelling uncertainty and assesses the impact of the uncertainty on some typical modeling problems. Some recent methods for uncertainty analysis in modelling are addressed..
... This means that a fixed number of failures is required to obtain a required accuracy and to reduce the influence of a single failure case on the total estimate. This influence can be quantified as the coefficient of variation (c.o.v.), the ratio of the standard deviation and mean value of the estimated failure probability [13]. ...
Article
The need of improvements in engineering designs especially with composite materials is nowadays a major request of the aerospace industry. Deterministic approaches are unable to take into account all the variabilities that characterize composites properties without oversizing structures. The necessity of assessing the probability of failure of a particular design requires a new methodology based on reliability analysis and design optimization through probabilistic models of physical properties. This paper intends to give a brief description of the most used methods for reliability analysis and point out the advantages of the application of these methods to shift from a deterministic to a probabilistic approach. Thus a methodology will be outlined that could serve as a guideline to develop a more efficient and optimized design process with particular focus on computational issues that could arise through the application of reliability analysis. Some application examples will be given to better explain the new methodology and the necessary theoretical background needed to understand the results; particular interest will be given to design optimization and its advantages in engineering designs. 1. Introduction to the probabilistic approach As designs grow more critical and competitive, the industrial community is continuously requesting improvements of current technologies. These needs lead to the introduction of new materials, especially in the aerospace industry, and even if their use is now rapidly increasing, current design methods do not directly account for the random nature of input parameters. This is mainly due to the lack of knowledge of real physical properties and to the variability related to the manufacturing process. These materials might have a wide field of applications, but the trust that the deterministic engineer usually puts in these materials is somewhat limited by a methodological gap; closing this gap would help to deal with variabilities in structural design. In fact, these materials have more intrinsic variables than metals and, to account for them, overdimensioning through large knockdown factors on strength properties is common practice. This unveils the trend to design in such way that the structure remains safe even in the extreme case that all possible unfavorable events would occur simultaneously. The result is a substantial weight increase without a quantifiable increase in structural reliability. Probabilistic methods could offer a powerful tool to assess the inherent risk of aerospace designs through the characterization of these variabilities, thus allowing the design engineer to reduce weight and costs, without losing reliability. The upcoming integration of these methods in the continuously improving Computer Aided Engineering (CAE) process will yield in the near future a new landscape in structural design. 2. Testcase: the composite structure
... These methods have not been considered; in the remainder of this paper, the term Transformation Method (TM) refers to the Reduced form. The Transformation Method can be extended to the frequency domain [14] to predict fuzzy responses of FE models; reported applications include uncertainty prediction on structural [14] and vibro-acoustic analyses [15], also on assembly level [16]. The Short Transformation Method (STM) is a computationally attractive alternative to the original TM [17], for dynamic analysis of structures with uncertain parameters that affect the eigenfrequency values in a (more or less) monotonic way. ...
Article
Full-text available
This paper aims to assess the effect of input uncertainty on FRF predictions obtained with Finite Element analysis. The Transformation Method (TM), presented by M. Hanss, uses a fuzzy approach based on interval arithmetic: the uncertain response is reconstructed from a set of deterministic responses, combining the extrema of each interval in every possible way. The TM is applicable if the output is monotonic in the inputs, otherwise an error is introduced. In this paper, the Short Transformation Method (STM) is proposed as an attractive alternative to the original TM, for dynamic analyses where the eigenfrequency values are monotonic in the uncertain inputs. A full set of deterministic responses is only performed at the lowest interval level. For higher levels, a smaller set is evaluated. The STM allows reconstructing the fuzzy FRF from a much lower number of deterministic computations, with only a small reduction in the accuracy of FRFs. Both methods are demonstrated on a car front cradle with uncertain design parameters.
Conference Paper
Finite Element (FE) analysis is the most common approach in today’s Computer Aided Engineering (CAE) to model real-world structures in an early design stage. Since deterministic FE models lack predictive capabilities in the mid-frequency range due to their inability to model the effect of uncertain input parameters, a reliable method to assess the effect of input uncertainty on the frequency response prediction is of great interest. In this paper, the Transformation Method (TM) has been applied to assess the effect of parameter uncertainty on the pressure response of a vibro-acoustic system, namely a car body with an acoustic cavity. The TM follows a fuzzy approach based on the concept of a-cuts, i.e. discretisation of the parameter’s magnitude into a number of intervals. Response uncertainty is then assessed by performing a set of deterministic computations. For each interval, a deterministic response is computed for all possible combinations of the interval’s extrema. The fuzzy response can then be reconstructed from the set of deterministic responses. To assess the accuracy of the obtained fuzzy response, it has been validated against response samples from a sufficiently high number of Monte Carlo simulations.
Article
Full-text available
This paper presents a novel approach in the field of experimental and numerical investigation of mechanical properties of composite structures. It takes into account test data variability resulting from structural dynamic properties measurement and uses them to quantify uncertainties in model parameters updating. The main goal of the conducted research is to investigate the dynamic properties of fibre reinforced composite structures. Non-destructive experimental and numerical simulation methods are used hereto. In the experimental part, different test configurations were taken into account. The excitation was performed by means of random and harmonic, single and multi point stimuli while the response measurement was done through contact and non-contact acceleration, velocity and dynamic strain sensing. The test results are applied in two ways: for the structural identification of the object and for non-deterministic updating of the numerical model according to a range of experimental models obtained from test. The sources of the test data variabilities were related to the excitation and measurement technique applied for the investigated object. Non - deterministic model updating and verification & validation included uncertainties of its parameters by means of interval and stochastic methods. A number of variable test modal models were statistically assessed to investigate impact of variability source onto modal model parameters. The presented research was conducted in the context of the FP6 Marie Curie project UNVICO-2.
Conference Paper
A systematic approach to design six-sigma simultaneously addressing variability and uncertainty present in design parameters is presented. The first step is the ability to integrate and federate best-of-class virtual prototyping simulation software programs, based on finite element modeling techniques. The second step is assessment of design parameter variability on the design parameter. The dynamic response variability is calculated using relativity-based methods such as First and Second Order Reliability methods, the Importance Sampling Monte Carlo (ISMC) method and the Transformed Importance Latin Hypercube Sampling (TILHS) method. The Transformation method allows visualizing worst-case scenarios on the FRF characteristics.