History of design variable (h) of Bayesian statistics.

History of design variable (h) of Bayesian statistics.

Source publication
Article
Full-text available
This paper focuses on a method to solve structural optimization problems using particle swarm optimization (PSO), surrogate models and Bayesian statistics. PSO is a random/stochastic search algorithm designed to find the global optimum. However, PSO needs many evaluations compared to gradient-based optimization. This means PSO increases the analysi...

Similar publications

Article
Full-text available
Assessments of damage following the 2010 Haitian earthquake were validated by comparing three datasets. The first, for 107,000 buildings, used vertical aerial images with a 15-25 cm spatial resolution. The second, for 1,241 buildings, used Pictometry images (oblique angle shots with a resolution of about 10 cm taken in four directions by aircraft)....
Article
Full-text available
This research aimed to study environmental and genetic effects on reproductive and productive traits of goats exploited for milk production. (Co)Variance components and genetic trait parameters for age at first calving, calving interval, and milk production in univariate and multivariate analysis, using Bayesian statistics under animal model, were...
Article
Full-text available
The general path model (GPM) is one approach for performing degradation-based, or Type III, prognostics. The GPM fits a parametric function to the collected observations of a prognostic parameter and extrapolates the fit to a failure threshold. This approach has been successfully applied to a variety of systems when a sufficient number of prognosti...
Article
Full-text available
This paper studies drug offence at schools and their environment to explain why some but not all schools had drug offences. We used a geographical approach that integrated spatial data of crime, census, and the built environment to identify potential risk factors of drugs at schools. Based on all recorded drug offences at schools (2001-2008) in the...
Article
Full-text available
Optimisation of routine monitoring programmes of internal contamination. To optimise the protection of workers against ionising radiation, French regulation imposes the use of limits and dose constraints and the progressive decrease of exposition, in line with the recommendations of the International Commission on Radiological Protection. To verify...

Citations

... n-dimensional space. The movement of particles and swarm is dictated by the relationship between the range of the movement and the particle's position [52]. The velocity and position of any particle can be calculated by using equations 1 and 2. ...
Article
Full-text available
Hardware‐based sensing frameworks such as cooperative fuel research engines are conventionally used to monitor research octane number (RON) in the petroleum refining industry. Machine learning techniques are employed to predict the RON of integrated naphtha reforming and isomerisation processes. A dynamic Aspen HYSYS model was used to generate data by introducing artificial uncertainties in the range of ±5% in process conditions, such as temperature, flow rates, etc. The generated data was used to train support vector machines (SVM), Gaussian process regression (GPR), artificial neural networks (ANN), regression trees (RT), and ensemble trees (ET). Hyperparameter tuning was performed to enhance the prediction capabilities of GPR, ANN, SVM, ET and RT models. Performance analysis of the models indicates that GPR, ANN, and SVM with R² values of 0.99, 0.978, and 0.979 and RMSE values of 0.108, 0.262, and 0.258, respectively performed better than the remaining models and had the prediction capability to capture the RON dependence on predictor variables. ET and RT had an R² value of 0.94 and 0.89, respectively. The GPR model was used as a surrogate model for fitness function evaluations in two optimisation frameworks based on genetic algorithm and particle swarm method. Optimal parameter values found by the optimisation methodology increased the RON value by 3.52%. The proposed methodology of surrogate‐based optimisation will provide a platform for plant‐level implementation to realise the concept of industry 4.0 in the refinery.
... First of all, these two incentives directly affect individuals and then ferment and indirectly guide the surrounding individuals and groups [13]. Jongbin once did a study and proved from a psychological perspective that a certain amount of negative feelings is greater than the satisfaction of the same amount of positive feelings [14]. For example, if a person loses a hundred yuan, the negative feeling he produces will be stronger than the positive feeling brought about by the one hundred yuan. ...
Article
Full-text available
Today, with the rapid development of the Internet, the new carrier and platform of college students’ ideological and political education innovation need to be improved. In order to improve the timeliness of college students’ ideological and political research, in view of this characteristic, particle algorithm is studied. Combined with the principle of particle algorithm, the core idea of particle algorithm is applied. College students think about their past behaviors and conduct self-evaluation and reflection. On the other hand, it is the competition and cooperation between multiple academic groups. Combined with particle swarm optimization algorithm, through model optimization, deal with the fitness value between the two attributes to optimize the work. Positive and negative excitation measures are introduced in the experimental research, and the particle swarm optimization evaluation function and behavior weighting factor are analyzed to hypothetically describe the working method. At the same time, it is pointed out that the educational working methods should be adjusted in time according to the changes of student groups and individual behaviors, so as to achieve good work results. Research shows that positive incentives are better than no incentives, and the introduction of negative incentives can only prevent college students from becoming negative role models because they cannot give them the best state of consciousness.
... BO is extensively used by the machine learning community (Snoek et al. 2012;Swersky et al. 2013;Bergstra et al. 2011) and is a fairly recent addition to the structural optimization field. Im and Park (2013) used particle swarm optimization with a surrogate model for structural optimization problems. Fan et al. (2019) adopted Kriging (Gaussian process) surrogates for reliability-based design optimization of crane bridges. ...
Article
Full-text available
Bayesian optimization (BO) is a popular method for solving optimization problems involving expensive objective functions. Although BO has been applied across various fields, its use in structural optimization area is in its early stages. Origami folding structures provide a complex design space where the use of an efficient optimizer is critical. In this work for the first time we demonstrate the ability of BO to solve origami-inspired design problems. We use a Gaussian process (GP) as the surrogate model that is trained to mimic the response of the expensive finite element (FE) objective function. The ability of this BO-FE framework to find optimal designs is verified by applying it to well-known origami design problems. We compare the performance of the proposed approach to traditional gradient-based optimization techniques and genetic algorithm methods in terms of ability to discover designs and computational efficiency. BO has many user-defined components/parameters and intuitions for these for structural optimization are currently limited. In this work, we study the role of hyperparameter tuning and the sensitivity of Bayesian optimization to the quality and size of the initial training set. Taking a holistic view of the computational expense, we propose various heuristic approaches to reduce the overall cost of optimization. Our results show that Bayesian optimization is an efficient alternative to traditional methods. It allows for the discovery of optimal designs using fewer finite element solutions, which makes it an attractive choice for the non-convex design space of origami fold mechanics.
... Kim et al. [32] addressed a Bayesian statistical procedure to calibrate the key parameters unknown prior in the model and test the data uncertainty for the analysis problem of piston insertion into the housing in a pyrotechnically actuated device, where the approach was employed. Im and Park [33] presented a particle swarm optimization procedure based on surrogate models and employed Bayesian statistics to obtain more reliable results for the structure optimization of a hub sleeve. Jo et al. [34] developed two adaptive variable-fidelity kriging surrogate models and integrated them with Bayesian-based and difference-based dynamic fidelity indicators, which were formulated as probabilistic model validation metrics to quantify model-form uncertainties into an efficient global optimization design framework for the design problems. ...
Article
Full-text available
A Bayesian framework-based approach is proposed for the quantitative validation and calibration of the kriging metamodel established by simulation and experimental training samples of the injection mechanism in squeeze casting. The temperature data uncertainty and non-normal distribution are considered in the approach. The normality of the sample data is tested by the Anderson–Darling method. The test results show that the original difference data require transformation for Bayesian testing due to the non-normal distribution. The Box–Cox method is employed for the non-normal transformation. The hypothesis test results of the calibrated kriging model are more reliable after data transformation. The reliability of the kriging metamodel is quantitatively assessed by the calculated Bayes factor and confidence. The Bayesian factor and the confidence level results indicate that the kriging model demonstrates improved accuracy and is acceptable after data transformation. The influence of the threshold ε on both the non-normally and normally distributed data in the model is quantitatively evaluated. The threshold ε has a greater influence and higher sensitivity when applied to the normal data results, based on the rapid increase within a small range of the Bayes factors and confidence levels.
... There has been much attention in terms of implementation of PSO in control theory. Particle swarm optimization has successfully been applied to a wide variety of problems such as neural networks [11], structural optimization [12], share topology optimization [13] and fuzzy systems [14]. ...
Article
Full-text available
This paper introduces the application of an optimization technique, known as Particle Swarm Optimization (PSO) algorithm to the problem of tuning the Proportional-Integral-Derivative (PID) controller for a linearized ball and beam control system. After describing the basic principles of the Particle Swarm Optimization, the proposed method concentrates on finding the optimal solution of PID controller in the cascade control loop of the Ball and Beam Control System. Ball and Beam control system tends to balance a ball on a particular position on the beam as defined by the user. The efficiency of Particle Swarm Optimization algorithm for tuning the controller will be compared with a classical method, Trial and Error method. The comparison is based on the time response performance. The two tuning methods have been developed by simulation study using Matlab\ m-file software. The evaluations show that Evolutionary method Particle Swarm Optimization (PSO) algorithm gives a much better response than trial and error method.
... It is sometimes very difficult to find a meta-model with sufficient accuracy throughout the design space. 55 The theory of this method is based on suppressing variable boundaries in the neighborhood of the current design point. The modified boundary, called ''trust region", is determined in each stage based on solution positions. ...
Article
Full-text available
This paper presents a novel optimization technique for an efficient multi-fidelity model building approach to reduce computational costs for handling aerodynamic shape optimization based on high-fidelity simulation models. The wing aerodynamic shape optimization problem is solved by dividing optimization into three steps—modeling 3D (high-fidelity) and 2D (low-fidelity) models, building global meta-models from prominent instead of all variables, and determining robust optimizing shape associated with tuning local meta-models. The adaptive robust design optimization aims to modify the shape optimization process. The sufficient infilling strategy—known as adaptive uniform infilling strategy—determines search space dimensions based on the last optimization results or initial point. Following this, 3D model simulations are used to tune local meta-models. Finally, the global optimization gradient-based method—Adaptive Filter Sequential Quadratic Programing (AFSQP) is utilized to search the neighborhood for a probable optimum point. The effectiveness of the proposed method is investigated by applying it, along with conventional optimization approach-based meta-models, to a Blended Wing Body (BWB) Unmanned Aerial Vehicle (UAV). The drag coefficient is defined as the objective function, which is subjected to minimum lift coefficient bounds and stability constraints. The simulation results indicate improvement in meta-model accuracy and reduction in computational time of the method introduced in this paper.
... Because PAPSO is more efficient in solving the integer programing problem, the mapping from the real coding to the natural number coding is established in order to solve the discrete optimization problem for module partition by means of the continuous optimization method. Although the basic arithmetic operation is a relatively simple operation, it still shows a strong ability to perform global searching with a reasonably fast speed of convergence [25]. This method is intended to solve a two-objective optimization problem, which concerns both a criterion function based on designer's experience and a noise function based on customer's preferences, towards a more robust scheme of module partition. ...
... The component vector in generation k about the ith particle is described as X kþ1 ð Þ i . Finally, the velocity of a particle is defined by Eq. (13) [25]. Fig. 5 Illustration of an electric-traction drum shearer X k i means the position of the ith particle in generation k, and V k ð Þ i means the moving speed of the ith particle in generation k. λ means a constraining factor that limits the velocity's magnitude, and the interval of λ is [0,1]. ...
Article
Full-text available
Against the sweeping trend of mass customization, the importance of product platform design is becoming increasingly recognized by the manufacturers. Module design is the foundation of product platform design, and module partition determines the effectiveness of module design. Traditionally, the vast majority of existing module partition methods ignored the design factor of customer preferences. This study proposes to employ the basic principles of robust design to make the module partition schemes less sensitive to the dynamically changing customer preferences by considering them as a noise factor. A criterion function and a noise function are each established based on the component-component correlation matrix and component-function contribution matrix, respectively. The criterion and noise functions, when combined, lead to a unique multi-objective optimization problem. Furthermore, an improved Pareto archive particle swarm optimization (PAPSO) algorithm is introduced to solve the multi-objective optimization problem in order to prevent the premature selections of non-optimal solutions. A case study is presented to showcase how the proposed new method is followed to conduct the module partition on an electric-traction drum shearer. The improved algorithm demonstrates highly competitive performance in comparison to the existing multi-objective optimization algorithms.
... PSO is very robust in exploring and exploiting the search space of multi echelon SC problems that usually have many decision variables and are multi-objective in nature [7] [9]. PSO relies both on its own cognitive behaviour and shared social behaviour of swarm or school to initially locate local optima and gradually move towards global optima [10][11] [12]. ...
Article
Full-text available
In the current globalized scenario, business organizations are more dependent on cost effective supply chain to enhance profitability and better handle competition. Demand uncertainty is an important factor in success or failure of a supply chain. An efficient supply chain limits the stock held at all echelons to the extent of avoiding a stock-out situation. In this paper, a three echelon supply chain model consisting of supplier, manufacturing plant and market is developed and the same is optimized using particle swarm intelligence algorithm.
... This discrete set can be obtained via sampling or experiments. Point interpolation method (PIM) is based on scattered nodes within the concerned domain, using polynomial basis functions [39,40]. The general obstacle of PIM is singular moment matrix, which if occurs, will lead to the PIM process breaking down. ...
... It can be seen from circuit evolution that the EA is able to find a structural bit-string code conforming to the target circuit from a disordered structural bit-string code according to a rule. In addition, such advantages of EHW as self-organization and self-repair are completely applied and reflected during circuit evolution 16,[19][20][21][22] . ...
Article
Full-text available
Since digital circuits have been widely and thoroughly applied in various fields, electronic systems are increasingly more complicated and require greater reliability. Faults may occur in electronic systems in complicated environments. If immediate field repairs are not made on the faults, electronic systems will not run normally, and this will lead to serious losses. The traditional method for improving system reliability based on redundant fault-tolerant technique has been unable to meet the requirements. Therefore, on the basis of (evolvable hardware)-based and (reparation balance technology)-based electronic circuit fault self-repair strategy proposed in our preliminary work, the optimal design of rectification circuits (RTCs) in electronic circuit fault self-repair based on global signal optimization is deeply researched in this paper. First of all, the basic theory of RTC optimal design based on global signal optimization is proposed. Secondly, relevant considerations and suitable ranges are analyzed. Then, the basic flow of RTC optimal design is researched. Eventually, a typical circuit is selected for simulation verification, and detailed simulated analysis is made on five circumstances that occur during RTC evolution. The simulation results prove that compared with the conventional design method based RTC, the global signal optimization design method based RTC is lower in hardware cost, faster in circuit evolution, higher in convergent precision, and higher in circuit evolution success rate. Therefore, the global signal optimization based RTC optimal design method applied in the electronic circuit fault self-repair technology is proven to be feasible, effective, and advantageous.