Article
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Metaheuristic algorithms have extensively been used in various engineering optimization problems considering either single-objective [35][36][37][38][39] or multi-objective cases [2, 40,41]. Parameter tuning is an essential step in adjusting a metaheuristic algorithm because every algorithm has some input parameters determining its effectiveness [42][43][44][45][46]. Suitable setting of these parameters improves the performance of an algorithm to find a better solution within lesser time. ...
Article
In this article, seven recently developed metaheuristic algorithms are utilized for optimal design of dome-shaped trusses with natural frequency constraints. These algorithms are social network search, jellyfish search optimizer, equilibrium optimizer, teaching-learning-based optimization, grey wolf optimizer, colliding bodies optimization, and the improved grey wolf optimizer. This study focuses on the assessment and tuning of these algorithms to enhance both computational efficiency and solution quality. In particular, we introduce a hyperparameter tuning procedure that significantly improves the performance of the algorithms to make them more suitable for the solution of structural optimization problems. The performance of the selected algorithms is studied through five benchmark examples for design optimization of dome structures, from small-scale to large-scale cases, with frequency constraints. Many combination scenarios are studied for hyperparameter tuning of the algorithms in each design example in order to provide comprehensive statistical results determining the sufficient number of structural analyses as well as the number of particles to find an efficient condition of each algorithm. Indeed, this study also presents an insight into the efficient setting of hyperparameters for each algorithm, leading to a better solution with lower computational cost. Furthermore, the optimization results provide a comparison of the solutions obtained in the current and previous studies in order to demonstrate the suitability of the selected algorithms.
... Depending on the complexity of the parameterized structure, number of classes of algorithms can be handled; simple methods were first developed (Rosen, 1961;Schittkowski, 1983), followed by more advanced algorithms such as metaheuristic algorithms (Afshari et al., 2019;Gunantara, 2018;Kaveh et al., 2020;Kicinger et al., 2005;Khalil et al., 2023), including evolutionary strategies (ES) (Rechenberg et al., 1965;Schwefel, 1965;van Ameijde et al., 2022), evolutionary programming (EP) (Murawski et al., 2000) and genetic algorithms (GA) (Zhang et al., 2021). In the context of GA, Talaslioglu (2019c) recommends Pareto Archived Genetic Algorithm (PAGA) for optimizing geometrically nonlinear dome structures. ...
Article
Full-text available
Biological structures and organisms are determined and optimized to adapt to changes and constraints imposed by the environment. The multiple functionalities and properties exhibited by such structures are currently a source of inspiration for designers and engineers. Thus, biomimetic design has been increasingly used in recent years with the intensive development of additive manufacturing to deliver innovative solutions. Due to their multifunctional properties combining softness, high stiffness, and light weight, many potential applications can be seen in the medical, aerospace, and automotive sectors. This paper introduces a biomimetic design and geometric modeling method of 3D-printed lightweight structures based on L-systems generated and distributed along their principal stress lines. Numerical simulations and parametric optimization were conducted with three case studies to demonstrate the relevance and applicability of this method in adapting mechanical structures to various load cases as well as ensuring a proper stiffness-to-weight ratio.
Article
Full-text available
Particle swarm optimization (PSO) is one of the most well-regarded swarm-based algorithms in the literature. Although the original PSO has shown good optimization performance, it still severely suffers from premature convergence. As a result, many researchers have been modifying it resulting in a large number of PSO variants with either slightly or significantly better performance. Mainly, the standard PSO has been modified by four main strategies: modification of the PSO controlling parameters, hybridizing PSO with other well-known meta-heuristic algorithms such as genetic algorithm (GA) and differential evolution (DE), cooperation and multi-swarm techniques. This paper attempts to provide a comprehensive review of PSO, including the basic concepts of PSO, binary PSO, neighborhood topologies in PSO, recent and historical PSO variants, remarkable engineering applications of PSO, and its drawbacks. Moreover, this paper reviews recent studies that utilize PSO to solve feature selection problems. Finally, eight potential research directions that can help researchers further enhance the performance of PSO are provided.
Article
Full-text available
The equilibrium of a membrane shell is governed by Pucher’s equation that is described in terms of the relations among the external load, the shape of the shell, and the Airy stress function. Most of the existing funicular form-finding algorithms take a discretized stress network as the input and find the shape. When the resulting shape does not meet the user’s expectation, there is no direct clue on how to revise the input. The paper utilizes the method of radial basis functions, which is typically used to smoothly approximate arbitrary scalar functions, to represent C∞ smooth shapes and stress functions of shells. Thus, the boundary value problem of solving Pucher’s equation can be converted into a least-squares regression problem, without the need of discretizing the governing equation. When the provided shape or stress function admits no solution, the algorithm recommends users how to tweak the input in order to find an approximate solution. The external load in this method can easily incorporate vertical and horizontal components. The latter part might not always be negligible, especially for the seismic hazard zones. This paper identifies that the peripheral walls are preferable to allow the membrane shells to carry horizontal loads in various directions without deviating from their original shapes. When there are no sufficient supports, the algorithm can also suggest the potential stress eccentricities, which could inform the design of reinforcing beams.
Article
Full-text available
In this study, application and efficiency of electrocoagulation (EC) in the removal of nickel and iron from real powerplant wastewater was evaluated. Fe electrode (St 12) was used as anode and cathode, connected with parallel monopolar configuration. Tests were conducted in two phases. Phase I mainly focussed on changing the range of the parameters to attain the possible range in which the tests can be conducted and also reached the optimum efficiency. A stirred batch reactor was used to define the test parameters range, including initial pH, electrical current (A), electrode distance (cm) and electrolysis time (min). Then, phase II of this study was conducted using the response surface methodology to design and optimise operational parameters. Outcomes indicated that considering pH = 8.1, d = 1 cm, I = 1.5 A and t = 18 min as operational conditions can result in 99.9% removal efficiency. Remarkably, the high correlation between experimental and predicted values (R²=99.5% and 99.6% for iron and nickel, respectively) demonstrated that the EC process is a promising method to remove heavy metals from power plant wastewater.
Article
Full-text available
The new method presented in this paper falls into the category of sampling methods and model management in the optimization process of surrogate related methods. This method was introduced in order to reach the global optimum with a limited number of computer experiments. During these developments, the Particle Swarm Optimization (PSO) was used as a smart sampling tool to construct the metamodel. These methods with their stochastic nature can also overcome the problems of local minima. In order to improve the efficiency and accuracy of the metamodel (Kriging), a knowledge database with smart sampling methods has been integrated into the optimization model management, to avoid unnecessary finite elements calculations and enrich the collection (sampling) in each optimization iteration. This method makes it possible to reduce the sampling size and at the same time increases the accuracy of the metamodel. For validation of the developed method, different benchmark functions were chosen in terms of features and has successfully then minimized. Finally, a practical engineering optimization problem for polymer extrusion was implemented with suggested Kriging Swarm Optimization algorithm (KSO). In this procedure, the Finite Element Analysis (FEA) was combined for simulation procedures to resolve non-isothermal non-Newtonian flow. Polymer extrusion results were applied for gathering information from design space samples and Kriging.
Article
Full-text available
The purpose of the research was to develop the method for optimizing the plane steel frames on the discrete sets of parameters with possibility of emergency actions. The search for the solution was carried out using genetic algorithm. The computational scheme included the following main steps: finding the optimal embodiment taking into account the conditions of normal use of the building; performing calculations of this object in the static and dynamic formulations in the assumption of local damages with the estimation of dynamic coefficient for each considered emergency action; optimal design of the construction with the calculations of the damaged rod systems in a quasi-static formulation using the obtained dynamic coefficients. The analysis of the transient dynamics was executed in the physically and geometrically nonlinear formulation using the associated flow rule. Calculation in the static formulation was implemented within the framework of the deformation plasticity theory with the account for rods of the influence of normal forces on bending. The example of optimal synthesis of the three-span plane frame with the fast damages of any of its column supports was considered. The proposed methodology can be recommended for use in the design of buildings and structures with high level of safety.
Article
Full-text available
The question of efficiency of thin-walled structures contains a number of contradictions. You need to select the best from all the existing structures on the criteria of optimization options. The search is conducted by varying of the parameters at parametric optimization. As a rule the aim of building structure optimization is reducing of material consumption, the labor input and cost. The costs of a particular variant of construction most full describes the given cost. There are two types of optimization parameters - immutable and varying. The result of the optimization of thin-walled beams will be a combination of parameters for each design situation in which provides the required strength and the minimum of the objective function - factory cost of production
Article
Full-text available
This article describes the process of constructing a genetic parametrical-optimization algorithm of trusses with the possibility of account the cost of manufacturing the nodal joints of rods. Accounting nodal joints in the process of parametric design synthesis will allow finding bearing systems with rational cost of their production. Parametric optimization was performed on the basis of modified genetic algorithm with constraints on the strength, stiffness and stability of the bearing system. The multipoint crossover and mutation operators and weakly interacting populations were used. The objective function takes into account the specific manufacturing of nodal joints for trusses with profiles of sections as binary angles. The cost of nodes is calculated on the basis of labor costs and materials used in the design of welded joints. А computational scheme for optimizing steel trusses of rods with profiles as paired angle sections and welded nodal joints is developed. The proposed iterative procedure is based on an efficient evolutionary algorithm for parametric synthesis of bearing systems that takes into account the valuation of materials and labor cost for the production of the structure. The example of optimal designing a secondary truss in frame building is considered. The solutions obtained as a result of optimization, considering the cost of nodes and not considering it, are compared. The above example confirms the efficiency of the proposed computational procedures.
Article
Full-text available
Computation-intensive design problems are becoming increasingly common, especially for large-size manufacturers such as those in the aerospace, automotive, and electronics industries. The computation burden is often caused by expensive analysis and simulation processes. Approximation or metamodeling techniques are often used to model these computation-intensive processes in order to improve efficiency. This work will review the current state-of-the-art on metamodeling-based techniques in support of product design. The review is organized from a practitioner's perspective according to the role of metamodeling in supporting design. Challenges and future development of metamodeling will also be analyzed and discussed.
Article
Full-text available
This paper summarizes the discussion at the Approximation Methods Panel that was held at the 9 th AIAA/ISSMO Symposium on Multidisciplinary Analysis & Optimization in Atlanta, GA on September 2–4, 2002. The objective of the panel was to discuss the current state-of-the-art of approximation methods and identify future research directions important to the community. The panel consisted of five representatives from industry and government: (1) Andrew J. Booker from The Boeing Company, (2) Dipankar Ghosh from Vanderplaats Research & Development, (3) Anthony A. Giunta from Sandia National Laboratories, (4) Patrick N. Koch from Engineous Software, Inc., and (5) Ren-Jye Yang from Ford Motor Company. Each panelist was asked to (i) give one or two brief examples of typical uses of approximation methods by his company, (ii) describe the current state-of-the-art of these methods used by his company, (iii) describe the current challenges in the use and adoption of approximation methods within his company, and (iv) identify future research directions in approximation methods. Several common themes arose from the discussion, including differentiating between design of experiments and design and analysis of computer experiments, visualizing experimental results and data from approximation models, capturing uncertainty with approximation methods, and handling problems with large numbers of variables. These are discussed in turn along with the future directions identified by the panelists, which emphasized educating engineers in using approximation methods.
Article
Full-text available
Optimization integrated with either one step solver (low fidelity model) or incremental nonlinear finite element solver (high fidelity model) has respectively gained increasing popularity in the design of sheet metal forming process to improve product quality and shorten lead time. However, the one step solver directly incorporated with optimization may result in inadequate design precision, while the incremental method often leads to a prohibitively low computing efficiency. In order to take the full advantages of both the one step solver and incremental solver, we present a variable fidelity algorithm which integrates the one step solver with incremental solver for optimizing sheet metal forming process in this study. In the variable fidelity method established, we need to determine the difference between the two solvers at some predefined experimental points firstly, and then constructing a corrected function using surrogate models for compensating the responses of the one step solver at other points. Different surrogate models, such as response surface methodology (RSM), Kriging (KRG), radial basis function (RBF) and support vector regression (SVR), are considered and compared for best modeling accuracy in this paper. The compensated low fidelity model can be used as a high fidelity model in the optimization process. In this study, we adopt the artificial bee colony (ABC) algorithm to obtain the global optimum. To demonstrate the capability of the variable fidelity method combined with the ABC algorithm, the optimal design of draw-bead restraining forces for an automobile inner panel is exemplified herein. The results show that the optimization with variable fidelity method presented significantly improves the computational efficiency and formability of the workpiece.
Article
Full-text available
A new optimisation methodology for the design of coat-hanger dies is presented. Two approaches are presented to optimise the velocities distribution across the die exit. In the first approach, we predict the optimal shape of a coat hanger die; in the second approach, to keep the same geometry and avoid design of a new die, we optimise the temperature of regulation in heterogeneous way. This method involves coupling a three-dimensional finite element simulation software and an optimisation strategy. For this optimisation, the Sequential Quadratic Programming algorithm and the global response surface method with Kriging interpolation are used.
Article
Full-text available
This paper addresses the difficulty of the previously developed Adaptive Response Surface Method (ARSM) for high-dimensional design problems. The ARSM was developed to search for the global design optimum for computation-intensive design problems. This method utilizes Central Composite Design (CCD), which results in an exponentially increasing number of required design experiments. In addition, the ARSM generates a complete new set of CCD samples in a gradually reduced design space. These two factors greatly undermine the efficiency of the ARSM. In this work, Latin Hypercube Design (LHD) is utilized to generate saturated design experiments. Because of the use of LHD, historical design experiments can be inherited in later iterations. As a result, ARSM only requires a limited number of design experiments even for high-dimensional design problems. The improved ARSM is tested using a group of standard test problems and then applied to an engineering design problem. In both testing and design application, significant improvement in the efficiency of ARSM is realized. The improved ARSM demonstrates strong potential to be a practical global optimization tool for computation-intensive design problems. Inheriting LHD samples, as a general sampling strategy, can be integrated into other approximation-based design optimization methodologies.
Article
Full-text available
This article reviews Kriging (also called spatial correlation modeling). It presents the basic Kriging assumptions and formulas—contrasting Kriging and classic linear regression metamodels. Furthermore, it extends Kriging to random simulation, and discusses bootstrapping to estimate the variance of the Kriging predictor. Besides classic one-shot statistical designs such as Latin Hypercube Sampling, it reviews sequentialized and customized designs for sensitivity analysis and optimization. It ends with topics for future research.
Article
Full-text available
Simulation is a widely applied tool to study and evaluate complex systems. Due to the stochastic and complex nature of real world systems, simulation models for these systems are often difficult to build and time consuming to run. Metamodels are mathematical approximations of simulation models, and have been frequently used to reduce the computational burden associated with running such simulation models. In this paper, we propose to incorporate metamodels into Decision Support Systems to improve its efficiency and enable larger and more complex models to be effectively analyzed with Decision Support Systems. To evaluate the different metamodel types, a systematic comparison is first conducted to analyze the strengths and weaknesses of five popular metamodeling techniques (Artificial Neural Network, Radial Basis Function, Support Vector Regression, Kriging, and Multivariate Adaptive Regression Splines) for stochastic simulation problems. The results show that Support Vector Regression achieves the best performance in terms of accuracy and robustness. We further propose a general optimization framework GA-META, which integrates metamodels into the Genetic Algorithm, to improve the efficiency and reliability of the decision making process. This approach is illustrated with a job shop design problem. The results indicate that GA-Support Vector Regression achieves the best solution among the metamodels.
Conference Paper
Full-text available
Evolutionary computation techniques, genetic algorithms, evolutionary strategies and genetic programming are motivated by the evolution of nature. A population of individuals, which encode the problem solutions are manipulated according to the rule of survival of the fittest through “genetic” operations, such as mutation, crossover and reproduction. A best solution is evolved through the generations. In contrast to evolutionary computation techniques, Eberhart and Kennedy developed a different algorithm through simulating social behavior (R.C. Eberhart et al., 1996; R.C. Eberhart and J. Kennedy, 1996; J. Kennedy and R.C. Eberhart, 1995; J. Kennedy, 1997). As in other algorithms, a population of individuals exists. This algorithm is called particle swarm optimization (PSO) since it resembles a school of flying birds. In a particle swarm optimizer, instead of using genetic operators, these individuals are “evolved” by cooperation and competition among the individuals themselves through generations. Each particle adjusts its flying according to its own flying experience and its companions' flying experience. We introduce a new parameter, called inertia weight, into the original particle swarm optimizer. Simulations have been done to illustrate the significant and effective impact of this new parameter on the particle swarm optimizer
Article
The modern additive manufacturing process enables the fabrication of innovative structural designs of lattice structures with enhanced mechanical properties. Characterizing the designability of lattice geometries and the corresponding mechanical performance, particularly the compressive strength is vital to commercializing lattice structures for lightweight components. In this paper, a Novel hybrid type of lattice structure was developed inspired by the overlapping pattern of scales on dermal layers of the species like fish and circular patterns observed from the bamboo tree structure. This research designed the lattice cell with 20%,30%,40%, and 50% overlapping areas based on the proposed circular area. The unit cell wall thickness varies between 0.4 mm and 0.6 mm. The designed structures with a constant volume of 40x40x40 mm, varied circular diameters, overlapping areas, and unit wall thickness are modeled in Fusion 360 software. 3D-printed lattice structures were fabricated using the vat polymerization type three-dimensional printing machine using the principle of stereolithography (SLA) technique. Then, the 3D-printed specimens are tested for quasi-static compressive response in a universal testing machine (UTM) by following ASTM standards. The test results are evaluated and compared with the simulation results. Finally, the best lattice structure is selected for energy absorption application in the aerospace and defense sectors.
Article
Some biological carapaces in the natural world have distinctive structures and excellent performance in impact resistance and light weight, which can be used to compensate for the shortcomings of traditional protective equipment in terms of wearing flexibility and energy absorption. A bioinspired hierarchical armor is designed in this article based on the inspiration provided by the internal filling layer of the biological carapace. The outer layer of the armor is made of ultra-high molecular weight polyethylene as the hard front plate, while the soft layer is made of silicone rubber filled with viscous fluid. Impact responses of the fluid filled armors with different soft layers are studied under low velocity impact load. The resistance behavior of the armor with multicell filling layer is analyzed stage by stage using experimental data and images obtained by the high speed camera. The research has found that the armor's synergistic effect of the hard layer, the soft layer and the substrate allows it to absorb enough impact energy and decrease blunt injuries while keeping a certain degree of flexibility in wearing. Furthermore, the fluid filled in the soft layer improves the defect of local excessive intrusion in the previous bioinspired protection studies. The study can provide a reference for the design of new protective equipment.
Article
What is the optimum corner radius for a regular hexagonal honeycomb in the context of in-plane, quasistatic compression loading? Drawing inspiration from social insect hexagonal cell nests, where a non-zero corner radius appears to be a design feature, a 400-point design of experiments study is conducted using 2D plane strain Finite Element Analysis (FEA) to study the influence of cell size, beam thickness and corner radius on effective modulus and maximum corner stress in the honeycomb cells. An experimental study is conducted to examine these relationships beyond the bounds of the small deformation, linear elastic FEA analysis. The study finds that corner radii always increase the effective modulus of the honeycomb, even after accounting for the additional mass associated with the corner fillet. A key finding of this work is that while corner radii also reduces the maximum corner stress, there exists a clear optimum, beyond which stresses rise again as the corner radius increases in magnitude. This optimum corner radius is shown to be a function of the beam thickness (t) to beam length (l) ratio (t/l), with the optimum value increasing with increasing t/l. The experimental study shows that the presence of a corner radius shifts the failure mechanism from nodal fracture to plastic hinging for honeycombs with thick beams, and may have benefits for energy absorption applications. This work makes the case for the treatment of the corner radius as an independent design feature for optimization in the wider context of cellular materials, as well as has implications for the study of the geometry of insect nests.
Article
Real‐time hybrid simulation (RTHS) integrates numerical modeling of analytical substructures with physical testing of experimental sub‐structures thus enabling system global responses to be efficiently evaluated through component testing in size‐limited laboratories. Traditional practice of RTHS focuses on responses evaluation of structures without considering their uncertainties. A cross‐validation (CV)‐Voronoi based adaptive sampling strategy is explored in this study for global meta‐modeling of multiple response quantities of interests through RTHS for engineering systems with uncertainties. Based on the Kriging meta‐model from initial samples, the CV‐Voronoi based adaptive sampling sequentially identifies the sample points for RTHS tests in laboratory and observed responses of interests are used to update the Kriging meta‐model. Multiple response distributions under structural uncertainties are thus acquired through limited number of experiments. RTHS tests of a two‐degree‐of‐freedom system with self‐centering viscous dampers (SC‐VDs) are conducted in this study to experimentally evaluate the effectiveness of the CV‐Voronoi based adaptive sampling for multiple response estimation. The accuracy of multi‐response meta‐models are further evaluated through comparison with validation tests. A stopping criterion is finally proposed for more efficient implementation of the adaptive sampling strategy. It is demonstrated that the CV‐Voronoi based adaptive sampling strategy provides a viable technique to enable accurate global meta‐modeling and estimation for multiple responses with limited number of RTHS tests in laboratory.
Article
This paper investigates reliability-based topology optimization (RBTO) and proposes a novel quantile-based topology optimization (QBTO) method. With this method, the traditional RBTO model is transformed into an equivalent quantile-based formulation, which can well avoid the issues existing in RBTO with Monte Carlo simulation (MCS), i.e., stagnation of optimizer due to near zero sensitivities of probabilistic constraint with regard to element densities, huge computational cost on calculating sensitivities, and the discontinuity of failure indicator function. Specially, in QBTO, the sensitivities only require to be calculated at the sample corresponding to the quantile instead of all MCS samples, which can drastically reduce the computational effort. Furthermore, a sequential update strategy of Kriging metamodel is developed to efficiently calculate the quantile by evaluating the true constraint at fewer samples, rather than all MCS samples. The high accuracy and efficiency of QBTO are validated by truss, beam and bridge problems.
Article
As a powerful optimization technique, multi-objective particle swarm optimization algorithms have been widely used in various fields. However, performing well in terms of convergence and diversity simultaneously is still a challenging task for most existing algorithms. In this paper, a multi-objective particle swarm optimization algorithm based on two-archive mechanism (MOPSO_TA) is proposed for the above challenge. First, two archives, including convergence archive (CA) and diversity archive (DA) are designed to emphasize convergence and diversity separately. On one hand, particles are updated by indicator-based scheme to provide selection pressure toward the optimal direction in CA. On the other hand, shift-based density estimation and similarity measure are adopted to preserve diverse candidate solutions in DA. Second, the genetic operators are conducted on particles from CA and DA to further enhance the quality of solutions as global leaders. Then the search ability of MOPSO_TA can be improved by performing hybrid operators. Furthermore, to balance global exploration and local exploitation of MOPSO_TA, a flight parameters adjustment mechanism is developed based on the evolutionary information. Finally, the proposed algorithm is compared experimentally with several representative multi-objective optimization algorithms on 21 benchmark functions. The experimental results demonstrate the competitiveness and effectiveness of the proposed method.
Article
Cellular structures in nature have attracted great attention in the field of structural optimization. This paper proposes a Kriging-assisted topology optimization method for design of functionally graded cellular structures (FGCS), which are infilled by smoothly-varying lattice unit cells (LUCs). Specifically, LUCs are depicted by level set functions and the shape interpolation method is employed to generate sample LUCs. Then, one Kriging metamodel is constructed to predict the mechanical properties of LUCs within FGCS, so as to reduce the computational expense involved in finite element analysis of LUCs. Meanwhile, the other Kriging metamodel is created to predict the values of shape interpolation function, and a Kriging-assisted morphological post-process method is put forward to achieve the smooth transition between adjacent graded LUCs. In the proposed method, the effective densities, mechanical properties, and geometrical configurations of LUCs are coupled by Kriging metamodels, so that the multiscale design of FGCS can be realized at a low computational burden by optimizing the distribution of elemental densities, and this also paves the way for design of FGCS with irregular geometries by morphological post-process and geometry reconstruction. Numerical examples are presented to validate the accuracy and effectiveness of the proposed method for design of FGCS. What is more, the design of a pillow bracket with a slightly complex geometry is provided to illustrate the engineering application of the proposed method. The results indicate that the proposed method is effective and universal for designing FGCS with smoothly-varying LUCs.
Article
Tubular solar still is a simple light-weight desalination unit with a large condensing surface compared with other types of solar stills. Regrettably, it suffers from the low water yield like other types of solar stills. In this work, two main research themes are studied. The first is enhancing the water yield and thermal efficiency of tubular solar still by providing the absorber plate with an electrical heater powered by a solar photovoltaic panel. The performance of the modified solar still is evaluated based on its water yield as well as energy and exergy efficiencies. The second is developing a fine-tuned artificial intelligent model to predict the thermal efficiency and water yield of the solar still. The fine-tuned model consists of a traditional artificial neural network model optimized by a meta-heuristic optimizer called humpback whale optimizer. The prediction accuracy of the developed model is compared with that of the standalone artificial neural network model and an optimized model using a traditional particle swarm optimizer. The results showed that the conventional tubular solar still produces an average accumulated water yield of 2.58 L/m2/day, while the modified tubular solar still produces an average accumulated water yield of 3.41 L/m2/day with 31.85% improvement. The daytime energy efficiency of the modified tubular solar still is 38.61%, but for the conventional one is only 30.67%. Moreover, the new developed model has the highest prediction accuracy among other investigated models. The optimized model using humpback whale optimizer has the highest correlation coefficient ranges between 0.983 and 0.999, the optimized model using particle swarm optimizer has a moderate correlation coefficient between 0.969 and 0.987, and standalone model has the lowest ranges between 0.594 and 0.937. These results revealed the vital role of the electrical heater in enhancing the thermal performance of the solar still and the important role of humpback whale optimizer for improving the prediction accuracy of the traditional neural network models.
Article
The paper presents an integrated analytical, numerical and experimental study on a kind of one-dimensional finite aperiodic metastructure, with mass-in-mass unit cells optimized for the broadband wave attenuation via genetic algorithm. The study begins with the analytic wave solution of the aperiodic structure and numerically validates the correctness of the wave solution in both frequency domain and time domain. Then, the paper outlines the optimization scheme in order to connect multiple separated narrow bandgaps into a wider continuous one, including the objective function of the wave attenuation, the design variables of inner masses and their constraints. The numerical studies show the successful broadband wave attenuation with a low vibration transmissibility in one direction and two opposite directions, respectively, based on the genetic algorithm method. The maximal attenuation bandwidth of the optimized aperiodic structure increases about 90% compared with the conventional repetitive local resonance, without any mass increase. Finally, the paper gives experimental studies of a 3D-printed lattice structure with an adjustable inner mass in each cell. The measured vibrations of the fabricated structure experimentally validate the optimized aperiodic structure with the broadband wave attenuation, as well as the effectiveness of proposed optimization method.
Article
This manuscript has more than 150 papers as a reference about energy absorption of auxetic structures and shows how several authors have approached the subject and how research in this field has progressed. It can be noted from the present paper that additive manufacturing has been an ally of researchers in the samples manufacturing and numerical analysis has also been widely used by the authors. In addition, this manuscript will provide a context for auxetic structures, discussing some cell models. The results obtained here serve as additional guidelines to assist engineers and designers in the development of auxetic structures.
Article
Over the past few decades, modeling, simulation, and optimization tools have received attention for their ability to represent and improve complex systems. The use of metamodeling techniques in optimization via simulation problems has grown considerably in recent years to promote more robust and agile decision-making, determining the best scenario in the solution space. The objective of this article is to conduct a systematic literature review of metamodeling-based simulation optimization (MBSO). The main contributions of this paper are related to systematically gather, analyze, and discuss the knowledge disseminated in this area, support future research and expand the literature related to MBSO techniques. Research questions were planned to assist MBSO researchers and practitioners, by presenting the most frequent contexts, applications, methods, tools, and metrics found in metamodeling studies in optimization via simulation problems. We considered papers published in scientific journals and listed in the Web of Science, Scopus, ACM Digital Library, IEEE Xplore, and Science Direct databases. The conclusion related the gaps, opportunities, and future perspectives found during the development of this research, suggesting that this research area is growing in the past 15 years.
Article
Fused Deposition Modelling is a fast emerging technology due to its capacity to generate usable components with multiple geometrical designs in a fair period of time without the usage of any tooling or human interaction. The features and reliability of FDM fabricated parts are highly dependent on a small number of processing variables and their settings. The current study examines the relationship between five significant processing constraints. i.e. raster angle, part orientation, air gap, layer thickness and raster width and what effect do they have on the dimensional accuracy of the fabricated part. Twenty-seven experiments were piloted and configured using Taguchi's architecture and recently formulated Optimization approaches as ten different optimizations have been utilized to predict the optimal setting of the experiment. A comparative inspection of these nature-inspired algorithms in FDM printed part was performed in this study which reported part orientation as the most significant element.
Article
With the development of additive manufacturing (AM), research interest is currently focused on lattice structures development due to their interesting mechanical properties. It implies the opportunity at the engineering level to be able to specify – beyond the shapes – mechanical properties distributed in the space to be manufactured. The article aims at introducing a design and optimization framework for AM, which highlights variable-density lattice structures. By processing both a topology optimization within a rough design space and a design of experiments driven parametric optimization, the development process of suitable and specific strength structures for AM becomes seamless and efficient.
Article
Compact heat exchangers for heat removal has become one of the most effective cooling techniques. In this paper, numerical scheme to simulate conjugated heat transfer using additively manufactured heat exchanger is presented. A lattice structure as porous medium is used as effective cooling techniques for heat removal. The objective of these lattices is to transmit heat from the hot part to the cold part while letting the system benefit from two phenomena, heat conduction and convection. An optimization procedure using response surface method is proposed for the presented compact heat exchanger to improve the efficiency. This method was introduced in order to reach the global optimum with a limited number of computer experiments. Two optimization variables are identified: edge thickness for the lattices and the inlet velocity. A constraint optimization problem is formulated to maximize the heat flux through the heat exchanger. Hence at the same time, limiting the increase of the pressure drop and the decrease of the exit temperature.
Article
The quality of a product obtained by hydroforming process is influenced by the geometrical, material and process parameters. In this paper, to predict an acceptable T-shaped tube with minimum wall thickness variations, and accomplishes the industrial requirements, a methodology based on the coupling of three-dimensional finite element incremental simulation based on Explicit Dynamic approach and an automatic surrogate model are proposed. The surrogate model is based on an adaptive moving target zone, and both Moving Least Square (MLS) and the Kriging technique. The optimization results are presented and compared in term of efficiency. Five quality criteria are used, an objective function defining the thickness variation with four nonlinear constraints functions, to reduce the risk of necking and to fulfil the industrial requirements. The proposed approach will provide a numerical estimation of the “best” tool dimensions and ideal punch stroke in order to obtain a final feasible workpiece.
Article
In this paper, the authors present an efficient procedure for optimal placement of poles in rational approximations by Müntz-Laguerre functions. The technique is formulated as the minimization of a quadratic criterion and the linear equations involved are efficiently expressed using the orthonormal basis functions. The presented technique has direct application in rational approximation and model order reduction of large-degree or infinite-dimensional systems.
Article
The objective of this article is to determine a wire coating-hanger melt distributor geometry to ensure a homogenous exit velocity distribution that will best accommodate a wide material range and multiple operating conditions (i.e., die wall temperature and flow rate change). The computational approach incorporates finite element (FE) analysis to evaluate the performance of a die design and includes a nonlinear constrained optimization algorithm based on the Kriging interpolation and sequential quadratic programming algorithm to update the die geometry. Two optimization problems are then solved, and the best solution is taken into account to manufacture the optimal distributor. The Taguchi method is used to investigate the effect of the operating conditions, i.e., melt and die wall temperature, flow rate and material change, on the velocity distribution for the optimal die. In the example chosen, the wire coating die geometry is optimized by taking into account the geometrical limitations imposed by the tool geometry. Finally, the FE analysis and optimization results are validated by comparison with the experimental data obtained with the optimal die. The purpose of the experiments described below is to investigate the effect of material change. POLYM. ENG. SCI., 2012. © 2012 Society of Plastics Engineers
Article
The production of high-strength clinched joints is the ultimate goal of the manufacturing industry. The determination of optimum tool shapes in the clinch forming process is needed to achieve the required high strength of clinched joints. The design of the tools (punch and die) is crucial since the strength of the clinched joints is closely related to the tools geometry. To increase the strength of clinched joints, an optimisation procedure using the response surface methodology, based on an adaptive moving target zone, is presented. The cost function studied here is defined in terms of the maximum value of the tensile force computed during the simulation of the sheets separation. Limitations on the geometrical parameters due to feasibility issues are also taken into account. The kriging interpolation is used to provide an approximation to the optimisation problem and to build the response surfaces.
Article
The requirement of repeated evaluation of structural responses in typical sensitivity based Finite Element Model Updating (FEMU) procedure limits its popular applications for large structures. The least-squares method (LSM) based response surface method (RSM) is applied as a potential alternative for responses approximations in iterative model updating procedure. However, the LSM is a major source of error in response prediction and the moving least-squares method (MLSM) is found to be more efficient in this regard. An attempt has been made in the present study to explore the effectiveness of MLSM based RSM in FEMU. A comparative assessment is performed between the MLSM based and the conventional LSM based RSM for model parameter updating. The comparative study is being illustrated with the help of two example problems using artificially generated input responses. It is generally observed that the MLSM based RSM identifies better than the LSM based approach.
Article
Clinching is a very cheap and efficient cold forming process that enables to join two sheets without any additional part such as a rivet or a bolt. This paper deals with the optimization of the final mechanical strength of a joined component using a global optimization technique based on Kriging meta-model. The optimization process includes both the joining stage and the mechanical strength simulations. It is shown that accounting for plastic strain, residual stresses and damage occurring during clinching is essential if one wants to study the final mechanical strength of the clinched component. The global optimization leads to a 13.5% increase of the mechanical strength for tensile loading and to 46.5% increase for shear loading. The global optimization procedure also enables to study the influence of input geometrical variability on the final mechanical strength, which is essential from an industrial point of view.
Article
Self-expanding Nitinol (nickel–titanium alloy) stents are tubular, often mesh like structure, which are expanded inside a diseased (stenosed) artery segment to restore blood flow and keep the vessel open following angioplasty. The super-elastic and shape memory properties of Nitinol reduce the risk of damage to the stent both during delivery into the body and due to accidents while in operation. However, as Nitinol stents are subjected to a long-term cyclic pulsating load due to the heart beating (typically 4 � 107 - cycles/year) fatigue fracture may occur. One of the major design requirements in medical implants is the device lifetime or, in engineering terms, fatigue life. In order to improve the mechanical properties of Nitinol stents, at first, a reliable procedure of finite element analysis (FEA) is established to provide quantitative measures of the stent’s strain amplitude and mean strain which are generated by the cyclic pulsating load. This allows prediction of the device’s life and optimization of stent designs. Secondly, the objective is to optimize the stent design by reducing the strain amplitude and mean strain over the stent, which are generated by the cyclic pulsating load. An optimization based simulation methodology was developed in order to improve the fatigue endurance of the stent. The design optimization approach is based on the Response Surface Method (RSM), which is used in conjunction with Kriging interpolation and Sequential Quadratic Programming (SQP) algorithm.
Article
One of the important recent advances in the field of hurricane/storm modeling has been the development of high-fidelity numerical models that facilitate accurate but also computationally intensive simulations of hurricane responses. For efficient implementation in probabilistic hurricane risk assessment, that typically requires simulation of a large number of hurricane scenarios within the coastal regions of interest, combination with metamodeling approaches has been recently proposed. In this work, kriging is investigated for this purpose, focusing on implementation for real-time assessment, i.e. for evaluating risk during an incoming event (and prior to the hurricane/storm making landfall), and on facilitating the development of efficient standalone tools. An important characteristic of this application is that the model output is very high dimensional, since the hurricane response is calculated in a large coastal region and potentially at different time instances. This makes it impractical to establish a different metamodel for each different output. Considering, though, the potential –spatial and/or temporal- correlation between the different outputs combination with principal component analysis (PCA) is proposed here. This analysis extracts a much smaller number of latent outputs to approximate the initial high-dimensional output. A separate metamodel is then developed for each latent output. Comparisons between kriging with and without PCA and between kriging and moving least squares response surface approximations are discussed in terms of both computational efficiency (speed) and memory requirements. Both these are important considerations when discussing the development of standalone tools that can efficiently run in personal laptops (or even smartphones) of emergency response managers. The impact of the number of latent outputs considered is investigated in this context, and also the optimal selection of basis and correlation functions for the kriging is discussed. Finally for calculating risk the prediction error stemming from the metamodel is explicitly addressed (i.e., the assessment does not rely only on the mean kriging response). The proposed approach is demonstrated for real-time hurricane risk assessment for the Hawaiian Islands, focusing in the region around Oahu.
Article
The design of anisotropic laminated composite structures is very susceptible to changes in loading, angle of fiber orientation and ply thickness. Thus, optimization of such structures, using a reliability index as a constraint, is an important problem to be dealt. This paper addresses the problem of structural optimization of laminated composite materials with reliability constraint using a genetic algorithm and two types of neural networks. The reliability analysis is performed using one of the following methods: FORM, modified FORM (FORM with multiple checkpoints), the Standard or Direct Monte Carlo and Monte Carlo with Importance Sampling. The optimization process is performed using a genetic algorithm. To overcome high computational cost it is used Multilayer Perceptron or Radial Basis Artificial Neural Networks. It is shown, presenting two examples, that this methodology can be used without loss of accuracy and large computational time savings, even when dealing with non-linear behavior.
Article
This paper describes a new numerical method for the design of multi-step stamping tools, in which the optimization approach is based on the Response Surface Method (RSM) with Kriging interpolation as well as the Sequential Quadratic Programming (SQP) algorithm. The present work attempts to provide a reliable methodology for the optimum design of the forming tools in order to produce a desired part by multi-step stamping within a severe tolerance (0.1 mm). The numerical method has been proposed to reduce the number of forming steps and therefore increasing the process productivity. To reach this goal, an integrated optimization approach, using the commercial finite element code ABAQUS© together with an optimization algorithm was developed. The optimization algorithm consists in constructing an explicit form of the objective function according to the design variables. To search the global optimum of the objective function, the SQP algorithm has been used. A thin metallic part formed by manual press and without blank-holder has been considered, to demonstrate the effectiveness of the optimization approach to get the optimal tools shape in a few iterations. © 2012 Elsevier B.V. All rights reserved.
Article
Viscosity is an important characteristic of flow property's and process ability for polymeric materials. A flat die was developed by Maillefer-extrusion, to make rheological characterisations. In this paper, the rheological parameters of the melt are identified through optimisation by a response surface method. The objective is to minimize the differences between the measured pressure obtained in flat die and the pressures computed by one-dimensional finite difference code programme. An objective function is defined as the global relative error obtained through the differences between measured and computed pressures. This objective function is minimised by varying the rheological parameters. For this minimisation, two methods are used, i.e. the local response surface and the global response surface. The rheological parameters permit to calculate the viscosity. Then, we compare this calculated viscosity with an experimental viscosity measured on a capillary rheometer to validate our method.
Article
Simulation metamodeling has been a major research field during the last decade. The main objective has been to provide robust, fast decision support aids to enhance the overall effectiveness of decision-making processes. This paper discusses the importance of simulation metamodeling through artificial neural networks (ANNs), and provides general guidelines for the development of ANN-based simulation metamodels. Such guidelines were successfully applied in the development of two ANNs trained to estimate the manufacturing lead times (MLT) for orders simultaneously processed in a four-machine job shop.The design of intelligent systems such as ANNs may help to avoid some of the drawbacks of traditional computer simulation. Metamodels offer significant advantages regarding time consumption and simplicity to evaluate multi-criteria situations. Their operation is notoriously fast compared to the time required to operate conventional simulation packages.
Article
In order to increase the efficiency and accuracy of design optimization, many efforts have been made on studying the metamodel techniques for effectively representing expensive and complex models. The performance of metamodel-based optimization is largely determined by the sampling method. Mainly due to the difficulty of knowing the appropriate sampling size a priori, termed as intelligent or sequential sampling has gained popularity in recent years. In this paper, a new kind of sampling mode, boundary-based strategies are studied. The major characteristic of intelligent methods is that the new sample is generated based on boundary and other specified samples. In order to evaluate the efficiency and accuracy of such kinds of strategies, a comparative study is implemented with mathematical nonlinear test functions. To validate the feasibility of boundary-based sampling methods, the proposed sampling strategy-based metamodeling technique is used to optimize real-world problems. Comparison of model predications and optimization data shows good agreement.
Article
Functions with poles occur in many branches of applied mathematics which involve resonance phenomena. Such functions are challenging to interpolate, in particular in higher dimensions. In this paper we develop a technique for interpolation with quotients of two radial basis function (RBF) expansions to approximate such functions as an alternative to rational approximation. Since the quotient is not uniquely determined we introduce an additional constraint, the sum of the RBF-norms of the numerator and denominator squared should be minimal subjected to a norm condition on the function values. The method was designed for antenna design applications and we show by examples that the scattering matrix for a patch antenna as a function of some design parameters can be approximated accurately with the new method. In many cases, e.g. in antenna optimization, the function evaluations are time consuming, and therefore it is important to reduce the number of evaluations but still obtain a good approximation. A sensitivity analysis of the new interpolation technique is carried out and it gives indications how efficient adaptation methods could be devised. A family of such methods are evaluated on antenna data and the results show that much performance can be gained by choosing the right method.
Article
Refractory titanium alloys are often used in aerospace industry for applications requiring high temperature strength and high mechanical resistance. However, the main problems encountered when machining this kind of material are the low material removal rate and the short cutting tool life. Due to their hard nature, an effective manufacturing strategy is required by industrials to optimize cutting parameters and improve machining efficiency. In this work, a new technique of optimization for dry machining refractory titanium alloys has been developed. The optimisation procedure is a new strategy based on both response surface method with “Kriging interpolation” and sequential quadratic programming algorithm. In comparison with previous models taken from the literature, two functions have been formulated in the machining problem to describe the non linear and complex relationship between variables and responses. The first function called “objective function” represents the volume of material removal produced during the cutting tool life and which has to be maximised. The second function called “constraint function” represents the surface roughness and has to be taken below a critical value. Thanks to this new approach, a robust optimization algorithm has been carried out to optimize cutting conditions taking into account productivity and quality.
Article
A new method is presented for flexible regression modeling of high dimensional data. The model takes the form of an expansion in product spline basis functions, where the number of basis functions as well as the parameters associated with each one (product degree and knot locations) are automatically determined by the data. This procedure is motivated by the recursive partitioning approach to regression and shares its attractive properties. Unlike recursive partitioning, however, this method produces continuous models with continuous derivatives. It has more power and flexibility to model relationships that are nearly additive or involve interactions in at most a few variables. In addition, the model can be represented in a form that separately identifies the additive contributions and those associated with the different multivariable interactions.
Article
"Kriging" is the name of a parametric regression method used by hydrologists and mining engineers, among others. Features of the kriging approach are that it also provides an error estimate and that it can conveniently be employed also to estimate the integral of the regression function. In the present work, the kriging method is described and some of its statistical characteristics are explored. Also, some extensions of the nonparametric regression approach are made so that it too displays the kriging features. In particular, a "data driven" estimator of the expected square error is derived. Theoretical and computational comparisons of the kriging and nonparametric regressors are offered.
Flat rod systems: optimization with overall stability control
  • In
  • A V Serpik
  • P Y Alekseytsev
  • N S Balabin
  • Kurchenko
IN. Serpik, A. v. Alekseytsev, P.Y. Balabin, and N.S. Kurchenko, Flat rod systems: optimization with overall stability control, Mag. Civ. Eng., vol. 76, no. 8, pp. 181-192, 2017. DOI: 10.
Optimization design of flash structure for forging die based on Kriging-PSO strategy
  • Y Zhang
  • Z An
  • J Zhou
Y. Zhang, Z. An, and J. Zhou, Optimization design of flash structure for forging die based on Kriging-PSO strategy, Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). vol. 6145 LNCS, no. PART 1, pp. 373-381, 2010. DOI: 10.1007/978-3-642-13495-1_46/COVER/.