Article

Simultaneous Optimization of Multiple Responses Represented by Polynomial Regression Functions

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

An algorithm is developed for the simultaneous optimization of several response functions that depend on the same set of controllable variables and are adequately represented by polynomial regression models of the same degree. The data are first checked for linear dependencies among the responses. If such dependencies exist, a basic set of responses among which no linear functional relationships exist is chosen and used in developing a function that measures the distance of the vector of estimated responses from the estimated “ideal” optimum. This distance function permits the user to account for the variances and covariances of the estimated responses and for the random error variation associated with the estimated ideal optimum. Suitable operating conditions for the simultaneous optimization of the responses are specified by minimizing the prescribed distance function over the experimental region. An extension of the optimization procedure to mixture experiments is also given and the method is illustrated by two examples.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Thus, the statistical analysis and SPO implementation are accomplished utilizing Minitab 18, which is traditional statistical software. Consequently, the value of this function for the PF in the single destination can be described as follows [66][67][68][69]: ...
... So, the first result for SPO from ANOVA shows that the mesh size and Reynolds number tend to be the most accurate predictors without specifying the type of duct. In addition, the interaction between Reynolds number and duct type is not influential since the P-value in the ANOVA of the current work is higher than 5% [64][65][66][67][68][69]. This interacted behavior is also inactive since the least hydraulic diameter employed reduces Reynolds number. ...
Article
Full-text available
This study presents a detailed Computational fluid dynamics (CFD) analysis, focusing on optimizing laminar flow within non-circular ducts, specifically those with square, rectangular, and triangular configurations. The study centers on the effective use of mesh quality and size in these ducts, a factor which is previously underrepresented in those CFD studies that predominantly emphasized turbulent rather than laminar flows. With the help of finite element approach, this study compares the performance of these non-circular ducts, employing Reynolds numbers ranging from 1600 to 2000 and mesh sizes of 6, 12, and 18 mm. A ribbed duct style, arranged in a hybrid manner, is adopted to further this study. Analysis in this paper applied the Single predictive optimization (SPO) technique to the identification of the K-ε-Standard as the preferred viscosity model and a hybrid rib distribution as optimal within the triangular duct configuration. Parameters of a Reynolds number of 1600 and a mesh size of 18 mm emerged as the most effective values for this duct style. Then, the attained results of the Analysis of variance (ANOVA) indicated the F-Criterion's insignificance for Reynolds laminar levels, rendering the laminar viscosity model less relevant within the test section. Additionally, the implementation of the Six sigma procedure (SSP) markedly enhanced both the performance factor (PF) and turbulence intensity, which were observed at 4.90% and 146.77%, respectively. This improvement was most notable in the triangular duct, characterized by rib heights of 66 mm (semi-circle), 66 mm (rectangular), and 38.126 mm (triangular).
... Thus, the statistical analysis and SPO implementation are accomplished utilizing Minitab 18, which is traditional statistical software. Consequently, the value of this function for the PF in the single destination can be described as follows [66][67][68][69]: ...
... So, the first result for SPO from ANOVA shows that the mesh size and Reynolds number tend to be the most accurate predictors without specifying the type of duct. In addition, the interaction between Reynolds number and duct type is not influential since the P-value in the ANOVA of the current work is higher than 5% [64][65][66][67][68][69]. This interacted behavior is also inactive since the least hydraulic diameter employed reduces Reynolds number. ...
Article
Full-text available
This study presents a detailed Computational fluid dynamics (CFD) analysis, focusing on optimizing laminar flow within non-circular ducts, specifically those with square, rectangular, and triangular configurations. The study centers on the effective use of mesh quality and size in these ducts, a factor which is previously underrepresented in those CFD studies that predominantly emphasized turbulent rather than laminar flows. With the help of finite element approach, this study compares the performance of these non-circular ducts, employing Reynolds numbers ranging from 1600 to 2000 and mesh sizes of 6, 12, and 18 mm. A ribbed duct style, arranged in a hybrid manner, is adopted to further this study. Analysis in this paper applied the Single predictive optimization (SPO) technique to the identification of the K-ε-Standard as the preferred viscosity model and a hybrid rib distribution as optimal within the triangular duct configuration. Parameters of a Reynolds number of 1600 and a mesh size of 18 mm emerged as the most effective values for this duct style. Then, the attained results of the Analysis of variance (ANOVA) indicated the F-Criterion's insignificance for Reynolds laminar levels, rendering the laminar viscosity model less relevant within the test section. Additionally, the implementation of the Six sigma procedure (SSP) markedly enhanced both the performance factor (PF) and turbulence intensity, which were observed at 4.90% and 146.77%, respectively. This improvement was most notable in the triangular duct, characterized by rib heights of 66 mm (semi-circle), 66 mm (rectangular), and 38.126 mm (triangular).
... Deringer and Suich [21] developed and presented a method to construct an overall desirability using a desirability function -a popular method which necessarily requires the decision maker's preference information, and fails otherwise. [22] developed the distance function method, while the method of process capability index was developed by [23]; both methods were either too problem-specific or situation-specific [24]. presented a priority-based approach for MRSO which considers the highest importance as the objective function and the rest of the functions are considered as constraints -a method earlier suggested independently by [25], and then by [26]; but this method ultimately depends on the knowledge of what response is of highest importance among the lot. ...
... The distance function approach was proposed by [22]. The distance function is ...
Article
Full-text available
Multi-response surface optimization (MRSO) is a problem that is peculiar to an industrial setting, where the aim of a process engineer is to set his process at operating conditions that simultaneously optimize a set of process responses. In Statistics, several methods have been proffered for tackling problems of this nature. Some of such methods are that of: overlapping contour plots, constrained optimization problem, loss function approach, process capability approach, distance function approach, game theory approach, and the desirability function approach. These, methods are however, not without teething flaws as they are either too problem specific, or require very complex and inflexible routines; little wonder, the method of desirability function has gained popularity especially because it overcomes the latter limitation. In this article, we have proposed and implemented a multivariate-based technique for solving MRSO problems. The technique fused the ideas of response surface methodology (RSM), multivariate multiple regression and Pareto optimality. In our technique, RSM was implemented on an all-maximization problem as a case-study process; in which case, first-order models (FOMs) for the responses were fitted using 2 k factorial designs until the FOMs proved to be inadequate, while uniform precision rotatable central Original Research Article Usen et al.; AJPAS, 11(4): 60-85, 2021; Article no.AJPAS.59570 61 composite design was used to obtain second-order models (SOMs) for the respective responses in the event of model inadequacy of the FOMs. With the implementation of the proposed technique to the case study, optimal operating conditions were obtained, with observations stemming thereof summarized as axioms. The first, second and third axioms respectively stated that: (1) the mid-point of all optimal operating conditions obtained via the proposed technique is Pareto optimal, (2) the mid-point of all optimal responses at the Pareto optimal operating condition is Pareto optimal, and (3) the region bounded by each of the optimal operating conditions from each second-order model (SOM) is a Pareto front.
... Deringer and Suich [21] developed and presented a method to construct an overall desirability using a desirability function -a popular method which necessarily requires the decision maker's preference information, and fails otherwise. [22] developed the distance function method, while the method of process capability index was developed by [23]; both methods were either too problem-specific or situation-specific [24]. presented a priority-based approach for MRSO which considers the highest importance as the objective function and the rest of the functions are considered as constraints -a method earlier suggested independently by [25], and then by [26]; but this method ultimately depends on the knowledge of what response is of highest importance among the lot. ...
... The distance function approach was proposed by [22]. The distance function is ...
Article
Multi-response surface optimization (MRSO) is a problem that is peculiar to an industrial setting, where the aim of a process engineer is to set his process at operating conditions that simultaneously optimize a set of process responses. In Statistics, several methods have been proffered for tackling problems of this nature. Some of such methods are that of: overlapping contour plots, constrained optimization problem, loss function approach, process capability approach, distance function approach, game theory approach, and the desirability function approach. These, methods are however, not without teething flaws as they are either too problem specific, or require very complex and inflexible routines; little wonder, the method of desirability function has gained popularity especially because it overcomes the latter limitation. In this article, we have proposed and implemented a multivariate-based technique for solving MRSO problems. The technique fused the ideas of response surface methodology (RSM), multivariate multiple regression and Pareto optimality. In our technique, RSM was implemented on an all-maximization problem as a case-study process; in which case, first-order models (FOMs) for the responses were fitted using 2k factorial designs until the FOMs proved to be inadequate, while uniform precision rotatable central composite design was used to obtain second-order models (SOMs) for the respective responses in the event of model inadequacy of the FOMs. With the implementation of the proposed technique to the case study, optimal operating conditions were obtained, with observations stemming thereof summarized as axioms. The first, second and third axioms respectively stated that: (1) the mid-point of all optimal operating conditions obtained via the proposed technique is Pareto optimal, (2) the mid-point of all optimal responses at the Pareto optimal operating condition is Pareto optimal, and (3) the region bounded by each of the optimal operating conditions from each second-order model (SOM) is a Pareto front.
... [4]. On the other hand, Khuri and Conlon (1981), used a method based on a multivariate regression model [5]. In addition, Su and Tong (1997), have tried to solve multiple response problem with the help of Principal Component Analysis (PCA) [6]. ...
... [4]. On the other hand, Khuri and Conlon (1981), used a method based on a multivariate regression model [5]. In addition, Su and Tong (1997), have tried to solve multiple response problem with the help of Principal Component Analysis (PCA) [6]. ...
Article
Full-text available
Multiple response is a widely used method of increasing product quality, optimizing cost and time in industry. However, technological developments and processes are becoming more and more complex, which means that more than one response is effective rather than a single response, in product or process optimization. The Response Surface Methodology (RSM) can be used to optimize a single response or multiple responses. It is known that when there are numerous responses, it is difficult and complex to optimize responses simultaneously. Data Envelopment Analysis (DEA) is a statistical approach where multiple inputs and multiple outputs, regardless of how many they are can simultaneously be optimized. For this reason, in this study Data Envelopment Analysis (DEA) technique was applied in combination with The Response Surface Methodology (RSM) and this enabled us to optimize more than one response concurrently.
... An appropriate solution must meet some tradeoff requirement between the several conflicting quality characteristics. Many researchers have studied the multiresponse optimization problems (Khuri and Conlon 1981;Ortiz et al. 2004;Shah, Montgomery, and Carlyle 2004;Vining 1998;Wang et al. 2016;Yadav et al. 2014). Khuri and Conlon (1981) proposed an algorithm to optimize multiple responses which depended on the same controllable variables. ...
... Many researchers have studied the multiresponse optimization problems (Khuri and Conlon 1981;Ortiz et al. 2004;Shah, Montgomery, and Carlyle 2004;Vining 1998;Wang et al. 2016;Yadav et al. 2014). Khuri and Conlon (1981) proposed an algorithm to optimize multiple responses which depended on the same controllable variables. Vining (1998) presented a mean squared error method that considered the process economics and the correlation among different responses. ...
Article
In practice, engineers seek to find reasonable solutions for complex and unstructured problems, which are common in many areas. The workable solutions for these problems are never a one-shot experiment and data analysis procedure. Rather, the proper solution for these problems requires an inductive-deductive process which involves a series of experiments. To teach engineers the sequential learning strategy in solving complex problems, this article presents a case study on the startup of an ethanol–water distillation column that illustrates the scientific process of response surface methodology. The goal of this experiment is generally to find a good, robust solution that produces high grade concentration of ethanol with maximum profit. This case illustrates the sequential application of response surface methodology and consists of an initial fractional factorial design, a steepest ascent design, a full factorial design, and a central composite face-centered cube design. The analysis of the data in the previous steps gives engineers a guidance about the design of experiment in the next step. This study uses the desirability function approach to obtain a compromise optimization between the concentration of ethanol and the profit, which gives a robust solution to the complex problem. Finally, we conduct appropriate confirmation experiments to verify the optimization results. The case study emphasizes the importance of sequential nature and provides a useful guidance for engineers to solve complex problems.
... The optimal composition of HPMC and PVP was determined according to the distance function method defined in Equation (2) ( Khuri & Conlon, 1981): ...
Article
Full-text available
Model-informed drug development is an important area recognized by regulatory authorities and is gaining increasing interest from the generic drug industry. Physiologically based biopharmaceutics modeling (PBBM) is a valuable tool to support drug development and bioequivalence assessments. This study aimed to utilize an artificial neural network (ANN) with a multilayer perceptron (MLP) model to develop a sustained-release matrix tablet of metformin HCl 500 mg, and to test the likelihood of the prototype formulation being bioequivalent to Glucophage® XR, using PBBM modeling and virtual bioequivalence (vBE). The ANN with MLP model was used to simultaneously optimize 735 formulations to determine the optimal formulation for Glucophage® XR release. The optimized formulation was evaluated and compared to Glucophage® XR using PBBM modeling and vBE. The optimized formulation consisted of 228 mg of hydroxypropyl methylcellulose (HPMC) and 151 mg of PVP, and exhibited an observed release rate of 42% at 1 h, 47% at 2 h, 55% at 4 h, and 58% at 8 h. The PBBM modeling was effective in assessing the bioequivalence of two formulations of metformin, and the vBE evaluation demonstrated the utility and relevance of translational modeling for bioequivalence assessments. The study demonstrated the effectiveness of using PBBM modeling and model-informed drug development methodologies, such as ANN and MLP, to optimize drug formulations and evaluate bioequivalence. These tools can be utilized by the generic drug industry to support drug development and biopharmaceutics assessments.
... Many studies applying Design of Experiments (DOE) to optimize processes, especially processes involving multiple responses, have neglected the quality of the model obtained (Ch'ng et al., 2005;Derringer & Suich, 1980;Khuri & Conlon, 1981;Pang et al., 2020;Pinto & Pereira, 2021). This one examines the use of genetic programming to obtain models with higher predictability than those obtained by Ordinary Least Squares (OLS), and the use of models with multiple responses in the optimization process. ...
Article
Full-text available
Purpose: This work aims to analyze and compare the performance between the Ordinary Least Squares (OLS) method executed in Minitab (v. 17) and the genetic programming performed in Eureqa Formulize (v. 1.24.0). Theoretical reference: Obtaining a model that mathematically describes the relationship between the independent variable and the response variable is essential to optimizing the process. The model can be described as an approximate representation of the real system or process, while the modeling process is a balance between simplicity and accuracy (X. Chen et al., 2018; Gomes et al., 2019; Sampaio et al., 2022; A. R. S. Silva et al., 2021). Method: An Evaluation of the best method for constructing mathematical models was performed using the Adjusted Coefficient of Determination (Radj2) and Akaike's Information Criterion. Results and conclusion: The comparison between the use of the methods showed the superiority of genetic programming over OLS in the construction of mathematical models. Originality/Value: Genetic Programming produces mathematical models that are sometimes differentiated when several replicates are performed, but always with similar explanatory power and with biased characteristic that does not affect in any way the quality of prediction of the dependent variable being studied.
... In the past decades, a large number of different metamodels (or surrogate models) have been proposed. The common metamodels include radial basis function (RBF) (Park et al. 1991;Dyn et al. 1986), polynomial regression (PR) (Peixoto et al. 1990;Khuri 1981), Kriging (KRG) (Kleijnen et al. 2009;Oliver 1990), Gaussian process (Kocijan 2004;Bonilla et al. 2007), support vector regression (Smola 2004;Wu et al. 2003), neural networks (Bishop 1994;Hagan and Demuth 1999) and multivariate adaptive regression splines (Friedman 1991). However, it is worth noting that the most appropriate metamodel for approximations of different objective functions or responses in engineering problems is different. ...
Article
Full-text available
The optimization of the engineering system is becoming more and more significant as time progresses, and a large number of metamodel-based multi-objective optimization methods have been proposed during the past decades. For a metamodel-based multi-objective optimization method, the accuracy of the solution is greatly influenced by the prediction accuracy of the metamodels. This study developed a novel metamodel-based multi-objective optimization method using an adaptive multi-regional ensemble of metamodels (AMEM) to get an optimization method with higher accuracy and efficiency. Two strategies were employed in this new metamodel-based multi-objective optimization method. One strategy is that several stand-alone metamodels, i.e., polynomial regression (PR), radial basis function (RBF), and Kriging (KRG) were combined into an ensemble of metamodels (EM) using the weight sum approach. The other strategy is that new design points were dynamically added into various local regions where the Pareto optimal designs are located, and each basis model and its weight factors were regenerated simultaneously, which adaptively improved the accuracy of the constructed adaptive multi-regional ensemble of metamodels. Besides, two novel approaches for selecting new design points from Pareto optimal set were proposed in this study. The performance of this novel metamodel-based multi-objective optimization method was evaluated using twelve mathematical functions and two practical engineering optimization problems. The twelve mathematical functions were typically selected from previous studies, and the two practical engineering optimization problems are crashworthiness optimization of triply periodic minimal surface (TPMS)-filled tube (one design variable) and buffer performance optimization of airbags used in the re-entry capsule (two design variables). To compare the advantages of this method, the optimization of these problems was also implemented by common metamodel-based methods. The results showed that the adaptive multi-regional ensemble of metamodels-based multi-objective optimization method is more accurate than the common methods in both mathematical functions and engineering optimization problems. In addition, the adaptive multi-regional ensemble of metamodels-based multi-objective optimization method has better efficiency than the common method, especially in practical engineering optimization problems.
... The hyperparameter response function F D i is the misclassification rate for the corresponding i metric. Thus, the genetic algorithm selects the optimal hyperparameter that maximizes the six validation metrics [37][38][39]. In the context of a CNN model, a fitness function, based on six metrics, determines how well a particular set of model parameters performs on a skin lesion classification task. ...
Article
Full-text available
Examining and predicting skin cancer from skin lesion images is challenging due to the complexity of the images. Early detection and treatment of skin lesion disease can prevent mortality as it can be curable. Computer-aided diagnosis (CAD) provides a second opinion for dermatologists as they can classify the type of skin lesion with high accuracy due to their ability to show various clinical identification features locally and globally. Convolutional neural networks (CNNs) have significantly improved the performance of CAD systems for medical image segmentation and classifications. However, tuning CNNs are challenging since the search space of all possible hyperparameter configurations is substantially vast. In this paper, we adopt a genetic algorithm to automatically configure a CNN model for an accurate, reliable, and robust automated skin lesion classification for early skin lesion diagnosis. The optimized CNN model uses four public datasets to train and be able to detect abnormalities based on skin lesion features in different orientations. The model achieves the best scores for each of the DICE coefficients, precision measure, and F-score. These scores compare better than other existing methods. Considering the success of this optimized model, it could be a valuable method to implement in clinical settings.
... Due to the complexity of struvite formation conditions, it is often difficult to reconcile multiple response optimality conditions. At present, many studies have been conducted on multi-objective optimization methods, including the Distance Function Method (Khuri and Conlon, 1981), the Multivariate Loss Method (Ames et al., 1997), and the Dual-response Surface Method (Tang and Xu, 2002), but they are not concise enough for optimization with few objectives. The Desired Function Approach (DFA) is a common multi-objective optimization method based on converting multiple responses into a single response, avoiding the problem of non-uniformity in optimality conditions (Kumar et al., 2022;Maamoun et al., 2020). ...
Article
Full-text available
The struvite method has been widely used to recover N and P from wastewater. However, the drawbacks of alkali consumption and small crystallization are not negligible. Therefore, alkaline porous carrier materials are greatly desired to enhance struvite crystalline precipitation and maintain suitable pH values. In this study, using red mud (RM) as a carrier, MgO was loaded onto RM by co-precipitation method to prepare an N and P recovery material (MgO-RM). A response surface methodology based on the Box-Behnken design was used to explore the effects of the factors on N and P recovery. The multi-objective optimization of the recovery process was carried out using the desirability function approach to achieve an economically feasible recovery. Characterizations, including SEM-EDS, BET, FTIR, XRD, and XPS, were carried out to explore the recovery mechanism. The results demonstrated that the nano-sized MgO was well deposited on the RM surface, resulting in a larger specific surface area and greater reactivity of the MgO-RM. Using RM as a carrier significantly increased the struvite crystal size, and the MgO-RM could maintain the pH value of the solution in a suitable range for struvite growth. Under optimal conditions (dosage=3.5 g/L, N/P=1.8, and pH value=3.4), the maximum N and P recovery capacity by MgO-RM was 57.23 mg/g and 128.05 mg/g. The recovery process may involve coupled reactions between physical adsorption, ion exchange, coordination exchange, and chemical precipitation. Struvite produced by chemical precipitation is the main recovery mechanism. Keywords: Struvite; Red mud; Multi-objective optimization; Response surface methodology; Nitrogen and phosphorus recovery
... However, there is no commonly agreed definition of the cost matrix. Some researchers, such as Khuri and Conlon (1981), suggest using the GD for multi-response optimization. The GD takes the generalized martingale distance between the response and the target as the objective function. ...
Article
Full-text available
In production design processes, multiple correlated responses with different distributions are often encountered. The existing literature usually assumes that they follow normal distributions for computational convenience, and then analyzes these responses using traditional parametric methods. A few research papers assume that they follow the same type of distribution, such as the t-distribution, and then use a multivariate joint distribution to deal with the correlation. However, these methods give a poor approximation to the actual problem and may lead to the recommended settings that yield substandard products. In this article, we propose a new method for the robust parameter design that can solve the above problems. Specifically, a semiparametric model is used to estimate the margins, and then a joint distribution function is constructed using a multivariate copula function. Finally, the probability that the responses meet the specifications simultaneously is used to obtain the optimal settings. The advantages of the proposed method lie in the consideration of multiple correlation patterns among responses, the absence of restrictions on the response distributions, and the use of nonparametric smoothing to reduce the risk of model misspecification. The results of the case study and the simulation study validate the effectiveness of the proposed method.
... Since directly optimizing multiple responses is generally intractable, many creative approaches have been proposed to form MRO into a single objective optimization model. These approaches include the goal programming approach (Kazemzadeh et al., 2008), generalized distance approach (Khuri & Conlon, 1981), compromise programming approach (Costa et al., 2012a), probabilistic approach (Peterson et al., 2009), loss function approach (Taguchi, 1986;Pignatiello, 1993), and desirability function approach (Derringer, 1994). ...
Article
Full-text available
This paper proposes a robust method for multi-response optimization (MRO) considering the location effect, dispersion effect, and model uncertainty simultaneously. We propose a multi-objective optimization model for MRO that simultaneously maximizes the satisfaction degrees of the local and dispersion effects. Specifically, a robust desirability function is used to model the overall satisfaction degree of multiple responses with respect to the two effects. This desirability function evaluates a solution’s performance considering the confidence intervals predicted by the regression models of location and dispersion effects, and thus, can address the problem of model uncertainty. The proposed multi-objective model yields a set of non-dominated solutions that approximate the Pareto front instead of one single solution, which provides more flexibility for decision makers to select the final best compromise solution based on their preferences. To solve the model, a hybrid multi-objective optimization algorithm called NSGAII-DMS that combines non-dominated sorting genetic algorithm II (NSGA-II) with direct multi-search (DMS) is proposed. NSGAII-DMS uses the search mechanism of NSGA-II during the early evolutionary phase to quickly find a set of non-dominated solutions and then uses the search mechanism of DMS to further tune the found non-dominated solutions. Two test examples have shown that the proposed multi-objective MRO model can produce a set of robust solutions by considering the location effect, dispersion effect, and model uncertainty. Further analyses illustrate that NSGAII-DMS shows significantly better search performance than several well-known benchmark multi-objective optimization algorithms, which include NSGA-II, SPEA2, MOEA/D, and DMS.
... Therefore, the latest developments on split-plot design in a robust design framework are attributable, inter alia, to Robinson, Pintar, Anderson-Cook, and Hamada (2012) and Tan and Wu (2013), where two Bayesian approaches are introduced in order to apply the split-plot in a robust design context and also evaluate non-normally distributed response variables. Nevertheless, a further challenge is the consideration of the multiple response case starting from the seminal papers of Derringer and Suich (1980) and Khuri and Conlon (1981). In this direction, many authors have contributed towards improving this relevant topic in order to solve the problem of finding an unique solution which could be an optimal compromise following process issues, noises, weighting and computational aspects (Vining and Myers 1990;Lin and Tu 1995;Tang and Xu 2002;Del Castillo 2007). ...
Article
Full-text available
This paper deals with a proposal for joint modeling and process optimization for split-plot designs analyzed through mixed response surface models. It addresses the following main issues: i) the building of a joint mixed responsesurface model for a multiple response situation, by defining only one response through which specific coefficients areincluded for studying the association among the responses; ii) the considering of fixed as well as random effects within a joint modeling and optimization context; iii) the achievement of an optimal solution by involving specific as well as common coefficients for the responses. We illustrate our contribution through a case-study related to a split-plot design on electronic components of printed circuit boards (PCBs); we obtain satisfactory results by confirming the validity of thiscontribution, where the qualitative factor PCB is also studied and optimized.
... Combing these four project health factors to one single output value provides the necessary output side of the optimization. The approach of converting multiple responses to a single response is based on the idea described by Khuri and Conlon [14]. However, a complex vector-distance based model did not seem necessary for the simple goal of combining result variables. ...
Article
Full-text available
This article examines the use of multivariate optimization, as a method, to improve the success of project management tasks. The optimization approach is founded on 8 selected continuous project processes of the execution phase of the PMBOK project management framework. Using a custom-developed, online portal, 103 data sets were collected from project management practitioners, indicating their individual distribution of effort and focus on the selected project processes, as well as, the current health status of their project. Based on this dataset, stepwise regression combined with optimization applying a sequential quadratic programming method was used to define the distribution of project process relevance thereby maximizing the project health.
... Various techniques have been reported in the literature to simultaneously optimize multiple output responses, including the distance function approach [45], loss function approach [46], and desirability function approach [47]. The desirability function approach was initially proposed by Harrington [48] in the form of exponential functions and later modified by Derringer and Suich [47] and Del Castillo et al. [49]. ...
Article
Full-text available
This paper presents a systematic and efficient design approach for the two degree-of-freedom (2-DoF) capacitive microelectromechanical systems (MEMS) accelerometer by using combined design and analysis of computer experiments (DACE) and Gaussian process (GP) modelling. Multiple output responses of the MEMS accelerometer including natural frequency, proof mass displacement, pull-in voltage, capacitance change, and Brownian noise equivalent acceleration (BNEA) are optimized simultaneously with respect to the geometric design parameters, environmental conditions, and microfabrication process constraints. The sampling design space is created using DACE based Latin hypercube sampling (LHS) technique and corresponding output responses are obtained using multiphysics coupled field electro–thermal–structural interaction based finite element method (FEM) simulations. The metamodels for the individual output responses are obtained using statistical GP analysis. The developed metamodels not only allowed to analyze the effect of individual design parameters on an output response, but to also study the interaction of the design parameters. An objective function, considering the performance requirements of the MEMS accelerometer, is defined and simultaneous multi-objective optimization of the output responses, with respect to the design parameters, is carried out by using a combined gradient descent algorithm and desirability function approach. The accuracy of the optimization prediction is validated using FEM simulations. The behavioral model of the final optimized MEMS accelerometer design is integrated with the readout electronics in the simulation environment and voltage sensitivity is obtained. The results show that the combined DACE and GP based design methodology can be an efficient technique for the design space exploration and optimization of multiphysics MEMS devices at the design phase of their development cycle.
... Most of the time, such a relationship is unknown but can be approached by a polynomial model. In most cases, a second-degree polynomial model, with the following form, is used [17]: ...
Article
This paper describes an optimization approach of a novel configuration of a once-through multi-stage flash (MSF-OT) desalination plant. The system integrates a thermal vapor compression (TVC) unit within the conventional MSF-OT configuration. Three objectives controlling the operating cost of the installation were considered. The first is to maximize plant capacity production. The second is to minimize thermal energy consumption. The third is to minimize the feed seawater flow rate, which reduces electrical energy and chemical additives consumption. Solving the multi-objective optimization problem, using solvers of MATLAB software, led to obtaining a large number of optimal operating parameters of the MSF-OT/TVC plant. The comparison between the current operating state of the desalting installation using the MSF-OT process, and optimal operating states of the studied installation, showed a significant improvement in parameters controlling the operating cost of the installation. This improvement corresponded to a reduction of feed flow rate and motive steam flow rate of 23% and 7.3%, respectively.
... A number of techniques are available for the generation of surrogate models. Polynomial regression (Khuria and Conlona, 1981), artificial neural networks (Skinner and Broughton, 1995;Cichocki and Unbehaven, 1993), radial basis function (RBF) (Dyn et al., 1986), and Kriging ) are among the most common. This paper employs artificial neural networks (ANN). ...
Experiment Findings
Full-text available
The problem seeks the identification of parameters with a reduced number of finite element simulations. The methodology is based on the construction of a surrogate model using artificial neural networks. Towards that end, we introduced a sampling scheme that identifies the next point in the search space during the optimization process. In other words, the sample space is built in an interleaved manner at the same time that the surrogate model is built. Optimization is achieved through the differential evolution algorithm to identify the parameters of the material that make the nanoindentation curve obtained with the surrogate model coincide and the original curve obtained with experimental data. Specifically, we focused on the nanoindentation of an aluminum matrix in a die-cast aluminum alloy.
... China are tested to affirm the reliability of the proposed model compared with six other widely used prediction models, which are the GVM(1,1), NGBM(1,1), FANGBM(1,1), PR(n) (Polynomial regression) (Khuri and Conlon, 1981), ARIMA (autoregressive integrated moving average) (Chatfield and Prothero, 1973) and ANN (artificial neural network) (Park et al., 1991). The other purpose of numerical analysis is to demonstrate the efficacy of the PSO algorithm in comparison with other advanced algorithms, namely, GA (genetic algorithm) (Holland, 1975) and WOA (whale optimization algorithm) (Mirjalili and Lewis, 2016). ...
Article
The hydroelectricity consumption of China is increasing, and hydropower exerts a crucial influence on sustained economic growth. A new method for estimating the hydroelectricity consumption of China is developed through the grey modeling technique. Taking into account the inherent error occurring in the leap from differential to difference in most existing grey serial models, this study constructs an unbiased NGBM(1,1) model based on the nonlinear grey Bernoulli model (NGBM(1,1)), which outperforms other grey benchmark models by adjusting the nonlinear parameter. The structural parameters of the model are deduced from the differential equation directly and therefore, the inherent error is eliminated. Moreover, the nonlinear parameter for the novel model is determined by Particle Swarm Optimization (PSO). Based on hydroelectricity consumption from 2010 to 2018, the novel model is built to predict its volume in the later one-quarter phase of the 13th Five Year Plan of China (2019–2020). The results show that hydroelectricity consumption maintains a continuous increase, exceeding 270 million tonnes oil equivalent (Mtoe) in 2020, while the growth rate decreases to 0.77%. In accordance with these forecasts, suggestions on associated hydropower are provided for policy-makers.
... Multi-objective optimization was further performed, with the aid of GAs. In order to optimize successfully the two output parameters, drug loading efficiency, and particle size, generalized distance function (Khuri & Conlon, 1981) was applied. Zaki, Varshosaz, and Fathi (2015) have also demonstrated the superiority of coupled GA-ANN over RSM in terms of preparation of agar nanospheres and modeling their properties. ...
Chapter
Modeling like a magic predicts the knowns and unknowns inside a “black box,” so there is a need for it in many processes. Nowadays, all of the branches in science employ models to define the characteristics of a system under study. The release process of encapsulated food ingredients could also be modeled using mathematical models to elucidate the underlying mechanisms which help develop new products and minimize the trial and error in processing. Generally, three main mechanisms play important roles in controlled release: diffusion, swelling, and erosion. The mechanistic modeling of controlled release in the literature is usually based on one of these mechanisms. In this chapter, diffusion, swelling, and erosion mechanisms are first described; then controlled release based on each of the mentioned mechanisms is explained; and finally, application of mechanistic modeling in the release process is discussed.
... This effect destabilizes the mathematical models producing errors in the regression coefficients. As a result, estimated models are unable to represent the objective or constraint functions properly [31]- [34]. ...
Article
Full-text available
DMAIC (define, measure, analyze, improve and control) is one of the most utilized methods for guiding practitioners in the decision-making process of quality improvement projects. Industrial processes commonly deal with multiple critical-to-quality (CTQ) characteristics. When these characteristics are correlated, multivariate statistical techniques should be applied. This paper aims to propose a domain- specific Six Sigma method, the MDMAIC (multivariate DMAIC). The new stepwise procedure helps practitioners not only to reduce problem dimension but also to take account of the correlation structure among CTQs during the decision-making process. Principal component analysis has been applied for assessing the measurement system, analyzing process stability and capability, as well as modeling and optimizing multivariate manufacturing processes. A hardened steel turning case has been presented for proposal validation. The result analysis has shown that the MDMAIC was very successful in leading the practitioner during the steps and phases of the quality improvement project. The multivariate capability index of the enhanced process emphasized the substantial economic improvement.
... . A variety of choices is possible for the distance measure í µí¼Œ, Khuri and Conlon [14] have suggested the two following distance functions: ...
Article
Full-text available
This study investigates the simultaneous optimization of multiple correlated responses that involve mixed ordinal and continuous responses. The proposed approach is applicable for responses that have either an all ordinal categorical form are continuous but have different marginal distributions, or when standard multivariate distribution of responses is not applicable or does not exist. These multiple responses have rarely been the focus of studies despite their high occurrence during experiments. The copula functions have been used to construct a multivariate model for mixed responses. To resolve the computational problems of estimation under a high dimension of responses, we have estimated parameters of the model according to a pairwise likelihood estimation method. We adapted the generalized distance approach to determine settings of the factors that simultaneously optimized the mean of continuous responses and desired cumulative categories of the ordinal responses. A simulation study was used to evaluate the performance of the estimators from the pairwise likelihood approach. Finally, we presented an application of the proposed method in a real data example of a semiconductor manufacturing process.
... Response surface design The response surface design [120,214] suggests to model the function to optimize using an approximated known non-linear function. It facilitates the search of the local minima in a faster way. ...
Thesis
Full-text available
In 2015, 360 million people, including 32 million children, were suffering from hearing impairment all over the world. This makes hearing disability a major worldwide issue. In the US, the prevalence of hearing loss increased by 160% over the past generations. However, 72% of the 34 million impaired American persons (11% of the population) still have an untreated hearing loss. Among the various current solutions alleviating hearing disability, hearing aid is the only non-invasive and the most widespread medical apparatus. Combined with hearing aids, assisting listening devices are a powerful answer to address the degraded speech understanding observed in hearing-impaired subjects, especially in noisy and reverberant environments. Unfortunately, the conventional devices do not accurately render the spatial hearing property of the human auditory system, weakening their benefits. Spatial hearing is an attribute of the auditory system relying on binaural hearing. With 2 ears, human beings are able to localize sounds in space, to get information about the acoustic surroundings, to feel immersed in environments... Furthermore, it strongly contributes to speech intelligibility. It is hypothesized that recreating an artificial spatial perception through the hearing aids of impaired people might allow for recovering a part of these subjects' hearing performance. This thesis investigates and supports the aforementioned hypothesis with both technological and clinical approaches. It reveals how certain well-established signal processing methods can be integrated in some assisting listening devices. These techniques are related to sound localization and spatialization. Taking into consideration the technical constraints of current hearing aids, as well as the characteristics of the impaired auditory system, the thesis proposes a novel solution to restore a spatial perception for users of certain types of assisting listening devices. The achieved results demonstrate the feasibility and the possible implementation of such a functionality on conventional systems. Additionally, this thesis examines the relevance and the efficiency of the proposed spatialization feature towards the enhancement of speech perception. Via a clinical trial involving a large number of patients, the artificial spatial hearing shows to be well appreciated by disabled persons, while improving or preserving their current hearing abilities. This can be considered as a prominent contribution to the current scientific and technological knowledge in the domain of hearing impairment.
... Initially the application and the Taguchi method theory were only for single response optimization or one response. But lately the Taguchi method has begun with multirespon, where in the case of multirespon there are several scientists who develop multirespon optimization methods, including Khuri and Conlon (1981) proposing a procedure that can optimize multiple variables simultaneously using a distance function to measure deviation from the ideal optimum value. Garg (2010) developed a method for multirespon cases using the Taguchi method and utility functions. ...
Article
Full-text available
Multirepon optimization in the Taguchi method can be done by using the VIKOR and TOPSIS approach, which are based on the concept that the best chosen alternative not only has the shortest distance from a positive ideal solution, but also has the longest distance from the negative ideal solution. The basic concept of these two methods is to determine the ranking of existing samples by looking at the results of the utility (S), regrets (R) and solution distances as the best alternatives for each sample. This study aims to obtain significant process variables on the brightness and soreness response variables in the envelope making process by using VIKOR and TOPSIS method approaches, and comparing the results of VIKOR and TOPSIS optimization. The results showed that the two methods produced process optimization in setting variables that were not the same. The VIKOR method produced a setting variable namely A3B2C1D1 while the TOPSIS method produced a setting variable A1B1C3D3. Looking at the value of the two methods, the VIKOR method produced a better estimated value of the brightness parameters and TOPSIS produced a better estimate value for the silence parameter.
... This technique author considered to assure small variation in quality loss mean and the standard variation with several multiple responses along with a small overall average loss. Khuri and Conlon (1981) obtained polynomial regression model for the optimizing various responses related to best alternatives selection. In this model author used tools and techniques for optimizing the best solution among given alternatives. ...
Article
Full-text available
In the present scenario an proficient Multi-Criteria Decision Making (MCDM) approach has been predictable for optimization of green electro-chemical machining, because it is a commonly used non-traditional machining process. Electrochemical machining (ECM) has predictable itself as one of the major alternatives to conventional methods for machining hard materials and versatile contours deficient the residual stresses and tool wear. ECM has spacious-ranging purpose in automotive, petroleum, aerospace, textile industries, medical fields, electronic industries etc. The optimization of Green electro-chemical machining is a Multi-Criteria Decision Making (MCDM) problem narrow-minded by several performance criteria and alternatives. These criteria's and alternatives are commonly two types, qualitative and quantitative. These problems are analyzed and identified using many MCDM tools and techniques So to find the best alternative solution of MCDM problems there should be transformed quantitative criteria values into an equivalent single performance index called Multi-attribute Performance Index (MAPI). In this text, present study highlights application of VIseKriterijumska Optimizacija I Kompromisno Resenje (VIKOR) that means: Multicriteria Optimization and Compromise Solution. This Method personalized from MCDM techniques for obtaining the precise result. Detailed methodology of VIKOR method has been described in this paper through a case study.
... Therefore, a predicted strength variable follows a normal probability distribution with the abovementioned mean vector and covariance matrix Khuri & Conlon, 1981;Cornell & Khuri, 1987). ...
Article
Full-text available
The performance of healthcare systems has always been a significant concern of policy makers. Performance in healthcare systems is assessed by multiple criteria, and it is a complex task to optimize the performance, considering the fact that outputs are mostly interrelated and some external factors affect performance of the system. This study aims to apply the Multi-Response Optimization (MRO) method for reliability-based performance optimization of healthcare systems. In this study, we consider all types of response variables so that strength features of the system are optimized against external stresses. The proposed approach in this research is based on the stress-strength model, and stresses are assumed to be normally distributed. Also, we present the implementation results of the proposed approach to a hospital in Iran for reliability-based optimization of the bed capacity planning. The results indicate that the proposed approach is powerful in tackling the inherent complexity of healthcare systems.
... d voltage, electrolyte flow rate and electrolyte concentration, and lower value of tool feed rate. Table 3 also exhibits a comparison of the derived optimal solutions with those achieved while employing the other popular multi-objective optimization approaches, i.e. desirability function method (Derringer and Suich, 1980), distance function method (Khuri. and Conlon, 1981;Jing and Yongfan, 2017) and mean squared error method (Vining, 1998). In all the cases, the achieved MRR and SR values are observed to be worst as compared to those derived using the multivariate quality loss function approach. ...
Article
Full-text available
Due to various added advantages over the conventional material removal processes, non-traditional machining (NTM) processes have now been widely applied in different manufacturing industries. To achieve the desired response values, it is always recommended to operate these NTM processes at their optimal parametric settings. Various single response optimization techniques are already available to determine the optimal combinations of NTM process parameters for achieving maximum or minimum value of a single response. In this paper, a multivariate quality loss function approach is adopted for simultaneous optimization of responses for three NTM processes. It is observed that this approach outperforms the other multi-response optimization techniques, like desirability function, distance function and mean squared error methods with respect to the achieved response values. With modification of the corresponding objective function and constraints of the developed non-linear programming problem, it can be effectively applied to any non-traditional as well as conventional machining process as a multi-objective optimization tool.
... KRG and SVR both refer to a kernel function, rendering the calculation cost unacceptably high [31,34]. PRS is easy to train but generally unable to describe complex relationships [35]. RF accommodates big data with a relatively short learning time, and this method is therefore selected for use in this paper [36,37]. ...
Article
Load prediction of tunnel boring machines (TBMs) is crucial for the design and safe operation of these complex engineering systems. However, to date, studies have mostly used only geological data, but the operation of TBMs also has an important effect on the load, especially its dynamic behavior. With the development of measurement techniques, large amounts of operation data are obtained during tunnel excavation. Mining these heterogeneous in-situ data, including geological data and operation data, is expected to improve the prediction accuracy and to realize dynamic predictions of the load. In this paper, a dynamic load prediction approach is proposed based on heterogeneous in-situ data and a data-driven technique. In this approach, the integration of heterogeneous in-situ data is conducted as follows: i) the geological data are extended to match the scale of the operation data using an interpolation method; ii) the categorical data and numerical data are fused through a proposed encoding method; and iii) the geological data are combined with the operation data according to the location of each operation datum. A data-driven technique, Random forest, is used to construct the prediction model based on the integrated heterogeneous in-situ data. The approach is applied to a collection of heterogeneous in-situ TBM data from a tunnel in China, and the results indicate that the approach can not only accurately predict the dynamic behaviour of the load but can also precisely estimate the statistical characteristics of the load. This work also highlights the applicability and potential of data-driven techniques in the design and analysis of other complex engineering systems similar to TBMs.
... Among these quality characteristics, breaking elongation loss and abrasive surface damage of knitted fabric are significantly reduced by 7.08 and 9.93% respectively. The solutions of the same problem as derived using the multivariate loss function approach are also compared with those obtained while employing desirability function [18], mean squared error [19] and distance function [20] methods, as exhibited in Table 4. This comparison of the derived results also confirms the superiority of multivariate loss function approach over the other state-of-the-art multiobjective optimization methods in identifying the optimal combination of slub yarn parameters. ...
Article
Recent advancements in textile industry have given rise to several spinning techniques, such as ring spinning, rotor spinning etc., which can be used to produce a wide variety of textile apparels so as to fulfil the end requirements of the customers. To achieve the best out of these processes, they should be utilized at their optimal parametric settings. However, in presence of multiple yarn characteristics which are often conflicting in nature, it becomes a challenging task for the spinning industry personnel to identify the best parametric mix which would simultaneously optimize all the responses. Hence, in this paper, the applicability of a new systematic approach in the form of multivariate quality loss function technique is explored for optimizing multiple quality characteristics of yarns while identifying the ideal settings of two spinning processes. It is observed that this approach performs well against the other multi-objective optimization techniques, such as desirability function, distance function and mean squared error methods. With slight modifications in the upper and lower specification limits of the considered quality characteristics, and constraints of the non-linear optimization problem, it can be successfully applied to other processes in textile industry to determine their optimal parametric settings.
... 18) The simulation parameters correlated with experiments were estimated based on the standardized Euclidian distance between the simulation and experimental values, and penalty functions. 19,20) In this study, the W ...
Article
The influence of granule size on simulation parameters and residual shear stress in tablets was determined by combining the finite element method (FEM) into the design of experiments (DoE). Lactose granules were prepared using a wet granulation method with a high-shear mixer and sorted into small and large granules using sieves. To simulate the tableting process using the FEM, parameters simulating each granule were optimized using a DoE and a response surface method (RSM). The compaction behavior of each granule simulated by FEM was in reasonable agreement with the experimental findings. Higher coefficients of friction between powder and die/punch (μ) and lower by internal friction angle (αy) were generated in the case of small granules, respectively. RSM revealed that die wall force was affected by αy. On the other hand, the pressure transmissibility rate of punches value was affected not only by the αy value, but also by μ. The FEM revealed that the residual shear stress was greater for small granules than for large granules. These results suggest that the inner structure of a tablet comprising small granules was less homogeneous than that comprising large granules. To evaluate the contribution of the simulation parameters to residual stress, these parameters were assigned to the fractional factorial design and an ANOVA was applied. The result indicated that μ was the critical factor influencing residual shear stress. This study demonstrates the importance of combining simulation and statistical analysis to gain a deeper understanding of the tableting process. Graphical Abstract Fullsize Image
Article
This article presents a case study re-analysis of a complex response-factor data set involving a split-plot design with blocking for two quality responses. The analysis presented herein makes use of multivariate predictive distributions to both optimize and quantify the risk of meeting specifications. This article shows how a modern approach using predictive distributions can provide deeper insight and improved process optimization over the use of classical response surface methodology tools such as “overlapping means” plots and (mean-based) desirability functions. It is shown how the R and the Stan programming languages are used to facilitate the analysis.
Book
Full-text available
Six Sigma has come a long way since its introduction in the mid-1980s. Our association with the subject began in the 1990s when a number of multinational corporations in Singapore began to deploy Six Sigma in pursuit of business excellence. Prior to this, some of us had been working on statistical quality improvement techniques for more than two decades. It was apparent at the outset that the strength of Six Sigma is not in introducing new statistical techniques as it relies on well-established and proven tools; Six Sigma derives its power from the way corporate mindsets are changed towards the application of statistical tools, fromtop business leaders to those on the production floor. We are privileged to be part of this force for change through our involvement in Six Sigma programs with many companies in the Asia-Pacific region.
Article
This paper presents a multi-objective optimization algorithm that combines Normal Boundary Intersection method with response surface models of equimax rotated factor scores in order to simultaneously optimize multiples sets of means and variances of manufacturing processes characteristics. The algorithm uses equimax factor rotation to separate means and variances in individual and uncorrelated functions and afterwards combines them in a mean squared error function. These functions are then optimized using Normal Boundary Intersection method generating a Pareto frontier. The optimal solutions found are then filtered according to a 95% non-overlapping confidence ellipses for the predicted values of the responses and posteriorly they are assessed by a Fuzzy decision-maker index established between the volume of each confidence ellipsoid and the Mahalanobis distance between each Pareto point and its individual optima for a given weight. In order to illustrate the practical implementation of this approach, two cases involving the multi-objective optimization of the hardened steel turning process were considered: (a) the AISI 52100 hardened steel turning with CC6050 mixed ceramic inserts and (b) the AISI H13 hardened steel turning with CC 670 mixed ceramic tools. For both cases, the best setup for cutting speed (V), feed rate (f) and depth of cut (d) were adjusted to find the minimal process cost (Kp) and the maximal tool life (T), both responses with minimal variance. The suitable results achieved in these case studies indicate that the proposal may be useful for similar manufacturing processes.
Chapter
Simultaneous optimisation of multiple quality characteristics or responses is so-called ‘multiple response optimisation (MRO)’ problem. As the responses are often correlated, trade-off solutions are inevitable for multiple response optimisation problems. Thus, multiple efficient or non-dominated solutions are sought for specific multiple response optimisation problems. In addition, the solution quality of multiple response optimisation problems also depends on the accuracy of empirical models or response surface (RS) models. The higher the response surface model accuracy, the better is the solution quality. However, point estimate prediction from an empirical response surface model will always have inherent uncertainties. Two such prominent uncertainties are attributed to model parameters and response uncertainties. Thus, obtaining an implementable best process setting condition is always a challenging task for researchers. Re-searchers suggested various solution approaches to derive improved and efficient solutions in the context of multiple response optimisation. To further contribute to this field of study, this chapter proposes a multi-objective particle swarm optimisation-based solution approach for mean-responses optimisation, considering the above-mentioned uncertainties. To assist the decision-maker (DM), a ranking strategy is suggested based on two multi-criteria decision-making (MCDM) techniques. Step-wise implementation of the proposed approach is illustrated using varied mean-response multiple response optimisation cases. Implementation results are also contrasted with existing best-reported solutions (derived using various approaches). Comparative results indicate that the proposed approach can derive the best efficient solutions with higher nondominated frequency and mean responses closer to target values.
Thesis
Full-text available
O elevado consumo de materiais poliméricos e o consequente aumento na geração de resíduos sólidos demandam à criação de rotas alternativas de reciclagem. Muitos estudos na área de materiais têm sido voltados para o reaproveitamento de resíduos como fase dispersa em materiais compósitos, principalmente os cimentícios, aliando questões ambientais, econômicas e tecnológicas. Dentre os materiais poliméricos, o polietileno de alta densidade (PEAD) é um dos mais produzidos, porém sua reciclagem não é feita de maneira expressiva e a pesquisa ainda se mostra incipiente. Sendo assim, esta tese de doutorado tem por otimizar a incorporação de resíduos poliméricos em compósitos cimentícios por meio de arranjos de misturas entre matriz cimentícia e agregados de PEAD e de quartzo, analisando sua influência na microestrutura e propriedades físico-mecânicas.. A metodologia DoE foi utilizada para planejar um experimento de misturas envolvendo partículas de PEAD, quartzo e cimento e o método desirability para fazer sua otimização. A relação água/cimento de 0,5 foi mantida constante, a relação agregado/cimento variou de 3,75 a 5,25 e foi definido um percentual mínimo de 30% de partículas de PEAD no experimento. Pôde-se notar que há uma baixa adesão entre as partículas de PEAD e a matriz cimentícia. Houve uma redução na resistência à compressão (até 80%), resistência à flexão (até 75%), velocidade de ultrassom (até 55%), módulo dinâmico (até 90%), densidade volumétrica e densidade aparente (até 50% para ambas) e aumento na porosidade (até 60%) e absorção de água (até 200%) à medida em que houve o aumento percentual da incorporação das partículas de PEAD. Foram otimizadas todas as variáveis respostas, sendo a proporção 0,21 de matriz cimentícia, 0,24 de PEAD e 0,55 de quartzo o ótimo individual, exceto para as variáveis densidades volumétrica e aparente em que o ótimo foi de 0,16 de matriz cimentícia e 0,84 de PEAD. Foram traçados seis diferentes cenários para otimização multiobjectivo em que se variou a importância dada as variáveis respotas, obtendo-se desirability individuais e compostas satisfatórias para todos os casos, o que mostra a possibilidade de incorporação de partículas de PEAD para diferentes objetivos. Os resultados revelam uma nova rota de reciclagem de PEAD em aplicações de microconcreto não estruturais.
Article
The mixing solution of wollastonite-based brushite cement is a phosphoric acid solution containing metallic cations and borax. This work complements a previous study devoted to the influence of the H3PO4 concentration, Ca/P and liquid-to-solid (l/s) ratios on the setting and hardening process of the binder by providing new insight into the role of aluminum and boron. Boron retards the setting and decreases the heat released during the process. It also contributes to reduce the macroporosity of the hardened material but yields to poor compressive strength. With aluminum in the mixing solution, the mechanical properties are greatly improved thanks to the precipitation of an amorphous aluminophosphate which increases the density of the cement matrix. But aluminum alone leads to fast setting. A joint addition of boron and aluminum to the mixing solution makes it possible to get a material with optimized properties both in fresh and hardened states.
Article
To solve multiple response optimization problems that often involve incommensurate and conflicting responses, a robust interactive desirability function approach is proposed in this article. The proposed approach consists of a parameter initialization phase and calculation and decision-making phases. It considers a decision maker's preference information regarding tradeoffs among responses and the uncertainties associated with predicted response surface models. The proposed method is the first to consider model uncertainty using an interactive desirability function approach. It allows a decision maker to adjust any of the preference parameters, including the shape, bound, and target of a modified robust function with consideration of model uncertainty in a single and integrated framework. This property of the proposed method is illustrated using a tire tread compound problem, and the robustness of the adjustments for the approach is also considered. The new method is shown to be highly effective in generating a compromise solution that is faithful to the decision maker's preference structure and robust to uncertainties associated with model predictions.
Article
Full-text available
En este trabajo se desarrolla una propuesta para comparar diferentes metodologías de optimización multirespuesta aplicadas a superficies de respuesta (RSM) en diseños experimentales, como herramientas de solución a problemas presentes principalmente en el área industrial. Se estudian las siguientes metodologías: función de deseabilidad (DES), MOORA (MOO), TOPSIS (TOP), MULTIMOORA (MMO), MOORA AD (MAD), TOPSIS AD (TAD) y redes neuronales multicapa (con los paquetes Neuralnet (NEU) y Nnet (NET)). Cada una de estas técnicas se aplican a tres casos de interés comercial o industrial con diferentes diseños experimentales (Taguchi, Box-Behnken y Diseño Central Compuesto), en un estudio de simulación Monte Carlo donde se considera como factores las diferentes técnicas comparadas, el tipo de diseño experimental y diferentes escenarios de correlaciones. Se comparan las técnicas por medio de una métrica que evalúa la distancia de cada respuesta estimada respecto a su valor ideal o deseado, con el fin de analizar las ventajas y desventajas de cada método. Los resultados obtenidos son consistentes en cada uno de los casos abordados y se concluye que las redes neuronales Neuralnet (NEU) son el mejor método, en segundo lugar, la función de deseabilidad (DES) y las redes neuronales Nnet (NET). Además, se encontró que el método MOORA AD (MAD) propuesto, tiene un excelente desempeño en un caso de estudio particular. Se recomienda en estudios comparativos futuros, emplear más tipos de diseños experimentales y aplicar más técnicas de optimización multirespuesta disponibles, con el fin de obtener mayor información sobre los escenarios y condiciones que muestren mejor desempeño y realizar sugerencias de implementación más puntuales. Todo el desarrollo se realizó en R (R Core Team, 2019) con el fin de promover el uso de software libre con fines de investigación o desarrollo comercial.
Thesis
Full-text available
This work focuses on the application of the response surface methodology to optimize microwave components. Components are seen as black boxes receiving input parameters and providing outputs. The relationship between the inputs (factors) and outputs (responses) is described by a polynomial obtained by means of a statistical study. This last one is conducted in accordance with the responses surfaces methodology. We make the required mathematical and statistical analyses to validate the polynomial models used and then a multicriteria optimization which consists in transforming all the responses in individual desirability functions. An overall desirability function defined from these individual desirability functions associated with various objectives, must be then maximized. The use of a minimization algorithm (BFGS) from the models provided by the designs of experiments yielded a large number of local minima with restricted domain of input factors to identify a solution of high quality. When there are correlations between factors, it is impossible to implement a classic design to solve an optimization problem, the geometry of the experimental filed loses its regularity, optimal designs must be used. We study in the latter part of this thesis the criteria on which we based to build an optimal design illustrated by examples of optimization of two kinds of resonators.
Article
Purpose The purpose of this paper is to address three key objectives. The first is the proposal of an enhanced multiobjective optimisation (MOO) solution approach for the mean and mean-variance optimisation of multiple “quality characteristics” (or “responses”), considering predictive uncertainties. The second objective is comparing the solution qualities of the proposed approach with those of existing approaches. The third objective is the proposal of a modified non-dominated sorting genetic algorithm-II (NSGA-II), which improves the solution quality for multiple response optimisation (MRO) problems. Design/methodology/approach The proposed solution approach integrates empirical response surface (RS) models, a simultaneous prediction interval-based MOO iterative search, and the multi-criteria decision-making (MCDM) technique to select the best implementable efficient solutions. Findings Implementation of the proposed approach in varied MRO problems demonstrates a significant improvement in the solution quality in worst-case scenarios. Moreover, the results indicate that the solution quality of the modified NSGA-II largely outperforms those of two existing MOO solution strategies. Research limitations/implications The enhanced MOO solution approach is limited to parametric RS prediction models and continuous search spaces. Practical implications The best-ranked solutions according to the proposed approach are derived considering the model predictive uncertainties and MCDM technique. These solutions (or process setting conditions) are expected to be more reliable for satisfying customer specification compared to point estimate-based MOO solutions in real-life implementation. Originality/value No evidence exists of earlier research that has demonstrated the suitability and superiority of an MOO solution approach for both mean and mean-variance MRO problems, considering RS uncertainties. Furthermore, this work illustrates the step-by-step implementation results of the proposed approach for the six selected MRO problems.
Article
Full-text available
Temel bileşenler analizi büyük değişken kümesi içeren verinin boyutunu orijinal veride bilgi kaybına yol açmadan küçük kümeye indirger. Deneysel çalışmalarda çoklu korelasyona sahip yanıt değişkenler sıklıkla bulunmaktadır. Temel bileşenler analizi yanıt değişkenler kümesindeki bilginin çoğunu içeren bir genel yanıt değişken elde etmek için kullanılabilir. Genel yanıt değişken elde edildikten sonra, birkaç bağımsız değişkenden etkilenen yanıt değişkeni optimize etmek için yanıt yüzey yöntemi kullanılabilir. Bu çalışmada fitik asit, fitat fosforu, toplam fosfor gibi korelasyonlu yanıt değişkenler içeren ekmek üretim işlemi verisine temel bileşenler analizi uygulanmıştır. Temel bileşenler analizi ile oluşturulan yeni yanıt değişkeni optimize etmek için yanıt yüzey yöntemi uygulanmış ve en uygun koşullar elde edilmiştir.
Article
Multiple responses optimization (MRO) consists in the search for the best settings in an problem with conflicting responses. MRO is performed following the steps: experimental design; experimental data gathering; mathematical models building; statistical validation of models; agglutination of the models responses in only one function to be optimized; optimization of agglutinated function; experimental validation of the best conditions. This work selected two MRO cases from literature aiming to compare two methods of mathematical models building and two agglutinating functions to assess the best one among the four possible combinations. The methods used in mathematical models building were the ordinary least squares performed in Minitab (v. 17) and genetic programming performed in Eureqa Formulize (v. 1.24.0). The assessment of the best method for building mathematical models was performed using the Akaike Information Criterion. The responses agglutination were performed using the desirability and modified desirability functions. In all MRO cases, the optimization step was performed by generalized reduced gradient method on Microsoft Excel TM software. The average percentage distance between predicted and experimental results was used to both assess the best agglutination function and verify the effect of the method used in the building of the mathematical models about its fitness to estimate the best condition close to that one obtained on experimental validation step. The obtained results suggest as the better strategy for multiple responses optimization the use, jointly, of genetic programming to mathematical models building and the modified desirability function to responses agglutination.
Article
This study used Response Surface Methodology to investigate the effects of moisture content, temperature and applied pressure on the density, durability and impact resistance of millet bran briquettes. Furthermore, this study analyzed the optimum conditions for preparing the millet bran briquette using Generalized Distance Function. It was found that the density, durability and impact resistance of the millet bran briquettes increased as the temperature increased and as the moisture content decreased. As pressure increased, density, durability and impact resistance of the briquettes initially increased and then decreased. High-quality briquettes made of millet bran could be produced within the range of moisture content from 5% to 10%, temperature from 80 °C to 110 °C and pressure from 110 MPa to 130 MPa. The optimum moisture content, temperature and pressure were 5.4%, 101.9 °C, and 122.7 MPa, respectively. The density, durability and impact resistance under optimum conditions were 1.21 g cm ⁻³ , 95.7% and 99.64%, respectively. Millet bran possessed good fuel quality and could be successfully used as professional feedstock for producing solid biofuel.
Article
Full-text available
This work aims to evaluate the performance of the agglutination functions Desirability, Modified Desirability and Commitment Programming , applied in the optimization of problems with multiple response. The performance of the agglutinating functions was measured through the Average Percentage of Deviation. In order to apply the proposed method, three problems containing multiple responses were selected in the literature. The results obtained for the selected case studies indicate a better performance of the modified Desirability agglutination function for the simultaneous optimization of multiple responses, especially when these responses are modeled by equations containing quadratic terms, regardless of the number of terms, type of responses and Number of variables they may contain. The methodology and results presented in this study aim to collaborate with the research, development and evaluation of advanced techniques for the optimization of problems with multiple responses that are relevant in the search for specifications to increase efficiency in industrial processes.
Article
Full-text available
Çok yanıtlı eniyileme (Multi Response Optimization-MRO) uygulamaları son yıllarda yaygın olarak kullanılmaya başlansa da gıda ürünleri üzerindeki deney tasarımı uygulamaları çoğunlukla tek yanıtlı eniyileme problem çözümü şeklinde gerçekleştirilmektedir. Bu çalışmada bir gıda işletmesinde üretilen Tavuk Adana Kebap (Köfte) gıda ürününün kalitesinin arttırılması için Taguchi parametre tasarımı yöntemi kullanılmıştır. Pişirme sıcaklığı, pişirme süresi ve fan hızı parametrelerinin en uygun düzeyleri ile mikrobiyolojik yük, merkez sıcaklığı ve ağırlık kalite karakteristiklerinin en iyi değerlerinin belirlenmesi için her üç kalite karakteristiğinin eşzamanlı eniyilenmesi çekicilik fonksiyonu ve kalite kayıp fonksiyonu kullanımıyla gerçekleştirilmiştir. Elde edilen çözüme bağlı olarak Tavuk Adana Kebap ürünü için en uygun üretim parametrelerinin düzeylerine ilişkin önerilerde bulunulmuştur.
Presentation
Content: • Introduction into Desirability Functions, • Optimization of Desirability Functions, • Existing Methods, • Our Method, • Numerical Results and Comparisons, • Outlook to Future Studies, • Generalizations, • Conclusion
Article
Full-text available
During building model process, it is difficult to construct a multiple regression model (MRM) while the response variable(Y) is proposed as a vector of (r.v), (Y1,Y2,Y3,…..,Yn) in an experiment. So that a single response (MRM) is not able to perform multi-response data (MRD) separately (one for each response), this is because of the linear dependency (LD) among responses, then (MRRM) which was proposed by (Len Beirman, & Freidman 1997) has better performance to detect effects and patterns for the factors, (Explanatory) (X1 , X2, X3 ,…..Xk )that are introduced to the (MRRM) system on the (r.v) altogether. This Study was applied (MRRM) on an agricultural experiment through (450m2) in west Sulaimani- Kurdistan Region-Iraq.
Article
S ummary Simultaneous procedures for variable selection in multiple linear regression have recently been given by Aitkin. One of these procedures, proposed for the case when the regression equation is to be used for descriptive purposes, is an application of each of a number of simultaneous procedures concerned with the multivariate general linear model and given by Gabriel with applications in the manova context. The applications of Gabriel's procedures to multivariate linear regression are presented here and illustrated as generalizations of Aitkin's technique.
Article
The purpose of this paper is to present the theory and develop an algorithm associated with the exploration of a dual response surface system. The approach is to find conditions on a set of independent or “design” variables which maximize (or minimize) a “primary response” function subject to the condition that a “constraint response” function takes on some specified or desirable value. A method is outlined whereby a user can generate simple two dimensional plots to determine the conditions of constrained maximum primary response regardless of the number of independent variables in the system. He thus is able to reduce to simple plotting the complex task of exploring the dual response system. The procedure that is used to generate the plots depends on the nature of the individual univariate response functions.In certain situations it becomes necessary to apply the additional constraint that the located operating conditions are a certain “distance” from the origin of the independent variables (or the center of the experimental design). The methods derived and discussed in the paper are applicable only to quadratic response functions.
Article
In a 1959 paper, A. E. Hoerl discussed a method for examining a second order response surface. Thii paper provides a mathematically simpler derivation of the technique and proofs of some stated properties.
Article
The problem of variable selection in multivariate regression is considered. Procedures for isolating subsets of the independent variables which adequately describe the regression relationship between the set of dependent variables and the set of all independent variables are discussed. Each procedure leads to a set of subsets of variables which contains all adequate subsets with probability not less than a specified lower bound. A graphical aid to the assessment of the relative importance of subsets of independent variables is noted.
Article
For rectangular confidence regions for the mean values of multivariate normal distributions the following conjecture of 0. J. Dunn [3], [4] is proved: Such a confidence region constructed for the case of independent coordinates is, at the same time, a conservative confidence region for any case of dependent coordinates. This result is based on an inequality for the probabilities of rectangles in normal distributions, which permits one to factor out the probability for any single coordinate.
Article
A problem facing the product development community is the selection of a set of conditions which will result in a product with a desirable combination of properties. This essentially is a problem involving the simultaneous optimization of several response variables (the desirable combination of properties) which depend upon a number of independent variables or sets of conditions. Harrington, among others, has addressed this problem and has presented a desirability function approach. This paper will modify his approach and illustrate how several response variables can be transformed into a desirability function, which can be optimized by univariate techniques. Its usage will be illustrated in the development of a rubber compound for tire treads.
Article
Nelder and Mead [2] have developed a simple, robust direct-search procedure for finding the minimum of a function. It consists of evaluating a function of n variables at the (n + 1) vertices of a general simplex. The simplex is then moved away from the largest function value by replacing the vertex having this value with one located by reflection through the centroid of the other vertices. Extension or contraction is then applied depending on the contours of the response surface. This continues until either the specified number of trials has been used, the function values differ among themselves by less than a specified amount, or the coordinates of the function are changing by less than a specified amount.
Article
For rectangular confidence regions for the mean values of multivariate normal distributions the following conjecture of O. J. Dunn [3], [4] is proved: Such a confidence region constructed for the case of independent coordinates is, at the same time, a conservative confidence region for any case of dependent coordinates. This result is based on an inequality for the probabilities of rectangles in normal distributions, which permits one to factor out the probability for any single coordinate.
Article
A different approach to the general problem of optimization of nonlinear, restrained systems is presented. This approach, called the Created Response Surface Technique, is based, essentially, on steepest ascents up a succession of created response surfaces within the solution space. The most important characteristic of the approach is that it automatically avoids restraint boundary violations during the optimization. This article discusses application to fully developed steady-state mathematical models. By the use of appropriate experimental designs, the method could be used for the characterization and optimization of mathematically incomplete, nonlinear, restrained systems. In addition, it seems possible that the Created Response Surface concept could be applied to attain dynamic process control in certain nonlinear, restrained control situations. However, the mathematical implications of the new approach must be studied further.
Article
This paper serves as sequel to the earlier review paper on mixtures [4] by discussing the literature on mixture experiments dating from 1973 to the present. The sections listed consist of techniques for analyzing mixture data, additional design configurations for covering the whole simplex, and designs for exploring restricted regions of the simplex. The material from each of twenty-two papers is given a brief exposure. and suggestions are made of topics for future research.
Article
A review of the literature concerning experiments with mixtures is presented. Beginning with the work of Scheffe in 1958, the review covers approximately twenty papers in near chronological order, the last appearing in October 1971. Although priority in the treatment of designs for mixtures is claimed by Quenouille [27], and while others may argue that the work of Claringbold [6] was the impetus for Scheffé's research, the choice of starting with Scheffé is made because most of the authors of the reviewed papers recognized Scheffé as the person most instrumental in pioneering the work which has appeared over the past fourteen years.
Article
Multiple regression analysis was used to obtain prediction equations to measure individual and combination effects of CaCl 2 and cysteine on gel texture parameters (hardness, cohesiveness, and springiness) and on compressible water of dialyzed whey protein concentrate (WPC) systems. Predicted maximum gel hardness occurred at 11.1 mM CaCl 2 or 9.7 mM cysteine. Cysteine levels at 30 mM or above drastically reduced gel hardness. Cohesiveness tended to decrease with addition of either reagent. Reagent addition generally decreased gel springiness while compressible water increased. Response surface contour predictions suggested that increases in CaCl 2 or cysteine in reagent combination systems decreased hardness and increased compressible water. Predicted cohesiveness and springiness maxima were at 13.9 mM cysteine and 18.4 mM CaCl 2 and at 10.3 mM cysteine and 10.0 mM CaCl 2, respectively. Significant x 1-x 2 interaction terms in the mathematical model for combined reagent effects were observed on hardness, cohesiveness, and compressible water in the WPC gel systems.
Article
A procedure is described which employs an experimental adaptation of the gradient projection method to find conditions on a set of process design variables which optimize a primary process response, subject to maintaining a set of secondary process responses within specified ranges. This approach consists of performing experiments to establish an improving direction, either the gradient or gradient-projection direction, and then performing experiments along the chosen direction to determine a stopping point. This procedure is repeated until optimum process conditions are found. Several factors affect the progress of this experimental search, including the type of experimental design, the spacing of design points in the experimental region, the number of replicates employed, and the magnitude of random error inherent in the process under study. Properly utilized, the proposed technique offers a viable strategy for process design and development experimentation when multiple process responses are encountered.
Article
Response surface methodology, an experimental strategy initially developed and described by Box and Wilson, has been employed with considerable success in a wide variety of situations, especially in the fields of chemistry and chemical engineering. It is the purpose of this paper to review the literature of response surface methodology, emphasizing especially the practical applications of the method. A comprehensive bibliography is included.
Article
Experience has shown that unless special care is exercised in analyzing multiresponse data serious mistakes can be made. In this paper some problems associated with fitting multiresponse models are identified and discussed. In particular, three kinds of dependencies are considered: dependence among the errors, linear dependencies among the expected values of the responses, and linear dependencies in the data. Since ignoring such dependencies can lead to difficulties, a method is described for detecting and handling them. The concepts involved are illustrated with a chemical example.
Article
Data augmentation is shown to provide a rather general formulation for the study of biased prediction methods using multiple linear regression. Variable selection is a limiting case and Ridge regression is a special case of data augmentation. A way is proposed for obtaining predictors, given a credible criterion of good prediction.
Article
Matrices of random variables that are symmetric but not necessarily positive definite can arise in various ways. Assuming normality and a certain ‘invariant’ covariance structure, or alternatively assuming a Wishart structure, we derive exact likelihood ratio tests of certain hypotheses on the latent vectors. In the former case, these lead to exact confidence regions for the unknown vectors; approximations to these regions are given. The approximate validity of the regions in non-normal and non-invariant cases (for example, in the case of rounding-off errors) is investigated. An application to the analysis of a quadratic response surface is described.
An Introduction to Multioariate Statisti-cal Analysis A Response Surface Method for Experimen-tal Optimization of Multi-Response Processes
  • T W Anderson
ANDERSON, T. W. (1958), An Introduction to Multioariate Statisti-cal Analysis, New York: John Wiley. BILE& W. E. (1975), " A Response Surface Method for Experimen-tal Optimization of Multi-Response Processes, " Ind. Eng. Chem., Process Des. Dev., 14,152-158.
On the Experimental Attainment of Optimum Conditions The Created Response Surface Tech-nique for Optimizing Nonlinear, Restrained Systems
  • G E P Box
BOX, G. E. P., and WILSON, K. B. (1951), " On the Experimental Attainment of Optimum Conditions, " J. Roy. Statist. Sot., Ser. B, 13,145. CARROLL, C. W. (1961), " The Created Response Surface Tech-nique for Optimizing Nonlinear, Restrained Systems, " Oper-ations Research, 9,169-185.
Some Remarks on Multivariate Statistical Methods for Analysis of Experimental Data
  • R Gnanadesikan
GNANADESIKAN, R. (1963), " Some Remarks on Multivariate Statistical Methods for Analysis of Experimental Data, " Ind. Qual. Cont., 19,22-32.
Experimental Design ( coded ) and Response Values ( example 2) Yl(%) y2 bin) y,(X) Yq (%) -1 -1 -1 -1 -1 -1 -1 The Relationship Between Variable Selec-tion and Data Augmentation and a Method for Prediction
  • References Allen
The maximum obtained in step 5, a function of x alone, is minimized over R. SIMULTANEOUS OPTIMIZATION OF MULTIPLE RESPONSES 371 Table 6. Experimental Design ( coded ) and Response Values ( example 2) Yl(%) y2 bin) y,(X) Yq (%) -1 -1 -1 -1 -1 -1 -1 REFERENCES ALLEN, D. M. (1974) " The Relationship Between Variable Selec-tion and Data Augmentation and a Method for Prediction, " Technometrics, 16, 125-127.
Methods For Statistical Data Analy-sis ofMultivariate Observations The Desirability Function A Review of Response Surface Methodology: A Literature Review
  • R Gnanadesikan
  • E C Harrington
  • W J Hill
GNANADESIKAN, R. (1977), Methods For Statistical Data Analy-sis ofMultivariate Observations, New York: John Wiley. HARRINGTON, E. C. (1965), " The Desirability Function, " Ind. Qual. Cont., 21,494498. HILL, W. J., and HUNTER, W. G. (1966), " A Review of Response Surface Methodology: A Literature Review, " Technometrics, 8, 571-590.