Article

Optimization of Multiple Responses considering Location and Dispersion Effects

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

An integrated modeling approach to simultaneously optimizing both the location and dispersion effects of multiple responses is proposed. The proposed approach aims to identify the setting of input variables to maximize the overall minimal satisfaction level with respect to both location and dispersion of all the responses. The proposed approach overcomes the common limitation of the existing multiresponse approaches, which typically ignore the dispersion effect of the responses. Several possible variations of the proposed model are also discussed. Properties of the proposed approach are revealed via examples.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... These methods model the location and dispersion effects as two separate "responses" to be optimized. Based on the idea of dual-response optimization, Kim & Lin (2006) proposed an MRO method that simultaneously optimizes the location and dispersion effects of multiple responses. The desirability function is adopted to obtain the degrees of satisfaction in terms of the means and standard deviations of the responses. ...
... A real MRO problem from a production process of colloidal gas aphrons (CGAs) is used as the first test example. This problem was first reported by Jauregi et al. (1997) and then used by Kim & Lin (2006), He et al. (2017), and Lee et al. (2018) to evaluate the performance of MRO methods. The CGAs are microbubbles composed of a gaseous inner core covered by a thin surfactant film produced by intensively stirring a surfactant solution (Jauregi et al., 1997). ...
... The specifications, including the targets (T j,µ and T j,σ ), upper limits (y max j,µ and y max j,σ ) and lower limits (y min j,µ and y min j,σ ) for the means and standard deviations of the responses y i (j = 1, 2, 3) are shown in Table 2. The values in Table 2 are set based on the work by Kim & Lin (2006). The only difference is that we relaxed the upper limits of the standard deviations of the responses, i.e., y max 1,σ , y max 2,σ , and y max 3,σ are changed from 0.10, 0.10, and 2.00 to 0.20, 0.20, and 3.00. ...
Article
Full-text available
This paper proposes a robust method for multi-response optimization (MRO) considering the location effect, dispersion effect, and model uncertainty simultaneously. We propose a multi-objective optimization model for MRO that simultaneously maximizes the satisfaction degrees of the local and dispersion effects. Specifically, a robust desirability function is used to model the overall satisfaction degree of multiple responses with respect to the two effects. This desirability function evaluates a solution’s performance considering the confidence intervals predicted by the regression models of location and dispersion effects, and thus, can address the problem of model uncertainty. The proposed multi-objective model yields a set of non-dominated solutions that approximate the Pareto front instead of one single solution, which provides more flexibility for decision makers to select the final best compromise solution based on their preferences. To solve the model, a hybrid multi-objective optimization algorithm called NSGAII-DMS that combines non-dominated sorting genetic algorithm II (NSGA-II) with direct multi-search (DMS) is proposed. NSGAII-DMS uses the search mechanism of NSGA-II during the early evolutionary phase to quickly find a set of non-dominated solutions and then uses the search mechanism of DMS to further tune the found non-dominated solutions. Two test examples have shown that the proposed multi-objective MRO model can produce a set of robust solutions by considering the location effect, dispersion effect, and model uncertainty. Further analyses illustrate that NSGAII-DMS shows significantly better search performance than several well-known benchmark multi-objective optimization algorithms, which include NSGA-II, SPEA2, MOEA/D, and DMS.
... Besides, the process setting conditions' sensitivity should be considered before the final implementation of any efficient Pareto solution (Kim and Lin, 2006). It is desirable to adopt and implement robust solutions that are less sensitive to deviations in setting conditions. ...
... MRO solutions also depend upon a consideration of only means or mean-variance of responses. Readers may refer to solution approaches proposed for mean MRO problems (Carlyle et al., 2000;Chapman et al., 2014a;Derringer, 1994) and mean-variance MRO problems (Bera and Mukherjee, 2013;Kim and Lin, 2006;Sharma and Mukherjee, 2020). ...
... In the context of SCOO, a few of the popular response surface-based solution approaches include the contour plot, desirability function, generalised distance, loss function, and process performance indices (Bera and Mukherjee, 2018;Mukherjee and Ray, 2006). To simultaneously optimise the mean-variance of responses, Kim and Lin (2006), Lee et al. (2018) suggest developing separate RS models for the mean and variance of each response and optimising them using a composite desirability function approach. Kazemzadeh et al. (2008) and propose a goal programming framework to resolve meanvariance MRO problems. ...
Article
Full-text available
The primary objective of this study is to propose a robust multiobjective solution search approach for a mean-variance multiple correlated quality characteristics optimization problem, so-called 'multiple response optimization (MRO) problem.' The solution approach needs to consider response surface (RS) model parameter uncertainties, response uncertainties, process setting sensitivity, and response correlation strength to derive the robust solutions iteratively. A fine-tuned, non-dominated sorting genetic algorithm-II (NSGA-II) is used to derive efficient multiobjective solutions for varied mean-variance MRO problems. The final solutions are ranked based on different multi-criteria decision-making (MCDM) techniques.
... However, Pareto's function is of major interest in the context of information scarcity. (Kim & Lin, 2006) (Eq. 9). ...
... Graphical explanation of Maximin aggregation function(Kim & Lin, 2006) Figure 28 explains how Maximin function aggregates the solutions. The function first determines the minimum zi for each solution; zi computed from yi via a desirability function. ...
Thesis
L’optimisation est un processus intéressant pour la conception en général et la conception architecturale en particulier. Il existe de nombreux outils d’optimisation de conception générative en architecture. Cependant, ces outils ne sont pas très utilisés par les architectes. La conception architecturale pose des problèmes mal structurés; la créativité et l’interprétation des concepteurs sont essentielles pour résoudre ces problèmes. Donc en architecture, l’acceptabilité des solutions par les concepteurs est aussi importante que l’optimalité numérique de leurs performances. Or, bien que les préférences des concepteurs sont cruciales pour l’acceptabilité, les outils existants ne les intègrent pas dans le processus d’optimisation. Le maque d’implication possible pour le concepteur lors de l’utilisation des outils est une cause majeure de la réticence des architectes à utiliser ces outils. Cette thèse vise à définir un ensemble de recommandations qui aident les développeurs à proposer des systèmes d’aide à la décision plus attractifs pour les architectes, car permettant une plus grande intégration du concepteur dans le processus d’optimisation et donc une plus grande implication lors de l’utilisation de ces outils.Pour définir l’ensemble des recommandations, la recherche a commencé par explorer différents processus de conception. A partir de cette exploration, un cadre de conception basé sur quatre modèles Morphogenèse, Observation, Interprétation, Agrégation, MOIA est défini. Ensuite, les typologies d’outils utilisées par les architectes sont explorées. En outre, les “workflows” d’optimisation de conception générative les plus connus sont étudiés, en utilisant MOIA comme référence. Ensuite, la recherche adopte une approche expérimentale portant sur l’acceptabilité des concepteurs. Cinq expériences différentes sont réalisées. Deux des expériences comparent différents “workflows” d’optimisation de conception générative existants en utilisant l’acceptabilité des concepteurs comme référence. Les trois autres expériences comparent les différentes fonctions d’agrégation en utilisant le jugement des concepteurs comme référence. Ces fonctions sont la fonction de Pareto, Maximin, Derringer & Suich.Les résultats de ces expériences peuvent être résumés en quatre points. Premièrement, la programmation visuelle est recommandée pour les futurs outils d’optimisation générative. La programmation visuelle aide l’architecte à décrire des modèles paramétriques sophistiqués sans codage. En effet, les concepteurs, en général, ne sont pas formés à coder. Deuxièmement, l’aspect graphique de l’outil peut fortement influencer la décision du concepteur. Les performances des solutions doivent être présentées graphiquement aux concepteurs et la méthode de représentation doit dépendre du nombre d’objectifs. Troisièmement, l’utilisation d’un algorithme d’optimisation interactif qui permet aux concepteurs de sélectionner la solution en fonction de leur jugement subjectif de la forme peut augmenter l’acceptabilité des “workflows”. Quatrièmement, la disponibilité des informations est la clé pour définir la fonction d’agrégation adaptée. L’interprétation requise pour les différentes fonctions d’agrégation n’est pas toujours la même. Lorsque les informations nécessaires sont disponibles, les fonctions cardinales à forte néguentropie sont préférées aux fonctions ordinales à faible néguentropie.les outils d’optimisation de conception générative existants doivent être davantage attractifs pour les architectes. La recherche adopte une approche expérimentale basée sur l’acceptabilité des concepteurs. La méthodologie développée dans cette recherche a permis de définir un ensemble de recommandations visant à réaliser des outils plus attractifs pour les concepteurs, favorisant ainsi la pratique de l’optimisation lors des processus de conception. La recommandation se concentre sur le fait d’offrir la possibilité aux concepteurs d’être plus impliqués dans le processus.
... Ordinary least-squares (OLS) method is widely used in process optimisation due to its simplicity in the calculation and its efficiency under the normality assumption (Kim and Lin 2006;He, Zhu, and Park 2012;Ouyang et al. 2020). Vining and Bohn (1998) presented that the OLS method is often inadequate. ...
... Location and dispersion effects play a vital role in the determination of the optimal input setting. It is natural that prediction mean and variance are implemented to measure location and dispersion effects, respectively (Kim and Lin 2006). Therefore, the optimisation strategy based on theconstraint method and the two effects can be given as follows ...
Article
Many industrial process optimisation methods rely on empirical models that relate output responses to a set of design variables. One of the most crucial problems in process optimisation is how to efficiently implement model selection and model estimation. This paper presents a Bayesian hierarchical modelling approach to process optimisation based on the seemingly unrelated regression (SUR) models. This approach can estimate a set of predictors to be included in a model based on a Bayesian hierarchical procedure (i.e. model selection) and then give model prediction based on a Bayesian SUR model (i.e. model estimation). Meanwhile, a two-stage optimisation strategy considering practitioners’ preference information is proposed in process optimisation, which initially finds a set of non-dominated input settings and then determines the best one based on the similarity to an ideal solution method. The performance and effectiveness of the proposed method are illustrated with both simulation studies and a case study. The comparison results demonstrate that the proposed method can be a good alternative to existing process optimisation methods.
... Most of the existing desirability function methods focus on optimizing the means of multiple responses assuming that each has constant variance. Kim and Lin (2006) tackled this assumption and suggested simultaneous optimization of both the location and dispersion effects of multiresponse (i.e., mean and variability of multiple responses). This approach first constructs desirability functions for the mean and standard deviation of each response. ...
... Recently, Lee, Jeong, and Kim (2009) suggested a posterior preference articulation approach for optimizing the mean and variability of the multiresponse. This approach first builds d l j x ð Þ and d r j x ð Þ as in Kim and Lin (2006), then uses these functions to generate a set of nondominated solutions. Then, a decision maker selects the most preferred solution by considering tradeoffs between the mean and variance of the multiresponse. ...
Article
Full-text available
A multistage process consists of sequential stages where each stage is affected by its preceding stage, and it in turn affects the stage that follows. The process described in this article also has several input and response variables whose relationships are complicated. These characteristics make it difficult to optimize all responses in the multistage process. We modify a data mining method called the patient rule induction method and combine it with desirability function methods to optimize the mean and variance of multiresponse in the multistage process. The proposed method is explained by a step-by-step procedure using a steel manufacturing process example.
... Such an example was solved in Kim and Lin. 9 The purpose of this example is to find optimal settings of the 3 input variables for which the means and standard deviations of the 3 responses are optimized. ...
... In this case, we determined these specification values based on specification information from Kim and Lin. 9 Table 3 summarizes the specification values. ...
Article
Full-text available
A desirability function approach has been widely used in multi-response optimization due to its simplicity. Most of the existing desirability function-based methods assume that the variability of the response variables is stable; thus, they focus mainly on the optimization of the mean of multiple responses. However, this stable variability assumption often does not apply in practical situations; thus, the quality of the product or process can be severely degraded due to the high variability of multiple responses. In this regard, we propose a new desirability function method to simultaneously optimize both the mean and variability of multiple responses. In particular, the proposed method uses a posterior preference articulation approach, which has an advantage in investigating tradeoffs between the mean and variability of multiple responses. It is expected that process engineers can use this method to better understand the tradeoffs, thereby obtaining a satisfactory compromise solution.
... However, this is not unusual in works where the so-called Taguchi (Quality Engineering or Robust Parameter Design) method is used to optimize multiple responses and aims at minimizing response bias and variance around the response(s) target value simultaneously. Examples are the Refs (Kim & Lin, 2006;Yang et al., 2023). The Pareto solutions set and the respective variables values shown in Figure 8 are obtained by optimizing the mean responses. ...
Article
Generating a set of optimal solutions is a recommended practice when solving a multiresponse problem. However, it is known that some optimal solutions may yield unexpected outcomes when implemented in practice. Thus, to avoid wasting resources and time in implementing a theoretical optimal solution that does not produce the expected outcomes, a new approach to select a solution from the Pareto front is proposed. This approach employs a desirability-based function to aggregate all the desired response characteristics, namely the responses’ Bias, Resilience, Quality of Predictions, and Robustness. Two case studies illustrate the usefulness of the proposed approach.
... When it comes to pharmaceuticals, desirability includes minimizing product damage or loss, whether during transit or due to transportation time delays. The desirability of a route is influenced by the level of risk associated with its physical appearance, including population density or the number of households along the route 45,46 . Criteria for selecting the transportation route were route's winding slope (O1), route surface In our ZODP, the preceding section's route selection was influenced by the highest overall desirability level. ...
Article
Full-text available
The facility location problem is extended by a new two-stage zero-one programming system (2S-ZOPS). It is a type of design optimization issue that exists in logistics implementations such as supply chain planning in healthcare or agriculture. Along with concerns regarding PD delivery time manner for connecting logistics centers and customers, recent studies have considered the zero-one location design model. This research discussed a route selection model for the 2S-ZOPS that did not exist in the published studies by taking into account the level of risk associated with physical appearance. The mathematical models were developed in response to a PD supply chain design that occurred in Thailand’s National Health Insurance Program. By combining the virus optimization algorithm (VOA) with a large neighborhood search (LNS), we created a hybrid metaheuristic method for solving the 2S-ZOPS. Experiments with real-world data demonstrated that the hybrid algorithm was efficient in terms of time consumption and solution quality, saving approximately 6% on total costs. The presented practice benefits not only the healthcare industry but also various other businesses.
... The traditional DRSM usually assumes that the quality characteristics are normally distributed without contamination. 16,17,18 If the normal assumption is violated or the data are contaminated, the optimal operating conditions for the controllable factors may be far removed from the actual optimal conditions. 19,20 Park and Cho 19 presented adopting the sample median as the location effect and the median absolute deviation (MAD) or interquartile range (IQR) as the dispersion effect. ...
Article
To achieve better product quality, the dual‐response surface has been widely used to simultaneously achieve targets for the mean and reduced variance. The general assumption is that the experimental data are normally distributed without censoring. However, reliability data are usually non‐normally distributed and censored. Consequently, the classical dual‐response surface method may not be suitable for handling reliability data. This study proposed a dual‐response surface method based on robust parameter design (RPD) for reliability data. First, a percentile lifetime and an interquartile range (IQR) were used for the location and dispersion effects, respectively. A two‐stage method was then developed to identify the effects of significant factors. During the first stage, the Boruta algorithm was employed for factor effect selection, and during the second stage, the least absolute shrinkage and selection operator (LASSO) dimension reduction method was used to determine the final significant factor effects. Finally, a modified standard particle swarm optimization (SPSO) algorithm was adopted to determine the optimal solution. The proposed method was demonstrated using an industrial thermostat experiment. Compared with other methods, the proposed method performed better over various warranty periods, particularly for the 10th percentile.
... In addition, traditional methods usually assume that every response follows a normal distribution with its mean being a function of the input factors and its variance as a constant. However, this approach ignores two very common situations in manufacturing, i.e., the response does not always follow a normal distribution, and the variance of the response also varies with the process variables (a phenomenon commonly referred to as heteroscedasticity) (Kim and Lin, 2006;Tan and Ng, 2009). Therefore, it is important to take into account the complex correlation among non-normal responses and to adequately capture the variability of the distribution. ...
Article
Full-text available
In production design processes, multiple correlated responses with different distributions are often encountered. The existing literature usually assumes that they follow normal distributions for computational convenience, and then analyzes these responses using traditional parametric methods. A few research papers assume that they follow the same type of distribution, such as the t-distribution, and then use a multivariate joint distribution to deal with the correlation. However, these methods give a poor approximation to the actual problem and may lead to the recommended settings that yield substandard products. In this article, we propose a new method for the robust parameter design that can solve the above problems. Specifically, a semiparametric model is used to estimate the margins, and then a joint distribution function is constructed using a multivariate copula function. Finally, the probability that the responses meet the specifications simultaneously is used to obtain the optimal settings. The advantages of the proposed method lie in the consideration of multiple correlation patterns among responses, the absence of restrictions on the response distributions, and the use of nonparametric smoothing to reduce the risk of model misspecification. The results of the case study and the simulation study validate the effectiveness of the proposed method.
... The desirability values of each objective can be summed up to obtain an overall desirability value by incorporating the geometric mean, weighted geometric mean, minimum, and maximin approaches (Bera & Mukherjee, 2012;G. C. Derringer, 1994;Harrington, 1965;Kim & Lin, 2006). Moreover, Jeong and Kim (2003) formulated the IDFA to handle a preference articulation process, in which decision makers can adjust tightening and relaxation in the shape including bound and target functions for a satisfactory compromise (Jeong & Lee, 2020). ...
Article
This study presents a new integrated approach based on the fuzzy analytic hierarchy process (FAHP) and zero-one desirability programming (ZODP) for selecting strategic marketing information system projects. The FAHP is employed to prioritize the criteria, and the ZODP is used to select a set of appropriate projects. The proposed integrated FAHP-ZODP approach maximizes the overall desirability value under various objectives, limited resources, and preferences. The findings of this study support decision makers in project selection when conflicting attributes are involved, ensure effective decision making with an understanding of the marketing environment, and assist project management under certain and uncertain environments.
... On the contrary, in order to achieve a global value, surrogate model is coupled with metaheuristics. e surrogate models include response surface approach [35,36], Kriging [37], neural network [38], fuzzy [39], and adaptive-network-based fuzzy inference system [40]. Among them, adaptive-network-based fuzzy inference system (ANFIS) is an exact predictor. ...
Article
Full-text available
Compliant mechanisms are crucial parts in precise engineering but modeling techniques are restricted by a high complexity of their mechanical behaviors. Therefore, this paper devotes an optimal design method for compliant mechanisms. The integration method is a hybridization of statistics, finite element method, artificial intelligence, and metaheuristics. In order to demonstrate the superiority of the method, one degree of freedom is considered as a study object. Firstly, numerical datasets are achieved by the finite element method. Subsequently, the main design parameters of the mechanism are identified via analysis of variance. Desirability of both displacement and frequency of the mechanism is determined, and then, they are embedded inside a fuzzy logic system to combine into a single fitness function. Then, the relationship between the fine design variables and the fitness function is modeled using the adaptive network-based fuzzy inference system. Next, the single fitness function is maximized via moth-flame optimization algorithm. The optimal results determined that the frequency is 79.517 Hz and displacement is 1.897 mm. In terms of determining the global optimum solution, the current method is compared with the Taguchi, desirability, and Taguchi-integrated fuzzy methods. The results showed that the current method is better than those methods. Additionally, the devoted method outperforms the other metaheuristic algorithms such as TLBO, Jaya, PSOGSA, SCA, ALO, and LAPO in terms of faster convergence. The result of this study will be considered to apply for multiple-degrees-of-freedom compliant mechanisms in future work.
... If it is verified that they are uncorrelated, the two vectors may be treated with an appropriate analyzer, which handles multiple responses. The robust multi-factorial screening and optimization of multiple responses is a challenging task because it requires specialized solvers (Gabrel et al. 2014;Kim and Lin, 2006). In dealing with 'saturatedunreplicated-and-censored' OA-datasets, the robust solver should be equipped to assess the symmetry status of the residual errors across all the examined factor-settings. ...
Article
Full-text available
Reliability enhancement is indispensable in modern operations. It aims to ensure the viability of complex functionalities in competitive products. We propose a full-robust screening/optimization method that promotes the rapid multi-factorial profiling of censored highly-fractionated lifetime datasets. The method intends to support operational conditions that demand quick, practical and economical experimentation. The innovative part of this proposal includes the robust split and quantification of structured lifetime information in terms of location and dispersion tendencies. To accomplish the robust data-reduction of lifetimes, maximum breakdown-point estimators are introduced to stabilize potential external-noise intrusions, which might be manifested as outliers or extremities. The novel solver provides resilience by robustifying the location (median) and dispersion (Rousseeuw-Croux Qn) estimations. The proposed profiler fuses dichotomized and homogenized lifetime information in a distribution-free manner. The converted and consolidated lifetime dataset is non-parametrically pre-screened to ensure error balances across effects. Consequently, any strong effects that maximize the lifetime response are diagnosed as long as the error symmetry has been previously established. We discuss problems that may be encountered in comparison to other multi-factorial profilers/optimizers upon application to densely-fractionated-and-saturated experimental schemes. We comment on the lean and agile advantages of the proposed technique with respect to several traditional treatments for the difficult case that implicates small and censored survival datasets. The robust screening procedure is illustrated on an industrial-level paradigm that concerns the multi-factorial reliability improvement of a thermostat; the trial units have been subjected to conditions of censoring and use-rate acceleration.
... The commonly used methods to deal with the model uncertainty are Bayesian inference, interval estimation (He et al., 2017;Ouyang, Ma, Byun, Wang, & Tu, 2016) and ensemble modeling technique (Kim & Lin, 2006), et al. Therefore, one possibility of modifying the proposed method is to incorporate an ensemble modeling technique or interval estimation into the framework of online RPD. ...
Article
With the rapid development of the Internet of Things and sensor technology, some noise factors can be measured or estimated during operation and production. This paper develops a new multi-response optimization method that facilitates online parameter design by using the extra information available about observable noise factors. Bayesian multivariate regression model and Bayesian vector autoregressive model are used to consider the uncertainty of both the response model and the noise model. The Monte Carlo procedure is employed to obtain the predictions of multiple correlated noise factors from their posterior predictive distribution. The proposed method provides a convenient way to continuously update process settings during the production, which helps to further reduce the influence of the variability in the noise factor on product or process quality. Two examples are used to illustrate the effectiveness of the proposed method. The results show that the proposed method outperformance the offline parameter design and another online parameter design that does not consider model parameter uncertainty.
... Ko, Kim, and Jun (2005) proposed a loss function-based method considered the robustness and quality of optimisation results simultaneously. Ouyang et al. (2016a) improved the optimisation approach used by Kim and Lin (2006) to develop a weighted optimisation method that takes into account the location and dispersion performances comprehensively, they pointed out that the weighted MSE method and data-driven method proposed by Ding, Lin, and Wei (2004) can balance well between the different objectives based on the idea of an 'efficient curve'. ...
Article
In robust parameter design, it is common to use computer models to simulate the relationships between input variables and output responses. However, for the contaminated experimental data, the model uncertainty between computer models and actual physical systems will seriously impair the robustness of the optimal input settings. In this paper, we propose a new weighted robust design approach concerning the model uncertainty from outliers based on the robust Gaussian process model with a Student-t likelihood (StGP). Firstly, to reduce the impact of outliers on the output means and variances, the StGP modelling technique is adopted to estimate the relationship models for contaminated data. Secondly, the Gibbs sampling technique is employed to estimate model parameters for better mixing and convergence. Finally, an optimisation scheme integrating the quality loss function and confidence interval analysis approach is built to find the feasible optimisation solution. Meanwhile, the hypersphere decomposition method and data-driven method are applied to determine the relative weights of objective functions. Two examples are used to demonstrate the effectiveness of the proposed approach. The comparison results show that the proposed approach can achieve better performance than other approaches by considering the model uncertainty from outliers.
... Numerous industrial experiments often involve more than one quality characteristic of interest. A common problem in quality design is how to optimize multiple response surfaces simultaneously, which is called a multi-response surfaces (MRS) optimization problems (Kim & Lin, 2006). In MRS optimization problems, there exists a series of research issues such as correlation among multiple responses (Chiao & Hamada, 2001;Zhang, Ma, Ouyang, & Liu, 2016), robustness measurement of multivariate process (He, Zhu, & Park, 2012), making trade-offs among multiple optimization goals (Shin, Samanlioglu, Cho, & Wiecek, 2011), uncertainty of model parameters or model form itself (Ng, 2010;Ouyang, Ma, Wang, & Tu, 2017;, prediction performance of process model (Ouyang, Ma, Byun, Wang, & Tu, 2016) and reliability assessment for optimization results (Peterson, 2004;Peterson, Miro-Quesada, & Del Castillo, 2009). ...
Article
This paper proposes a new Bayesian modeling and optimization method for multi-response surfaces (MRS). The proposed approach not only measures the conformance probability (i.e., the probability of all responses simultaneously falling within their corresponding specification limits) through the posterior predictive function but also takes into account the expected loss (i.e., bias, quality of predictions and robustness) with the expected loss function. Also, it is shown that the Bayesian SUR models can provide more flexible and accurate modeling than the standard multivariate regression (SMR) models with the same covariate structure across different responses. Besides, the proposed approach also takes into account the correlation structure of the response data, the variability of the process distribution, and the uncertainty of model parameters as well as the prediction performance of the response model. A Polymer experiment and a simulation experiment are used to demonstrate the effectiveness of the proposed approach. The comparison results show that the Bayesian SUR model has higher conformance probability and lower expected loss than the Bayesian SMR model.
... The reason of this assumption is that the quality improvement for miniature and precise products usually focuses on the location and the dispersion effects simultaneously. These two effects are widely regarded as the same importance in most literature (Kim & Lin, 2006;Lin & Tu, 1995). The squared bias and the variance are typical representatives of the location and the dispersion effects, respectively. ...
Article
This paper proposes an ensemble radial basis function neural network that selects important RBF subsets based on Pareto chart using Bootstrap samples. Then, the analysis of variance method is used to determine the choice of the unequal/equal weights. The effectiveness of the proposed technique is illustrated with a micro-drilling process. The comparison results show that the proposed technique can not only improve the model prediction performance, but also generate a reliable scheme for quality design. Keywords: RBF model, Ensemble model, Model selection, Pareto chart, Process optimization
... The reason of this assumption is that the quality improvement for miniature and precise products usually focuses on the location and the dispersion effects simultaneously. These two effects are widely regarded as the same importance in most literature (Lin and Tu, 1995;Kim and Lin, 2006). The squared bias and the variance are typical representatives of the location and the dispersion effects, respectively. ...
Article
//This paper has been published in Journal of Management Science and Engineering which was held by National Nature Science Foundation of China (NSFC) on Dec. 2016.// More information about this paper can be seen in https://www.sciencedirect.com/science/article/pii/S2096232019300769. Abstract: This paper proposes an ensemble radial basis function neural network that selects important RBF subsets based on Pareto chart using Bootstrap samples. Then, the analysis of variance method is used to determine the choice of the unequal/equal weights. The effectiveness of the proposed technique is illustrated with a micro-drilling process. The comparison results show that the proposed technique can not only improve the model prediction performance, but also generate a reliable scheme for quality design.
... A deeper study of this criterion can also be found in [23]. ...
Article
In multiobjective optimization, objectives are typically conflicting, i.e., they cannot reach their individual optima simultaneously. Identifying the best one among Pareto-optimal solutions is not a simple task for the decision maker (DM), since the Pareto-optimal set can potentially contain a very large number of solutions. To ease this task, it is possible to resort to aggregate objective functions that should take into consideration the DM's preferences and objectives; however, accurately specifying meaningful weights can be a challenge for many practitioners. Moreover, for the same DM's preferences, different criteria give different results. A new post-Pareto analysis methodology, based on sum of ranking differences, is proposed to rank and detect the possible groupings of similar solutions of the Pareto front that match the DM's preferences. This way the proposed technique provides the DM a smaller set of optimal solutions. The proposed method was tested in two practical benchmark problems, in the design of a brushless dc motor and in the optimization of a die press model (TEAM Workshop Problem 25).
... It should be noted that this technique (multi response surface regression) involves a set of methods that try to understand the relationship between the input and output of the system. In fact, the designer in this technique seeks to establish an optimal relation between these variables (Kim & Lin, 2006). In other words, using response surface analyses, the relationship between strategic objectives in different perspectives of the BSC method is determined and it becomes clear which strategic objectives have a significant relationship. ...
Article
Full-text available
Purpose The purpose of this paper is to propose a quantitative methodology for setting targets in the framework of Balanced Scorecard (BSC) in order to achieve vision and goals. Design/methodology/approach Response Surface Methodology is proposed to find the significant relationships that should be included in the strategy map and the optimal values of performance measures are assessed by using the desirability function-based approach of RSM. The proposed method was created by reviewing the existing literature, modeling the problem, and applying it in an oil company. In fact, RSM is used to execute the design matrix, analyze the collected data, extract models, analyze the results, and optimize the procedures that generate multiple responses. Findings By applying this methodological design, a clearer picture of the relationships between strategic objectives is obtained and the influence of strategic objectives on one another is determined. Afterward, optimal values for performance measures are determined. Research limitations/implications This paper proposes a framework for constructing a strategy map and setting quantitative targets to translate the goals and strategies into corresponding performance measures and targets. Also, this paper presents a case study to demonstrate the applicability and effectiveness of the proposed approach. However, RSM-based techniques require a greater amount of data to generate more accurate results. Although the advent of the Information Age has forced organizations’ decision makers to provide sufficient information and data for business analysis, the data requirements of RSM-based techniques are met. Practical implications In practice, the process of setting targets for performance measures can be challenging in terms of reaching a consensus between managers and decision makers. The findings of this paper can offer a new approach for performance evaluation based on the BSC which allows the organization’s decision makers to reach a more accurate picture of the relationship model between organization goals and those objectives within the BSC. It also demonstrates how decision makers can be guided in the process of defining performance target values in the BSC method. Originality/value Reviewing the literature on setting quantitative targets within the framework of the BSC showed no prior study in which RSM is used. This approach has two main contributions: the associations among strategic objectives are investigated and obtained in an effective way which analytically identifies the direction and degree of the relations among the performance measures. Considering the performance evaluation structure based on the BSC, quantitative targets have been determined to help in achieving the long-term goals of the organization. The application of the proposed method in a company showed that the contributions of this research are not only theoretical, but practical as well.
... Some variations of this approach are discussed by Kim and Lin [30]. ...
Article
A diversity of multiresponse optimization methods has been introduced in the literature; however, their performance has not been thoroughly explored, and only a classical desirability-based criterion has been commonly used. With the aim of contributing to help practitioners in selecting an effective criterion for solving multiresponse optimization problems developed under the response surface methodology framework, and thus to find compromise solutions that are technically and economically more favorable, the working ability of several easy-to-use criteria is evaluated and compared with that of a theoretically sound method. Four case studies with different numbers and types of responses are considered. Less-sophisticated criteria were able to generate solutions similar to those generated by sophisticated methods, even when the objective is to depict the Pareto frontier in problems with conflicting responses. Two easy-to-use criteria that require less-subjective information from the user yielded solutions similar to those of a classical desirability-based criterion. Preference parameters range and increment impact on optimal solutions were also evaluated.
... posterior distribution of the response considers the uncertainty in the model parameters. Kim and Lin (2006) presented an integrated modeling approach to reduce the effect of model parameter uncertainty by maximizing the overall minimal satisfaction level. Myers et al. (2009) derived an unbiased estimator for the variance of the model prediction by considering the covariance of the model parameters. ...
Article
Response surface-based design optimization has been commonly used to seek the optimal input settings in process or product design problems. Yet in most of the existing researches, there is no model parameter uncertainty in the modeling process and the optimal settings can be implemented at the precise values. These two assumptions are far from the reality. Consequently, the optimal settings often turn out to be suboptimal in some manner. This paper proposes a new loss function method to deal with model parameter uncertainty and implementation errors (MPUIE). An interpretable expression for the new optimization strategy is derived, which provides insights into the impact of MPUIE on the determination of the optimal settings. A random simulation example and a real-life case study are used to demonstrate the effectiveness of the proposed approach. The approach gives the optimal settings that are the result of making tradeoffs among different directions from the center of the experimental design region or, more generally, in a manner that mitigates the adverse effects of MPUIE.
Article
Mathematical model development for multivariate decision-making problems (i.e., multiple parameter and response optimization problems) has received significant attention and has been investigated by many researchers and practitioners. Although many different optimization models and methods, including priority and weight allocation problems, have been proposed in the literature, there is significant room for improvement. The majority of existing optimization methods may have difficulties in determining reliable priorities and weights for practical industrial situations. In addition, because they often focus on tradeoffs only among multiple output responses, these methods may not incorporate the multiple priority and/or weight effects directly from input factors to output responses (i.e., quality characteristics). To address these problems, the primary objective of this research is to propose a new multi-objective optimization approach by integrating a game theory principle (i.e., the Stackelberg leadership game) into a robust parameter design model to determine the optimal factor settings. First, a multi-response robust design optimization (RDO) problem was formulated using a mean squared error model and response surface methodology. A Stackelberg leadership game is then integrated into this RDO problem, where multiple responses play the role of game participants. Second, the integrated RDO model using the Stackelberg game-based multi-response (SGMR) formulation approach is analyzed by decomposition into various sequential optimization models in terms of different leader–follower relationships. Finally, non-dominated solutions are obtained from the proposed model by evaluating the overall quality loss (OQL). A pharmaceutical numerical example was performed to demonstrate the proposed integrated model and to examine whether this model can determine the most efficient solutions when the relative importance among multiple responses is unidentified. In addition, a comparative study associated with the conventional multi-objective optimization methods and the newly proposed RDO model is presented, and the proposed RDO model provided significantly better solutions than the conventional methods in the relevant comparative study.
Preprint
Full-text available
An important problem in manufacturing or product and process design is optimization of several responses simultaneously. Common approaches for multiple response optimization problems often begin with estimating the relationship between responses as outputs and control factors as inputs. Among these methods, response surface methodology (RSM), has attracted more attention in recent years, but in certain cases, relationship between responses and control factors are far too complex to be efficiently estimated by regression models and RSM method especially when we want optimize several responses simultaneously. An alternative approach proposed in this paper is to use artificial neural network (ANN) to estimate the response functions, Because of high mean square error (MSE) in neural network training step we use heuristic algorithms instead of Descent Gradient based algorithms. In the optimization phase a particle swarm optimization (PSO) and desirability function are considered to determine the optimal settings for the control factors. Two case study from literature are prepared to illustrate the strength of proposed approach in optimization of multiple response problems.
Article
It is known that development parameters such as injection and production rate, injection temperature, and well location and spacing play an important role in determining the recoverable geothermal energy in a licensed region during the exploitation stage, but they are seldomly considered in geothermal resource assessment. In this study, a robust assessment method of recoverable geothermal energy considering the optimal development parameters is developed. To maximize the recoverable geothermal energy whilst avoid environmental disasters including reservoir pressure declining, thermal breakthrough and surface subsidence, an efficient approach of optimizing development parameters is proposed using the fully coupled thermo-hydro-mechanical reservoir model, experimental design and proxy model, and multiple response optimization method. Based on the case study of two doublets in the Tongzhou district, China, the robustness and reasonability of the developed assessment method are demonstrated. The results show that the optimal development parameters can significantly increase the recoverable geothermal energy, but the optimal development parameters for different doublets are different due to the complexity in geological structures and the heterogeneity in reservoir parameters. Therefore, it is highly recommended to maximize the recoverable geothermal energy by selecting the optimal development parameters.
Article
Full-text available
Aiming at the problem of supply chain robust optimization with multiple performance responses, a robust optimization method was proposed by combining principal component analysis with double response surface method based on Kriging meta-model. Firstly, through principal component analysis, the location characteristics and divergence characteristics of multiple related performance responses were transformed into the principal component comprehensive score of supply chain performance; secondly, the Kriging metamodels of location characteristics and divergence characteristics of multiple performance responses were constructed respectively, and the principal component comprehensive score model of comprehensive performance based on Kriging meta-model was constructed. Thus the optimal operating conditions by optimizing the constructed robust optimization strategy were obtained in order to realize the robust optimization of supply chain system with correlated multiple responses. Finally, a supply chain simulation case was given to illustrate the effectiveness of our proposed optimization method, and the robustness of different methods was also discussed. The comparison results showed that our proposed was more robust.
Article
Full-text available
Minimizing the impact of the mixed uncertainties (i.e., the aleatory uncertainty and the epistemic uncertainty) for a complex product of compliant mechanism (CPCM) quality improvement signifies a fascinating research topic to enhance the robustness. However, most of the existing works in the CPCM robust design optimization neglect the mixed uncertainties, which might result in an unstable design or even an infeasible design. To solve this issue , a response surface methodology-based hybrid robust design optimization (RSM-based HRDO) approach is proposed to improve the robustness of the quality characteristic for the CPCM via considering the mixed uncertainties in the robust design optimization. A bridge-type amplification mechanism is used to manifest the effectiveness of the proposed approach. The comparison results prove that the proposed approach can not only keep its superiority in the robustness, but also provide a robust scheme for optimizing the design parameters. Keywords: response surface methodology (RSM), hybrid robust design optimization (HRDO), uncertainty, complex product of compliant mechanism (CPCM).
Article
Full-text available
Airfoils play significant roles in aerodynamic engineering as they are important devices in designing aircraft and engines. For airfoils, geometric design variables often deviate from nominal values that deteriorate airfoil quality. Thus, the design optimization of airfoils under the noises of design variables is important. However, the existing literature does not consider functional responses that can more adequately describe airfoil performances than univariate and multivariate responses. To fill in this gap and further improve airfoil quality, we develop a surrogate-based robust parameter design methodology for the design optimization of airfoils with functional responses. A new optimization criterion that involves both location and dispersion effects associated with the functional feature of quality losses, noises of design variables, and emulator uncertainty is proposed. A Gaussian process (GP) model with functional responses is employed as a surrogate/emulator. An efficient method for computing the proposed criterion is developed. The proposed methodology is applied to a typical airfoil design optimization problem to demonstrate that better and more robust solutions are obtained with the proposed criterion for airfoils, and the computing time is significantly reduced with the proposed estimation method.
Article
Multiresponse problems are common in product or process development. A conventional approach for optimizing multiple responses is to use a response surface methodology (RSM), and this approach is called multiresponse surface optimization (MRSO). In RSM, the method of steepest ascent is widely used for searching for an optimum region where a response is improved. In MRSO, it is difficult to directly apply the method of steepest ascent because MRSO includes several responses to be considered. This paper suggests a new method of steepest ascent for MRSO, which accounts for tradeoffs between multiple responses. It provides several candidate paths of steepest ascent and allows a decision maker to select the most preferred path. This generation and selection procedure is helpful to better understand the tradeoffs between the multiple responses, and ultimately, it moves the experimental region to a good region where a satisfactory compromise solution exists. A hypothetical example is employed for illustrating the proposed procedure. The results of this case study show that the proposed method searches the region containing an optimum where a satisfactory compromise solution exists.
Article
To solve multiple response optimization problems that often involve incommensurate and conflicting responses, a robust interactive desirability function approach is proposed in this article. The proposed approach consists of a parameter initialization phase and calculation and decision-making phases. It considers a decision maker's preference information regarding tradeoffs among responses and the uncertainties associated with predicted response surface models. The proposed method is the first to consider model uncertainty using an interactive desirability function approach. It allows a decision maker to adjust any of the preference parameters, including the shape, bound, and target of a modified robust function with consideration of model uncertainty in a single and integrated framework. This property of the proposed method is illustrated using a tire tread compound problem, and the robustness of the adjustments for the approach is also considered. The new method is shown to be highly effective in generating a compromise solution that is faithful to the decision maker's preference structure and robust to uncertainties associated with model predictions.
Article
Most manufacturing industries produce products through a series of sequential stages, known as a multistage process. In a multistage process, each stage affects the stage that follows, and the process often has multiple response variables. In this paper, we suggest a new procedure for optimizing a multistage process with multiple response variables. Our method searches for an optimal setting of input variables directly from operational data according to a patient rule induction method (PRIM) to maximize a desirability function, to which multiple response variables are converted. The proposed method is explained by a step‐by‐step procedure using a steel manufacturing process as an example. The results of the steel manufacturing process optimization show that the proposed method finds the optimal settings of input variables and outperforms the other PRIM‐based methods.
Article
Robust design is an effective Quality by Design method to reduce product variation by selecting levels of design factors. For a number of situations, a nonstandard design region with linearly limited resources is needed to conduct an experiment. In the literature, little attention has been given to the development of robust design models for the nonstandard design region with a combination of linearly limited resources and a limited number of design points. In this paper, a selection scheme of D‐optimal experimental design points is proposed to generate design points using the modified exchange algorithm for the nonstandard design region while specifying linearly limited resources and the limited number of design points. The modified exchange algorithm is able to generate global design points with less time complexity than the improved Fedorov algorithm. In addition, robust design models linking a D‐optimal experimental design with quality considerations are proposed in order to obtain optimum settings of design factors for the product. Comparative studies are also presented. Finally, a real‐life experimental study shows that the proposed models with the desirability function and the sequential quadratic programming technique achieve greater variance reduction than the traditional counterparts.
Article
Purpose The purpose of this paper is to address three key objectives. The first is the proposal of an enhanced multiobjective optimisation (MOO) solution approach for the mean and mean-variance optimisation of multiple “quality characteristics” (or “responses”), considering predictive uncertainties. The second objective is comparing the solution qualities of the proposed approach with those of existing approaches. The third objective is the proposal of a modified non-dominated sorting genetic algorithm-II (NSGA-II), which improves the solution quality for multiple response optimisation (MRO) problems. Design/methodology/approach The proposed solution approach integrates empirical response surface (RS) models, a simultaneous prediction interval-based MOO iterative search, and the multi-criteria decision-making (MCDM) technique to select the best implementable efficient solutions. Findings Implementation of the proposed approach in varied MRO problems demonstrates a significant improvement in the solution quality in worst-case scenarios. Moreover, the results indicate that the solution quality of the modified NSGA-II largely outperforms those of two existing MOO solution strategies. Research limitations/implications The enhanced MOO solution approach is limited to parametric RS prediction models and continuous search spaces. Practical implications The best-ranked solutions according to the proposed approach are derived considering the model predictive uncertainties and MCDM technique. These solutions (or process setting conditions) are expected to be more reliable for satisfying customer specification compared to point estimate-based MOO solutions in real-life implementation. Originality/value No evidence exists of earlier research that has demonstrated the suitability and superiority of an MOO solution approach for both mean and mean-variance MRO problems, considering RS uncertainties. Furthermore, this work illustrates the step-by-step implementation results of the proposed approach for the six selected MRO problems.
Chapter
Most of manufacturing industries produce products through a series of sequential processes. This is called multistage process. It is often difficult to optimize the multistage process due to the correlation between stages. Therefore, the relationships among the multiple processes should be considered in the multistage process optimization. Also, the processes often have multiple responses, thus, it is important to optimize multiple responses of multistage process. In these days, data mining techniques have been widely applied to process optimization. The proposed method attempts to optimize multiresponse of multistage process using a particular data mining method, called patient rule induction method. The proposed method obtains an optimal setting of input variables directly from the operational data in which multiple responses are optimized, simultaneously. The proposed approach is explained and illustrated by a step-by-step procedure with a case example.
Article
A common problem generally encountered during manufacturing process improvement involves simultaneous optimization of multiple ‘quality characteristics’ or so-called ‘responses’, and determining the best process operating conditions. Such a problem is also referred to as ‘multiple response optimization (MRO) problem’. The presence of interaction between the responses calls for trade-off solution. The term ‘trade-off’ is an explicit compromised solution considering the bias and variability of the responses around the specified targets. The global exact solution in such types of nonlinear optimization problems are usually unknown, and various trade-off solution approaches (based on process response surface models or without using process response surface models) had been proposed by researchers over the years. Considering the prevalent and preferred solution approaches, the scope of this paper is limited to response surface (RS)-based solution approaches and similar closely related solution framework for MRO problems. This paper contributes by providing a detailed step-by-step RS-based MRO solution framework. The applicability and steps of the solution framework is also illustrated using a real life in-house pin-on-disc design of experiment study. A critical review on solution approaches with details on inherent characteristic features, assumptions, limitations, application potential in manufacturing and selection norms (indicative of the application potential) of suggested techniques/methods to be adopted for implementation of framework are also provided. To instigate research in this field, scopes for future work are also highlighted at the end.
Article
Full-text available
In product and process optimization, it is common to have multiple responses to be optimized. This is called multi-response optimization (MRO). When optimizing multiple responses, it is important to consider variability as well as mean of the multiple responses. The authors call this problem as extended MRO (EMRO) where both of mean and variability of the multiple responses are optimized. In this article, they propose a data mining approach to EMRO. In these days, analyzing a large volume of operational data is getting attention due to the development of data processing techniques. Traditional MRO methods takes a model-based approach. However, this approach has limitations when dealing with a large volume of operational data. The authors propose a particular data mining method by modifying patient rule induction method for EMRO. The proposed method obtains an optimal setting of the input variables directly from the operational data where mean and standard deviation of multiple responses are optimized. The authors explain a detailed procedure of the proposed method with case examples. Copyright © 2018, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Article
Natural process variability and model's uncertainty impact on nondominated solutions reproducibility cannot be ignored to assure that product or process will perform as expected when theoretical results are implemented in productive environments. To help the decision-maker in making more informed decisions when he/she selects a nondominated solution, two metrics are used to assess the predicted variability of nondominated solutions; one of them (the predicted standard error) quantifies the uncertainty in the estimated value for each response, the another one (the quality of predictions) quantifies the uncertainty associated to each generated solution. Supplementary material is provided to help the practitioners in calculating the metric values.
Article
Health care professionals often use regression methods to quantitatively describe the functional relationships between predictors and outcomes. However, little research investigates the appropriateness of tool application for response-surface-based design of experiments, the choice of regression estimators under different environments, and the impact of the determination of optimum conditions, or optimal factor-level settings, to achieve desirable target outcomes. Robust parameter design (RPD) is an established methodology for determining optimum conditions for a process to achieve specified process targets while minimizing variability of the outcomes. Underlying assumptions for RPD modeling and process conditions should be taken into account when selecting a regression estimator for developing fitted models. If these assumptions are incorrect, then a direct use of estimates obtained has the potential to be problematic, and the results may be potentially catastrophic, particularly when applied to the health care field. Many approaches to RPD in existing literature use ordinary least squares to obtain response functions by assuming normality and moderate variability in the underlying process. Given that some biologic processes are often highly variable and inherently asymmetric, a conditions-based approach for the selection of a regression estimate technique needs to be explored. This paper examines alternative approaches to regression estimation when the process data indicates that asymmetry or a high degree of process variability exists. The performance of select alternative regression methods is compared using Monte Carlo simulation and numerical analysis.
Article
Full-text available
In this paper we present a comparative study between the Generalized Reduced Gradient (GRG) and Genetic Algorithm (GA) methods to optimize multiple-response processes. Results from experiment design were used to compose the objective function to be minimized. The case studies in this work were selected from literature. A Microsoft Excel spreadsheet was used for parameters optimization using GRG, and the Scilab software was used to GA. Ten replicates were performed and the mean of the results was obtained. To assess the methods was used performance measures based on the mean percentage error. From the performance measures used, the AG showed better results compared to the GRG, indicating that the AG can generate better responses than GRG.
Article
Semiconductors are fabricated through unit processes including photolithography, etching, diffusion, ion implantation, deposition, and planarization processes. Chemical mechanical planarization (CMP), which is essential in advanced semiconductor manufacturing, aims to achieve high planarity across a wafer surface. Selectivity and roughness are the main response variables of the CMP process. Since the response variables are often in conflict, it is important to obtain a satisfactory compromise solution by reflecting the CMP process engineer's preference information. In this study, we present a case study in which the satisfactory compromise solution is obtained. The recently developed posterior preference articulation approach to multi-response surface optimization is employed for this purpose. The performance of response variables of CMP process have been shown to be better at the obtained setting than at the existing setting of process variables.
Chapter
In manufacturing and diverse industries, quality has a main role to increase the market share. Hence, producers focus their attention to design products or services with high quality to meet the customer's expectations. Quality characteristics of products usually expressed by variables called response variables. Today's complex systems have several performance attributes in responses and designers try to select the best combination of controllable factors that satisfies all quality characteristics simultaneously. Since there are often several conflicts in quality characteristics, such as measurement units, scale and optimality directions, there are different approaches in model building and optimization of multi response surface problems. Therefore, the study of simultaneous analysis and improvement methods of the outputs are of great importance.
Article
In this paper, considering the uncertainty associated with the fitted response surface models and the satisfaction degrees of the response values with respect to the given targets, we construct the robust membership functions of the responses in three cases and explain their practical meanings. We translate the feasible regions of multiple responses optimization (MRO) problems into ∂-level sets and incorporate the model uncertainty with the confidence intervals simultaneously to ensure the robustness of the feasible regions. Then we develop the robust fuzzy programming (RFP) approach to solve the multiple responses optimization (MRO) problems. The key advantage of the presented method is that it takes account of the location effect, dispersion effect and model uncertainty of the multiple responses simultaneously and thus can ensure the robustness of the solution. An example from literatures is illustrated to show the practicality and effectiveness of the proposed algorithm. Finally some comparisons and discussions are given to further illustrate the developed approach.
Article
Cassava (Manihot esculenta) is the most important staple food in sub-Saharan Africa. However, the shelf-life of the crop is short and, for this reason, the roots are usually processed into more stable products like cassava flour by village-based enterprises. Most of these enterprises use small-scale locally built pneumatic dryers, but such dryers still need further development, so the objective of this research was to improve their energy performance. Experiments were conducted at two cassava processing centres, one in Tanzania and one in Nigeria. Sensors were installed on the dryers, product samples were collected and the mass and energy balance of the equipment analysed, allowing the dryers' minimum air mass flow rates to be calculated. The air mass flow rates of both dryers were then reduced to a level approximating the minimum value. In Tanzania, the air mass flow rate of the dryer was reduced by 24%, while in Nigeria it was reduced by 14%. In both locations, the modifications decreased the dryers' heat input without jeopardising evaporation rates, and so not affecting the final moisture content of the dry products. Air temperatures at the dryer outlets decreased and relative humidity increased, while enthalpy remained unchanged. The energy required to evaporate 1 kg of water decreased by 20% in Tanzania and by 13% in Nigeria. The modification also improved energy efficiency by 25% in Tanzania and by 14% in Nigeria. However, in Nigeria, where yellow cassava flour was being used, the dryer modifications resulted in greater product colour losses.
Article
Full-text available
Inadequate control of a production line can result in catastrophic ramifications, which may lead to low-quality products. For this reason, enhancing the quality of product in the first stages of development requires considerable attention. In this regard, robust parameter design for product has emerged as an outstanding tool. This article presents a robust parameter design in one product, into which the concept of response surface methodology has been infused. Through this, by encompassing the variance and mean of response variable, an integrated approach is presented to combine three phases of product development: system design, parameter design, and tolerance design. Moreover, to survive in a competitive market, companies must produce appealing products. Therefore, we resorted to a feasible scheme to enhance customer satisfaction in the production process through the implementation of the desirability function concept. To approve the suitability of the presented method, it has been executed in an actual case study. Research results emphasize that the demonstrated model is more effective than dual-response and mean-square error techniques.
Conference Paper
Robust design, which is an important technology of continuous quality improvement activity, has been widely applied to optimal design of product or process. In this paper, a new approach integrating an improved desirability function and dual response surface models is proposed to tackle the problem of multi-response robust design. We build two desirability functions for mean and variance through combining desirability function and dual response surface models, respectively. Furthermore, we separately give objective weights for mean desirability function and variance desirability function by using entropy weight theory. Then, the overall desirability function considering location effect and dispersion effect is optimized by a hybrid genetic algorithm to obtain the optimum parameter settings. An example is illustrated to verify the effectiveness of the proposed method. The results show that the proposed approach can achieve more robust and feasible parameter settings.
Article
Full-text available
The p-hub median problem aims at locating p-hub facilities in a network and allocating non-hub nodes to the hubs such that the overall transportation cost is minimized. One issue of major importance in this problem remarks the requirement to deal with uncertain factors such as weather conditions and traffic volume. These lead to uncertainty in travel time between origin and destination points. In today’s competitive markets in which customers look for robust delivery services, it is important to minimize the upper bound of uncertainty in the network routes. In this paper, a robust bi-objective uncapacitated single allocation p-hub median problem (RBUSApHMP) is introduced in which travel time has non-deterministic nature. The problem aims to select location of the hubs and allocation of the other nodes to them so that overall transportation cost and maximum uncertainty in network are minimized. To do this, a desirability function-based approach is suggested that ensures both interested objectives to fall within their specification limits. Due to the complexity of the model, a heuristic based on scatter search and variable neighborhood descent is developed. To evaluate the performance of the proposed method a computational analysis on Civil Aeronautics Board and Australian Post data sets was performed. The obtained results using the proposed hybrid metaheuristic are compared to those of the optimum solutions obtained using GAMS. The results indicate excellent performance of the suggested solution procedure to optimize RBUSApHMP.
Conference Paper
Full-text available
RESUMO – Neste trabalho foi realizado um estudo comparativo entre as metodologias de otimização Gradiente Reduzido Generalizado (GRG) e Algoritmo Genético (AG) para a otimização de processos com múltiplas respostas. Para estimar os parâmetros que minimizam a função objetivo foram utilizadas respostas geradas por planejamento de experimentos de forma aglutinada, as quais foram incorporadas à função objetivo. A otimização dos valores dos parâmetros do processo utilizando GRG foi realizada em uma planilha do Microsoft Excel e para o algoritmo genético utilizou-se a função optim_ga do software computacional Scilab. Foram realizadas 10 replicações e calculada a média dos resultados obtidos. A comparação entre os métodos foi realizada com base em medidas de desempenho, por meio da distância média percentual. A partir das medidas de desempenho utilizadas, o AG apresentou melhores resultados em comparação com o GRG, indicando que o AG pode gerar respostas melhores que o GRG.
Article
Full-text available
A goal attainment approach to optimize multiresponse systems is presented. This approach aims to identify the settings of control factors to minimize the overall weighted maximal distance measure with respect to individual response targets. Based on a nonlinear programming technique, a sequential quadratic programming algorithm, the method is proved to be robust and can achieve good performance for multiresponse optimization problems with multiple conflicting goals. Moreover, the optimization formulation may include some prior work as special cases by assigning proper response targets and weights. Fewer assumptions are needed when using the approach as compared to other techniques. Furthermore, the decision-maker's preference and the model's predictive ability can easily be incorporated into the weights' adjustment schemes with explicit physical interpretation. The proposed approach is investigated and compared with other techniques through various classical examples in the literature.
Article
Since the early 1980s, industry has embraced the use of designed experiments as an effective means for improving quality. For quality characteristics not normally distributed, the practice of first transforming the data and then analyzing them by standard normal-based methods is well established. There is a natural alternative called generalized linear models (GLMs). This paper explains how GLMs achieve the intended goal of transformation while at the same time giving a wider class of models that can handle a range of applications. Moreover, the same iterative strategy for data analysis that has been developed for normal data over the years, namely, the alternation between model selection and model checking, extends easily to analyses with GLMs. The paper illustrates the ability of GLMs to handle many different types of data by the re-analysis of three quality-improvement experiments.
Article
The optimization of dual response systems in order to achieve better quality has been an important problem in the robust design of industrial products and processes. In this paper, we propose a goal programming approach to optimize a dual response system. The formulation is general enough to include some of the existing methods as special cases. For purposes of illustration and comparison, the proposed approach is applied to two examples. We show that by tuning the weights assigned to the respective targets for the mean and standard deviation, past results can easily be replicated using Excel Solver, the use of which further enhances the practical appeal of our formulation.
Article
In modern quality engineering, dual response surface methodology is a powerful tool. In this paper, we introduce a fuzzy modeling approach to optimize the dual response system. We demonstrate our approach in two examples and show the advantages of our method by comparing it with existing methods.
Article
Vining and Myers adapted the dual response approach to achieve the goals of Taguchi's philosophy. This excellent approach contains some deficiencies that will be highlighted in this paper. A more satisfactory and substantially simpler optimization procedure on a dual response approach is proposed that allows more general response models to be entertained in reality. The new method is demonstrated using the example given in Vining and Myers in which it leads to a solution with 25% smaller mean square error.
Article
A single data transformation may fail to satisfy all the required properties necessary for an analysis. With generalized linear models (GLMs), the identification of the mean-variance relationship and the choice of the scale on which the effects are to be measured can be done separately, overcoming the shortcomings of the data-transformation approach. GLMs also provide an extension of the response surface approach. In this paper, we set out the current status of the GLM approach to the analysis of data from quality-improvement experiments and discuss its merits.
Article
Statistically designed experiments have been employed extensively to improve product or process quality and to make products and processes robust. In this paper, we consider experiments with correlated multiple responses whose means, variances, and correlations depend on experimental factors. Analysis of these experiments consists of modeling distributional parameters in terms of the experimental factors and finding factor settings which maximize the probability of being in a specification region, i.e., all responses are simultaneously meeting their respective specifications. The proposed procedure is illustrated with three experiments from the literature.
Article
Practitioners often must choose optimum operating conditions for several responses simultaneously. Rarely is the resulting "optimum" truly optimal for all of the individual responses taken individually. Instead, the optimum represents some explicit compromise among the conflicting conditions. This paper proposes a mean squared error method which allows the practitioner to specify the directions of economic importance for the compromise optimum, while seriously considering the variance-covariance structure of the multiple responses. By looking at the mean square error, this approach seriously incorporates the variance-covariance structure of the multiple responses. An example illustrates the methodology.
Article
Generalized linear models provide a useful tool for analyzing data from quality-improvement experiments. We discuss why analysis must be done for all the data, not just for summarizing quantities, and show by examples how residuals can be used for model checking. A restricted-maximum-likelihood-type adjustment for the dispersion analysis is developed.Les modèles généralisés linéaires fournissent un outil utile à l'analyse de données provenant d'expériences d'amélioration de la qualité. Nous discutons pourquoi l'analyse doit être faite pour les données entières, et non seulement pour des quantités de résumé, et démontrons par des exemples comment les résiduelles peuvent être utilisées pour vérifier les modèles. Un ajustement de type maximum de vraisemblance restreinte pour l'analyse de la dispersion est développé.
Article
The purpose of this paper is to present the theory and develop an algorithm associated with the exploration of a dual response surface system. The approach is to find conditions on a set of independent or “design” variables which maximize (or minimize) a “primary response” function subject to the condition that a “constraint response” function takes on some specified or desirable value. A method is outlined whereby a user can generate simple two dimensional plots to determine the conditions of constrained maximum primary response regardless of the number of independent variables in the system. He thus is able to reduce to simple plotting the complex task of exploring the dual response system. The procedure that is used to generate the plots depends on the nature of the individual univariate response functions.In certain situations it becomes necessary to apply the additional constraint that the located operating conditions are a certain “distance” from the origin of the independent variables (or the center of the experimental design). The methods derived and discussed in the paper are applicable only to quadratic response functions.
Article
This chapter discusses the basic methods used in response surface methodology (RSM) for the design and analysis of multiresponse experiments. The formal development of RSM was initiated by the work of Box and Wilson, which introduced the sequential approach in an experimental investigation. In the early development of RSM, only single-response variables were considered. Data obtained in a multiresponse experiment (multiresponse data) are multivariate in character. It is therefore necessary that multivariate techniques be deployed for the analysis of such data. The other issues discussed in the chapter include (1) plotting of multiresponse data, (2) estimation of parameters of a multiresponse model, (3) inference for multiresponse models, (3) Designs for multiresponse models, and (4) multiresponse optimization. There are several techniques for graphing multiresponse data. Some techniques plot projections of the data on subspaces of dimensions three or fewer. The use of multiresponse techniques in response surface methodology is considered to be a relatively novel endeavor. Both the design and analysis of multiresponse experiments have received precious little attention even though they are sorely needed.
Article
A problem facing the product development community is the selection of a set of conditions which will result in a product with a desirable combination of properties. This essentially is a problem involving the simultaneous optimization of several response variables (the desirable combination of properties) which depend upon a number of independent variables or sets of conditions. Harrington, among others, has addressed this problem and has presented a desirability function approach. This paper will modify his approach and illustrate how several response variables can be transformed into a desirability function, which can be optimized by univariate techniques. Its usage will be illustrated in the development of a rubber compound for tire treads.
Article
Recently several authors have discussed how one can solve the three dual response problems proposed by Taguchi (target is best, larger is better, and smaller is better) without resorting to signal-to-noise ratios. The most recent articles have all relied on nonlinear programming techniques to obtain the actual solution. While these techniques certainly work, it is also possible to solve the dual response problems using the more familiar technique of direct function minimization. We demonstrate this using the Nelder-Mead simplex procedure. We also propose slightly different formulations of the problems, which seem to be more realistic. Examples for each type of dual response problem are presented.
Article
G. Taguchi and his school have made significant advances in the use of experimental design and analysis in industry. Of particular significance is their promotion of the use of statistical methods to achieve certain objectives in a mean response while simultaneously minimizing the variance. Unfortunately, many statisticians criticize the Taguchi techniques of analysis, particularly those based on the signal-to-noise ratio. This paper shows how the dual response approach of optimizing a “primary response” function while satisfying conditions on a “secondary response” function can be used to achieve the goals of the Taguchi philosophy within a more rigorous statistical methodology. An example illustrates this point.
Article
An algorithm is developed for the simultaneous optimization of several response functions that depend on the same set of controllable variables and are adequately represented by polynomial regression models of the same degree. The data are first checked for linear dependencies among the responses. If such dependencies exist, a basic set of responses among which no linear functional relationships exist is chosen and used in developing a function that measures the distance of the vector of estimated responses from the estimated “ideal” optimum. This distance function permits the user to account for the variances and covariances of the estimated responses and for the random error variation associated with the estimated ideal optimum. Suitable operating conditions for the simultaneous optimization of the responses are specified by minimizing the prescribed distance function over the experimental region. An extension of the optimization procedure to mixture experiments is also given and the method is illustrated by two examples.
Article
Colloidal gas aphrons (CGAs) were first reported by Sebba (J. Colloid Interface Sci., 35 (4) (1971) 643) as micro bubbles (10–100 μm), composed of a gaseous inner core surrounded by a thin surfactant film, which are created by intense stirring of a surfactant solution. Since then, these colloidal dispersions have been used for diverse applications (clarification of suspensions, removal of sulphur crystals, separation of organic dyes from wastewater, etc.). However, there have been no reports, as yet, of their direct application for protein recovery. In this study, CGAs are created from an anionic surfactant (AOT) and are characterised in terms of stability and gas hold-up for a range of process parameters relevant to their proposed use for protein recovery, at a later stage. A statistical experimental design was developed in order to study the effect of different factors (surfactant concentration, salt concentration, pH, time of stirring and temperature) on the stability and gas hold-up of CGAs. The analysis of results from the experimental design provides predictive statistical models. Stability was found to depend mainly on salt and surfactant concentration. Several interactions are shown to be significant including the time-temperature interaction. Gas hold-up was found to depend mainly on salt and surfactant concentration and time of stirring. Also, results from power measurements are presented and the minimum energy for the formation of CGAs, for one set of solution properties, is determined.
Article
Virtually every manufactured product has more than one characteristic by which its overall quality is determined. Since those quality characteristics may be correlated, some changes in the levels of the controllable product design and process design variables may improve one of the quality characteristics while adversely affecting one or more of the other quality characteristics. In this paper we define a quadratic loss function for use with multiple quality characteristics. We discuss several strategies that could be employed for robust quality engineering of products and processes when there is more than one quality characteristic of interest.
Chapter
The optimization of dual response systems in order to achieve better quality has been an important problemin the robust design of industrial products and processes. In this chapter, we propose a goal programming approach to optimize a dual response sys- tem.The formulation is general enough to include some of the existingmethods as spe- cial cases. For purposes of illustration and comparison, the proposed approach is ap- plied to two examples.We showthat by tuning theweights assigned to the respective targets for themean and standard deviation, past results can easily be replicated using Excel Solver, the use ofwhich further enhances the practical appeal of our formulation.
Conference Paper
The authors suggest a framework for the multicriteria optimization of simulation models by first discussing the unique difficulties of this problem area along with important problem characteristics, and then discussing the way that these problem characteristics would affect the choice of a particular technique. The problem of manufacturing system optimization is addressed. Various techniques, along with their advantages and disadvantages, are discussed and categorized according to the timing of the articulation of the required preference (tradeoff) information with respect to the optimization
Conference Paper
In this paper, we develop a ranking and selection procedure for making multiple comparisons of systems that have multiple performance measures. The procedure combines multiple attribute utility theory with ranking and selection to select the best configuration from a set of K configurations using the indifference zone approach. We demonstrate our procedure on a simulation model of a large project that has six performance measures.
Article
A modelling approach to optimize a multiresponse system is presented. The approach aims to identify the setting of the input variables to maximize the degree of overall satisfaction with respect to all the responses. An exponential desirability functional form is suggested to simplify the desirability function assessment process. The approach proposed does not require any assumptions regarding the form or degree of the estimated response models and is robust to the potential dependences between response variables. It also takes into consideration the difference in the predictive ability as well as relative priority among the response variables. Properties of the approach are revealed via two real examples—one classical example taken from the literature and another that the authors have encountered in the steel industry.
A balancing act: Optimizing a product’s properties
  • Derringer
Derringer, G., 1994. A balancing act: Optimizing a productÕs properties. Quality Progress 6, 51–58.
Optimization of Ford Taurus Wheel Cover Balance
  • D Harper
  • M Kosbe
  • L Peyton
Harper, D., Kosbe, M., Peyton, L., 1987. Optimization of Ford Taurus Wheel Cover Balance, Fifth Symposium on Taguchi Methods, pp. 527–539.
Simulation Modeling and Analysis, third ed Robust design via generalized linear models
  • A Law
  • W Kelton
Law, A., Kelton, W., 2000. Simulation Modeling and Analysis, third ed. McGraw-Hill, Singapore. Lee, Y., Nelder, J., 2003. Robust design via generalized linear models. Journal of Quality Technology 35 (1), 2–12.
Multiresponse surface methodology Handbook of Statistics: Design and Analysis of Experiments
  • A Khuri
Khuri, A., 1996. Multiresponse surface methodology. In: Ghosh, A., Rao, C.R. (Eds.), Handbook of Statistics: Design and Analysis of Experiments (Volume 13), pp. 377–406.
Characterisation of colloidal gas aphrons for subsequent use for protein recovery
  • Jauregi