Article

Process capability: A criterion for optimizing multiple response product and process design

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

In this research, we consider the maximization of process capability as the criterion in product/process design that is used for selecting preferred design factor levels and propose several approaches for single and multiple response performance measure designs. All of these approaches assume that the relationship between a process performance measure and a set of design factors is represented via an estimate of a response surface function. In particular, we develop; (i) criteria for selecting an optimal design, which we call MCpk and MCpm; (h) mathematical programming formulations for maximizing MCpk and MCpm, including formulations for maximizing the desirability index (Harrington, 1965) and for maximizing the standardized performance criteria (Barton and Tsui, 1991) as special cases of the formulation for maximizing MCpk, (iii) formulations for considering cost when maximizing MCpk and MCpm, (iv) a means for assessing propagation of error; (v) a robust design method for assessing design factor effects on residual variance; (vi) a means for assessing the optimality of a proposed solution: and (vii) an original application in the screening of printed circuit board panels.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... However, the traditional process capability index, which is defined as the ratio of dispersion to tolerance i.e. Cp=T/6σ, is not suitable for assessing the process alternatives during preliminary process planning in that: (1) Cp is only suitable for assessing individual quality characteristic in one manufacturing step [9] and works well at the micro level of manufacturing system [10]. (2) Cp is only suitable for batch production mode because it requires the outputs i.e. quality characteristics of manufacturing system subject to statistical distribution so that their standard deviation (σ) can be determined [9]. ...
... Cp=T/6σ, is not suitable for assessing the process alternatives during preliminary process planning in that: (1) Cp is only suitable for assessing individual quality characteristic in one manufacturing step [9] and works well at the micro level of manufacturing system [10]. (2) Cp is only suitable for batch production mode because it requires the outputs i.e. quality characteristics of manufacturing system subject to statistical distribution so that their standard deviation (σ) can be determined [9]. Hence, we propose a quality measure index-composite process capability (CCp) as the approximate assessment criteria for the process quality of the process alternatives. ...
Conference Paper
Full-text available
This paper proposes an approach to preliminary process planning for quality. It aims to determine key process alternatives with enough process capability by using quality planning and assessment methods during preliminary process planning. This approach includes four steps: (1) identification of quality characteristics, (2) planning of process quality by combining quality function deployment (QFD) with process FMEA, (3) selecting of process alternatives, and (4) assessment of process quality through a quality measure index called composite process capability (CCp). The key process alternatives with enough CCp can be used as guidelines to the detailed process planning so as to reduce the modification of process plans.
... Notwithstanding, in recent years, optimizing multi-response variables has become the subject matter for Statisticians with a view to tackling trending industrial problems which involve the simultaneous optimization of several quality characteristics [7,8,13,17,11], among other authors have all argued that the simultaneous consideration of multi-response variables should commence with the development of appropriate response surface models for each of the response, first, after which attempts to find a set of operating conditions that optimize all the process responses simultaneously, or which at least keep them in desired ranges, can be made. Over the years, beginning with [18][19][20], scholarly research articles have strictly adhered to the first part of the routine in tackling the problem of multi-response optimization, with variations in their techniques emerging from the second part of the routine. ...
... Subsequently, in an independent work, [37] further extended this index; hence, this index could now be applied in multi-response optimization. The maximization of process capability as a criterion for multi-response optimization was further considered by [19]. ...
Article
Full-text available
Multi-response surface optimization (MRSO) is a problem that is peculiar to an industrial setting, where the aim of a process engineer is to set his process at operating conditions that simultaneously optimize a set of process responses. In Statistics, several methods have been proffered for tackling problems of this nature. Some of such methods are that of: overlapping contour plots, constrained optimization problem, loss function approach, process capability approach, distance function approach, game theory approach, and the desirability function approach. These, methods are however, not without teething flaws as they are either too problem specific, or require very complex and inflexible routines; little wonder, the method of desirability function has gained popularity especially because it overcomes the latter limitation. In this article, we have proposed and implemented a multivariate-based technique for solving MRSO problems. The technique fused the ideas of response surface methodology (RSM), multivariate multiple regression and Pareto optimality. In our technique, RSM was implemented on an all-maximization problem as a case-study process; in which case, first-order models (FOMs) for the responses were fitted using 2 k factorial designs until the FOMs proved to be inadequate, while uniform precision rotatable central Original Research Article Usen et al.; AJPAS, 11(4): 60-85, 2021; Article no.AJPAS.59570 61 composite design was used to obtain second-order models (SOMs) for the respective responses in the event of model inadequacy of the FOMs. With the implementation of the proposed technique to the case study, optimal operating conditions were obtained, with observations stemming thereof summarized as axioms. The first, second and third axioms respectively stated that: (1) the mid-point of all optimal operating conditions obtained via the proposed technique is Pareto optimal, (2) the mid-point of all optimal responses at the Pareto optimal operating condition is Pareto optimal, and (3) the region bounded by each of the optimal operating conditions from each second-order model (SOM) is a Pareto front.
... Notwithstanding, in recent years, optimizing multi-response variables has become the subject matter for Statisticians with a view to tackling trending industrial problems which involve the simultaneous optimization of several quality characteristics [7,8,13,17,11], among other authors have all argued that the simultaneous consideration of multi-response variables should commence with the development of appropriate response surface models for each of the response, first, after which attempts to find a set of operating conditions that optimize all the process responses simultaneously, or which at least keep them in desired ranges, can be made. Over the years, beginning with [18][19][20], scholarly research articles have strictly adhered to the first part of the routine in tackling the problem of multi-response optimization, with variations in their techniques emerging from the second part of the routine. ...
... Subsequently, in an independent work, [37] further extended this index; hence, this index could now be applied in multi-response optimization. The maximization of process capability as a criterion for multi-response optimization was further considered by [19]. ...
Article
Multi-response surface optimization (MRSO) is a problem that is peculiar to an industrial setting, where the aim of a process engineer is to set his process at operating conditions that simultaneously optimize a set of process responses. In Statistics, several methods have been proffered for tackling problems of this nature. Some of such methods are that of: overlapping contour plots, constrained optimization problem, loss function approach, process capability approach, distance function approach, game theory approach, and the desirability function approach. These, methods are however, not without teething flaws as they are either too problem specific, or require very complex and inflexible routines; little wonder, the method of desirability function has gained popularity especially because it overcomes the latter limitation. In this article, we have proposed and implemented a multivariate-based technique for solving MRSO problems. The technique fused the ideas of response surface methodology (RSM), multivariate multiple regression and Pareto optimality. In our technique, RSM was implemented on an all-maximization problem as a case-study process; in which case, first-order models (FOMs) for the responses were fitted using 2k factorial designs until the FOMs proved to be inadequate, while uniform precision rotatable central composite design was used to obtain second-order models (SOMs) for the respective responses in the event of model inadequacy of the FOMs. With the implementation of the proposed technique to the case study, optimal operating conditions were obtained, with observations stemming thereof summarized as axioms. The first, second and third axioms respectively stated that: (1) the mid-point of all optimal operating conditions obtained via the proposed technique is Pareto optimal, (2) the mid-point of all optimal responses at the Pareto optimal operating condition is Pareto optimal, and (3) the region bounded by each of the optimal operating conditions from each second-order model (SOM) is a Pareto front.
... Such an ability is usually measured by process capability indices [22][23]. Certain scholars suggested that process capability can be estimated when PCI is introduced into the parameter and tolerance design stages [24][25][26]. In previous studies, the approach to solution of parameter and tolerance parallel design for multivariate quality characteristics is described as follows. ...
... PCI is used to measure the quality level of a production process. Plante [24] proved that it is worthwhile to apply MPCI to multiple-response optimization design of both product and process. Jeang [25] analyzed the manufacturer cost and quality loss cost and subsequently implemented parameter and tolerance parallel design using MPCI. ...
Article
Full-text available
The process capability index (PCI) is widely used in on-line quality control stage for measuring and controlling the quality level of a production process. The calculation of PCI requires a large number of samples, but in the off-line quality control stage, a certain production process in off-line quality control stage only has a few individual observations. From the perspective of quality loss and tolerance cost, this paper proposes a parameter and tolerance economic design approach for multivariate quality characteristics based on the modified PCI with individual observations. The response surface models of mean and variance are constructed using individual observations, and exponential models are fitted according to the tolerance cost data of design variables. A modified PCI is proposed with consideration of three types of quality characteristics. The optimal design variables and tolerances are obtained by a comprehensive optimization model that is constructed based on the proposed PCI. An example of an isobutylene-isoprene rubber (IIR) inner tube is used to: (i) demonstrate the implementation of our proposed approach, (ii) improve the PCI value and reflect the sensitivity of the deviation between process mean and specification, and (iii) reduce the risk of increasing cost of quality caused by replicated experimental design and some other unknown reasons.
... Del modify the desirability function to allow the assigning of importance weights to responses. Fogliatto & Albin (2000;2001; propose the desirability function as a means to combine quantitative and qualitative response variables in an optimization procedure based on the Analytic Hierarchy Process. They also introduce accuracy of prediction as an optimization criterion in MRO. ...
... Finally, Wu (2005) proposes modifications in the loss function to accommodate situations where loss associated to deviation from target is not symmetric and where responses present correlated losses. The resulting function may be viewed as a variation of that proposed by Plante (2001), presented below. ...
Conference Paper
Full-text available
Dans les plans d'expériences multiréponse (MRE) le même élément expérimental est évalué par rapport à plus d'une réponse simultanément. L'optimisation de MREs implique typiquement la détermination du point dans la surface réponse où les réponses sont optimales par rapport aux critères donnés. Des fonctions utilitaires (utility functions) sont employées pour transformer des résultats de réponses à chaque traitement expérimental en mesures de performance. Nous étudions l'optimisation de MREs où quelques résultats de réponse sont des profils plutôt que de différentes valeurs. Une réponse profil donne un ou plusieurs profils comme résultats observés à chaque traitement, et l'objectif est d'identifier les résultats qui sont les plus proches du profil cible. Nous proposons l'utilisation de la distance de Hausdorff, une métrique de similarité du domaine de traitement d'image, en combinaison avec une fonction de désirabilité pour obtenir une fonction utilitaire qui donne la distance des résultats de réponse profil à sa cible désirée. Un exemple de l'industrie alimentaire illustre nos propositions. Abstract In multiresponse experiments (MRE) the same experimental unit is evaluated with respect to more than one response simultaneously. Optimization of MREs typically involves determining the point in the design region where responses perform best with respect to given criteria. Utility functions are used to transform responses outcomes at each experimental treatment into performance measures. Here we investigate the optimization of MREs where some response outcomes are profiles rather than individual values. A profile response gives one or more profiles as observed outcomes at each treatment, and the objective is to identify the outcome that is closest to a target profile. We propose the use of the Hausdorff Distance, a simmilarity metric from the field of image recognition, in combination with a desirability function to obtain a utility function that gives the distance of a response profile outcome to its desired target. A case example from the food industry illustrates our propositions.
... Esta técnica además de que no considera la correlación entre las variables de respuesta, ni la incertidumbre en los parámetros de los modelos construidos, puede llevar a seleccionar como la condición de operación del proceso óptima alguna en la que uno o unos de los índices sean demasiado grandes para algunas respuestas, sin importar que las otras tomen valores demasiado bajos o inclusive que su media caiga fuera de especificaciones. Plante (2001) propone los índices de capacidad del proceso MCpk y MCpm como criterios de optimización múltiple. Empezando por el índice de capacidad del proceso MCpk, este es igual a la media geométrica de las medidas de comportamiento Cpk individuales y según el autor, la filosofía fundamental detrás del índice MCpk tiene profundas implicaciones en la calidad, esto es, si una o más de las medidas de desempeño es inadecuada (índice cercano a cero), entonces el producto completo se vuelve inaceptable, sin importar los índices de capacidad de las restantes medidas de desempeño. ...
... La optimización es similar a la del MCpk, sin embargo el índice MCpm, como el índice Cpm, requiere de valores objetivo para cada característica de calidad, por lo que no es tan flexible con el índice MCpk. Plante (2001), reconoce que en algunas ocasiones no todas las características de calidad son igual de importantes, por lo que según dice puede ser más importante lograr mayores índices de capacidad en una o más características a expensas de algunas otras. Es por esto que en su enfoque se considera la posibilidad de considerar la importancia relativa de las diferentes características de calidad. ...
Article
Full-text available
The application of a bayesian methodology for simultaneous optimization of multiple responses is presented. A previous proposal is modified by introducing a modification that makes it more flexible, allowing working with the case in which not all the variables considered in the study are equally important. To incorporate in the methodology these differences related to the analyzed response variables the weighted geometric mean of the probabilities that each variable meets its own specification is used. With this change, the multiple optimization variables are converted into only one. An example to show how the proposed modification helps in achieving different optimal scenarios to consider in making decisions is discussed.
... Although PCI is one of the quality measurements employed during on-line quality management, several authors have pointed out that PCI should be addressed at the beginning of the design stage, rather than at the production stage, i.e., for off-line quality management (Shina 1991, Shina and Saigal 2000, Chen et al. 2001, Plante 2001. This provided the motivation to develop the proposed PCI expression to estimate the degree of the producer's process capability prior to the production process. ...
... If C pm expression is considered as a measurement for process design in off-line PCI applications, the associated process mean and process variance, in contrast to on-line PCI applications, become two unknown variables which process design engineers would then need to determine. Thus, Plante (2001) considered the maximising of process capability C pm as the criterion for selecting preferred design variable levels in product/process design. As for the nature of C pm expression, process designers would likely set the process mean as close to the design target as possible within process feasible ranges, and minimise the process variance as much as possible within the process capability limits. ...
Article
Full-text available
Conventional process capability analysis is used to measure and control the quality level of a production process in real exercises for on-line quality management. There has been a deficiency in this type of management; namely, the defects which occur in the production process are only passively detected and modified afterwards. Additionally, conventional process capability expression has difficulty distinguishing between alternatives for process selection among possible candidates before process realisation. There is, therefore, considerable motivation for developing a process capability expression which can be used to evaluate alternatives at the beginning of the process design, i.e., off-line application. The conventional Cpm expression is built up by measuring mean deviation and process variances for on-line application. However, if Cpm is used for the process capability analysis for process design, an erroneous Cpm value is found and an inappropriate process design is ended. Thus, the proposed process capability expression revised from the conventional Cpm in consideration of the balance between tolerance cost and quality loss has been developed. This development is the main contribution of this research and, with this development, the appropriate mean and tolerance values can be determined simultaneously prior to the real production process so as to maximise the proposed process capability value. The production is then processed with the pre-determined mean and tolerance values in a real production process. The expectation after process realisation is that the produced responses will be the best of all the alternatives in terms of quality and cost, and that the process capability value obtained after the real production process will be close to the proposed process capability value maximised prior to the real production process.
... As already mentioned, several options have emerged to work with the problem of simultaneous optimization of multiple responses, the most common based on the loss function [3][4] on process capability indexes [5][6], on the desirability function [7] [8] on the overlapping of graphical response surfaces and have recently emerged techniques that optimize the probability of concordance [9][1]. Some of these techniques emerged to overcome a weakness that had other options or even to complement its earlier versions evolving on the same idea and incorporating some changes that would make them more complete. ...
... [26] proposed the use of as an alternative optimization criterion to MSE for multi-responses optimization. [27] considered the maximization of the process capability as a criterion in the process design to obtain the best operating condition. Using the ideas set out in these articles as a point of departure, the current literature includes several other approaches -see, [28][29][30][31]. ...
Article
Full-text available
Response surface methodology (RSM) – the method most preferred by quality engineers – is a natural and effective tool to achieve the desired process quality. Most of the current literature on process quality does not focus on information relating to how much better or worse a process is and also the degree of the process performance. On the other hand, although the process performance criteria are able to predict process capability, they cannot provide significant information relating to the process quality in terms of rate of rejects and losses. Therefore, this paper takes into account these two concepts and defines a criterion based on the process capability indices for the upside-down normal loss function (UDNLF). The proposed approach determines the optimal settings of a given process by minimizing the expected UDNLF which is defined in terms of and indices. The proposed procedure and its merits are illustrated on the basis of an example.
... As aforementioned, several options have emerged to work with the problem of simultaneous optimization of multiple responses, such as the most common ones based on the loss function (Ames et al. 1997;Ch'ng et al. 2005), on process capability indexes (Plante 2001;Derringer and Suich 1980), on the desirability function (Ortiz et al. 2004;Lee and Kim 2007), on the overlapping of graphical response surfaces, and the recently emerged techniques that optimize the probability of concordance (Chiao and Hamada 2001;Peterson 2004). Some of these techniques emerged to overcome an existing weakness of other options or even to complement their earlier versions evolving on the same idea and incorporating some changes that would make them more complete. ...
Chapter
This chapter proposes the modification of a technique of simultaneous optimization of multiple response variables that works using a Bayesian predictive distribution to incorporate different weights to the response variables according to their importance in the cost or functionality of products. To achieve this, the desirability function has been incorporated into the original proposal. This research shows through the simulation of different scenarios in a case study taken from literature that the proposed optimum process operating conditions always moved towards regions where the response variables with the highest weights had the best results, at the expense of performance in variables with the lowest weights.
... As aforementioned, several options have emerged to work with the problem of simultaneous optimization of multiple responses, such as the most common ones based on the loss function (Ames et al. 1997;Ch'ng et al. 2005), on process capability indexes (Plante 2001;Derringer and Suich 1980), on the desirability function (Ortiz et al. 2004;Lee and Kim 2007), on the overlapping of graphical response surfaces, and the recently emerged techniques that optimize the probability of concordance (Chiao and Hamada 2001;Peterson 2004). Some of these techniques emerged to overcome an existing weakness of other options or even to complement their earlier versions evolving on the same idea and incorporating some changes that would make them more complete. ...
Chapter
The Project Portfolio Problem (PPP) has been solved through different approaches. The success of some of them is related to a proper application of the decision-maker’s preferences, and a correct identification of organization’s resource practices and conditions. However, there are still a small number of classes of PPP that have been solved using these approaches, and there is also a need for increasing them. Due to this situation, the present research develops a strategy, based on ant colony optimization that incorporates the decision-maker’s preferences into the solution of a case of PPP under conditions of synergy, cannibalization, redundancy, and with interactions between projects. The algorithm was experimentally tested, and the results show a good performance of it over a random set of instances.
... Plante [37] proposed calculating multivariate PCIs as the geometrical mean of all univariate PCIs that describe the capability of each single-product characteristic. ...
Article
Full-text available
This paper offers a review of univariate and multivariate process capability indices (PCIs). PCIs are statistic indicators widely used in the industry to quantify the capability of production processes by relating the variability of the measures of the product characteristics with the admissible one. Univariate PCIs involve single-product characteristics while multivariate PCIs deal with the multivariate case. When analyzing the capability of processes, decision makers of the industry may choose one PCI among all the PCIs existing in the literature depending on different criteria. In this article, we describe, cluster, and discuss univariate and multivariate PCIs. To cluster the PCIs, we identify three classes of characteristics: in the first class, the characteristics related to the information of the process data input are included; the second class includes characteristics related to the approach used to calculate the PCIs; and in the third class, we find characteristics related to the information that the PCIs give. We discuss the strengths and weaknesses of each PCI using four criteria: calculation complexity, globality of the index, relation to proportion of nonconforming parts, and robustness of the index. Finally, we propose a framework that may help practitioners and decision makers of the industry to select PCIs.
... It is also important to highlight that these metrics are not unique and can be adjusted or complemented by others, like those introduced by Chiao and Hamada, 79 Kim and Cho, 80 Ko et al., 77 and Plante. 81 Graphical approaches are also alternatives. 82 ...
Article
This paper explores the benefits of Gaussian process model as an alternative modeling technique for problems developed in the Response Surface Methodology framework. Three case studies with different type and number of responses were investigated, and the compromise solutions obtained with three modeling techniques were evaluated. Results provide evidences of the Gaussian process model usefulness for stochastic responses, namely, when responses are correlated. Copyright © 2015 John Wiley & Sons, Ltd.
... Another general solution approach is to change the multiobjective problem with various constraints into a single scalar measure and solve it as a single objective problem (Xu et al., 2004). The single measure has been defined as the following: the distance from the ideal design point (Khuri and Conlon, 1981;Vining, 1998); a desirability function (Harrington, 1965;Derringer and Suich, 1980;Del Castillo et al., 1996;Kim and Lin, 2000); the weighted sum of response objective functions (Montgomery et al., 1972); a quality loss function (Pignatiello, 1993;Plante, 1999); a standardized performance index (Barton and Tsui, 1991); or a process ability criterion (Plante, 2001). The weighted sum method is particularly recommended to clearly link the weights to the relative importance of the responses that represent the quality characteristics of the product we want to optimise. ...
Conference Paper
This work is part of a research project named Ninive and funded by the Italian Ministry of Education, University and Research, and the Lombardy Region. The main objective of the Ninive project is the development of an innovative plaster with high performance in terms of thermal insulation and acoustic absorption. The main components of the plaster under development are an alumina-based binder, polypropylene fibres, cellular glass and Cosmos, a material produced through an inertization process applied to municipal solid waste incinerator fly ashes developed at the University of Brescia. Due to its properties and to the presence of Cosmos as an ingredient, which is a recycled material, this plaster can enhance the sustainability of buildings: it can reduce energy needs and, more generally, the environmental impact of the whole building life cycle. In this paper, the design of experiments for the tuning of the plaster composition is presented. In the case of the Ninive project, the plan of experiments is of great interest due to the high number of variables affecting the results and to the heterogeneity of the characteristics of the plaster to be optimized: not only acoustic absorption and thermal insulation coefficients, but also indicators of environmental impact, ease of application on the wall and production cost. Moreover, since some of such characteristics are conflicting with each other, the optimization has required an innovative approach based on the definition of a global performance objective function using the AHP (Analytic Hierarchy Process) technique.
... The most popular approach is to reduce multiresponses to a single objective optimization problem using an aggregated measure [20]. The techniques for reducing the responses into a single scalar include desirability functions [21,22], loss functions [23,24], distance functions [3], and process capability approach [5,25,26]. Desirability function approach is simple and flexible. However, it focuses on location effects and ignores the dispersion effects. ...
Article
Full-text available
In product or process development, it is very common to deal with a situation where simultaneous optimization of several quality characteristics is required. A common approach in multiresponse optimization problems is to use the desirability function approach based on the polynomial regression models. However, desirability function approach does not consider both location and dispersion effects. In addition, when dealing with complex manufacturing processes, artificial neural networks can estimate the relationship between responses and input variables effectively. In this paper, an artificial neural networks approach is presented which utilizes a process capability index to combine multiple responses into a single value function. Then genetic algorithm is applied to optimize the resulting function. In order to improve quality of the output results, the decision maker is allowed to interactively incorporate some preference parameters into the problem. Performance of the proposed approach is evaluated against two existing approaches. The results indicate that the proposed interactive approach is an effective and efficient method to solve multiresponse optimization problems.
... Among the optimization methods contemplating multiple responses, the desirability function [3], the multivariate integration [4] and the capacity indexes MCpm and MCpk [5] and [6] are listed as examples. Rao [7] describes the Global Criterion Method (GCM) as an interesting strategy. ...
Article
The necessity of efficient and controlled processes has increased the demand by employing optimization methods to the most diverse industrial processes. For these cases, the Global Criterion Method is described in literature as a technique indicated for multi-objective optimizations. However, if the problem presents correlations between the responses, this technique does not consider such information. In this context, the Principal Component Analysis is a multivariate tool that can be used to represent correlated responses by uncorrelated components. Given that to negligence the correlation structure between the responses increases the likelihood of the optimization method in finding an inappropriate optimum point, the objective of this work is to combine the GCM and PCA in a strategy able to deal with problems having multiple correlated responses. For this reason, such strategy was used to optimize the 12L14 free machining steel turning process, characterized as an important machining operation. The optimized responses included the mean roughness, total roughness, cutting time and material removal rate. As input parameters, the cutting speed, feed rate and depth of cut were considered. Response Surface Methodology was employed to build the objective functions. The GCM based on principal components was successfully applied, presenting better practical results and a more appropriate location of the optimal point in comparison to the conventional GCM.
... In the last three decades, several approaches for optimization of multiple responses have been proposed in literature (Pal and Gauri, 2010). Among these techniques one can find, desirability functions (Ch'ng et al., (2005) and Del Castillo et al., (1996)), loss functions (Ko et al., (2005) and Ames et al., (1997)), capability indexes (Ch'ng et al., (2005) and Plante (2001)), but according to Miró-Quesada et al., (2004) there are various drawbacks with some of these techniques as do not consider the correlation between the responses or the uncertainty in the parameter estimates. In contrast, the Bayesian reliability approach by Peterson (2004) considers the correlation between de responses and the uncertainty in the parameter estimates in a formal way. ...
Conference Paper
Full-text available
This study shows a methodology employed for the simultaneous optimization of multiple responses. The methodology is Bayesian and is based in a work presented previously which proposes optimize the probability that the response variables meet their respective specifications simultaneously, only with a modification that makes it more flexible, which allows work with the case where not all the variables considered in the study are equally important. To incorporate in the methodology these differences in relevance of the analyzed response variables, the weighted geometric mean (WGM) of the probability that each variable meets its specification is used, so the multiple variables to optimize become one. An example is presented to illustrate how the proposed modification may change the given solution to the optimization problem when the differences in the importance of the analyzed response variables are taken into account.
... x 3 (%) 56 Modelos de regressão linear múltipla, relacionando respostas a fatores experimentais de controle, foram desenvolvidos para Y 1 , Y 2 e Y 3 . Para tanto utilizou-se uma implementação computacional do método dos mínimos quadrados. ...
... If appropriate, other metrics may supplement these ones. Examples are the expected proportion of conformance (Chiao and Hamada, 2001) and the system capability index (Plante, 2001). ...
Article
Criteria to solve multiresponse problems developed under the RSM framework are rarely evaluated in terms of their ability to depict Pareto frontiers and their solutions do not provide information about response properties. This manuscript contributes for positioning some optimization criteria in relation to each other based on their ability to capture solutions in convex and nonconvex surfaces in addition to the robustness, quality of predictions and bias of the generated solutions. Results show that an appealing compromise programming-based method can compete with leading methods in the field. It does not require preference information from the decision-maker, is easy-to-implement, can generate solutions to satisfy decision-makers with different sensitivity to bias and variance based on performance metric values, and evenly distributed solutions along the Pareto frontier. The validity of these results is supported on three examples.
... By this manner the multivariate process capability index is modeled as a mathematical programming and its optimization results in the optimum factors settings. In the method presented by Plante [32], process capability index was used for uncorrelated normal responses. Eq. (3) represents the PCI used in that study and Eq. ...
Article
Most of the researches developed for single response and multi response optimization problems are based on the normality assumption of responses, while this assumption does not necessarily hold in real situations. In the real world processes, each product can contain correlated responses which follow different distributions. For instance, multivariate non-normal responses, multi-attribute responses or in some cases mixed continuous-discrete responses. In this paper a new approach is presented based on multivariate process capability index and NORTA inverse transformation for multi response optimization problem with mixed continuous-discrete responses. In the proposed approach, assuming distribution function of the responses is known in advance based on historical data; first we transform the multivariate mixed continuous-discrete responses using NORTA inverse transformation to obtain multivariate normal distributed responses. Then the multivariate process capability index is computed in each treatment. Finally, for determining the optimum treatment, the geometric mean value of multivariate Process Capability Index (PCI) is computed for each factor level and the most capable levels are selected as the optimum setting. The performance of the proposed method is verified through a real case study in a plastic molding process as well as simulation studies with numerical examples.
... A univariate measure of desirability is then built and maximized. Plante [9,10], extended this work and proposed multicriteria models to perform the parameter design. ...
Article
Full-text available
This paper proposes a procedure for the concurrent optimization of design centering and tolerances. The novelty of the proposal is that it takes into account the covariance structure of the variables, which is estimated from the manufacturing process. The procedure can be interpreted as a process of recentering of the design factors such that the tolerances can be maximized without incurring in quality losses. The optimal parameters and tolerances are found by solving an optimization problem where the sum of the tolerances is maximized and where the covariance matrix of the variables is used in the optimization process. By taking the covariance matrix into account, the solution is more compatible with the manufacturing process, allowing larger tolerances with the same cost. The added complexity due to the use of the dependence structure of the process is solved by applying a singular value decomposition to the covariance matrix. This decomposition transforms the original dependent variables into uncorrelated factors, making the optimization problem more tractable. Some examples are provided illustrating the benefits of the proposal.
... The response surface models generally estimate the behavior of the average response characteristic. The equation of its variance, however, can be derived from the CCD array itself, using the principle of propagation of error [15]. In this case, the response variance, ...
Article
The machining process for vertical turning martensitic gray cast iron is of great importance to the automotive industry, mainly in the manufacturing process of piston rings. The aim of this paper is to demonstrate the process of development of coated carbide tools to maximize the productivity of the process, considering the maximum life of the cutting tool and the minimum machining cost per part. Using full-factorial design of experiments, we tested two different geometries: a square tool with special geometry—formed by two edges and two ends simultaneously cutting—and a hexagonal tool. Considering that the special square geometry provided maximum life, full quadratic models for responses of interest were constructed using a central composite design for feed (f) and rotation (n). Applying the generalized reduced gradient algorithm, the proposed optimization goals were achieved with feed f = 0.37 mm/v and rotation of 264 rpm for the use of the special square tool. Confirmation experiments prove the effectiveness of this solution.
... x 3 (%) 56 Modelos de regressão linear múltipla, relacionando respostas a fatores experimentais de controle, foram desenvolvidos para Y 1 , Y 2 e Y 3 . Para tanto utilizou-se uma implementação computacional do método dos mínimos quadrados. ...
Article
Full-text available
In multiresponse experiments (MREs) the same experimental unit is evaluated with respect to more than one response simultaneously. Optimization of MREs involves determining the point in the design region where responses perform best with respect to given criteria. Utility functions are used to transform responses outcomes at each experimental treatment into performance measures. In this paper, we investigate MREs where some response outcomes are profiles rather than individual values. A functional response gives one or more profiles as observed outcomes at each experimental treatment, and the objective is to identify the outcome that is closest to a target profile. We propose the use of the Hausdorff Distance, a similarity metric from the field of image recognition, in combination with a desirability function to obtain a utility function that gives the distance of a functional response outcome to its desired target.
Article
Quality‐based design optimization is a process consisting of factor identification, finding an empirical/physical model between responses and factors, and finding a factor setting leading to the best quality levels. Response surface methodology is a well‐grounded data‐driven approach that applies a collection of statistical‐mathematical techniques to achieve the design requirements. Today's complex systems are usually affected by nuisance factors and have more than one output characteristic with some degree of correlation. This study aims to develop two‐stage stochastic programming for a multi‐response optimization model based on the degree of conformance as the most comprehensive metric for such design problems. The proposed approach can include the model imprecision in the optimization procedure using the probabilistic properties of the estimated parameters. This method would be applied in highly sensitive processes or costly operations that require a more accurate setting to avoid extra reworks or scraps. A simulated annealing algorithm with a boosted scenario checking modification has been applied to cope with the expensive computations for problem instances with a large sample size of uncertain parameters. The proposed model has also analyzed two cases from the literature, and the results support its superiority compared to the existing approaches.
Chapter
The first section of this chapter introduces statistical process control (SPC) and robust design (RD), two important statistical methodologies for quality and productivity improvement. Section 11.1 describes in-depth SPC theory and tools for monitoring independent and autocorrelated data with a single quality characteristic. The relationship between SPC methods and automatic process control methods is discussed and differences in their philosophies, techniques, efficiencies, and design are contrasted. SPC methods for monitoring multivariate quality characteristics are also briefly reviewed.Section 11.2 considers univariate RD, with emphasis on experimental design, performance measures and modeling of the latter. Combined and product arrays are featured and performance measures examined, include signal-to-noise ratios SNR, PerMIAs, process response, process variance, and desirability functions. Of central importance is the decomposition of the expected value of squared-error loss into variance and off-target components which sometimes allows the dimensionality of the optimization problem to be reduced. Besides, this section deals with multivariate RD and demonstrates that the objective function for the multiple characteristic case is typically formed by additive or multiplicative combination of the univariate objective functions, and lists RD case studies originating from applications in manufacturing, reliability, and tolerance design.Section 11.3 discusses the mainstream methods used in the prognostics and health management (PHM) framework, including updated research from the literatures of both statistical science and engineering. Additionally, this section provides an overview of the systems health monitoring and management (SHMM) framework, discusses its basic structure, and lists several applications of SHMM to complex systems and to critical components within the context of a big data environment.KeywordsControl chartMinimum mean square errorRobust designDesirability functionStatistical process controlHealth monitoring and management
Article
This paper presents a multi-objective optimization algorithm that combines Normal Boundary Intersection method with response surface models of equimax rotated factor scores in order to simultaneously optimize multiples sets of means and variances of manufacturing processes characteristics. The algorithm uses equimax factor rotation to separate means and variances in individual and uncorrelated functions and afterwards combines them in a mean squared error function. These functions are then optimized using Normal Boundary Intersection method generating a Pareto frontier. The optimal solutions found are then filtered according to a 95% non-overlapping confidence ellipses for the predicted values of the responses and posteriorly they are assessed by a Fuzzy decision-maker index established between the volume of each confidence ellipsoid and the Mahalanobis distance between each Pareto point and its individual optima for a given weight. In order to illustrate the practical implementation of this approach, two cases involving the multi-objective optimization of the hardened steel turning process were considered: (a) the AISI 52100 hardened steel turning with CC6050 mixed ceramic inserts and (b) the AISI H13 hardened steel turning with CC 670 mixed ceramic tools. For both cases, the best setup for cutting speed (V), feed rate (f) and depth of cut (d) were adjusted to find the minimal process cost (Kp) and the maximal tool life (T), both responses with minimal variance. The suitable results achieved in these case studies indicate that the proposal may be useful for similar manufacturing processes.
Article
Multiple-response surface optimization (MRSO) aims to find a setting of the input variables that simultaneously optimizes the multiple responses. The process capability approach incorporates a decision maker’s (DM’s) preference information in the problem-solving process while considering the dispersion effects of the responses systematically through the framework of the process capability index. However, the existing process capability methods are based on an unrealistic assumption that all the preference information of the DM is given in advance. Although a few methods incorporate the DM’s preference information while solving the problem, they require too much preference information from the DM. In this article, a new interactive capability method, the pairwise comparison-based interactive process capability approach (PC-IPCA), is proposed to alleviate the DM’s burden in the solving process. The proposed method is illustrated through a well-known MRSO problem. The advantages of the proposed method compared with existing process capability methods are discussed.
Article
A common problem generally encountered during manufacturing process improvement involves simultaneous optimization of multiple ‘quality characteristics’ or so-called ‘responses’, and determining the best process operating conditions. Such a problem is also referred to as ‘multiple response optimization (MRO) problem’. The presence of interaction between the responses calls for trade-off solution. The term ‘trade-off’ is an explicit compromised solution considering the bias and variability of the responses around the specified targets. The global exact solution in such types of nonlinear optimization problems are usually unknown, and various trade-off solution approaches (based on process response surface models or without using process response surface models) had been proposed by researchers over the years. Considering the prevalent and preferred solution approaches, the scope of this paper is limited to response surface (RS)-based solution approaches and similar closely related solution framework for MRO problems. This paper contributes by providing a detailed step-by-step RS-based MRO solution framework. The applicability and steps of the solution framework is also illustrated using a real life in-house pin-on-disc design of experiment study. A critical review on solution approaches with details on inherent characteristic features, assumptions, limitations, application potential in manufacturing and selection norms (indicative of the application potential) of suggested techniques/methods to be adopted for implementation of framework are also provided. To instigate research in this field, scopes for future work are also highlighted at the end.
Article
The determination of tolerance allocations among design parameters is an integral phase of product/process design. Such allocations are often necessary to achieve desired levels of product performance. Parametric and nonparametric methods have recently been developed for allocating multivariate tolerances. Parametric methods assume full information about the probability distribution of design parameter processes, whereas, nonparametric methods assume that only partial information is available, which consists of only design parameter process variances. These methods currently assume that the relationship between the design parameters and each of the performance measures is linear. However, quadratic response functions are increasingly being used to provide better approximations of the relationships between performance measures and design parameters. This is especially prevalent where there is a multivariate set of performance measures that are functions of a common set of design parameters. In this research we propose both parametric and nonparametric multivariate tolerance allocation procedures which consider the more general case where these relationships can be represented by quadratic functions of the design parameters. We develop the corresponding methodology and nonlinear optimization models to accommodate and take advantage of the presence of interactions and other nonlinearities among suppliers.
Article
The main objective of this research is to optimize performance of plastic pipes' extrusion process with two main quality responses, including pipe's diameter and thickness, using Min-Max model in fuzzy goal programming. First, the variables control charts are constructed at initial factor settings of extrusion process, where the results reveal that the extrusion process is in statistical control. However, the actual capability index values for diameter and thickness are estimated 0.7094 and 0.7968, respectively. The process capability for a complete product, MCpk, is calculated as 0.752. These values indicate that the extrusion process is incapable. To improve process performance, the L18 array is utilized for experimental design with three 3-level process factors. Then, the Min-Max model is used to determine the combination of optimal factor settings. The estimated capability index values for diameter and thickness at the combination of optimal factor settings are estimated and found to be 1.504 and 1.879, respectively. The integrated process capability index, MCpk, is calculated as 1.681. Confirmation experiments should that the Min-Max model results in enhancing process capability for both responses. In conclusions, the Min-Max Model may provide valuable assistance to practitioners in optimizing performance while considering both product and process preferences.
Article
We study the problem of multiple response optimization (MRO) and focus on the selection of input levels which will produce desirable output quality. We propose an interactive multiple objective optimization approach to the input design. The earlier interactive methods utilized for MRO communicate with the decision maker only using the response variable values, in order to improve the current response values, thereby resulting in the corresponding design solution automatically. In their interaction steps of preference articulation, no account is taken of any active changes in design variable values. On the contrary, our approach permits the decision maker to change the design variable values in its interaction stage, which makes possible the consideration of the preference or economics of the design variable side. Using some typical value functions, we also demonstrate that our method converges reasonably well to the known optimal solutions.
Article
The concept of dual response approach based on response surface methodology has widely been investigated and adopted for the purpose of robust design. Separately estimating mean and variability responses, a dual response approach may take advantage of optimization modeling for finding optimum settings of input factors. One of the most typical robust design models is formulated to minimize variability while keeping the process mean at the desired target. Depending on the nature of applications, however, various versions of optimization modeling have been suggested in the literature since trade-off decisions between mean and variability responses need to be made to accomplish the goal of robust design. Addressing trade-off issues in robust design, this article proposes a robust design scheme for enhanced process capability by applying a dual response approach. Process capability has long been viewed as a critical performance measure to indicate how well a process meets the specifications and customer requirements. A drug formulation example from pharmaceutical industry has been investigated, where the proposed approach is compared with conventional robust design optimization models to demonstrate the procedures and applicability of the proposed approach.
Conference Paper
This research implements the well-known six sigma approach define-measure-analyze-improve-control (DMAIC) to improve the performance of direct compression process with two quality responses; tablet's weight and hardness. At current factor settings, the x and s charts are judged in-control for both responses. However, the process was found capable for hardness but incapable for weight. Three process factors are investigated, including machine speed (S), compression force (F), and filling depth (D). The Taguchi's L 27 array is adopted to investigate the effects of the three process factors concurrently. Then, the grey relational analysis based ranking is implemented to determine the combination of optimal factor levels, which is found as S1F3D1. Initially, the process capability values for hardness and weight are 1.5 and are 0.587, respectively. The multivariate capability index, MCpk, is calculated and found 0.938. After process improvement, the process capability values are found equals to 3.31 and 0.848. The MC pk is enhanced to 1.68. In conclusion, the DMAIC approach is found effective for improving the performance of direct compression process with tablet's weight and hardness.
Article
A serial manufacturing system generally consists of multiple and different dedicated processing stages that are aligned sequentially to produce a specific end product. In such a system, the intermediate and end product quality generally varies due to setting of in-process variables at a specific stage and also due to interdependency between the stages. In addition, the output quality at each individual stage may be judged by multiple correlated end product characteristics (so-called 'multiple responses'). Thus, achieving the optimal product quality, considering the setting conditions at multiple stages with multiple correlated responses at individual stage is a critical and difficult task for practitioners. The solution to such a problem necessitates building data driven empirical response function(s) at individual stage. These response function(s) may be nonlinear and multimodal in nature. Although extensive research works are reported for single-stage multiple response optimization (MRO) problems, there exist little evidence on work addressing multistage MRO problem with more than two sequential stages. This paper attempts to develop an efficient and simplified solution approach for a typical serial multistage MRO problem. The proposed approach integrates a modified desirability function and an ant colony-based metaheuristic search strategy to determine the best process setting conditions in serial multistage system. Usefulness of the approach is verified by using a real life case on serial multistage rolled aluminum sheet manufacturing process. © 2015 Elsevier B.V. and Association of European Operational Research Societies (EURO) within the International Federation of Operational Research Societies (IFORS). All rights reserved.
Article
This research implements the well-known Six Sigma approach define-measure-analyse improve-control (DMAIC) to improve the performance of direct compression process with two quality responses; tablet's weight and hardness. At define phase, the x-bar and s charts are established and judged in-control for both responses. To measure performance, the values of the actual capability indices for hardness and weight are estimated 1.5 and 0.587, respectively, which indicate that the tableting process is capable for tablet's hardness but it is incapable for weight. At the analyse and improve phases, designed experiments utilising the Taguchi's L27 array followed by the grey relational analysis (GRA) technique are implemented to determine the combination of factor levels that enhance process performance. After process improvement, the capability index values are found equal to 3.31 and 0.848. To control performance, the value of the multivariate capability index is found equal to 0.938 at initial factor settings, while it is enhanced using optimal factor settings to 1.68. Finally, the multivariate control charts for Hotelling T2 and generalised variance are employed to monitor future production. In conclusion, the DMAIC approach including GRA techniques is found effective for improving the performance of direct compression with tablet's weight and hardness.
Article
Full-text available
This work introduces a methodology for the simultaneous optimization of tolerances and parameters when the variables are correlated, and without any distributional assumption in the design factors. Therefore, the proposal covers a very general case. Tolerances can be asymmetric depending not only on variables distribution but also on the definition of the specification region. This flexibility allows larger tolerances than competing approaches. A simulation exercise is provided illustrating the advantage of the methodology.
Article
A generic problem encountered in process improvement involves simultaneous optimization of multiple responses (so-called ?critical response/output characteristics'). These types of problems are also referred to as ?multiple response optimization (MRO) problems'. The primary goal of any process improvement initiative is to determine the best process operating conditions that simultaneously optimizes various critical ?response characteristics'. Conventional desirability function approach uses response functions, target values, specifications to convert a MRO problem to a composite single objective optimization problem. The single objective function is maximized to determine near optimal conditions based on specific metaheuristic search strategy. The solution quality is expressed in terms of closeness of mean to target values and reduced variance around targets. Researchers generally impose hypothetical boundary conditions on variance to achieve satisfactory solutions. In this paper, an unconstrained modified desirability function is proposed, which do not require boundary conditions on variance, to determine efficient solution for MRO problem. Various case studies from open literature are selected to verify the superiority of the proposed approach over conventional desirability approach.
Article
A otimização de processos de manufatura com múltiplas respostas é uma prática que precisa se tornar mais acessível aos gestores. Além disso, as estruturas de correlação (variância-covariância) existente entre as respostas são desconhecidos e geralmente negligenciadas pelos métodos de otimização tradicionais, conduzindo, na maioria das vezes, a ótimos inadequados. Para tratar destas particularidades, este trabalho apresenta um método de otimização multiobjetivo, desenvolvido para estudar as múltiplas características do torneamento do aço endurecido AISI 52100, baseado no conceito do Erro Quadrático Médio Multivariado. Este conceito é desenvolvido combinando a Análise de Componentes Principais (ACP) com a Metodologia de Superfície de Respostas (MSR), com foco em problemas multidimensionais do tipo NTB (Nominal-the-best). Como variáveis do processo, foram selecionadas a velocidade de corte (V), o avanço (f) e a profundidade de corte (d). Os valores ótimos obtidos foram V=218 m/min, f=0,086 mm/rev e d=0,3424 mm. Rodadas experimentais sob as condições ótimas obtidas foram realizadas para confirmar os resultados teóricos, apontando para uma boa adequação do método proposto. A maioria das organizações, em especial, aquelas cuja produção está voltada para a manufatura, sente a complexidade do grande número de variáveis envolvida em seus processos. Além disso, a estrutura de variância-covariância, porventura existente entre essas variáveis, aumenta substancialmente essa complexidade. O equacionamento dessas variáveis, levando em consideração os seus relacionamentos, a fim de se buscar soluções adequadas, porém, não triviais, acabam por criar barreiras aos gestores, que devem estar aptos a absorver essa complexidade, inferir sobre ela e obter essas soluções. O resultado, quase sempre, reflete uma tomada de decisão inadequada e o conseqüente comprometimento dos níveis de eficiência organizacionais. A questão, portanto, é como compor um algoritmo matemático capaz de absorver toda essa complexidade? A resposta está na aplicação das diversas abordagens existentes no âmbito dos métodos de otimização, para tratamento destes relacionamentos. Mais especificamente, a utilização combinada do Projeto e Análise de Experimentos (DOE – do inglês Design of Experiments) – para criar os modelos matemáticos dos processos; a Metodologia de Superfície de Resposta (MSR) – para modelar as funções de aproximação de pontos de ótimo, geralmente encontrados em regiões dotadas de curvatura, de acordo com sua convexidade; e o conceito de Erro Quadrático Médio Multivariado (EQMM) – para buscar a minimização das distâncias entre as respostas e seus respectivos alvos e variâncias. Este último, envolve a combinação da Análise de Componentes Principais (ACP) e a Metodologia de Superfície de Resposta em problemas multivariados do tipo NTB (Nominal-the-best).
Article
In robust design studies, location and dispersion effects play different roles, and balancing them when there are multiple responses is an important and challenging task. We propose to use desirability function to simultaneously optimize multiple responses after assigning weights to reflect the relative importance of location and dispersion effects. Our analytical strategy is to plot the solutions obtained from different weights versus the weights and/or some aspects of the solutions versus the weights. These plots provide valuable information that can be useful in choosing one or more compromise solutions to balance the multiple location and dispersion effects. The proposed approach is illustrated using two real problems. Copyright © 2012 John Wiley & Sons, Ltd.
Article
The most widely used models to study the quality of service are the SERVQUAL and SERVPERF models that measure service mechanisms. This paper is to estimate service efficiency and consistency for Retail Industry. We tried to measure the service quality and overall satisfaction by using DEA and PCI, degree of combination and top2box which is a little bit different methodology from traditional ones. Rather than using the usual method of converting the service quality index by mean value, the Service Efficiency Index(SEI) and Service Quality Consistency Index(SQCI) are used to measure the efficiency and consistency level, which in turn can be used as the new service quality indices. The result of SEI and SQCI show the efficiency frontier in retail industry that 6 DMUs are analyzed relative efficient DMUs, 12DMUs are inefficient DMUs and retail Industry consistency level appeared low(0.35-0.47) Also, there is a significant difference in terms of efficiency and consistency in the each retail industry. Finally, we showed the summarized result as the Effi-Con Matrix
Article
Product quality is generally defined by a family of critical characteristics, or so-called responses. In this article, a new integrated approach is proposed that has the ability to reduce the response space dimensionality and also considers both location and dispersion effects in a correlated multiple response optimization problem. The proposed approach uses a principal component (PC)-based multivariate process capability index (MCpmk;PC) as an objective function for optimization. In addition, the nonlinear search algorithm can efficiently determine the best trade-off solution in the orthogonal PC space. Three different cases are selected to illustrate the effectiveness of the new proposed approach compared to a few selected existing approaches.
Article
A common problem encountered in product or process design is the selection of optimal parameter levels that involves the simultaneous consideration of multiple response characteristics, called a multi-response surface problem. Notwithstanding the importance of multi-response surface problems in practice, the development of an optimization scheme has received little attention. In this paper, we note that Multi-Response surface Optimization (MRO) can be viewed as a Multi-Objective Optimization (MOO) and that various techniques developed in MOO can be successfully utilized to deal with MRO problems. We also show that some of the existing desirability function approaches can, in fact, be characterized as special forms of MOO. We then demonstrate some MOO principles and methods in order to illustrate how these approaches can be employed to obtain more desirable solutions to MRO problems.
Article
Full-text available
A goal attainment approach to optimize multiresponse systems is presented. This approach aims to identify the settings of control factors to minimize the overall weighted maximal distance measure with respect to individual response targets. Based on a nonlinear programming technique, a sequential quadratic programming algorithm, the method is proved to be robust and can achieve good performance for multiresponse optimization problems with multiple conflicting goals. Moreover, the optimization formulation may include some prior work as special cases by assigning proper response targets and weights. Fewer assumptions are needed when using the approach as compared to other techniques. Furthermore, the decision-maker's preference and the model's predictive ability can easily be incorporated into the weights' adjustment schemes with explicit physical interpretation. The proposed approach is investigated and compared with other techniques through various classical examples in the literature.
Article
Service quality is measured by customers’ satisfaction. Traditionally, the degree of satisfaction is calculated from the data obtained from questionnaires that have been filled by customers directly. The percentile of each different level of a customer’s satisfaction is employed to summarize and compare the quality of service provided by different enterprises. This approach does not consider the consistency of the customers’ perceptions, thus making comparison difficult. This paper introduces the concept of a process capability index that considers both the average and the consistency of the data simultaneously. Evaluations of service quality are usually vague and linguistic. We use the fuzzy numbers of linguistic variables developed in fuzzy set theory to modify the process capability index, and then apply it to evaluate the quality of a service. The average and consistency of the data obtained from a service quality evaluation are thus considered simultaneously, making the comparison of the performance of service quality easier. Moreover, the value of the index can be applied to help to point out the direction for improving the performance of service quality whenever it is lower than some default value.
Article
Purchased materials often account for more than 50% of a manufacturer's product nonconformance cost. A common strategy for reducing such costs is to allocate periodic quality improvement targets to suppliers of such materials. Improvement target allocations are often accomplished via ad hoc methods such as prescribing a fixed, across-the-board percentage improvement for all suppliers, which, however, may not be the most effective or efficient approach for allocating improvement targets. We propose a formal modeling and optimization approach for assessing quality improvement targets for suppliers, based on process variance reduction. In our models, a manufacturer has multiple product performance measures that are linear functions of a common set of design variables (factors), each of which is an output from an independent supplier's process. We assume that a manufacturer's quality improvement is a result of reductions in supplier process variances, obtained through learning and experience, which require appropriate investments by both the manufacturer and suppliers. Three learning investment (cost) models for achieving a given learning rate are used to determine the allocations that minimize expected costs for both the supplier and manufacturer and to assess the sensitivity of investment in learning on the allocation of quality improvement targets. Solutions for determining optimal learning rates, and concomitant quality improvement targets are derived for each learning investment function. We also account for the risk that a supplier may not achieve a targeted learning rate for quality improvements. An extensive computational study is conducted to investigate the differences between optimal variance allocations and a fixed percentage allocation. These differences are examined with respect to (i) variance improvement targets and (ii) total expected cost. For certain types of learning investment models, the results suggest that orders of magnitude differences in variance allocations and expected total costs occur between optimal allocations and those arrived at via the commonly used rule of fixed percentage allocations. However, for learning investments characterized by a quadratic function, there is surprisingly close agreement with an “across-the-board” allocation of 20% quality improvement targets. © John Wiley & Sons, Inc. Naval Research Logistics 48: 684–709, 2001
Article
Full-text available
Recently, Vining and Myers presented a methodology for achieving some of the goals of the Taguchi philosophy using response surface methods and a dual response optimization approach. This paper shows how the same goals can be achieved using standard nonlinear programming techniques, specifically, the generalized reduced gradient (GRG) algorithm. The procedure is illustrated using examples taken from the literature. It is shown that the proposed method can be more flexible and easier to use than the dual response approach and, in some cases, can give better solutions within the region of interest. In conclusion, it is shown how the method can be applied to multiple response mixture experiments.
Article
Full-text available
Desirability functions have been used extensively to simultaneously optimize several responses. Since the original formulation of these functions contains non-differentiable points, only search methods can be used to optimize the overall desirability response. Furthermore, all responses are treated as equally important. We present modified desirability functions that are everywhere differentiable so that more efficient gradient-based optimization methods can be used instead. The proposed functions have the extra flexibility of allowing the analyst to assign different priorities among the responses. The methodology is applied to a wire bonding process that occurs in semiconductor manufacturing, an industrial process where multiple responses are common.
Article
Full-text available
Much controversy surrounds the use of process capability indices in quality assurance programs. Some of the controversies include the indices legitimacy and consistency in monitoring improvement, their abuse by practitioners in using estimates as parameters, and the appropriateness of an index in particular applications. Through the use of a weight function, a unified approach for some common indices is presented, and several properties associated with the various indices are examined. Elucidation of the relationships and statistical properties associated with the various indices may assist in reducing some of the controversies that currently exist.
Conference Paper
A product contributes to yield if all of its performance functions fall within their upper and/or lower limits. For example, a piston connecting rod may be required to provide rigidity along several axes. The actual connecting rod deflection will vary, depending on variations in the materials and forging conditions, but the deflection must remain less than an upper limit. Designing for maximum yield for multivariate performance limits is a difficult task. Direct optimization may require excessive computing resources. We discuss two efficient methods for yield improvement: ‘performance centering’ and a method based on Taguchi’s ‘parameter design’ philosophy. Both are shown to be motivated by the Chebychev inequality. It is important to remember that these are approximate methods. An example shows that they may produce sub-optimal yield, even when the random components of the performance functions are independent and identically distributed.
Article
Most manufactured products must conform to multiple "fitness for use" criteria. The manufacturing design must balance these different needs and find settings of design parameters that maximize product quality. This paper proposes the use of quadratic quality loss functions applied to response surface models to solve this multiple criterion problem. The discussion concentrates on the premanufacturing phase, when off-target losses predominate over losses due to random variability, but the methodology is equally applicable to situations in which both sources contribute appreciably to quality losses. It is shown that operating a process at the minimum of the loss function has the additional benefit of minimizing sensitivity of the product to noise variation in the design parameter settings, leading to robust designs automatically.
Article
Great interest has been stimulated by Genichi Taguchi and his co-workers in the application of statistically designed experiments to industrial product design. Language difficulties (Japanese to English to Statistics) have served to obscure many of the ideas offered by Taguchi. This paper is an attempt to describe some aspects of the “Taguchi Method” and particularly to review some of the tools of statistics employed: the choice of experimental design, the role of interactions, and the effects of data transformations.
Article
This is the introductory article in a special issue of the Journal of Quality Technology devoted to the topic of process capability analysis. Statistical issues generated by the current use of process capability indices are outlined. The contributions of the articles in this issue are highlighted, and the role of related statistical and graphical methods is discussed.
Article
The purpose of off-line quality control is to design robust products using robust manufacturing processes before the actual manufacturing of the product. Most of the research work has focused on determining the optimal level settings of process parameters for products with a single quality characteristic. In this paper, we employ the loss function approach to determine the optimal level settings of the process parameters of the production processes for products with multiple characteristics.
Article
A problem facing the product development community is the selection of a set of conditions which will result in a product with a desirable combination of properties. This essentially is a problem involving the simultaneous optimization of several response variables (the desirable combination of properties) which depend upon a number of independent variables or sets of conditions. Harrington, among others, has addressed this problem and has presented a desirability function approach. This paper will modify his approach and illustrate how several response variables can be transformed into a desirability function, which can be optimized by univariate techniques. Its usage will be illustrated in the development of a rubber compound for tire treads.
Article
G. Taguchi and his school have made significant advances in the use of experimental design and analysis in industry. Of particular significance is their promotion of the use of statistical methods to achieve certain objectives in a mean response while simultaneously minimizing the variance. Unfortunately, many statisticians criticize the Taguchi techniques of analysis, particularly those based on the signal-to-noise ratio. This paper shows how the dual response approach of optimizing a “primary response” function while satisfying conditions on a “secondary response” function can be used to achieve the goals of the Taguchi philosophy within a more rigorous statistical methodology. An example illustrates this point.
Article
A Japanese ceramic tile manufacturer knew in 1953 that is more costly to control causes of manufacturing variations than to make a process insensitive to these variations. The Ina Tile Company knew that an uneven temperature distribution in the kiln caused variation in the size of the tiles. Since uneven temperature distribution was an assignable cause of variation, a process quality control approach would have increased manufacturing cost. The company wanted to reduce the size variation without increasing cost. Therefore, instead of controlling temperature distribution they tried to find a tile formulation that reduced the effect of uneven temperature distribution on the uniformity of tiles. Through a designed experiment, the Ina Tile Company found a cost-effective method for reducing tile size variation caused by uneven temperature distribution in the kiln. The company found that increasing the content of lime in the tile formulation from 1% to 5% reduced the tile size variation by a factor of ten. This discovery was a breakthrough for the ceramic tile industry.
Article
Worldwide competition and rapid technological innovation have revitalized interest in efficient techniques for product design for quality and manufacturability. The Japanese approach, popularized by G. Taguchi, uses outcomes of statistical experiments to select settings for design parameters which yield desirable process mean and variance. In this paper we present mathematical models for incorporating the results of statistical performance models along with production costs into product design models. The objective of the models is to minimize the sum of quality loss, material and production costs. Costs are assumed to be functions of the design parameters. Statistical experiments are employed to aid in the development of quality performance models. Pertinent constraints include limits on the bias of the process mean and variance. The proposed approach permits a more general environment and utilizes a more direct, economic objective as compared to the Taguchi method. A product design example is presented.
Article
The allocation of process mean locations for design parameters is important in establishing the mean of a performance characteristic within performance limits. This is an especially difficult problem when there is a multivariate set of performance measures that are all functions of a common set of design parameters. We propose multicriteria models for the allocation of design parameters that either (1) attempt to minimize the deviation of performance measures from prespecified nominal values, or (2) attempt to place performance measure means well within tolerance ranges. Empirical comparisons with established procedures, via case studies and examples, are used to illustrate both the effectiveness of the proposed procedures as well as the degree of flexibility that these procedures possess for modeling a wide range of product/process design decision environments.
Article
The problem of determining an optimal set of values for the parameters of a manufactured product (and/or a manufacturing process), from the point of view of its robustness against various sources of noise, is commonly referred to as the the parameter design problem. We propose an algorithmic strategy for solving the parameter design problem, and present a specific implementation of this strategy which we refer to as the Successive Quadratic Variance Approximation Method (SQVAM). SQVAM is based on conventional optimization techniques, and assumes that the functional relationship between the input parameters and the performance characteristic of interest is either known or could be well approximated. An illustrative numerical example, discussing the design of a Wheatstone bridge, is then presented. This is the same example used by Taguchi to illustrate the orthogonal array method.
Article
Methodology which incorporates the philosophy and methods of Taguchi is developed to address the issues of designing for manufacturability and sensitivity analysis. Specifically, the goal is to obtain responses (outputs) at or near required targets and to minimize output variability when the inputs are subject to manufacturing tolerances. The desirability functions, which generalize the notion of yield, are used to assess how close responses are to their targets. Manufacturing tolerances in the inputs are incorporated in the analysis by means of the expected loss of the desirability function which is estimated by a Taguchi outer array approach. The desirability expected loss is used to determine an optimum nominal input point to give responses close to target with small variability. The sensitivity of the various responses to the inputs is determined by an analytical method and an analysis of variance approach using a Taguchi outer array. The methods are applied to the optimization of a BIMOS NPN transistor
Article
The basis of a method for designing circuits in the face of parameter uncertainties is described. This method is computationally cheaper than those methods which employ Monte Carlo analysis and nonlinear programming techniques, gives more useful information, and more directly addresses the central problem of design centering. The method, called simplicial approximation, locates and approximates the boundary of the feasible region of an n -dimensional design space with a polyhedron of bounding ( n - 1 )-simplices. The design centering problem is solved by determining the location of the center of the maximal hyperellipsoid inscribed within this polyhedron. The axis lengths of this ellipsoid can be used to solve the tolerance assignment problem. In addition, this approximation can be used to estimate the yield by performing an inexpensive Monte Carlo analysis in the parameter space without any need for the usual multitude of circuit simulations.
Article
We survey contemporary optimization techniques and relate these to optimization problems which arise in the design of integrated circuits. Theory, algorithms and programs are reviewed, and an assessment is made of the impact optimization has had and will have on integrated-circuit design. Integrated circuits are characterized by complex tradeoffs between multiple nonlinear objectives with multiple nonlinear and sometimes nonconvex constraints. Function and gradient evaluations require the solution of very large sets of nonlinear differential equations, consequently they are inaccurate and extremely expensive. Furthermore, the partmeters to be optimized are subject to inherent statistical fluctuations. We focus on those multiobjective constrained optimization techniques which are appropriate to this environment.
Article
This article describes some methods which enable us to construct small designs for quantitative factors, while maintaining as much orthogonality of the design as possible. To calculate the D-
The desirability function
  • E C Harrington
Harrington, E.C. (1965) The desirability function. Industrial Quality Control,21(10).
Importance measures independent of adjustment Design and Analysis of Experiments Strategies for robust multiresponse quality en-gineering
  • R B Leon
  • bullet
  • A C Shoemaker
  • R N Kueker
Leon, R.B.@BULLET Shoemaker, A.C. and Kueker, R.N. (1987) Importance measures independent of adjustment. Technometrics. 29. Montgomery. D.C. (1997) Design and Analysis of Experiments. John Wiley, New York, NY. Pignuticilo, J.J. (1993) Strategies for robust multiresponse quality en-gineering. liE Transactions, 25(3).
Use Ppk and Cpk to reduce cus-tomer conflicts. Mal1l!lacfUring Engineering
  • D M Udlcr
Udlcr, D.M. and Zaks. A.L. (1997) Use Ppk and Cpk to reduce cus-tomer conflicts. Mal1l!lacfUring Engineering; March.