Sie sind auf Seite 1von 8

Experiments are performed today in many manufacturing organizations to increase our understanding and knowledge of various manufacturing processes.

Experiments in manufacturing companies are often conducted in a series of trials or tests which produces quantifiable outcomes. For continuous improvement in product/ process quality, it is fundamental to understand the process behavior, the amount of variability and its impact on the processes. In engineering environment, experiments are often conducted to explore, estimate or confirm. Exploration refers to determining the effect of process variables or factors on the output performance characteristic. Confirmation implies verifying the predicted results obtained from experiment. In manufacturing processes, it is often of primary interest to explore the relationships between the key input process variables and output performance characteristics. For this, one of the common approach employed by many engineers today in manufacturing companies is One-Variables-At-a-Time (OVAT), where we vary one variable at a time keeping all other variables in the experiment fixed. This approach depends upon guesswork, luck, experience and intuition for its success. Moreover, this type of experimentation requires large resources to obtain a limited amount of information about the process. OVAT experiments often are unreliable, inefficient, time consuming and may yield false optimum condition for the process. Statistical thinking and statistical methods play an important role in planning, conducting, analyzing and interpreting data from engineering experiments. When several variables influence a certain characteristic of a product, the best strategy is then to design an experiment so that valid, reliable and sound conclusions can be drawn effectively, efficiently and economically. In designed experiment, the engineer often makes deliberate changes in the input variables and then determines how the output functional performance varies accordingly. It is important to note that not all variables affects the performance in the same manner, some may have medium influences over output performance, some may have strong and some may no have influence at all. Therefore, the objective of a carefully planned designed experiment is to understand which set of variables in a process affects the performance most and then determine the best

level for these variables to obtain satisfactory output functional performance in products.

The quality engineering methods of Taguchi, employing design of experiments, is one of the most important statistical tools for designing high quality system at reduced cost. Taguchi methods provide an efficient and systematic way to optimize design for performance, quality, and cost. Optimization of process parameters is the key step in in the Taguchi method to achieving high quality without increasing cost. This is because optimization of process parameters can improve quality characteristics and optional process parameters can improve quality characteristic and optimal process parameters obtained from Taguchi method are insensitive to the variation of environmental conditions and other noise factors. Classical process parameters design is complex and not an easy task. To solve this task, the Taguchi method uses a special design of orthogonal array to study the entire process parameter space with small number of experiments only. Furthermore, Taguchi has created a transformation of the repetition data to another value, which is a measure of variation present. The transformation is known as signal-to-noise (S/N) ratio. The S/N ratio consolidates several repetitions into one value, which reflects the amount of variation present. There are several S/N ratios available depending on the type of characteristic: lower is better(LB), nominal is the best(NB), or higher is better(HB). The S/N ratio for each level of process parameters is computed based on the S/N analysis. Regardless of the quality of the quality, a large S/N ratio corresponds to a better quality characteristic. Therefore, the optimal level of the process parameters is the level with the highest/N ratio. A statistical analysis of variance (ANOVA) is performed to see which process parameters are statistically significant.

Generally, the quality of aweld joint is directly influenced by the welding input parameters during the welding process: therefore, welding can be considered as a multi input and multi output process. Unfortunately, a common problem that has faced the manufacturer is the control of the process input parameters to obtain good welded joint with required bead geometry and weld quality with minimal

detrimental residual stresses and distortion. Traditionally, it has been necessary to determine the weld input parameters for every new welded product to obtain a welded joint with required specifications. To do so, requires a time consuming trial and error development efforts, with weld input parameters chosen by the skill of the engineer or machine operator. Then welds are examined to determine wether they meet the specifications or not. Finally weld parameters can be chosen to produce a welded joint that closely meets the joint requirements. Also, what is not achieved or often considered is an optimized welding parameters combination, which can be used if it can only determined. In order to overcome this problem, various optimization methods can be applied to define the desired output variables through developing mathematical models to specify the relationship between the input parameters and output variables.

G. Taguchi, a Japanese engineer, espoused an excellent philosophy for quality control in manufacturing industries. His philosophy has far reaching consequences, yet it is founded on three very simple fundamental concepts. The whole of technology and technique arise out of three ideas. The concepts are: 1. Quality should be designed into the product and not inspected into it. 2. Quality is best achieved by minimizing the deviation from target. The target should be so designed it is immune to uncontrollable environmental factors. 3. The cost of quality should be measured as a function of deviation from standard and the losses should be measured from system wide. The first belief of Taguchi was that the best way to improve quality was to design and built it into product. Quality improvement starts at the very beginning, during the design stages of the product or process and continues thought the production phase. It was proposed that an offline strategy for developing quality improvement be used in place of an attempt to inspect quality into product on the production line. The second concept deals with actual methods affecting quality. It contends that quality is directly related to deviation of the design parameter from target

value, not to the conformance to some fixed specifications. A product may be produced with properties skewed towards an one end of acceptance range yet show shorter life expectancy. However, by specifying a target value for the critical property and developing the manufacturing processes to meet the target value with little deviation, the life expectancy may be much improved. The third concept calls for measuring deviation from given design parameter in terms of overall life cycle cost of the product. These costs would include the cost of scrap, rework, inspection, returns, warranty service calls and product replacement. These cost provide guidance regarding the major parameters to be controlled. Quality improvement should be considered as an ongoing effort, one should continually strive to reduce variation around target value. A product under investigation may exhibit a distribution that has a mean value different from thetarget value. The first step towards improving quality is to achieve the population distribution as close to target value. The first step towards improving quality is to achive the population distribution as close to target value as possible. To accomplish this, Taguchi designed experiments using especially constructed tables known as orthogonal arrays(OA). The use of these tables makes an design experiment very easy and consistent. A second objective of manufacturing product to conform to deal values is to reduce the variation or scatter around target. To accomplish this, Taguchi incorporates a unique way to treat noise factors. Noise factors, under Taguchi terminology, are factors which influence the response of a process, but can not be economically controlled. The noise factors such as process variation, machinery wear are usually the prime source for variations. Through the use of orthogonal array, Taguchi devised an effective way to study their influences with the least number of repetitions. The end results is a robust design affected minimally by noise, with a high signal to noise value. To achieve desirable product quality by design, Taguchi recommends a three stage process:

1. System Design 2. Parameter Design 3. Tolerance Design The focus of the system design phase is on determining the suitable working levels of the design factors. It includes designing and testing a system based on the engineers judgment of selected materials, parts and normal product/ process parameters. While system design helps to identify the working levels of the design factors, parameter design seek to determine the factor levels that produce the beat performance of the product/ process under study. The optimum condition is selected so that the influence of uncontrolled factors causes minimum variation of system performance. Tolerance design is a step used to fine tune the results of parameter design by tightening the tolerance with significant influence on the product.

Quadratic loss function


The quadratic loss function can meaningfully approximate the quality loss in most situations. Let y be the quality characteristic of a product and m be the target value for y. (Note that the quality characteristic is a product's response that is observed for quantifying quality level and for optimization in a Robust Design project.) According to the quadratic loss function, the quality loss is given by L(y) = k(y-m) where k is a constant called quality loss coefficient. Equation (2.2) is plotted in Figure 2.3(b). Notice that at y = m the loss is zero and so is the slope of the loss function. This is quite appropriate because m is the best value for y. The loss L(y) increases slowly when we are near m; but as we go farther from m the loss increases more rapidly. Qualitatively, this is exactly the kind of behavior we would like the quality loss function to have. The quadratic loss function given by Equation (2.2) is the simplest mathematical function that has the desired qualitative behavior. The quadratic loss function given by Equation (2.2) is applicable whenever the quality characteristic y has a finite target value, usually nonzero, and the quality loss is

symmetric on either side of the target. Such quality characteristics are called nominal-the-best type quality characteristics and Equation (2.2) is called the nominal-the-best type quality loss function. Some variations of the quadratic loss function in Equation (2.2) are needed to cover adequately certain commonly occurring situations. Three such variations are given below. Smaller-the-better type characteristic. Some characteristics, such as radiation leakage from a microwave oven, can never take negative values. Also, their ideal value is equal to zero, and as their value increases, the performance becomes progressively worse. Such characteristics are called smaller-the-better type quality characteristics. L(y) = ky Note this is a one-sided loss function because y cannot take negative values. As described earlier, the quality loss coefficient k can be determined from the functional limit, Aq, and the quality loss, A0, can be determined at the functional limit by using Equation (2.3). Larger-the-better type characteristic. Some characteristics, such as the bond strength of adhesives, also do not take negative values. But, zero is their worst value, and as their value becomes larger, the performance becomes progressively betterthat is, the quality loss becomes progressively smaller. Their ideal value is infinity and at that point the loss is zero. Such characteristics are called larger-thebetter type quality characteristics. It is clear that the reciprocal of such a characteristic has the same qualitative behavior as a smaller-the-better type characteristic. Thus, we approximate the loss function for a larger-the-better type characteristic by substituting 1/y for y in Equation (2.5): L(y) = k[1/y] Asymmetric loss function. In certain situations, deviation of the quality characteristic in one direction is much more harmful than in the other direction. In such cases, one can use a different coefficient k for the two directions. Thus, the quality loss would be approximated by the following asymmetric loss function:

L(y) = ki(y-m)2, y>m =k2(y~m)2, y<m The four different versions of the quadratic loss function are plotted in Figure 2.4. For a more detailed discussion of the quality loss function see Taguchi

Classification of factors For manufacturing process optimization problems, the following factors are of interest to experimenters: 1. Control factors 2. Noise factors 3. Signal factors

Control factors are those factors that can be easily controlled during actual production conditions. It is the objective of the design activity to determine the best level of these factors to acheve product/ process robustness. In this sense robustness refers to making products/processes insensitive to various sources of variation. Noise factors are those factors which are difficult to control during actual production but may be controllable during experimental design. These factors cause to deviate performance characteristic of a product to deviate from its target or nominal value. The level of the noise factors changes from one unit to another, from one environmental condition to another and from time to time. Only the statistical characteristic such as the mean and variance of noise factors can be known or specified, the actual values in situations cannot be known.

ANALYSIS OF VARIANCE Different factors affect a different degree. A better feel for the relative effect of the different factors can be obtained by the decomposition of variance, which is commonly called analysis of variance (ANOVA). ANOVA is also needed for estimating the error variance for the factor effects and variance of the prediction error.

Das könnte Ihnen auch gefallen