Sie sind auf Seite 1von 207

SIX SIGMA GREENBELT PROGRAM

DMAIC
- Moving towards defect free processes

Alexiv Villas
Oct 6-7, 2010

WWW.ECCINTERNATIONAL.COM

PHILIPPINES MALAYSIA VIETNAM INDONESIA INDIA CHINA 1


DAY 1
• Data Collection Plan
– Operational Definition
– Develop Measurement Plan
– Data Collection
– Data Display and evaluation of Data
• Fundamentals of Minitab
• Basic Statistics
– Measures of Central Tendency
– Measures of Dispersion
– Probability Distribution
• Gage R & R
– Gage R & R for Continuous Data
– Gage R & R for Attribute Data
• Process Capability
• Process Sigma Level Calculations
2
Main Activities
Step 3:
• Take the snapshot of the process, how the
process performing currently & fix the baseline
Step 4:
Measure • Validate the measurement system from which
we collect the data.

Measure Phase
Outcomes
• Operational
Definitions
• Measurement
System Analysis
• Data Collection
Formats and Plans
• Process Baseline
• Capability
• Specification limits,
• target, defect
definition for project
Y(s)
3
Why to Measure

If we can’t accurately measure


something

We don’t know enough about it

We can’t control it

We are at the mercy of chance!!!

4
Science of Six Sigma

When you measure what you are speaking about and express
in numbers, you know something about it.

Scientific Explanation : Very little progress is possible in any


field of investigation without the ability to measure. The
progress of measurement is in fact the progress of science !

Non Scientific Explanation : If you can not measure, Just


Forget it ! It will be a sheer waste of time

Without data you are just a loud mouth with an opinion . .

5
Data Collection Plan

Foundation of six sigma is Data based decision making.


Data drives decisions and actions !!!

6
What is Data

Data are measurements or observations we record and use to


understand, characterize, optimize or control something such
as process.

7
Knowledge is Power

Knowledge is not based on opinion,


Rather it is derived from facts & data.

In order to efficiently collect the data & effectively analyze it.


to extract the maximum knowledge available, one must rely
On statistical techniques.

8
Use of Statistics

Data

Statistics
convert
to

Usable
Information

9
Data Collection Plan

• Define a Metric

• What, How, by Whom the measurement will


be done

• Stick to procedure/plan

• By Plotting (Graphing) the Data, the result


can
be easily understood.

10
Operational Definition

An operational definition is a precise description of the specific criteria used


for the measures (the what), the methodology to collect the data (the how),
the amount of data to collect (how much) and who has the responsibility to
measure the data

When developing an operational definition, it is important for the team to


Fully understand and agree that the DEFINITION reflects exactly what
Information the team is attempting to gather on the process.

Clarity is more important when developing and selecting the measures that
will be used to determine the SIGMA PERFORMANCE of the process.

11
Operational Definition

Example :

Operational definitions may determine if a team is to count all the defects


on an invoice (required to calculate defects per million opportunities)
Or

the total number of defective invoices (any invoice with any defect)
Or

The type of defects encountered on an invoice (to eliminate the most


common defects first).

Each of these cases may require a very different approach for gathering
the data.

12
Operational Definition

Operational definition provides the foundation for the team to

1.Reach an agreement on what data to be collected.

2.Build consistency and reliability into data collection.

3.Fully agree on how a particular characteristic of a process is to be


measured.

13
Example of Operational Definition.

Poor Operational Definition:


Cycle time of a transaction

Good Operational Definition:


Collect data from all transactions processes from 1-Aug-05 to 31-Aug-05.
The cycle time of each transition will be determined by the date
and
time of transaction was download from the client server by an
agent to the
date and time of the PROCESSED transaction was submitted in
client
server as per the client server system time.

14
Exercise: Operational Definition

Write operational definitions for the following cases

1. Maximization the server availability

2. Reduction of the attrition rate in ABZ

3. Improving the quality percentage a process

4. Reduction of call handling time

5. Minimization of abandoned calls in a call center

15
Develop Measurement Plan

Measurement Plan
Determining current process performance usually requires the collection
of data. When developing a measurement plan ensure that:
– The data collected id meaningful
– The data collected is valid
– All relevant data is collected concurrently

What is the Purpose of Collecting the All related Data collected?


Data? –Sample size, frequency, sampling
Will it serve the purpose? Method?
How will you collect the data? Is the Data Collecting method is
– what result will you measure? adequate?
– what kind of cause will you analyze for the – who will collect the data?
ineffective process – where can we collect the data?
What kind of tool will be required? – when will we collect the data?
– form, check sheet? – what kind of assistance will be necessary?

16
Data Classification

Before data collections starts, classify the data into different types:
continuous or discrete.

This is important because it will:

– Provide a choice of data display and analysis tools

– Dictate sample size calculation

– Provide performance or cause information

– Determine the appropriate control chart to use

– Determine the appropriate method for calculation of Sigma

17
Types of Data

Continuous Data Discrete Data


Description

Binary : Classified into one of two


Measured on a continuum or categories
scale Count : Counted discretely
Ordered categories : Rankings or
ratings

Example
% of applications with or without
errors.
Time (in hours) to process an
Number of errors in an application.
application
Customer satisfaction rating of
call center service

18
Continuous Data
Data generated by
– Physically measuring the characteristic
– Generally using an instrument
– Assigning an unique value to each item

Examples : Continuous Data :


Example (Call Waiting Time in Secs)
1. The time it takes to write a proposal SL No. Waiting SL No. Waiting
2. The time it takes to conduct a feasibility Time Time
study. 1 98 11 102
3. The time it takes to close the books each
month. 2 103 12 98
4. Invoice amounts. 3 100 13 101
5. Sales order amounts. 4 100 14 101
6. Handling Time, Time to certify Pes, etc.
5 99 15 99
6 101 16 100
7 97 17 101
8 102 18 99
9 100 19 100
10 99 20 102

19
Discrete Data

Data generated by

• Classifying the items into different groups based on some criteria


• All the items classified into group will have same value

Examples :
• Gender, Shade Variation, etc.
• Escalations, Repeat Calls, Defective Transactions, Defects in
Transactions etc.

20
Discrete : Binary Data (Binomial)

• Classifying the items into only two groups based on some criteria

• Each item will fall in either of the two groups

• All the items classified into a group will have same value

• Expressed or summarized as proportion p or percentage

Examples :

• Gender, Escalations, Repeat Calls, Defective Transactions, etc


• An invoice is either “complete” or “incomplete”.
• A delivery is either “late” or “not late”.
• A product is either “damaged” or “not damaged”.
• A hotel room is either “dirty” or “clean”.
• A sales pitch is either a “thumbs up” or “thumbs down”.

21
Binary Data : Example
(Month wise Escalation of Transactions)

Month No. of Transaction Processed No. of Transactions Escalated


Jan 2000 20
Feb 2500 30
Mar 1500 14
Apr 3000 27
May 4000 40
June 3500 33

Proportion of Escalated Transactions

p = No. of Transactions Escalated / Total No. of Transactions Processed

= (20 + 30 + 14+ 27 + 40 + 33) / (2000 + 2500 + 1500 + 3000 + 4000 + 3500)


= 164 / 16500 = 0.0099 = 0.01 = 1%

On an average, 1% of Transactions are escalated

22
Discrete : Count Data (Poisson)

Data generated by
• Counting the exact number of occurrences of the characteristics in a
group of items

• It takes integer values as 0, 1, 2, -, -, -,

• Expressed of summarized as average number of occurrences

Examples :

• Number of fatal defects in transactions processed


• Number of accidents in the city during June 2005
• Number of suicides in the city during 2004.
• The number of errors on twenty invoices.
• The number of computer system failures in a month

23
Count Data : Example
(Data on Defects found during Transaction Audit)

No. of Items 100 450 716 82 172 150 89


Audited
No. of Defects 3 0 4 1 5 10 1

Average Number of Defects = Total No. of Defects / Total No. Audited


= (3 + 4 + 0 + 1 + 5 + 10 +1) / (100 + 50 + 76 + 82 + 172 + 150 +
89)
= 0.033
On an average 0.033 defects found per Transactions
On an average 3.3 defects found per 100 Transactions

24
Data Measurement Plan Format
Performance Operational definition Data Sample Who Data How Other data that
Measure Source size Will Collectio Will should be collected at
& Collect n Date the same time
locatio The Period Be
n Data collected
Time to Date and time of Client 256 Raju 1-Aug-05 Random Type of transaction,
process a transaction was download server Simita to 31- selection Day of Week, Agent
transaction from client server by an system Aug-05 Name
agent to the date and time time.
of the PROCESSED
transaction was submitted
in client sever

The data is being calculated to measure the performance is called PERFORMANCE DATA.

On the other hand, CAUSE DATA, focus on why the process performs as it does. Cause data
supports the problem solving by helping to isolate root causes of the problems.

Most of the times, however, we won’t know enough about potential causes until we have
determined our processes current performance level. Be prepared to document current
performance first, then brainstorm potential causes and collect additional data related to those
causes at a later date.

25
Data Collection

While collecting data ensure that the measurement data is followed.


Note any deviations from the plan.

Avoid bios and ensure consistency.

Use various tools like check sheets to record and grouping of the data.

Ensure that the sample selected is representatives of the population. If


there is any concern on this issue, record the things that may cause the
data collected to not be representative of the population.

Ensure Effective and


Efficient
Data Collection

26
Sampling

27
Sampling Objectives

• Understand the purpose and advantage of sampling

• Understand the application of different sampling techniques to


ensure accurate process representation

• Gain experience in asking appropriate questions to ensure a robust


sampling plan is implemented effectively and efficiently

• Understand guidelines and formulas used to determine sample size

28
Basic Definitions and Symbols

Population (N): The entire set of objects or activities for a process

µ : the mean (arithmetic average) calculated for a population


σ : the standard deviation calculated for a population

Sample (n): a group that is a part or subset of a population

x : the mean (arithmetic average) of a sample


s : the standard deviation of a sample

29
Sampling Definition

Sampling is the process of:


Collecting only a portion of the data that is available or could be
available, and drawing conclusion about the total population (statistical
inference)

x x x
x
x x x x
x
x x x x x
x x
x x x x
x x
x x x
x x
x x

N = 5000 n = 100
Example:
Estimating the average height of students in a college by measuring the
heights of only 250 students (250 is a subset of entire students
population).

30
Sample …….. When?

When to ……..
• Collecting all the data is impractical or too costly

• Data collection can be a destructive process

• When measuring the high-volume process

When not to ……
• A subset of data may not accurately depict the process, leading to a
wrong conclusion (every unit is unique-e.g., structured deals)

31
Kinds of Sampling

Random Sampling
• This sampling ensures that the
characteristics of the population are
collected with equal possibility.

Stratified Sampling
• Make stratifying plan for population
characteristics. Group A Group B
• Select the sample among each
stratified group.

32
Frequency of Sampling

• Recommended more often for unstable process


(Systematic, Subgroup sampling)

• Recommended less than usual for stable process.

• To make a useful business decision we have to decide the precision


of data and frequency of data.

33
Sampling Methodology

• Select a sample of items from the population

• Measure the characteristics on each item in the sample

• Calculate the sample statistics

• Provide the sample statistics as an estimate of population statistics.

34
Methodology: Example

To estimate the average height of students in a college

Select a sample of items from the population, say 250 students

Measure the characteristics on each item in the sample


i.e measure the height of all the 250 students in the sample

Calculate the sample statistics


i.e calculate the average height of 250 students (=5.5 feet)

Provide the sample statistics as an estimate of population statistics


estimate of average height of students in the college = 505 feet

35
Measure Phase
Six Sigma Statistics

36
Six Sigma Statistics
Welcome to Measure

Process Discovery

Six Sigma Statistics

Basic Statistics

Descriptive Statistics

Normal Distribution

Assessing Normality

Special Cause / Common Cause

Graphing Techniques

Measurement System Analysis

Process Capability

Wrap Up & Action Items

37
Purpose of Basic Statistics
The purpose of Basic Statistics is to:
• Provide a numerical summary of the data being analyzed.
– Data (n)
• Factual information organized for analysis.
• Numerical or other information represented in a form suitable
for processing by computer
• Values from scientific experiments.
• Provide the basis for making inferences about the future.
• Provide the foundation for assessing process capability.
• Provide a common language to be used throughout an organization to
describe processes.

Relax….it won’t
be that bad!

38
Statistical Notation – Cheat Sheet

Summation An individual value, an observation

The Standard Deviation of sample data A particular (1st) individual value

The Standard Deviation of population data For each, all, individual values

The variance of sample data The Mean, average of sample data


The variance of population data
The grand Mean, grand average
The range of data
The Mean of population data
The average range of data

Multi-purpose notation, i.e. # of subgroups, # A proportion of sample data


of classes
A proportion of population data
The absolute value of some term
Sample size
Greater than, less than

Greater than or equal to, less than or equal to Population size

39
Parameters vs. Statistics
Population: All the items that have the “property of interest” under study.

Frame: An identifiable subset of the population.

Sample: A significantly smaller subset of the population used to make an inference.

Population

Sample
Sample
Sample

Population Parameters: Sample Statistics:


– Arithmetic descriptions of a population – Arithmetic descriptions of a
– µ,  , P, 2, N sample
– X-bar , s, p, s2, n

40
Types of Data
Attribute Data (Qualitative)
– Is always binary, there are only two possible values (0, 1)
• Yes, No
• Go, No go
• Pass/Fail
Variable Data (Quantitative)
– Discrete (Count) Data
• Can be categorized in a classification and is based on counts.
– Number of defects
– Number of defective units
– Number of customer returns
– Continuous Data
• Can be measured on a continuum, it has decimal subdivisions that are
meaningful
– Time, Pressure, Conveyor Speed, Material feed rate
– Money
– Pressure
– Conveyor Speed
– Material feed rate

41
Discrete Variables

Discrete Variable Possible values for the variable

The number of defective needles in boxes of 100 0,1,2, …, 100


diabetic syringes

The number of individuals in groups of 30 with a 0,1,2, …, 30


Type A personality

The number of surveys returned out of 300 0,1,2, … 300


mailed in a customer satisfaction study.

The number of employees in 100 having finished 0,1,2, … 100


high school or obtained a GED

The number of times you need to flip a coin 1,2,3, …


before a head appears for the first time
(note, there is no upper limit because you might
need to flip forever before the first head appears.

42
Continuous Variables

Continuous Variable Possible Values for the Variable

The length of prison time served for individuals All the real numbers between a and b, where a is
convicted of first degree murder the smallest amount of time served and b is the
largest.

The household income for households with All the real numbers between a and $30,000,
incomes less than or equal to $30,000 where a is the smallest household income in the
population

The blood glucose reading for those individuals All real numbers between 200 and b, where b is
having glucose readings equal to or greater than the largest glucose reading in all such individuals
200

43
Definitions of Scaled Data
• Understanding the nature of data and how to represent it can affect the types
of statistical tests possible.

• Nominal Scale – data consists of names, labels, or categories. Cannot be


arranged in an ordering scheme. No arithmetic operations are performed for
nominal data.

• Ordinal Scale – data is arranged in some order, but differences between


data values either cannot be determined or are meaningless.

• Interval Scale – data can be arranged in some order and for which
differences in data values are meaningful. The data can be arranged in an
ordering scheme and differences can be interpreted.

• Ratio Scale – data that can be ranked and for which all arithmetic operations
including division can be performed. (division by zero is of course excluded)
Ratio level data has an absolute zero and a value of zero indicates a
complete absence of the characteristic of interest.
44
Nominal Scale

Qualitative Variable Possible nominal level data values for


the variable

Blood Types A, B, AB, O

State of Residence Alabama, …, Wyoming

Country of Birth United States, China, other

Time to weigh in!


45
Ordinal Scale

Qualitative Variable Possible Ordinal level data


values

Automobile Sizes Subcompact, compact,


intermediate, full size, luxury

Product rating Poor, good, excellent

Baseball team classification Class A, Class AA, Class AAA,


Major League

46
Interval Scale

Interval Variable Possible Scores

IQ scores of students in 100…


BlackBelt Training (the difference between scores
is measurable and has
meaning but a difference of 20
points between 100 and 120
does not indicate that one
student is 1.2 times more
intelligent )

47
Ratio Scale

Ratio Variable Possible Scores

Grams of fat consumed per adult in the 0…


United States (If person A consumes 25 grams of fat and
person B consumes 50 grams, we can say
that person B consumes twice as much fat
as person A. If a person C consumes zero
grams of fat per day, we can say there is a
complete absence of fat consumed on that
day. Note that a ratio is interpretable and
an absolute zero exists.)

48
Converting Attribute Data to Continuous Data

• Continuous Data is always more desirable

• In many cases Attribute Data can be converted to


Continuous

• Which is more useful?


– 15 scratches or Total scratch length of 9.25”
– 22 foreign materials or 2.5 fm/square inch
– 200 defects or 25 defects/hour

49
Descriptive Statistics

Measures of Location (central tendency)


– Mean
– Median
– Mode

Measures of Variation (dispersion)


– Range
– Interquartile Range
– Standard deviation
– Variance

50
Descriptive Statistics

Open the MINITAB™ Project “Measure Data Sets.mpj” and


select the worksheet “basicstatistics.mtw”

51
Measures of Location
Mean is:
• Commonly referred to as the average.
• The arithmetic balance point of a distribution of data.
Stat>Basic Statistics>Display Descriptive Statistics…>Graphs…
>Histogram of data, with normal curve

Histogram (with Normal Curve) of Data


Sample Population
Mean 5.000
80
StDev 0.01007
N 200
70

60

50
Frequency

40
Descriptive Statistics: Data
30
Variable N N* Mean SE Mean StDev Minimum Q1
20 Median Q3
Data 200 0 4.9999 0.000712 0.0101 4.9700 4.9900
10 5.0000 5.0100

0 Variable Maximum
4.97 4.98 4.99 5.00 5.01 5.02 Data 5.0200
Data

52
Measures of Location
Median is:
• The mid-point, or 50th percentile, of a distribution of data.
• Arrange the data from low to high, or high to low.
– It is the single middle value in the ordered list if there is an odd
number of observations
– It is the average of the two middle values in the ordered list if there
are an even number of observations

Histogram (with Normal Curve) of Data


Mean 5.000
80
StDev 0.01007
N 200
70

60

50
Frequency

Descriptive Statistics: Data


40
Variable N N* Mean SE Mean StDev Minimum Q1 Median Q3
30
Data 200 0 4.9999 0.000712 0.0101 4.9700 4.9900 5.0000 5.0100
20
Variable Maximum
10
Data 5.0200

0
4.97 4.98 4.99 5.00 5.01 5.02
Data

53
Measures of Location
Trimmed Mean is a:
Compromise between the Mean and Median.
• The Trimmed Mean is calculated by eliminating a specified percentage
of the smallest and largest observations from the data set and then
calculating the average of the remaining observations
• Useful for data with potential extreme values.

Stat>Basic Statistics>Display Descriptive Statistics…>Statistics…> Trimmed Mean

Descriptive Statistics: Data

Variable N N* Mean SE Mean TrMean StDev Minimum Q1 Median


Data 200 0 4.9999 0.000712 4.9999 0.0101 4.9700 4.9900 5.0000

Variable Q3 Maximum
Data 5.0100 5.0200

54
Measures of Location
Mode is:
The most frequently occurring value in a distribution of data.

Mode = 5

Histogram (with Normal Curve) of Data


Mean 5.000
80
StDev 0.01007
N 200
70

60

50
Frequency

40

30

20

10

0
4.97 4.98 4.99 5.00 5.01 5.02
Data

55
Measures of Variation
Range is the:
Difference between the largest observation and the smallest
observation in the data set.
• A small range would indicate a small amount of variability and a large
range a large amount of variability.

Descriptive Statistics: Data

Variable N N* Mean SE Mean StDev Minimum Q1 Median Q3


Data 200 0 4.9999 0.000712 0.0101 4.9700 4.9900 5.0000 5.0100

Variable Maximum
Data 5.0200

Interquartile Range is the:


Difference between the 75th percentile and the 25th percentile.

Use Range or Interquartile Range when the data distribution is Skewed.

56
Measures of Variation
Standard Deviation is:
Equivalent of the average deviation of values from the Mean for a
distribution of data.
A “unit of measure” for distances from the Mean.
Use when data are symmetrical.

Sample Population

Descriptive Statistics: Data

Variable N N* Mean SE Mean StDev Minimum Q1 Median Q3


Data 200 0 4.9999 0.000712 0.0101 4.9700 4.9900 5.0000 5.0100

Variable Maximum
Data 5.0200

Cannot calculate population Standard Deviation because this is sample data.

57
Measures of Variation
Variance is the:
Average squared deviation of each individual data point from the
Mean.

Sample Population

58
Normal Distribution
The Normal Distribution is the most recognized distribution in
statistics.

What are the characteristics of a Normal Distribution?


– Only random error is present
– Process free of assignable cause
– Process free of drifts and shifts

So what is present when the data is Non-normal?

59
The Normal Curve

The normal curve is a smooth, symmetrical, bell-shaped curve,


generated by the density function.

It is the most useful continuous probability model as many


naturally occurring measurements such as heights, weights,
etc. are approximately Normally Distributed.

60
Normal Distribution
Each combination of Mean and Standard Deviation generates a
unique Normal curve:

“Standard” Normal Distribution

– Has a μ = 0, and σ = 1

– Data from any Normal Distribution can be made to


fit the standard Normal by converting raw scores
to standard scores.

– Z-scores measure how many Standard Deviations from the


mean a particular data-value lies.

61
Normal Distribution
The area under the curve between any 2 points represents the
proportion of the distribution between those points.

The area between the


Mean and any other point
depends upon the Standard
Deviation.

m x
Convert any raw score to a Z-score using the formula:

Refer to a set of Standard Normal Tables to find the proportion


between μ and x.
62
The Empirical Rule
The Empirical Rule…

-6 -5 -4 -3 -2 -1 +1 +2 +3 +4 +5 +6

68.27 % of the data will fall within +/- 1 standard deviation


95.45 % of the data will fall within +/- 2 standard deviations
99.73 % of the data will fall within +/- 3 standard deviations
99.9937 % of the data will fall within +/- 4 standard deviations
99.999943 % of the data will fall within +/- 5 standard deviations
99.9999998 % of the data will fall within +/- 6 standard deviations

63
The Empirical Rule (cont.)
No matter what the shape of your distribution is, as you travel 3 Standard
Deviations from the Mean, the probability of occurrence beyond that point
begins to converge to a very low number.

64
Why Assess Normality?
While many processes in nature behave according to the normal
distribution, many processes in business, particularly in the areas
of service and transactions, do not

There are many types of distributions:

There are many statistical tools that assume Normal Distribution


properties in their calculations.

So understanding just how “Normal” the data are will impact how
we look at the data.

65
Tools for Assessing Normality

The shape of any Normal curve can be calculated based on


the Normal Probability density function.

Tests for Normality basically compare the shape of the


calculated curve to the actual distribution of your data points.

For the purposes of this training, we will focus on 2 ways in


MINITAB™ to assess Normality:
– The Anderson-Darling test
– Normal probability test

Watch that curve!

66
Goodness-of-Fit
The Anderson-Darling test uses an empirical density function.

Departure of the 100


Expected for Normal Distribution
actual data from the Actual Data
20%
expected Normal
80
Distribution. The C
u
m
Anderson-Darling u
l
a 60
Goodness-of-Fit test t
i
v
assesses the e
P
magnitude of these e 40
r
c
departures using an e
n
Observed minus t
20
Expected formula. 20%

0
3.0 3.5 4.0 4.5 5.0 5.5
Raw Data Scale

67
The Normal Probability Plot
Probability Plot of Amount
Normal
99.9
Mean 84.69
StDev 7.913
99 N 70
AD 0.265
95 P-Value 0.684
90
80
70
Percent

60
50
40
30
20
10
5

0.1
60 70 80 90 100 110
Amount

The Anderson-Darling test is a good litmus


test for normality: if the P-value is more
than .05, your data are normal enough for
most purposes.

68
Descriptive Statistics

The Anderson-Darling test also appears in this output. Again,


if the P-value is greater than .05, assume the data are Normal.

The reasoning behind the


decision to assume Normality
based on the P-value will be
covered in the Analyze
Phase. For now, just accept
this as a general guideline.

69
Anderson-Darling Caveat
Use the Anderson Darling column to generate these graphs.
Summary for Anderson Darling
Probability Plot of Anderson Darling
A nderson-Darling N ormality Test
Normal
A -S quared 0.18
99.9 P -V alue 0.921
Mean 50.03
M ean 50.031
StDev 4.951
99 S tDev 4.951
N 500 V ariance 24.511
AD 0.177 S kew ness -0.061788
95 P-Value 0.921 Kurtosis -0.180064
90 N 500

80 M inimum 35.727
70 1st Q uartile 46.800
Percent

60 M edian 50.006
50 3rd Q uartile 53.218
40 36 40 44 48 52 56 60 M aximum 62.823
30
95% C onfidence Interv al for M ean
20
49.596 50.466
10 95% C onfidence Interv al for M edian
5 49.663 50.500
95% C onfidence Interv al for S tDev
1 9 5 % C onfidence Inter vals
4.662 5.278
Mean
0.1
35 40 45 50 55 60 65 Median

Anderson Darling 49.50 49.75 50.00 50.25 50.50

In this case, both the Histogram and the Normality Plot look very “normal”. However, because
the sample size is so large, the Anderson-Darling test is very sensitive and any slight deviation
from Normal will cause the P-value to be very low. Again, the topic of sensitivity will be covered
in greater detail in the Analyze Phase.

For now, just assume that if N > 100 and the data look
Normal, then they probably are.

70
If the Data Are Not Normal, Don’t Panic!

• Normal Data are not common in the transactional world.

• There are lots of meaningful statistical tools you can use to analyze
your data (more on that later).

• It just means you may have to think about your data in a slightly
different way.

Don’t touch that button!


71
Normality Exercise

Exercise objective: To demonstrate how to test


for Normality.

1. Generate Normal Probability Plots and the


graphical summary using the “Descriptive
Statistics. MTW” file.

2. Use only the columns Dist A and Dist D.

3. Answer the following quiz questions based on


your analysis of this data set.

72
Isolating Special Causes from Common
Causes

Special Cause: Variation is caused by known factors that result in a


non-random distribution of output. Also referred to as “Assignable
Cause”.

Common Cause: Variation caused by unknown factors resulting in a


steady but random distribution of output around the average of the
data. It is the variation left over after Special Cause variation has been
removed and typically (not always) follows a Normal Distribution.

If we know that the basic structure of the data should follow a Normal
Distribution, but plots from our data shows otherwise; we know the
data contain Special Causes.

Special Causes = Opportunity

73
Introduction to Graphing
The purpose of Graphing is to:
• Identify potential relationships between variables.
• Identify risk in meeting the critical needs of the Customer,
Business and People.
• Provide insight into the nature of the X’s which may or may not
control Y.
• Show the results of passive data collection.

In this section we will cover…


1. Box Plots
2. Scatter Plots
3. Dot Plots
4. Time Series Plots
5. Histograms

74
Data Sources
Data sources are suggested by many of the tools that have
been covered so far:
– Process Map
– X-Y Matrix
– Fishbone Diagrams
– FMEA

Examples are:
1. Time 3. Operator
Shift Training
Day of the week Experience
Week of the month Skill
Season of the year Adherence to
procedures
2. Location/position
Facility 4. Any other sources?
Region
Office 75
Graphical Concepts

The characteristics of a good graph include:


• Variety of data
• Selection of
– Variables
– Graph
– Range

Information to interpret relationships

Explore quantitative relationships

OSSS LSS Black Belt 76 76


The Histogram
A Histogram displays data that have been summarized into intervals. It
can be used to assess the symmetry or Skewness of the data.
Histogram of Histogram

40

30
Frequency

20

10

0
98 99 100 101 102 103
Histogram

To construct a Histogram, the horizontal axis is divided into equal


intervals and a vertical bar is drawn at each interval to represent its
frequency (the number of values that fall within the interval).
77
Histogram Caveat
All the Histograms below were generated using random samples of the
data from the worksheet “Graphing Data.mtw”.

Histogram of H1_20, H2_20, H3_20, H4_20


98 99 100 101 102
H1_20 H2_20
4 4

3 3

2 2

1 1
Frequency

0 0
H3_20 H4_20
8 8

6 6

4 4

2 2

0 0
98 99 100 101 102

Be careful not to determine Normality simply from a Histogram plot, if the


sample size is low the data may not look very Normal.

78
Variation on a Histogram
Using the worksheet “Graphing Data.mtw” create a simple
Histogram for the data column called granular.

Histogram of Granular
25

20

15
Frequency

10

0
44 46 48 50 52 54 56
Granular

79
Dot Plot
The Dot Plot can be a useful alternative to the Histogram especially if
you want to see individual values or you want to brush the data.

Dotplot of Granular

44 46 48 50 52 54 56
Granular

80
Box Plot
Box Plots summarize data about the shape, dispersion and center of the
data and also help spot outliers.

Box Plots require that one of the variables, X or Y, be categorical or Discrete


and the other be Continuous.

A minimum of 10 observations should be included in generating the Box


Plot. Maximum Value

75th Percentile
Middle
50% of 50th Percentile (Median)
Data
Mean
25th Percentile

min(1.5 x Interquartile Range


or minimum value)
Outliers
81
Box Plot Anatomy

Outlier
*
Upper Limit: Q3+1.5(Q3-Q1)

Upper Whisker

Q3: 75th Percentile


Median

Box
Q2: Median 50th Percentile

Q1: 25th Percentile

Lower Whisker

Lower Limit: Q1+1.5(Q3-Q1)

82
Box Plot Examples
Boxplot of Glucoselevel vs SubjectID
225
What can you tell
200
about the data
175 expressed in a Box
Glucoselevel

150 Plots?
125

100

75
Cholesterol Levels
50

1 2 3 4 5 6 7 8350 9
SubjectID

300

Eat this – Data


250

then check 200

the Box 150

Plot! 100
2-Day 4-Day 14-Day

83
Box Plot Example

84
Box Plot Example
The data shows the setup cycle time to complete “Lockout – Tagout”
for 3 individuals in the department.

Setup Cycle Time for "Lockout - Tagout"


20.0

17.5

15.0

12.5
Data

10.0

7.5

5.0

Brian Greg Shree

85
Individual Value Plot Enhancement
The individual value plot shows the individual data points that
are represented in the Box Plot.

Individual Value Plot of Brian, Greg, Shree


20.0

17.5

15.0

12.5
Data

10.0

7.5

5.0

Brian Greg Shree

86
Attribute Y Box Plot
Box Plot with an Attribute Y (pass/fail) and a Continuous X
Graph> Box Plot…One Y, With Groups…Scale…Transpose value and
category scales

87
Attribute Y Box Plot

Boxplot of Hydrogen Content vs Pass/Fail

Pass/Fail 1

215.0 217.5 220.0 222.5 225.0 227.5 230.0 232.5


Hydrogen Content

88
Individual Value Plot
The Individual Value Plot when used with a Categorical X or Y
enhances the information provided in the Box Plot:
– Recall the inherent problem with the Box Plot when a
bimodal distribution exists (Box Plot looks perfectly
symmetrical)
– The Individual Value Plot will highlight the problem
Stat>ANOVA> One-Way (Unstacked )>Graphs…Individual value plot, Box Plots of data

Boxplot of Weibull, Normal, Bi Modal Individual Value Plot of Weibull, Normal, Bi Modal
30 30

25 25

20 20

Data
Data

15 15

10 10

5 5

0 0

Weibull Normal Bi Modal Weibull Normal Bi Modal

89
Jitter Example
Once your graph is created, click once on any of the data points (that
action should select all the data points).
Then go to MINITAB™ menu path: “Editor> Edit Individual
Symbols…Jitter…”
Increase the Jitter in the x-direction to .075, click OK, then click
anywhere on the graph except on the data points to see the results of
the change.
Individual Value Plot of Weibull, Normal, Bi Modal
30

25

20
Data

15

10

Weibull Normal Bi Modal

90
Time Series Plot
Time Series Plots allow you to examine data over time.
Depending on the shape and frequency of patterns in the
plot, several X’s can be found as critical or eliminated.
Graph> Time Series Plot> Simple...

Time Series Plot of Time 1

602

601

600
Time 1

599

598

597
1 10 20 30 40 50 60 70 80 90 100
Index

91
Time Series Example
Looking at the Time Series Plot below, the response appears to be
very dynamic.

Time Series Plot of Time 1

602

601

600
Time 1

599

598

597
1 10 20 30 40 50 60 70 80 90 100
Index

What other characteristic is present?

92
Time Series Example (Cont.)
Let’s look at some other Time Series Plots.
What is happening within each plot?
What is different between the two plots?
Graph> Time Series Plot> Multiple...(use variables Time 2 and Time 3)

Time Series Plot of Time 2, Time 3


605 Variable
Time 2
604 Time 3

603

602

601
Data

600

599

598

597

596
1 10 20 30 40 50 60 70 80 90 100
Index

93
Curve Fitting Time Series
MINITAB™ allows you to add a smoothed line to your time series
based on a smoothing technique called Lowess.
Lowess means Locally Weighted Scatterplot Smoother.
Graph> Time Series Plot> Simple…(select variable Time 3)…Data View…Smoother…Lowess

Time Series Plot of Time 3


605

604

603

602

601
Time 3

600

599

598

597

596
1 10 20 30 40 50 60 70 80 90 100
Index

94
Summary

At this point, you should be able to:

• Explain the various statistics used to express location and


spread of data

• Describe characteristics of a Normal Distribution

• Explain Special Cause variation

• Use data to generate various graphs and make interpretations


based on their output

95
Measure Phase
Measurement System Analysis

96
Measurement System Analysis

Welcome to Measure

Process Discovery

Six Sigma Statistics

Measurement System Analysis

Basics of MSA

Variables MSA

Attribute MSA

Process Capability

Wrap Up & Action Items

97
Introduction to MSA

So far we have learned that the heart and soul of Six Sigma is
that it is a data-driven methodology.
– How do you know that the data you have used is
accurate and precise?
– How do know if a measurement is a repeatable and
reproducible?

How good are these?

Measurement System Analysis


or
MSA

98
Measurement System Analysis

MSA is a mathematical procedure to quantify variation introduced to a


process or product by the act of measuring.

Item to be Reference
Measured Measurement
Operator Measurement Equipment
Process

Procedure
Environment

The item to be measured can be a physical part, document or a scenario for customer service.
Operator can refer to a person or can be different instruments measuring the same products.
Reference is a standard that is used to calibrate the equipment.
Procedure is the method used to perform the test.
Equipment is the device used to measure the product.
Environment is the surroundings where the measures are performed.

99
Measurement Purpose

In order to be worth collecting, measurements must provide value


- that is, they must provide us with information and ultimately,
knowledge

The question…
What do I need to know?

…must be answered before we begin to consider issues of


measurements, metrics, statistics, or data collection systems

Too often, organizations build complex data collection and


information management systems without truly understanding
how the data collected and metrics calculated actually benefit the
organization.

100
Purpose
The purpose of MSA is to assess the error due to measurement
systems.

The error can be partitioned into specific sources:


– Precision
• Repeatability - within an operator or piece of equipment
• Reproducibility - operator to operator or attribute gage to
attribute gage
– Accuracy
• Stability - accuracy over time
• Linearity- accuracy throughout the measurement range
• Resolution
• Bias – Off-set from true value
– Constant Bias
– Variable Bias – typically seen with electronic equipment,
amount of Bias changes with setting levels

101
Accuracy and Precision
Accurate but not precise - On Precise but not accurate - The
average, the shots are in the average is not on the center, but
center of the target but there is a the variability is small
lot of variability

102
MSA Uses
MSA can be used to:

Compare internal inspection standards with the standards of your customer.

Highlight areas where calibration training is required.

Provide a method to evaluate inspector training effectiveness as well as


serves as an excellent training tool.

Provide a great way to:


–Compare existing measurement equipment.
–Qualify new inspection equipment.

103
Why MSA?

Measurement System Analysis is important to:


• Study the % of variation in our process that is caused by our
measurement system.
• Compare measurements between operators.
• Compare measurements between two (or more) measurement
devices.
• Provide criteria to accept new measurement systems (consider
new equipment).
• Evaluate a suspect gage.
• Evaluate a gage before and after repair.
• Determine true process variation.
• Evaluate effectiveness of training program.

104
Appropriate Measures

Appropriate Measures are:

• Sufficient – available to be measured regularly

• Relevant –help to understand/isolate the problems

• Representative - of the process across shifts and people

• Contextual – collected with other relevant information that


might explain process variability.

105
Poor Measures

Poor Measures can result from:


• Poor or non-existent operational definitions
• Difficult measures
• Poor sampling
• Lack of understanding of the definitions
• Inaccurate, insufficient or non-calibrated measurement devices

Measurement Error compromises decisions that affect:


– Customers
– Producers
– Suppliers

106
Examples of What to Measure
Examples of what and when to measure:
• Primary and secondary metrics
• Decision points in Process Maps
• Any and all gauges, measurement devices, instruments, etc
• “X’s” in the process
• Prior to Hypothesis Testing
• Prior to modeling
• Prior to planning designed experiments
• Before and after process changes
• To qualify operators

MSA is a Show Stopper!!!


107
Components of Variation
Whenever you measure anything, the variation that you observe can be
segmented into the following components…

Observed Variation

Unit-to-unit (true) Variation Measurement System Error

Precision Accuracy

Repeatability Reproducibility Stability Bias Linearity

All measurement systems have error. If you don’t know how much of the variation
you observe is contributed by your measurement system, you cannot make
confident decisions.

If you were one speeding ticket away from losing your license, how fast
would you be willing to drive in a school zone?

OSSS LSS Black Belt 108 108


Precision
A precise metric is one that returns the same value of a given
attribute every time an estimate is made.

Precise data are independent of who estimates them or when the


estimate is made.

Precision can be partitioned into two components:


– Repeatability
– Reproducibility

Repeatability and Reproducibility = Gage R+R

109
Repeatability
Repeatability is the variation in measurements obtained with one
measurement instrument used several times by one appraiser while
measuring the identical characteristic on the same part.

Repeatability

For example:
– Manufacturing: One person measures the purity of multiple
samples of the same vial and gets different purity measures.
– Transactional: One person evaluates a contract multiple times
(over a period of time) and makes different determinations of
errors.
110
Reproducibility
Reproducibility is the variation in the average of the measurements made
by different appraisers using the same measuring instrument when
measuring the identical characteristic on the same part.

Reproducibility

Y Operator A
Operator B

For example:
– Manufacturing: Different people perform purity test on samples
from the same vial and get different results.
– Transactional: Different people evaluate the same contract and
make different determinations.

111
Time Estimate Exercise

Exercise objective: Demonstrate how well you can


estimate a 10 second time interval.

1. Pair up with an associate.


2. One person will say start and stop to indicate how
long they think the 10 seconds last. Do this 6 times.
3. The other person will have a watch with a second
hand to actually measure the duration of the estimate.
Record the value where your partner can’t see it.
4. Switch tasks with partner and do it 6 times also.
5. Record all estimates, what do you notice?

112
Accuracy
An accurate measurement is the difference between the observed average
of the measurement and a reference value.
– When a metric or measurement system consistently over or under
estimates the value of an attribute, it is said to be “inaccurate”
Accuracy can be assessed in several ways:
– Measurement of a known standard
– Comparison with another known measurement method
– Prediction of a theoretical value
What happens if we don’t have standards, comparisons or theories?
True
Average

Accuracy
Warning, do not assume your
metrology reference is gospel.

Measurement
113
Accuracy Against a Known Standard
In transactional processes, the measurement system can consist of a
database query.
– For example, you may be interested in measuring product returns
where you will want to analyze the details of the returns over
some time period.
– The query will provide you all the transaction details.

However, before you invest a lot of time analyzing the data, you must
ensure the data has integrity.
– The analysis should include a comparison with known reference
points.
– For the example of product returns, the transaction details should
add up to the same number that appears on financial reports,
such as the income statement.

114
Accuracy vs. Precision
ACCURATE PRECISE BOTH

+ =

Accuracy relates to how close the


average of the shots are to the
Master or bull's-eye.

Precision relates to the spread of the


NEITHER shots or Variance.

115
Bias
Bias is defined as the deviation of the measured value from the actual
value.

Calibration procedures can minimize and control bias within acceptable


limits. Ideally, Bias can never be eliminated due to material wear and
tear!

Bias Bias

116
Stability
Stability of a gauge is defined as error (measured in terms of Standard
Deviation) as a function of time. Environmental conditions such as
cleanliness, noise, vibration, lighting, chemical, wear and tear or other
factors usually influence gauge instability. Ideally, gauges can be
maintained to give a high degree of Stability but can never be eliminated
unlike Reproducibility. Gage Stability studies would be the first exercise
past calibration procedures.
Control Charts are commonly used to track the Stability of a
measurement system over time.
Drift

Stability is Bias characterized as a


function of time!

117
Linearity
Linearity is defined as the difference in Bias values throughout the
measurement range in which the gauge is intended to be used. This tells
you how accurate your measurements are through the expected range of
the measurements. It answers the question, "Does my gage have the
same accuracy for all sizes of objects being measured?"

Linearity = |Slope| * Process Variation Low Nominal High

+e

B i a s (y)
% Linearity = |Slope| * 100
0.00
*
-e
*
*
Reference Value (x)
y = a + b.x
y: Bias, x: Ref. Value
a: Slope, b: Intercept

118
Types of MSA’s
MSA’s fall into two categories:
– Attribute
– Variable
Attribute Variable
– Pass/Fail – Continuous scale
– Go/No Go – Discrete scale
– Document Preparation – Critical dimensions
– Surface imperfections – Pull strength
– Customer Service Response – Warp

Transactional projects typically have Attribute based measurement


systems.
Manufacturing projects generally use Variable studies more often,
but do use Attribute studies to a lesser degree.

119
Variable MSA’s
MINITAB™ calculates a column of variance components (VarComp) which are used to
calculate % Gage R&R using the ANOVA Method.

Measured Value True Value

Estimates for a Gage R&R study are obtained by calculating the variance components for
each term and for error. Repeatability, Operator and Operator*Part components are
summed to obtain a total Variability due to the measuring system.
We use variance components to assess the Variation contributed by each source of
measurement error relative to the total Variation.

OSSS LSS Black Belt 120 120


Session Window Cheat Sheet

Contribution of Variation to the total


Variation of the study.

% Contribution, based on variance


components, is calculated by dividing each
value in VarComp by the Total Variation then
multiplying the result by 100.

Use % Study Var when you are interested in


comparing the measurement system Variation to
the total Variation.
% Study Var is calculated by dividing each value in
Study Var by Total Variation and Multiplying by 100.
Study Var is calculated as 5.15 times the Standard
Deviation for each source.
(5.15 is used because when data are normally
distributed, 99% of the data fall within 5.15
Standard Deviations.)

121
Session Window Cheat Sheet
Session Window explanations

When the process tolerance is entered in the


system, MINITABTM calculates % Tolerance
which compares measurements system
Variation to customer specification. This allows
us to determine the proportion of the process
tolerance that is used by the Variation in the
measurement system.

Always round down to the nearest whole number.

122
Number of Distinct Categories
The number of distinct categories tells you how many separate
groups of parts the system is able to distinguish.

Unacceptable for estimating


process parameters and
indices
Only indicates whether the
process is producing
conforming or
1 Data Category
nonconforming parts

Generally unacceptable for


estimating process
parameters and indices
Only provides coarse
2 - 4 Categories
estimates

Recommended
5 or more Categories

123
AIAG Standards for Gage Acceptance
Here are the Automotive Industry Action Group’s definitions
for Gage acceptance.
% Tolerance
or % Contribution System is…
% Study Variance

10% or less 1% or less Ideal

10% - 20% 1% - 4% Acceptable

20% - 30% 5% - 9% Marginal

30% or greater 10% or greater Poor

124
MINITABTM Graphic Output Cheat Sheet
Gage name: Sample Study - Caliper
Date of study: 2-10-01
Gage R&R (ANOVA) for Data Reported by: B Wheat
Tolerance:
Misc:

Components of Variation By Part


100
%Contribution 0.630
%Study Var
Percent

%Tolerance
50 0.625

0 0.620
Gage R&R Repeat Reprod Part-to-Part Part 1 2 3 4 5 6 7 8 9 10

R Chart by Operator By Operator


0.010 1 2 3 MINITABTM breaks down the Variation in the
0.630
Sample Range

UCL=0.005936
measurement system into specific sources. Each
0.005
cluster
0.625 of bars represents a source of variation. By

0.000
R=0.001817
LCL=0
default, each cluster will have two bars, corresponding
0.620
0 to %Contribution
Operator 1 and2 %StudyVar. 3
If you add a
Xbar Chart by Operator tolerance and/or historical
Operator*Part sigma, bars for % Tolerance
Interaction
1 2 3 Operator
0.632
0.631
UCL=0.6316 and/or
0.631
0.630
%Process are added. 1
Sample Mean

0.630 2
0.629
Average

0.629 3
0.628 Mean=0.6282 0.628
0.627
0.626
In a good measurement system, the largest component
0.627
0.626
0.625
0.624
LCL=0.6248 of Variation is Part-to-Part variation. If instead you
0.625
0.624
0 havePartlarge1 amounts
2 3 4 5 of
6 Variation
7 8 9 10 attributed to Gage

R&R, then corrective action is needed.

125
MINITABTM Graphic Output Cheat Sheet
Gage name: Sample Study - Caliper
Date of study: 2-10-01
Gage R&R (ANOVA) for Data Reported by: B Wheat
Tolerance:
Misc:

Components of Variation By Part


100
%Contribution 0.630
%Study Var
Percent

%Tolerance
50 0.625

0 MINITABTM provides an R Chart and Xbar Chart by Operator.


0.620
Gage R&R Repeat Reprod Part-to-Part Part 1 2 3 4 5
The R chart consists of 6the7 following:
8 9 10

R Chart by Operator By Operator


0.010 1 2 3
- The plotted points are the difference between the largest
0.630
Sample Range

UCL=0.005936
and smallest measurements on each part for each operator.
0.005 If the measurements are the same then the range = 0.
0.625
R=0.001817 - The Center Line, is the grand average for the process.
0.000 LCL=0 - The Control Limits represent the amount of variation
0.620
0 expected
Operator 1 for the subgroup
2 ranges. 3These limits are
Xbar Chart by Operator calculated Operator*Part
using the variation within subgroups.
Interaction
1 2 3 Operator
0.632 UCL=0.6316 0.631
0.631 1
If any of the points on the graph go above 2the upper Control
0.630
Sample Mean

0.630
0.629
Average

0.629
Mean=0.6282
Limit (UCL), then that operator is having problems
0.628
3
0.628
0.627 consistently measuring parts. The Upper Control Limit value
0.627
0.626 0.626
0.625 LCL=0.6248
takes into account the number of measurements by an
0.625
0.624 operator on a part and the variability between parts. If the
0.624
0 Part 1
operators 2
are3 4 5 6
measuring 7 8 9 10
consistently, then these ranges
should be small relative to the data and the points should
stay in control.

126
MINITABTM Graphic Output Cheat Sheet
Gage name: Sample Study - Caliper
Date of study: 2-10-01
Gage R&R (ANOVA) for Data Reported by: B Wheat
Tolerance:
Misc:

Components of Variation By Part


100
%Contribution 0.630
%Study Var
Percent

%Tolerance
50 MINITABTM provides an R Chart and Xbar Chart by Operator.
0.625
The Xbar Chart compares the part-to-part variation to
repeatability. The Xbar chart consists of the following:
0 0.620
Gage R&R Repeat Reprod Part-to-Part Part 1 2 3 4 5 6 7 8 9 10
- The plotted points are the average measurement on each
R Chart by Operator By Operator
0.010 1 2 3 part for each operator.
- The Center Line is the overall average for all part
0.630
Sample Range

UCL=0.005936 measurements by all operators.


0.005
- The Control Limits (UCL and LCL) are based on the
0.625
R=0.001817 variability between parts and the number of measurements in
0.000 LCL=0 each average.
0.620
0 Operator 1 2 3

Xbar Chart by Operator Because theOperator*Part


parts chosen for a Gage R&R study should
Interaction
Operator
0.632 1 2 3
UCL=0.6316 represent the entire range of possible parts, 1this graph should
0.631
0.631 0.630
ideally show lack-of-control. Lack-of-control2exists when
Sample Mean

0.630
0.629
Average

0.629 3
0.628 Mean=0.6282 many points are above the Upper Control Limit and/or below
0.628
0.627 0.627
0.626
the Lower Control Limit.
0.626
0.625 LCL=0.6248 0.625
0.624 0.624
0
In this case there are only a few points out of control which
Part 1 2 3 4 5 6 7 8 9 10
indicates the measurement system is inadequate.

127
MINITABTM Graphic Output Cheat Sheet
Gage name: Sample Study - Caliper
Date of study: 2-10-01
Gage R&R (ANOVA) for Data Reported by: B Wheat
Tolerance:
MINITABTM provides an interaction chart that shows Misc:

the average measurements takenof by


Components each operator
Variation By Part
on each part in100the study, arranged by part. Each %Contribution
line
0.630
connects the averages for a single operator. %Study Var
Percent

%Tolerance
50 0.625
Ideally, the lines will follow the same pattern and the
part averages will
0 vary enough that differences 0.620
Gage R&R Repeat Reprod Part-to-Part Part 1 2 3 4 5 6 7 8 9 10
between parts are clear.
R Chart by Operator By Operator
0.010 1 2 3
0.630
Sample Range

UCL=0.005936
Pattern0.005 Means… 0.625
R=0.001817
0.000 LCL=0
Lines are virtually identical Operators are measuring the 0.620
0 Operator 1 2 3
parts the same
Xbar Chart by Operator Operator*Part Interaction
One line is consistently That
1 operator
2 is measuring
3 Operator
0.632 UCL=0.6316 0.631
0.631 1
higher or lower than the parts consistently higher or 0.630
Sample Mean

0.630 2
0.629

Average
3
others 0.629
0.628
lower than the others Mean=0.6282 0.628
0.627 0.627
Lines are not parallel
0.626 or they The operators ability to 0.626
0.625 LCL=0.6248 0.625
cross 0.624 measure a part depends on 0.624
0 which part is being measured Part 1 2 3 4 5 6 7 8 9 10

(an interaction between


operator and part)

128
MINITABTM Graphic Output Cheat Sheet
Gage name: Sample Study - Caliper
MINITABTM generates
Gage a “by operator”
R&R (ANOVA) for Data chart that
Date of study:
Reported by:
2-10-01
B Wheat
helps us determine whether the measurements are Tolerance:
Misc:
variability are consistent across operator.
Components of Variation By Part
100
The by operator graph shows all the study %Contribution
%Study Var
0.630

measurements arranged by operator. Dots


Percent

%Tolerance
50
represent the measurements; the circle-cross 0.625

symbols represent the means. The red line


0 0.620
connects the average measurements
Gage R&R Repeat Reprod for each
Part-to-Part Part 1 2 3 4 5 6 7 8 9 10

operator. R Chart by Operator By Operator


If the red0.010
line is … 1 2
Then… 3
0.630
Sample Range

UCL=0.005936
Parallel to the
0.005x-axis The operators are 0.625
measuring the partsR=0.001817
0.000 similarly LCL=0 0.620
0 Operator 1 2 3
Not parallel to the x-axis The operators are
Xbar Chart by Operator
measuring the parts Operator*Part Interaction
1 2 3 Operator
0.632
0.631 differently UCL=0.6316 0.631
0.630
1
Sample Mean

0.630 2
0.629

Average
0.629 3
0.628 Mean=0.6282 0.628
0.627 0.627
You can also assess whether the overall Variability
0.626 0.626
0.625 LCL=0.6248 0.625
in part measurement is the same using this graph.
0.624 0.624
Is the spread in the0 measurements similar? Or is Part 1 2 3 4 5 6 7 8 9 10

one operator more Variable than the others?

129
MINITABTM Graphic Output Cheat Sheet
Gage name: Sample Study - Caliper
Date of study: 2-10-01
Gage R&R (ANOVA) for Data Reported by: B Wheat
Tolerance:
Misc:
MINITABTM allows us to analyze all of the
Components
measurements taken in theofstudy
Variationarranged By Part
100
by part. The measurements are represented %Contribution
%Study Var
0.630

by dots; the means by the circle-cross


Percent

%Tolerance
50
symbol. The red line connects the average 0.625

measurements for each part.


0 0.620
Gage R&R Repeat Reprod Part-to-Part Part 1 2 3 4 5 6 7 8 9 10
Ideally, R Chart by Operator By Operator
 Multiple measurements
0.010 1 2
for each
3
0.630
Sample Range

individual part have little variation UCL=0.005936


(the
dots for one part will be close together)
0.005
0.625

• Averages will vary enough that R=0.001817


0.000
differences between parts are clearLCL=0 0.620
0 Operator 1 2 3

Xbar Chart by Operator Operator*Part Interaction


1 2 3 Operator
0.632 UCL=0.6316 0.631
0.631 1
0.630
Sample Mean

0.630 2
0.629
Average
0.629 3
0.628 Mean=0.6282 0.628
0.627 0.627
0.626 0.626
0.625 LCL=0.6248 0.625
0.624 0.624
0 Part 1 2 3 4 5 6 7 8 9 10

130
Practical Conclusions
For this example, the measuring system contributes a great deal to the overall
Variation, as confirmed by both the Gage R&R table and graphs.
The Variation due to the measurement system, as a percent of study Variation is
causing 92.21% of the Variation seen in the process.
By AIAG Standards this gage should not be used. By all standards, the
data being produced by this gage is not valid for analysis.

% Tolerance
or % Contribution System is…
% Study Variance

10% or less 1% or less Ideal

10% - 20% 1% - 4% Acceptable

20% - 30% 5% - 9% Marginal

30% or greater 10% or greater Poor

131
Repeatability and Reproducibility Problems
Repeatability Problems:
• Calibrate or replace gage.
• If only occurring with one operator, re-train.

Reproducibility Problems:
• Measurement machines
– Similar machines
• Ensure all have been calibrated and that the standard measurement method
is being utilized.
– Dissimilar machines
• One machine is superior.
• Operators
– Training and skill level of the operators must be assessed.
– Operators should be observed to ensure that standard procedures are followed.
• Operator/machine by part interactions
– Understand why the operator/machine had problems measuring some parts and
not others.
• Re-measure the problem parts
• Problem could be a result of gage linearity
• Problem could be fixture problem
• Problem could be poor gage design

132
Design Types
Crossed Design
• A Crossed Design is used only in non-destructive testing and assumes that all the
parts can be measured multiple times by either operators or multiple machines.
– Gives the ability to separate part-to-part Variation from measurement system
Variation.
– Assesses Repeatability and Reproducibility.
– Assesses the interaction between the operator and the part.

Nested Design
• A Nested Design is used for destructive testing (we will learn about this in MBB
training) and also situations where it is not possible to have all operators or machines
measure all the parts multiple times.
– Destructive testing assumes that all the parts within a single batch are identical
enough to claim they are the same.
– Nested designs are used to test measurement systems where it is not possible
(or desirable) to send operators with parts to different locations.
– Do not include all possible combinations of factors.
– Uses slightly different mathematical model than the Crossed Design.

133
Gage R & R Study
Gage R&R Study
– Is a set of trials conducted to assess the Repeatability and Reproducibility of
the measurement system.
– Multiple people measure the same characteristic of the same set of multiple
units multiple times (a crossed study)

– Example: 10 units are measured by 3 people. These units are then


randomized and a second measure on each unit is taken.

A Blind Study is extremely desirable.


– Best scenario: operator does not know the measurement is a part of a test
– At minimum: operators should not know which of the test parts they are
currently measuring.

NO, not that kind of R&R!

134
Variable Gage R & R Steps

Step 1: Call a team meeting and introduce the concepts of the Gage R&R
Step 2: Select parts for the study across the range of interest
– If the intent is to evaluate the measurement system throughout the process range,
select parts throughout the range
– If only a small improvement is being made to the process, the range of interest is
now the improvement range
Step 3: Identify the inspectors or equipment you plan to use for the analysis
– In the case of inspectors, explain the purpose of the analysis and that the inspection
system is being evaluated not the people
Step 4: Calibrate the gage or gages for the study
– Remember Linearity, Stability and Bias
Step 5: Have the first inspector measure all the samples once in random order
Step 6: Have the second inspector measure all the samples in random order
– Continue this process until all the operators have measured all the parts one time
– This completes the first replicate
Step 7: Repeat steps 5 and 6 for the required number of replicates
– Ensure there is always a delay between the first and second inspection
Step 8: Enter the data into MINITABTM and analyze your results
Step 9: Draw conclusions and make changes if necessary

135
Gage R & R Study

Part Allocation From Any Population

10 x 3 x 2 Crossed Design is shown


A minimum of two measurements/part/operator is required
Three is better!

Trial 1
Operator 1
P Trial 2
a
r Trial 1
t 1 2 3 4 5 6 7 8 9 10 Operator 2
s Trial 2

Trial 1
Operator 3
Trial 2

136
Data Collection Sheet
Create a data collection sheet for:
– 10 parts
– 3 operators
– 2 trials

137
Data Collection Sheet

138
The Data Collection Sheet

139
Gage R & R
Open the file “Gageaiag2.MTW” to view the worksheet.

Variables:
– Part
– Operator
– Response

140
Gage R & R

Use 1.0 for the


tolerance.

141
Graphical Output
Looking at the “Components of Variation” chart, the Part to Part Variation needs to be
larger than Gage Variation.

If in the “Components of Variation” chart the “Gage R&R” bars are larger than the “Part-to-
Part” bars, then all your measurement Variation is in the measuring tool i.e.… “maybe the
gage needs to be replaced”. The same concept applies to the “Response by Operator”
chart. If there is extreme Variation within operators, then the training of the operators is
suspect.

Part to Part
Variation needs to
be larger than Gage
Variation

Operator
Error

142
Session Window

Two-Way ANOVA Table With Interaction


Source DF SS MS F P
Part 9 1.89586 0.210651 193.752 0.000
Operator 2 0.00706 0.003532 3.248 0.062
Part * Operator 18 0.01957 0.001087 1.431 0.188
Repeatability 30 0.02280 0.000760
Total 59 1.94529
Gage R&R
%Contribution
Source VarComp (of VarComp)
Total Gage R&R 0.0010458 2.91
Repeatability 0.0007600 2.11
Reproducibility 0.0002858 0.79
Operator 0.0001222 0.34
Operator*Part 0.0001636 0.45
Part-To-Part 0.0349273 97.09
Total Variation 0.0359731 100.00
Number of Distinct Categories = 8

I can see clearly now!

143
Session Window

If the Variation due to Gage R & R is high, consider:


• Procedures revision?
• Gage update? • 20 % < % Tol. GRR < 30%  Gage Unacceptable
• Operator issue? • 10 % < % Tol GRR < 20 %  Gage Acceptable
• Tolerance validation? • 1 % < % Tol GRR < 10 %  Gage Preferable

Study Var %Study Var %Tolerance


Source StdDev (SD) (6 * SD) (%SV) (SV/Toler)
Total Gage R&R 0.032339 0.19404 17.05 19.40
Repeatability 0.027568 0.16541 14.54 16.54
Reproducibility 0.016907 0.10144 8.91 10.14
Operator 0.011055 0.06633 5.83 6.63
Operator*Part 0.012791 0.07675 6.74 7.67
Part-To-Part 0.186889 1.12133 98.54 112.13
Total Variation 0.189666 1.13800 100.00 113.80

Number of Distinct Categories = 8

144
Signal Averaging
Signal Averaging can be used to reduce Repeatability error when a
better gage is not available.
– Uses average of repeat measurements.
– Uses Central Limit theorem to estimate how many repeat
measures are necessary.

Signal Averaging is a method


to reduce Repeatability error
in a poor gage when a better
gage is not available or when
a better gage is not possible.

145
Signal Averaging Example
Suppose SV/Tolerance is 35%.

SV/Tolerance must be 15% or less to use gage.

Suppose the Standard Deviation for one part measured by one person
many times is 9.5.

Determine what the new reduced Standard Deviation should be.

146
Signal Averaging Example
Determine sample size:

Using the average of 6


repeated measures will
reduce the Repeatability
component of
measurement error to the
desired 15% level.

This method should be considered temporary!

147
Paper Cutting Exercise
Exercise objective: Perform and Analyze a variable MSA
Study.

1. Cut a piece of paper into 12 different lengths that are all


fairly close to one another but not too uniform. Label the
back of the piece of paper to designate its “part number”
2. Perform a variable gage R&R study as outlined in this
module. Use the following guidelines:
– Number of parts: 12
– Number of inspectors: 3
– Number of trials: 5
3. Create a MINITABTM data sheet and enter the data into the
sheet as each inspector performs a measurement. If
possible, assign one person to data collection.
4. Analyze the results and discuss with your mentor.

148
Attribute MSA
A methodology used to assess Attribute Measurement Systems.

Attribute Gage Error

Repeatability Reproducibility Calibration

– They are used in situations where a continuous measure cannot be obtained.


– It requires a minimum of 5x as many samples as a continuous study.
– Disagreements should be used to clarify operational definitions for the
categories.
• Attribute data are usually the result of human judgment (which category
does this item belong in).
• When categorizing items (good/bad; type of call; reason for leaving) you
need a high degree of agreement on which way an item should be
categorized
149
Attribute MSA Purpose

The purpose of an Attribute MSA is:


– To determine if all inspectors use the same criteria to determine “pass” from “fail”.
– To assess your inspection standards against your customer’s requirements.
– To determine how well inspectors are conforming to themselves.
– To identify how inspectors are conforming to a “known master,” which includes:
• How often operators ship defective product.
• How often operators dispose of acceptable product.
– Discover areas where:
• Training is required.
• Procedures must be developed.
• Standards are not available.

An Attribute MSA is similar in many ways to the continuous MSA, including the
purposes. Do you have any visual inspections in your processes? In your experience
how effective have they been?

150
Visual Inspection Test
Take 60 Seconds and count the number of times “F” appears in this
paragraph?

The Necessity of Training Farm Hands for First Class Farms in the Fatherly
Handling of Farm Live Stock is Foremost in the Eyes of Farm Owners. Since
the Forefathers of the Farm Owners Trained the Farm Hands for First Class
Farms in the Fatherly Handling of Farm Live Stock, the Farm Owners Feel
they should carry on with the Family Tradition of Training Farm Hands of First
Class Farmers in the Fatherly Handling of Farm Live Stock Because they
Believe it is the Basis of Good Fundamental Farm Management.

151
How can we Improve Visual Inspection?
Visual Inspection can be improved by:
• Operator Training & Certification
• Develop Visual Aids/Boundary Samples
• Establish Standards
• Establish Set-Up Procedures
• Establish Evaluation Procedures
– Evaluation of the same location on each part.
– Each evaluation performed under the same lighting.
– Ensure all evaluations are made with the same standard.

Look closely now!

152
Excel Attribute R & R Template

Attribute Gage R & R Effectiveness

SCORING REPORT
DATE: 5/10/2006
5
Attribute Legend (used in computations) NAME: Joe Smith
1 pass PRODUCT: My Gadget All operators
2 fail BUSINESS: Unit 1 agree within and All Operators
between each agree with
Other standard
Known Population Operator #1 Operator #2 Operator #3 Y/N Y/N
Sample # Attribute Try #1 Try #2 Try #1 Try #2 Try #1 Try #2 Agree Agree
1 pass pass pass pass pass fail fail N N
2 pass pass pass pass pass fail fail N N
3 fail fail fail fail pass fail fail N N
4 fail fail fail fail fail fail fail Y Y
5 fail fail fail pass fail fail fail N N
6 pass pass pass pass pass pass pass Y Y
7 pass fail fail fail fail fail fail Y N
8 pass pass pass pass pass pass pass Y Y
9 fail pass pass pass pass pass pass Y N
10 fail pass pass fail fail fail fail N N
11 pass pass pass pass pass pass pass Y Y
12 pass pass pass pass pass pass pass Y Y

153
Attribute: Precision Assessment Deliverable

Precision Precision + Bias

Repeatability

Reproducibility R
A
C A
T

The green triangle represents the actual score of the U N


A
appraiser. The range between the red squares is the L G
Confidence Interval which is a function of the operators
E
score and the size of the sample they have inspected.

154
Statistical Report

The Operator agrees on


both trials with the known
The Operator agrees with standard
themselves on both trials

All Operators agree


All Operators agree Within & Between
Within & Between themselves and with
themselves the standard

155
M&M Exercise
Exercise objective: Perform and Analyze an Attribute MSA Study.

• You will need the following to complete the study:


– A bag of M&Ms containing 50 or more “pieces”
– The attribute value for each piece.
– Three or more inspectors.

• Judge each M&M as pass or fail.


Number Part Attribute
– The customer has indicated that they want a bright and
1 M&M Pass
shiny M&M and that they like M’s.
2 M&M Fail
• Pick 50 M&Ms out of a package.
3 M&M Pass

• Enter results into either the Excel template or MINITABTM and


draw conclusions.

• The instructor will represent the customer for the Attribute


score.

156
Summary

At this point, you should be able to:

• Understand Precision & Accuracy

• Understand Bias, Linearity and Stability

• Understand Repeatability & Reproducibility

• Understand the impact of poor gage capability on product quality

• Identify the various components of Variation

• Perform the step by step methodology in Variable and Attribute


MSA’s

157
Measure Phase
Process Capability

158
Process Capability

Welcome to Measure

Process Discovery

Six Sigma Statistics

Measurement System Analysis

Process Capability

Continuous Capability

Concept of Stability

Attribute Capability

Wrap Up & Action Items

159
Understanding Process Capability
Process Capability:

• The inherent ability of a process to meet the expectations of the


customer without any additional efforts.

• Provides insight as to whether the process has a :


– Centering Issue (relative to specification limits)
– Variation Issue
– A combination of Centering and Variation
– Inappropriate specification limits

• Allows for a baseline metric for improvement.


*Efforts: Time, Money, Manpower, Technology, and Manipulation

160
Capability as a Statistical Problem

Our Statistical Problem: What is the probability of our process


producing a defect ?

Define a Practical
Problem

Create a
Statistical Problem

Correct the
Statistical Problem

Apply the Correction


to the Practical
Problem

161
Capability Analysis
The X’s The Y’s
(Inputs)
Y = f(X) (Process Function) Variation – “Voice of
(Outputs)
the Process”

Frequency
Op i Verified Op i + 1
? Data for
Y1…Yn
X1
Y1 10.16
10.11
10.16 9.87
X2 Off-Line 10.05
10.11 9.99
10.16
9.87 10.11
Analysis Scrap 10.33
10.05 10.12
9.99 10.05
Correction 10.44
10.33 10.43
10.12 10.33

X3 Y2 9.86
10.44 10.21
10.43 10.44
10.01
10.21 9.86
9.80 9.90 10.0 10.1 10.2 10.3 10.4 10.5
10.07
9.86
10.29
10.07 10.15
10.01 10.07
10.36
10.29 10.44
10.15 10.29
10.03
10.44 10.36
X4 10.36
10.33
10.03
10.15
10.33
Yes No Y3 10.15

X5 Correctable
?

Requirements – “Voice
Critical X(s): of the Customer”
Data - VOP
Any variable(s) 10.16
10.11 9.87 10.16
LSL = 9.96 USL = 10.44
which exerts an 10.05
10.33
9.99
10.12
10.43
10.11
10.05
10.44 10.33
undue influence on 9.86
10.07
10.21
10.01
10.44
9.86
10.15
the important 10.29
10.36 10.44
10.03
10.07
10.29

outputs (CTQ’s) of a
10.36
10.33
10.15

process Defects Defects

Capability Analysis Numerically


Compares the VOP to the VOC -6 -5 -4 -3 -2 -1 +1 +2 +3 +4 +5 +6

9.70 9.80 9.90 10.0 10.1 10.2 10.3 10.4 10.5 10.6
Percent Composition

162
Process Output Categories

Incapable Off target


LSL
Average
USL LSL Average
USL

Target Target

Capable and
on target
Average
LSL USL

Target

163
Problem Solving Options – Shift the Mean

This involves finding the variables that will shift the process over to
the target. This is usually the easiest option.

USL
LSL
Shift

164
Problem Solving Options – Reduce Variation

This is typically not so easy to accomplish and occurs often in Six


Sigma projects.

LSL USL

165
Problem Solving Options – Shift Mean &
Reduce Variation

This occurs often in Six Sigma projects.

USL
LSL Shift & Reduce

166
Problem Solving Options

Obviously this implies making them wider, not narrower. Customers


usually do not go for this option but if they do…it’s the easiest!

LSL USL USL

Move Spec

167
Capability Studies

Capability Studies:
• Are intended to be regular, periodic, estimations of a process’s ability to
meet its requirements.
• Can be conducted on both Discrete and Continuous Data.
• Are most meaningful when conducted on stable, predictable processes.
• Are commonly reported as Sigma Level which is optimal (short term)
performance.
• Require a thorough understanding of the following:
– Customer’s or business’s specification limits
– Nature of long-term vs. short-term data
– Mean and Standard Deviation of the process
– Assessment of the Normality of the data (Continuous Data only)
– Procedure for determining Sigma level

168
Steps to Capability

Select Output for


Improvement

#1 Verify Customer
Requirements

#2 Validate
Specification
Limits

#3 Collect Sample
Data

#4 Determine
Data Type
(LT or ST)

#5 Check data
for normality

#6 Calculate
Z-Score, PPM,
Yield, Capability
Cp, Cpk, Pp, Ppk
#7
169
Verifying the Specifications

Questions to consider:

• What is the source of the specifications?


– Customer requirements (VOC)
– Business requirements (target, benchmark)
– Compliance requirements (regulations)
– Design requirements (blueprint, system)

• Are they current? Likely to change?

• Are they understood and agreed upon?


– Operational definitions
– Deployed to the work force

170
Data Collection
Capability Studies should include “all” observations (100% sampling) for a specified period.
Short-term data: Long-term data:
•Collected across a narrow •Is collected across a broader inference
inference space. space.
•Daily, weekly; for one shift, •Monthly, quarterly; across multiple
machine, operator, etc. shifts, machines, operators, etc
•Is potentially free of special cause •Subject to both common and special
variation. causes of variation.
•Often reflects the optimal •More representative of process
performance level. performance over a period of time.
•Typically consists of 30 – 50 data •Typically consists of at least 100 – 200
points. data points.
Lot 1 Lot 5
Fill Quantity

Lot 3

Lot 2

Lot 4
Short-term studies

Long-term study

171
Baseline Performance
Process Baseline: The
average, long-term performance
level of a process when all input
variables are unconstrained.
Long-term
baseline

Short Term
4
Performance

` 3

2
1
LSL TARGET USL

172
Components of Variation
Even stable processes will drift and shift over time by as much as 1.5
Standard Deviations on the average.

Long Term
Overall Variation

Short Term
Between Group Variation

Short Term
Within Group Variation

173
Sum of the Squares Formulas

SS total = SS between + SS within

Precision
Shift (short-term capability)
x
x x
x
x x
x x x
x x
x
x x x Time
x x
x x x x
x x x
x

174
Stability
A Stable Process is consistent over time. Time Series Plots and Control
Charts are the typical graphs used to determine stability.

At this point in the Measure Phase there is no reason to assume the


process is stable.
Time Series Plot of PC Data
70

60
PC Data

50

Tic toc…
40
tic toc…
30
1 48 96 144 192 240 288 336 384 432 480
Index

175
Measures of Capability
Mathematically Cpk and Ppk are the same and Cp and Pp are the
same.

The only difference is the source of the data, Short-term and Long-
term, respectively.
Hope
– Cp and Pp
• What is Possible if your process is perfectly Centered
• The Best your process can be
• Process Potential (Entitlement)
Reality
– Cpk and Ppk
• The Reality of your process performance
• How the process is actually running
• Process Capability relative to specification limits

176
Capability Formulas

Six times the sample


Standard Deviation

Sample Mean

Three times the sample


Standard Deviation

LSL – Lower specification limit


Note: Consider the “K” value the penalty for being off center USL – Upper specification limit

177
MINITAB™ Example
Open worksheet “Camshaft.mtw”. Check for Normality.

By looking at the “P-values”


the data look to be Normal
since P is greater than .05

178
MINITAB™ Example
Create a Capability Analysis for both suppliers, assume long-term
data.
Note the subgroup size for this example is 5.
LSL=598 USL=602

179
MINITAB™ Example

Process Capability of Supplier 1

LSL USL
P rocess Data Within
LS L 598 Ov erall
Target *
USL 602 P otential (Within) C apability
S ample M ean 599.115 Cp 1.19
S ample N 100 C P L 0.66
S tDev (Within) 0.559239 C P U 1.72
S tDev (O v erall) 0.604106 C pk 0.66
O v erall C apability
Pp 1.10
PPL 0.62
PPU 1.59
P pk 0.62
C pm *

597.75 598.50 599.25 600.00 600.75 601.50


O bserv ed P erformance E xp. Within P erformance E xp. O v erall P erformance
P P M < LS L 30000.00 P P M < LS L 23088.05 P P M < LS L 32467.79
PPM > USL 0.00 PPM > USL 0.12 PPM > USL 0.90
P P M Total 30000.00 P P M Total 23088.18 P P M Total 32468.68

180
MINITAB™ Example

Process Capability of Supplier 2

LSL USL
P rocess Data Within
LS L 598 Ov erall
Target *
USL 602 P otential (Within) C apability
S ample M ean 600.061 Cp 0.66
S ample N 100 C P L 0.68
S tDev (Within) 1.00606 C P U 0.64
S tDev (O v erall) 1.14898 C pk 0.64
O v erall C apability
Pp 0.58
PPL 0.60
PPU 0.56
P pk 0.56
C pm *

597 598 599 600 601 602 603


O bserv ed P erformance E xp. Within P erformance E xp. O v erall P erformance
P P M < LS L 40000.00 P P M < LS L 20251.30 P P M < LS L 36425.88
PPM > USL 60000.00 P P M > U S L 26969.82 P P M > U S L 45746.17
P P M Total 100000.00 P P M Total 47221.11 P P M Total 82172.05

181
MINITAB™ Example
MINITAB™ has a selection to calculate Benchmark Z’s or Sigma levels
along with the Cp and Pp statistics. By selecting these the graph will
display the “Sigma Level” of your process!

Stat>Quality Tools>Capability Analysis>Normal…>Options…Benchmark Z’s (sigma level)

182
MINITAB™ Example

Process Capability of Supplier 1

LSL USL
P rocess Data Within
LS L 598 Ov erall
Target *
USL 602 P otential (Within) C apability
S ample M ean 599.115 Z.Bench 1.99
S ample N 100 Z.LS L 1.99
S tDev (Within) 0.559239 Z.U S L 5.16
S tDev (O v erall) 0.604106 C pk 0.66
O v erall C apability
Z.Bench 1.85
Z.LS L 1.85
Z.U S L 4.78
P pk 0.62
C pm *

597.75 598.50 599.25 600.00 600.75 601.50


O bserv ed P erformance E xp. Within P erformance E xp. O v erall P erformance
P P M < LS L 30000.00 P P M < LS L 23088.05 P P M < LS L 32467.79
PPM > USL 0.00 PPM > USL 0.12 PPM > USL 0.90
P P M Total 30000.00 P P M Total 23088.18 P P M Total 32468.68

183
MINITAB™ Example

Process Capability of Supplier 2

LSL USL
P rocess Data Within
LS L 598 Ov erall
Target *
USL 602 P otential (Within) C apability
S ample M ean 600.061 Z.Bench 1.67
S ample N 100 Z.LS L 2.05
S tDev (Within) 1.00606 Z.U S L 1.93
S tDev (O v erall) 1.14898 C pk 0.64
O v erall C apability
Z.Bench 1.39
Z.LS L 1.79
Z.U S L 1.69
P pk 0.56
C pm *

597 598 599 600 601 602 603


O bserv ed P erformance E xp. Within P erformance E xp. O v erall P erformance
P P M < LS L 40000.00 P P M < LS L 20251.30 P P M < LS L 36425.88
PPM > USL 60000.00 P P M > U S L 26969.82 P P M > U S L 45746.17
P P M Total 100000.00 P P M Total 47221.11 P P M Total 82172.05

184
Example Short Term

MINITAB™ assumes long-term data


– When short-term data is taken, do one of the following:

Option 1 Option 2

Enter “Subgroup size:” = total Go to “Options”, turn off “Within


number of samples subgroup analysis”

185
Continuous Variable Caveats
Capability indices assume Normally Distributed data.
Always perform a Normality test before assessing capability.
Process Capability

LSL USL
Process Data Within
LSL 35.00000 Overall
Target *
Potential (Within) Capability
USL 65.00000
Z.Bench 2.54
Sample Mean 50.19214
Sample N 150 Z.LSL 2.81
Z.USL 2.74
StDev(Within) 5.40199
Cpk 0.91
StDev(Overall) 20.93958
CCpk 0.93

Overall Capability

Z.Bench 0.07
Z.LSL 0.73
Z.USL 0.71
Ppk 0.24
Cpm * Probability Plot
99.9
Mean 50.19
StDev 20.90
99 N 150
AD 11.238
95 P-Value <0.005

0 15 30 45 60 75 90 90

80
70
Observed Performance Exp. Within Performance Exp. Overall Performance

Percent
60
50
PPM < LSL 413333.33 PPM < LSL 2459.27 PPM < LSL 234065.73 40
30
PPM > USL 453333.33 PPM > USL 3060.91 PPM > USL 239730.12 20

PPM Total 866666.67 PPM Total 5520.18 PPM Total 473795.85 10


5

0.1
0 25 50 75 100 125

186
Capability Steps

Select Output for We can follow the steps for


Improvement
calculating capability for Continuous
#1 Verify Customer
Data until we reach the question
Requirements about data Normality…

#2 Validate
Specification
Limits

#3 Collect Sample
Data

#4 Determine
Data Type
(LT or ST)

#5 Check data
for Normality

#6 Calculate
Z-Score, PPM,
Yield, Capability
Cp, Cpk, Pp, Ppk

#7
187
Attribute Capability Steps
Select Output for
Improvement Notice the difference when we come
to step 5…
#1 Verify Customer
Requirements

Validate
#2 Specification
Limits

#3 Collect Sample
Data

#4
Calculate
DPU

#5
Find Z-Score

#6 Convert Z-Score
to Cp & Cpk

#7

188
Z Scores
Z Score is a measure of the distance in Standard Deviations of a sample from
the Mean.

The Z Score effectively transforms the actual data into standard normal units.
By referring to a standard Z table you can estimate the area under the Normal
curve.
– Given an average of 50 with a Standard Deviation of 3 what is the
proportion beyond the upper spec limit of 54?

50

54

189
Z Table

190
Attribute Capability
Attribute data is always long-term in the shifted condition since it requires so
many samples to get a good estimate with reasonable confidence.

Short-term Capability is typically reported, so a shifting method will be


employed to estimate short-term Capability.

You Want to Estimate : ZST ZLT


Short Term Long Term Sigma Short-Term Long-Term
Your Data Is : Capability Capability Level DPMO DPMO
1 158655.3 691462.5
Short Term Subtract
ZST Capability 1.5 2 22750.1 308537.5

Long Term Add 3 1350.0 66807.2


ZLT Capability 1.5 4 31.7 6209.7

5 0.3 232.7

6 0.0 3.4

191
Attribute Capability
By viewing these formulas you can see there is a relationship between
them.

If we divide our Z short-term by 3 we can determine our Cpk and if we


divide our Z long-term by 3 we can determine our Ppk.

192
Attribute Capability Example

A customer service group is interested in estimating the Capability of their


call center.

A total of 20,000 calls came in during the month but 2,500 of them
“dropped” before they were answered (the caller hung up).

Results of the call center data set:


Samples = 20,000
Defects = 2,666

They hung up….!

193
Attribute Capability Example

1. Calculate DPU
2. Look up DPU value on the Z-Table
3. Find Z-Score
4. Convert Z Score to Cpk, Ppk

Example:
Look up ZLT
ZLT = 1.11
Convert ZLT to ZST = 1.11+1.5 = 1.61

194
Attribute Capability
1. Calculate DPU
2. Look up DPU value on the Z-Table
3. Find Z Score
4. Convert Z Score to Cpk, Ppk

Example:
Look up ZLT
ZLT = 1.11
Convert ZLT to ZST = 1.11+1.5 = 1.61

195
Summary

At this point, you should be able to:

• Estimate Capability for Continuous Data

• Estimate Capability for Attribute Data

• Describe the impact of Non-normal Data on the analysis


presented in this module for Continuous Capability

196
Measure Phase
Wrap Up and Action Items

197
Measure Phase Overview - The Goal
The goal of the Measure Phase is to:

• Define, explore and classify “X” variables using a variety of tools.


– Detailed Process Mapping
– Fishbone Diagrams
– X-Y Matrixes
– FMEA

• Demonstrate a working knowledge of Basic Statistics to use as a


communication tool and a basis for inference.

• Perform Measurement Capability studies on output variables.

• Evaluate stability of process and estimate starting point capability.

198
Six Sigma Behaviors

• Being tenacious, courageous

• Being rigorous, disciplined

• Making data-based decisions

• Embracing change & continuous learning Walk


• Sharing best practices
the
Walk!
Each “player” in the Six Sigma process must be
A ROLE MODEL
for the Six Sigma culture

199
Measure Phase Deliverables
Listed below are the Measure Deliverables that each candidate should
present in a Power Point presentation to their mentor and project
champion.

At this point you should understand what is necessary to provide these


deliverables in your presentation.
– Team Members (Team Meeting Attendance)
– Primary Metric
– Secondary Metric(s)
– Process Map – detailed
– FMEA
– X-Y Matrix
– Basic Statistics on Y
– MSA
– Stability graphs
– Capability Analysis
– Project Plan
– Issues and Barriers
200
Measure Phase - The Roadblocks
Look for the potential roadblocks and plan to address them before they
become problems:
– Team members do not have the time to collect data.
– Data presented is the best guess by functional managers.
– Process participants do not participate in the creation of the X-Y
Matrix, FMEA and Process Map.

It won’t all be
smooth
sailing…..

201
DMAIC Roadmap
Process Owner
Champion/

Identify Problem Area

Determine Appropriate Project Focus


Define

Estimate COPQ

Establish Team
Measure

Assess Stability, Capability, and Measurement Systems

Identify and Prioritize All X’s


Analyze

Prove/Disprove Impact X’s Have On Problem


Improve

Identify, Prioritize, Select Solutions Control or Eliminate X’s Causing Problems

Implement Solutions to Control or Eliminate X’s Causing Problems


Control

Implement Control Plan to Ensure Problem Doesn’t Return

Verify Financial Impact

202
Measure Phase
Detailed Problem Statement Determined

Detailed Process Mapping

Identify All Process X’s Causing Problems (Fishbone, Process Map)

Select the Vital Few X’s Causing Problems (X-Y Matrix, FMEA)

Assess Measurement System

Y
Repeatable &
Reproducible?
N

Implement Changes to Make System Acceptable

Assess Stability (Statistical Control)

Assess Capability (Problem with Centering/Spread)

Estimate Process Sigma Level

Review Progress with Champion

Ready for Analyze

203
Measure Phase Checklist
Measure Questions
Identify critical X’s and potential failure modes
• Is the “as is” Process Map created?
• Are the decision points identified?
• Where are the data collection points?
• Is there an analysis of the measurement system?
• Where did you get the data?
Identify critical X’s and potential failure modes
• Is there a completed X-Y Matrix?
• Who participated in these activities?
• Is there a completed FMEA?
• Has the Problem Statement changed?
• Have you identified more COPQ?
Stability Assessment
• is the “Voice of the Process” stable?
• If not, have the special causes been acknowledged?
• Can the good signals be incorporated into the process?
• Can the bad signals be removed from the process?
• How stable can you make the process?
Capability Assessment
• What is the short-term and long-term Capability of the process?
• What is the problem, one of centering, spread or some combination?
General Questions
• Are there any issues or barriers that prevent you from completing this phase?
• Do you have adequate resources to complete the project?

204
Planning for Action
WHAT WHO WHEN WHY WHY NOT HOW
Identify the complexity of the process
Focus on the problem solving process
Define Characteristics of Data
Validate Financial Benefits
Balance and Focus Resources

Establish potential relationships between variables


Quantify risk of meeting critical needs of Customer,
Business and People
Predict the Risk of sustainability

Chart a plan to accomplish the desired state of the culture


What is your defect?
When does your defect occur?
How is your defect measured?
What is your project financial goal (target & time) to reach
it?
What is your Primary metric?
What are your Secondary metrics?
Define the appropriate elements of waste

205
Summary

At this point, you should:

• Have a clear understanding of the specific action items

• Have started to develop a project plan to complete the action items

• Have identified ways to deal with potential roadblocks

• Be ready to apply the Six Sigma method within your business

206
Thank you!!!
QUESTIONS
AND
DISCUSSIONS

207

Das könnte Ihnen auch gefallen