You are on page 1of 29

5.

0 DMAIC DMAIC stands for define, measure, analyze, improve and control, and all continuous applications (CA's) in LSSQTT are tied to parts of DMAIC. DMAIC principles, applied through CA's in LSSQTS, are a key part of the way we create and manage projects toward a useful and productive conclusion for improvement. Work in any part of the DMAIC is not necessarily linear, but may move about in the various CA's and within DMAIC as needed to complete a project in mature, cost effective ways to account for team-based teaching and learning. Regardless of which worksheets are selected/assigned/used to build your team model, all are continuously developed, and added to in data and documentation over the course with all on the team contributing to each, building on what has been started in the "define" DMAIC workbook. Worksheets 5.1-5.12 all illustrate various important "control" principles but certainly there are other ways to control as may be defined by your team through your model and project. Control, Part of DMAIC
"Improve" worksheets, similar to all parts of DMAIC, are assigned by team leader for the tool or phase being completed according to the tool/phase rollout in the syllabus. Team leader rotates alphabetically for each tool or phase, letting various persons do this (at least once) and to improve all systems appropriately/cross functionally, as documented/grown over time. All tool assignments, engage LSSQTT content for review and action, in concert with "define" documentation. Some work (and worksheets) should be modified as the project is further defined, but much will be used to explain the work of the team, and to meet course outcomes, cumulatively. Each person on the team does not need to work on each worksheet at each assignment, but should have participated with each, at least once, by the end of the course. All on the team should have participated in each application worksheet selected by the team, demonstrably adding data and/or documentation to it, and /or added value to the worksheet in ways which have improved the DMAIC LSSQTS system.

Team Assignments, Flow Of Work, Individual And Grand Compilations


All review assigned content in LSSQTT, individually do applications, and post responses in team work area (created by instructor) as assigned by team leaders (and from syllabus). Chats are configured and led by team leader, assuring the chat room is ready to go, archiving is running, agenda is posted, etc. Several items in applications should be "standing" items in chat, done early in posting cycle, with a second chat as may be needed for best practices, at the end of the cycle (inviting instructor may also be helpful). Team leader creates posting threads and all on team post their individual tool work, according to schedule, as suppliers. Team leader also assigns specific compiling work, for a "grand" compilation in a continuously updated and improving project portfolio. All post at threads in team work area, and those not posting on schedule should have points reduced by team assessment. Work must be well planned, articulated and managed, with all doing separate worksheets in initial postings, and compiled data/documentation reflecting team best practices.

Continuous Applications, FACR Relationships, Phases I and II


The project evolves as a prototypically developed model, based on FACR's derived, and LSSQTT content and applications, started and continued in the "Define" section of DMAIC. The FACR in the "define" application workbook should be used to track all main, evolving findings, analyses and related conclusions, recommendations, ongoing, incrementally evolved. This requires a continuing and emerging plan, configured by the team leader, with timelines adhered to, and where all "grow" knowledge in team, culture, and total portfolio as a vision for improvement, evidenced, particularly at phases I and II. Multiple tools and DMAIC CA's will be provided incrementally at the course information area to help assess systems in a change and improvement context. Teams continuously evolve applications, all integrated into an evolving team model, as project portfolio documentation. Each tool and DMAIC CA worksheet, selected by your team, should be trialed, applied, in your project model, and changed as deemed appropriate, based on FACR analysis. Work started in an earlier tool must be "gone back through", cleaned up, added to, and improved, as all learn and grow team knowledge, focus project, etc.

5.1 Pareto Charting.


Pareto charting is often done early in analyses, piggy-backed on histograms and other preliminary data collection and problem identification. It is used to show areas needing attention versus those we can postpone. Thus, it is a good decision tool in trying to get to the root of the problem associated with a characteristic or attribute which is indicating a defect or defective. General shape of the chart is constant but, the frequency and the % values shift to present relationships inherent in the facts being shown.

General Directions.
Identify the problems or attribute to study. Collect data and frequency of the attributes. List the attributes in the table in descending order, with those with the highest frequency on the top and the least at the bottom. The percentage of total occurrence for each attribute relative to the total of all occurences will be automatically calculated in the third column. This will automatically generate a pareto chart with the percentage frequency listed on the vertical axis and the attributes listed in the horizontal axis. Each column represents individual attributes and is shaded with a different color. The % frequency is generally not 100%, since in fact, it is very unlikely that any set of occurrences would ever equal 100%.

Attribute Table.
The following table lists the data from a study done on the causes of expenses incurred in a simulated hospital Lab. The most probable problem areas are identified and their frequency of occurrence are tabulated in decreading order. The frequency is listed according to the extra expenses incurred in a month's period due to the corresponding attribute. The attribute with the highest frequency was the major culprit in increasing extra expense during the period being analyzed, and thus, is likely the area to attack first for improvement, with other areas following in decreasing order. A graphical analysis of the relationship between individual attributes is shown in the pareto chart below.

Attributes Identified, Frequency, % Occurrence. % of Total Attributes Occurrence


% of Total Occurence

40% 35% 30% 25% 20% 15% 10% 5% 0%

37% 24% 15%

1.4.4 Pareto Chart

Contamination Glassware Breakage Sloppy housekeeping Inventory Mismanagement Improper Sterilization Disposable Apparatus Other

37% 24% 15% 10% 7% 5% 2%

10%

7%

5%
Disposable Apparatus

2%
Other

0%

0%

Inventory Mismanagement

Contamination

Sloppy housekeeping

Improper Sterilization

Glassware Breakage

Attributes

5.2 Geneal Safety Inspection Checklist Checklist Item


Good Housekeeping/Cleanliness Piling and Storage/Tagging Systems Aisles, Walkways, and Exists Tools And Supplies Ladders And Stairs Machinery And Equipment Floors, Platforms, and Railings Electrical Fixtures/Equipment Dust, Ventilation, and Explosives Overhead Valves, Pipes, Markings Protective Clothing/Equipment Washroom, Lockers, Shower, Deluge Unsafe Practices/Horseplay/SOPs First Aid Facilities Vehicles, Hand and Power Trucks Fire Fighting Equipment Guards And Safety Devices Lighting, Work Tables/Areas General Maintenance Safety Training, Communication Company/OSHA Standards Comply Cranes, Hoists, Conveyors Scrap And Rubbish Other Items, Circumstances

OK

Not Safe

Not App.

Actual Site

Explanations/ Action/Other

5.3 Pie Chart


Pie Chart is a graphical representation of the defects or areas needing improvement in a process. The chart shows 100 product defects by categories, sorted and organized within the pie chart by count and percentage. This analytical tool is a simple "slice out of the pie" for each area represented. Again, the power in this approach is "seen" clearly by comparing in rather straight forward ways, the larger versus smaller areas--and where we need to go to work to make improvements. This is particularly true as related to attribute data at the workplace for quick and easy analytical aids for operators.

General Directions
Identify defects in a fixed number of final produdcts of the process. Categorize the defects and quantify the number of products associated with each defect. Place the defects in the first column of the table and the number of products with the corresponding defects in the second column. The Third Column will automatically calculate the % frequency of each defect by dividing each defect frequency by the total number of defects in the study. These values will be automatically appear in the pie chart, with each slice of the pie representing a problem area. Using this graphical tool, a comparison between the various defects can be made. The defect occupying the largest area in the pie-chart is the most problematic and requires immediate attention.

Attribute Table
Attributes/Defects Too small Too Large Out of round Wrong Color Surface Problems Number of Products with the Defect 19.5477707 5 6 10 12 % Frequency 37% 10% 11% 19% 23%

Pie Chart Showing Defects & % Frequency

Too small Too Large Out of round

Surface Problems 23% Too small 37% Wrong Color 19% Too Large Out of round 10% 11%

Wrong Color Surface Problems

ONGOING PROCESS CONTROL PLAN

ONGOING PROCESS CONTROL PLAN


Customer: Prototype Control Plan Number: Part Number/Latest Change Level: Part Name/Description: Supplier/Plant: Supplier Code: Core Team: Supplier/Plant Approval/Date: Other Approval/Date: Customer Engineering Approval/Date: Customer Quality Approval/Date: Other Approval/Date: Pre- Launch Production Key Contact/Phone: Date(Original): Date(Revision): Page 1 of 1

Part Process Number


1

Process Name/ Operation description


Receiving

Machine, Device, Jig, Tools. For Mfg.


Receiving Dock

Characteristics No
1

Product

Process
Receipt & Verification

Special Char. Class


*

Methods Product/Process Evaluation Measurement Specification/Tolerances Technique


Quantity, delivery & damage. Visual

Sample Size Frequency


All Per Lot #

Control Method
Certified vendors, Purchase orders / Packing slips. Procedures / Work Instructions Trained Material Handlers

Reaction Plan
Notify supplier. Isolate material until disposition is reached. Corrective action. Notify Material Handling Supervisor. Initiate a corrective action. Notify supervisor. Initiate corrective action. Notify Material Handling Supervisor. Initiate a corrective action. Stop production. Inform Supervisor. Sort back through till pts. conform. "

Move Raw Material

Warehouse

Relocation to Warehouse

FIFO

Visual

All

Per Lot

Store Raw Material

Warehouse

Storage

FIFO / Storage & Preservation Procedure FIFO

Visual

All

Per Lot

Fork lift operator verifies goods against storage floor plan. Procedures / Work Instructions Trained Material Handlers Job control plan Work order record "

Move Raw Material

Bruno Press

Relocation to Machine

Visual

All

Per Lot

5&6

Die Cutting & Inspection

Bruno Press

Length

759.4 mm

+/- 5.0mm

Tape measure

1 piece 1st, last & per cavity hourly " "

Width

539.9 mm

+/- 5.0mm

"

7&8

Packaging & Labeling

Bruno Press

Amount per container

Must have correct amount of parts

Visual / Counting

All

Per Order

Job Contol Plan Work Instruction Work order record "

Re-Count & Re-Pack Parts. Change label to reflect the proper information. Notify Material Handling Supervisor. Iniciate a corrective action. Notify supervisor. Initiate corrective action. Isolate order. Hold for 100% inspection. " Corrective action. Notify customer.

Label information

Bar coded label must reflect the proper part #, serial # and quantity FIFO

"

"

"

Move Finished Goods

Warehouse

Relocation to Warehouse Storage

Visual

All

Per Lot

Procedures / W. Inst. Trained Material Handlers Storage & Preservation procedure. Dock Audit check sheet. "

10

Store Finished Goods

Warehouse / Shipping Area

Storage & Preservation Procedure As per Work Instruction

Visual

All

Per Lot

11

Dock Audit

Shipping Dock

2 12 Ship to Customer Shipping Dock 1

Overall condition of parts, labels, cartons, amounts etc. Part Verification Part delivered to the customer

Check Sheets, JCPs, Blue Prints. Visual, Work order Label Mapics tracking system.

As per Dock Audit procedure. " 100% All Orders

Parts in carton must match Work order / Label On-time and free of damage

Shipping Log On time delivery policy.

Potential Falure Mode and Effects Analysis (PROCESS FMEA) POTENTIAL FUTURE MODEL & EFFECT ANALYSIS ( PROCESS FMEA) Item: Model Year(s) Vehicles: Core Team:
PROCESS DESCRIPTION POTENTIAL FAILURE MODE EFFECTS OF FAILURE S E V C L A S S

Die Cutting Application Process Responsibility: Key Date:


POTENTIAL CAUSES OF FAILURE O C C CURRENT CONTROLS Prevention CURRENT CONTROLS Detection D E T R P N

FMEA Number: Prepared by: FMEA Date (Orig):


RECOMMENDED ACTION RESP. COMP. DATE

(Rev.) 01
ACTION TAKEN S E V O C C D E T R P N

1 RECEIVING VersaMat

- Receive wrong material.

- Will not meet customers specifications. Internal reject. - Wrong thickness of - Will not meet the material is received. customers specifications. - VersaMat has not been certified. - Material arrives damaged. - May not meet flammability requirements - May lose the proper yield. May cause a slow down while processing.

9 4

- Vendor labeling problem. Miscomunication between order desk and vendor. - Vendor labeling problem. Misscomunication between order desk and vendor. * - Vendor failure Material is not certifiable. - Not loaded in truck correctly. Truck leaked etc. - Lack of training - Lack of knowledge of storage location - Lack of training Improper set-up / material mis-fed

- Vendor certification. Receipt & verification to packing slip - Vendor certification. Receipt & verification - A2LA certification. Receipt & verification - Vendor certification. Receipt & verification Training records / certification Training records / certification Training records / certification

45 None

63 None

1 2

9 9

81 None 72 None

2 STORE Raw Material

- Damage during - Mis-handling of material movement - Stored in the wrong - Will not meet the customer's location delivery - Damage during movement - Miscut - Mis-handling of material - Will not meet customers specifications. Customer rejection. - Will not meet customers specifications. Customer rejection. - Will not meet customers specifications. Customer rejection. - Will not meet customers specifications. Customer rejection. Parts are usable to customer Possible bad products

6 7 6

1 Forklift training 1 Forklift training 1 Forklift training

9 9 8

54 None 63 None 48 None 56 None

3 MOVE Raw Material 4 DIE CUTTING 5 INSPECTION

- Work instructions, job 8 control plans, quality records

- Jagged Cuts

Dull/bent/damaged die rule

- Work instructions, job 8 control plans, quality records

56 None

- Dirty Parts

Poor housekeeping

1 Housekeeping Audits

- Work instructions, job 8 control plans, quality records

56 None

- Incomplete Cuts

Dull/bent/damaged die rule/ improper head height setting

- Work instructions, job 8 control plans, quality records

56 None

- Used Wrong Die - Gage out of calibration - Wrong part is produced / wrong amount of parts. 6 PACKAGING 7 LABELING - Too many parts in container.

7 4

No label on die Calibration date expired

1 Tooling Die Procedure

Tooling Approval Log

8 8

56 None 32 None

1 Gage Calibration System Gage Calibration Log

- Customer rejection. Customer may not have enough to fill orders.

- Not following Work Inst. Work order discrepency.

- Production planner reviews work orders. Work Instr. Shippers pick list - Production planner reviews work orders. Work Instr. Shippers pick list - Production planner reviews work orders. Work Instr.

56 None

- Container may not be able to hold up, could break during shipping. - Not enough parts in - Increased shipping costs. container. Cust. may not have enough pts

- Packer not following JCP work instruction. - Packer not following JCP work instruction.

72 None

72 None

Page 6 of 29

Potential Falure Mode and Effects Analysis (PROCESS FMEA) Item: Model Year(s) Vehicles: Core Team:
PROCESS DESCRIPTION POTENTIAL FAILURE MODE EFFECTS OF FAILURE S E V C L A S S

Die Cutting Application Process Responsibility: Key Date:


POTENTIAL CAUSES OF FAILURE O C C CURRENT CONTROLS Prevention CURRENT CONTROLS Detection D E T R P N

FMEA Number: Prepared by: FMEA Date (Orig):


RECOMMENDED ACTION RESP. COMP. DATE

(Rev.) 01
ACTION TAKEN S E V O C C D E T R P N

- Wrong parts arrive at customers.

to keep their process going. Pts damaged / slid in carton. - Customer shutdown.

Shippers pick list 4 - Mislabeled parts. / Mislabeled carton.

- Production planner reviews work orders. Work Instr. Shippers pick list - Production planner reviews work orders. Work Instr. Shippers pick list Training records / certification Training records / certification Training records / certification Dock Audit Sheet Training records / certification - On time delivery policy. Delivery tracking system. - On time delivery policy. Delivery tracking system. - Loading instructions, packaging instructions

36 None

- Parts are all stuck together. 8 MOVE Finished Goods 9 STORE Finished Goods - Damage during movement - Damage during movement - Stored in the wrong location - Procedure not followed correctly - Damage during movement - Parts arrive at customers late. - Parts arrive at customer too early. - Parts arrive at customer damaged.

- Customer rejection. Parts are not fit for use. - Mis-handling of material - Mis-handling of material - Will not meet the customer's delivery - Customer rejection - Mis-handling of material - Could cause some down time for customer. - Could cause a storage problem for customer. - Customer rejection

- Parts were not properly orientated in carton - Lack of training - Lack of training - Lack of knowledge of storage location - Lack of interesting the dock audit review - Lack of training - Parts not cut/assembled on time. Trucking problem. Work order discrepency. - Work order discrepency. Shipping problem. - Parts were not loaded on truck properly. Cartons were not sealed / packed correctly

36 None

6 6 7 4 6 6

1 - Forklift training 1 - Forklift training 1 - Forklift training 1 1 - Forklift training 1

8 9 9 9 8 9

48 None 54 None 63 None 36 None 48 None 54 None

10 DOCK AUDIT 11 MOVE Finished Goods 12 SHIPPING

2 5

1 1

9 9

18 None 45 None

Page 7 of 29

Variable Statistical Process Control (VSPC)


Researcher (s): Part: Characteristic: Operator: Operation: Compiler (s): the compilers were JWS and Prakriti Team: Phase: Date: Tool:

Inspection, Data Gathering Description (Attach Pertinent Information): The description identified here includes a main emphasis on 100% inspection by the operator. Time
1 2 Measure Value: 3 4 5

501 500 496 495 502 2494 499 7

502 496 497 498 501 2494 499 6

498 508 502 500 498 2506 501 10

497 500 496 497 496 2486 497 4

502 500 501 496 498 2497 499 6 499

502 500 501 496 498 2497 499 6

498 508 502 500 498 2506 501 10

502 500 501 496 498 2497 499 6

502 500 501 496 498 2497 499 6

502 500 501 496 498 2497 499 6

497 500 496 496 498 2487 497 4

497 500 496 497 496 2486 497 4

502 500 501 500 498 2501 500 4

Sum Average Range

Calculate Grand Mean or X Double Bar = Calculate R Bar = UCL = LCL= 499 499.2 6 + 0.58 0.58 * *

6 6

= =

503 496

Calculate UCL = X Double Bar + (A2) (R Bar); Calculate LCL = X Double Bar (A2) (R Bar) Where A2 = .58 Constant 0 LCL R= 12.82 Calculate UCL R = (D4) (R Bar); And LCL R = (D3) (R Bar) Where D3 = 0 And D4 = 2.11 Constants UCL R= **Configure Graphs To Accommodate Values Calculated, And To Fit Within Work Areas, Illustrating Process Sample 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17

UCL = 503

503

503

503

503

503

503

503

503

503

503

503

503

503

503

503

503

503

X Double Bar = 499

499

499

499

499

499

499

499

499

499

499

499

499

499

499

499

499

499

LCL = 496

496

496

496

496

496

496

496

496

496

496

496

496

496

496

496

496

496

10

11

12

13

14

15

16

17

18

UCL R = 12.8

12.82

12.82

12.82

12.82

12.82

12.82

12.82

12.82

12.82

12.82

12.82

12.82

12.82

12.82

12.82

12.82

12.82

R Bar = 6

LCL R = 0

0 0

0 1

0 2

0 3 4

0 5

0 6

0 7

0 8

0 9

0 10

0 11 12

0 13

0 14

0 15

0 16

0 17

0 18

spection by the operator.

Tool:

Date:

Attribute Gage R&R Study


Operation Characteristic Gage Name Gage No. Name Operator A Operator B Operator C **NOTES:
1. Select at least 20 parts, some slightly below and some above both

Date

spec. limits. Legend Attribute Legend A Accept B D G G N Bad Defect Go


Operator A 2. Select at least 2 operators for this study. 3. The gage is acceptable if all measurement decisions agree.

Good NoGo
Operator C Trial 1 Trial 2 Score

Calculation Field
Operator A Operator B Operator C Trail 1 Trial 2 Trial 1 Trial 2 Trial 1 Trial 2

Known Attribute

Operator B

Sample ID Attribute Trail 1 Trial 2 Trial 1 Trial 2

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

G G G G N G G G N G

G N G G G G G G N G

G G G G N G G G N G

G G G G N G G G N G

G G G G N G G G N G

Y N Y Y N Y Y Y Y Y

1 0 1 1 0 1 1 1 1 1

1 1 1 1 1 1 1 1 1 1

1 1 1 1 1 1 1 1 1 1

1 1 1 1 1 1 1 1 1 1

0 0 0 0 0 0 0 0 0 0

0 0 0 0 0 0 0 0 0 0

Score via Trial/Operator Score via Operator

80%

100% 90%

100%

100% 100%

0%

0% 0%

80%

Trail 1 Trial 2 Trial 1 Trial 2 Operator A Operator B Summary % Appraiser Effectiveness Operator A Operator B Operator C % Effectiveness Total 0.8 0.9 1 0

Trial 1 Trial 2 Operator C **Notes: The gage is acceptable if all measurement decisions agree. If measurement decisions do not agree, gage must be improved, re-evaluated. If the gage cannot be improved, it is unacceptable and an alternate measurement system should be found.

Variable Data Gage Reproducibility And Repeatability System (VDGRARS)


Part Description:

Inspection Description:

Operation:

Tolerance: 0.032

Characteristic:

Charts Used:

Gage/Device:

17

Gage R&R Data Table:


Operator A Sample ID Operator B Range Trial 1 Trial 2 Trial 3 Range Trial 1 Operator C Trial 2 Trial 3 Range

1 2 3 4 5 6 7 8 9 10

Trial 1 1.300 1.303 1.306 1.305 1.300 1.300 1.297 1.300 1.306 1.305
Trial 1

Trial 2 1.301 1.303 1.305 1.305 1.300 1.300 1.300 1.306 1.306 1.306
Trial 2

Trial 3

1.300 1.302 1.306 1.305 1.301 1.300 1.300 1.299 1.306 1.306
Trial 3

0.001 0.001 0.001 0.000 0.001 0.000 0.003 0.007 0.000 0.001

1.300 1.303 1.306 1.305 1.300 1.300 1.297 1.300 1.306 1.305
Trial 1

1.301 1.303 1.305 1.305 1.300 1.300 1.300 1.306 1.306 1.306
Trial 2

1.300 1.302 1.306 1.305 1.301 1.300 1.300 1.299 1.306 1.306
Trial 3

0.001 0.001 0.001 0.000 0.001 0.000 0.003 0.007 0.000 0.001

1.300 1.302 1.306 1.305 1.301 1.300 1.300 1.299 1.306 1.306
Trial 1

1.300 1.302 1.306 1.305 1.301 1.300 1.300 1.299 1.306 1.306
Trial 2

1.305 1.305 1.305 1.305 1.305 1.305 1.305 1.305 1.305 1.305
Trial 3

0.005 0.003 0.001 0.000 0.004 0.005 0.005 0.006 0.001 0.001

Trial Averages

1.302

1.303
Operator A

1.303
RA

1.302 0.002

1.303
Operator B

1.303
RB

1.303 0.002

1.303
Operator C

1.305 RC 0.003

1.303 R Average: 0.002 Minimum Operator: 1.303 Maximum Operator: 1.305 Difference: 0.002 Number of Operators (r): 3 No. of Trials (n): 3 Calculations: Calculate UCL R UCL R= R Average * D4 0.0052

1.303 D4: K1: 2.58 3.05

1.305

K2: 2.7 Tolerance:

0.032

Calculate Equipment Variation (EV) where: EV = ( R Avg) (K1) EV= 0.0062 Calculate Appraiser Variation (AV) where: AV = [ (Diff Max-Min)(K2)] - [(EV) / (n x r)] AV = 0.0064 Calculate R & R Where: R&R = (EV) + (AV) R&R= 0.0089

Calculate %EV where: %EV= 100[(EV)/(Tolerance)] %EV= 19.38 Calculate %AV where: % AV = 100 [ (AV) / (Tolerance)] % AV= 44.67 Calculate % R & R Where: % R&R= ( % EV) + (% AV) % R&R= 48.694

18

P Attribute Statistical Process Control (PASPC)


Part: Characteristic: Date
Time Sample n # def (np) P = np/n Sample 60 12 0.200 1 60 8 0.133 2 0.300 0.281 0.250 0.200 0.150 0.145 0.100 0.050 LCL= 0.009 0.009 0.000 0 0.009 1 0.009 3 0.009 4 0.009 5 6 0.009 7 0.009 8 0.009 10 0.009 11 60 6 0.100 3 60 5 0.083 4 60 10 0.167 5 60 8 0.133 6 60 8 0.133 7 60 11 0.183 8 60 12 0.200 9 60 7 0.117 10

Operator: Operation:

Inspection, Data Gathering (Attach Pertinent Information) :

UCL=

0.281

0.281

0.281

0.281

0.281

0.281

0.281

0.281

0.281

P bar=

0.145

0.145

0.145

0.145

0.145

0.145

0.145

0.145

0.145

2 600

Calculate Sample n = All n Summed, Sn=

Calculate Number Defective (np) As All Defectives Summed, Snp = Calculate P bar = np/n Summed/ Number Of Samples, P bar =

87 0.145

Calculate UCL And LCL = P bar 3 P bar (1-P bar)/n 0.281 UCL= LCL= 0.009

0.250

0.200

0.150

0.100

0.050

0.000 0 1 2 3 4 5 6 7 8 9 10 11

5.10 Cpk

Directions :
1. Input the data in the Data column (below), it will accomidate up to 30 points, delete data that is not yours. 2. Input the Upper Spec & Lower Spec in the cells below the graph. You will find the Cpk at the bottom of the page, in the double lined box.

Point # 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30

Data (in mm.) 539.8 539.8 539.8 541.3 541.3 539.8 541.3 541.3 541.3 539.8 541.3 541.3 540.0 539.8 541.3 539.8 541.3 539.8 541.3 539.8 539.8 539.8 539.8 539.8 541.3 539.8 542.0 541.3 539.8 541.3

Run Chart
550.0 540.0
Series1

Value

530.0 520.0 1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 Data Point

Upper Spec Lower Spec Center Spec Tolerance

544.9 534.9 539.9

Mean 540.518 Variance 0.68564 StDev. 0.828 Cp 2.01 1.76

5.0 Cpk

Page 21

Six Sigma Analysis (SSA), Two Factor ANOVA


As six sigma analysis is implemented in an organization, typically one of the areas which is pursued are designed experiments, often referred to in quality systems as design of experiments (DOE). Generally this would be done based on discreet variable data which is being collected from various sources for other purposes, such as statistical process control (SPC) being tracked for other customer-oriented purposes. This is an important point since the maturation which exists in SPC is a foundation from which other more robust areas of testing and analysis can be pursued. If some reasonable level of SPC maturation does not exist in an .organization, it is highly suspect regarding why the organization would want to begin with ANOVA or DOE ANOVA and DOE, in just about any form, are fairly complex and robust quality tools used to help further analyze process performance in data-driven circumstances. Best case scenario, data would already be collected and documented from which samples could be selected and/or configured for various purposes. Usual purposes would relate to comparisons of performance for one approach to another, to determine which process approach is the best under some set of predetermined conditions. It is important that the processes being analyzed experimentally in ways discussed here be "under control" as evidenced by traditional SPC methods such as gage R & R, variable data SPC, Cpk indices, and so on. Thus, DOE and ANOVA tools are usually reserved for applications where we are needing to continuously improve, and where we have already .spent considerable time to improve ANOVA, stands for analysis of variance. As it is displayed and applied here, ANOVA is a fairly simplistic form known as two factor. The illustrations used are simple, as ANOVA examples go, but still fairly complex since any form of data has real costs associated with collection and assuring reasonable degree of validity and trustworthiness. These illustrations are used, also, since they can actually be run with excel on a PC as you can discover by using the systems provided. Other systems and tools are available but all require other softwares and setups to make them run and do the work needed. The ANOVA examples provided should help .show you ways to configure your data to test at higher levels, and continue to improve process performance As with any ANOVA, and/or DOE situation, the numbers and types of factors must be considered. In the case of examples shown here in pages that follow, two factors are used at any given time. Levels of factors will vary, contingent upon the nature of the work, what we are trying to determine, resources available, etc. Factors and levels are at the root of any situation where we are trying to improve based on comparative analysis. DOE software typically takes some of the guesswork out of the decision making processes in setup and running of experiments and/or iterations, and this is a main reason for DOE being used as it is. DOE is also commonly used where less scientific, and more technical (process improvement at the factory floor level), type analyses are being pursued. As a general rule, DOE software systems provide a less precise analysis .relative to standard forms of ANOVA The worksheet assumes you are reasonably familiar with excel, ANOVA and how to manipulate data in statistical purposes. Manipulation of data sets requires taking existing data collection circumstances and modifying them in ways which are necessary to provide meaningful inputs and outputs for analysis. Hints are provided near sets on the examples which follow, intended to guide your decisions and pursuits for solving problems and/or continuously improving. These are only provided as examples and guides, and it is intended that you will need to become intimately involved in setting up the tables and organizing the data per .parameters allowed in the excel systems When these systems are being analyzed and explored for applicability in your project, note that it is intended that you will use information provided in various LSSQT long and short form tools, and that you and others on the team will focus ROL information and exploration on the same/related topics. As with most other tools and worksheets, there are no absolute guides or definitive examples to "plug in" and use on any one set of circumstances. Rather, as with most lean six sigma quality transformation circumstances, you will need to .learn the basics and experiment with what works best, how to use, and so on

Set 1: .50% Sodium Bisulfate addition rate; Zeolite amendment rate between .00 to 1.50%. DAY Control 0% Z .75% Z 1% Z 1.25% Z 1.5% Z 1 115.0 300.0 218.0 74.2 113.0 145.0 1 147.0 111.0 114.0 64.0 75.8 98.0 1 103.0 38.6 98.8 111.0 65.9 153.0 2 142.0 388.0 243.0 96.0 196.0 358.0 2 357.0 288.0 160.0 141.0 158.0 198.0 2 177.0 243.0 224.0 178.0 145.0 224.0 3 115.0 139.0 22.2 44.5 94.8 122.0 3 215.0 128.0 91.8 53.2 80.8 249.0 3 136.0 144.0 160.0 52.2 77.0 194.0 4 46.7 38.8 1.1 16.9 17.8 3.4 4 54.8 50.4 4.1 16.8 14.4 18.8 4 40.8 52.5 8.6 14.6 31.6 10.5 Set 2: .75% Sodium Bisulfate addition rate; Zeolite amendment rate between .00 to 1.25%. DAY Control 0% Z .50% Z .75% Z 1% Z 1.25% Z 1 115.0 86.6 64.0 128.0 47.0 6.9 1 147.0 27.2 68.2 157.0 3.5 40.6 1 103.0 128.0 36.6 45.6 87.0 157.0 2 142.0 276.0 224.0 204.0 128.0 174.0 2 357.0 229.0 265.0 209.0 4.4 129.0 2 177.0 168.0 190.0 160.0 262.0 212.0 3 115.0 50.2 104.0 96.5 96.8 90.2 3 215.0 213.0 96.8 129.0 9.2 100.0 3 136.0 80.0 61.6 73.2 20.4 126.0 4 46.7 6.0 48.8 53.4 34.2 35.9 4 54.8 57.5 19.1 55.0 14.6 30.9 4 40.8 43.6 14.7 26.7 100.0 55.2 Set 3: 1.25% Sodium Bisulfate addition rate; Zeolite amendment rate 1.25%. DAY Control 1.25% Z 1 115.0 56.0 1 147.0 26.8 1 103.0 30.6 2 142.0 170.0 2 357.0 129.0 2 177.0 246.0 3 115.0 145.0 3 215.0 141.0 3 136.0 145.0 4 46.7 64.8 4 54.8 89.8 4 40.8 55.0 1. Run at Tools -> Data Analysis-> ANOVA: Two Factor With Replicates. 2. Must highlight variable names when transposing raw data. 3. Interpreting P-values from ANOVA calculation sheets provides sample and column values below .05 as significant. This means the variance among and between means was significantly different. 4. Interpreting P-value for interaction provides a value greater than .05 as indicating no interaction.

Anova: Two-Factor With Replication with .50% Sodium Bisulfate addition rate; and, Zeolite Amendment Rate Variable Between .00% and 1.50%. SUMMARY 0%-control 0% Z .75% Z 1% Z 0.0125 0.015
1

Total 18 2145.3 119.2 3686.4

Count Sum Average Variance


2

3 365 121.7 517.3

3 449.6 149.9 18215.5

3 430.8 143.6 4209.3

3 249.2 83.1 611.2

3 254.7 84.9 616.7

3 396 132.0 883.0

Count Sum Average Variance


3

3 3 676 919 225.3333 306.33333 13308.33 5508.3333

3 627 209 1891

3 3 415 499 138.33333 166.333333 1686.3333 702.333333

3 780 260 7372

18 3916 217.555556 6887.43791

Count Sum Average Variance


4

3 466 155.3333 2780.333

3 411 137 67

3 3 274 149.9 91.333333 49.966667 4747.3733 22.663333

3 252.6 84.2 87.88

3 18 565 2118.5 188.333333 117.694444 4056.33333 3710.14173

Count Sum Average Variance


Total

3 3 142.3 141.7 47.43333 47.233333 49.40333 54.443333

3 13.71 4.57 14.1453

3 48.3 16.1 1.69

3 63.8 21.2666667 82.9733333

3 32.73 10.91 59.1853

18 442.54 24.5855556 331.794026

Count Sum Average Variance

12 12 12 12 12 12 1649.3 1921.3 1345.51 862.4 1070.1 1773.73 137.4417 160.10833 112.12583 71.866667 89.175 147.810833 7499.73 13815.879 8077.3668 2640.197 3163.97477 11309.2678

ANOVA Source of Variation SS Sample Columns Interaction Within Total 335281.4 72179.67 41199.67 135089.5 583750.2

df 3 5 15 48 71

MS

P-value

F crit

4.74E-13 2.79806065 14435.934 5.1293762 0.000756 2.40851411


111760.47 39.71073 2746.6448 0.9759379 0.49366405 1.88017458 2814.3645

Anova: Two-Factor With Replication


SUMMARY Control 1

0% Z

.50% Z

.75% Z

1% Z

1.25% Z

Total

Count Sum Average Variance


2

3 365 121.6667 517.3333

3 3 241.8 168.8 80.6 56.266667 2567.16 294.49333

3 3 330.6 137.5 110.2 45.8333333 3340.12 1744.08333

3 18 204.5 1448.2 68.1666667 80.4555556 6202.44333 2522.64967

Count Sum Average Variance


3

3 3 3 676 673 679 225.3333 224.33333 226.33333 13308.33 2932.3333 1410.3333

3 3 573 394.4 191 131.466667 727 16598.4533

3 18 515 3510.4 171.666667 195.022222 1726.33333 5616.66771

Count Sum Average Variance


4

3 466 155.3333 2780.333

3 3 3 3 343.2 262.4 298.7 126.4 114.4 87.466667 99.566667 42.1333333 7513.48 514.77333 785.46333 2272.69333

3 18 316.2 1812.9 105.4 100.716667 342.28 2871.83324

Count Sum Average Variance


Total

3 142.3 47.43333 49.40333

3 3 3 107.1 82.6 135.1 35.7 27.533333 45.033333 709.87 344.04333 252.72333

3 148.8 49.6 2001.16

3 18 122 737.9 40.6666667 40.9944444 164.663333 474.542908

Count Sum Average Variance

12 12 12 12 12 1649.3 1365.1 1192.8 1337.4 807.1 137.4417 113.75833 99.4 111.45 67.2583333 7499.73 7791.3772 6815.1473 3896.0973 5618.84083

12 1157.7 96.475 4165.41477

ANOVA
Source of Variation

Sample Columns Interaction Within Total

SS 230751.2 32355.34 24702.84 138198.6 426008

df 3 5 15 48 71

MS F P-value 76917.076 26.715316 2.6141E-10 6471.0679 2.2475715 0.0645754 1646.8558 0.5719962 0.88165422 2879.1378

F crit 2.79806065 2.40851411 1.88017458

Anova: Two-Factor With Replication SUMMARY Control


1

1.25% Z

Total

Count Sum Average Variance


2

3 365 121.6667 517.3333

3 6 113.4 478.4 37.8 79.733333 252.04 2417.8347

Count Sum Average Variance


3

3 3 676 545 225.3333 181.66667 13308.33 3524.3333

6 1221 203.5 7305.1

Count Sum Average Variance


4

3 3 466 431 155.3333 143.66667 2780.333 5.3333333

6 897 149.5 1155.1

Count Sum Average Variance


Total

3 3 142.3 209.6 47.43333 69.866667 49.40333 322.01333

6 351.9 58.65 299.543

Count Sum Average Variance

12 1649.3 137.4417 7499.73

12 1299 108.25 4313.63

ANOVA Source of Variation SS Sample 79171.99 Columns 5112.92 Interaction 9256.721 Within 41518.25 Total 135059.9

df 3 1 3 16 23

MS F P-value 26390.664 10.170242 0.00054538 5112.9204 1.9703801 0.17951857 3085.5738 1.189096 0.34531969 2594.8904

F crit 3.23887152 4.49399842 3.23887152

exist in an

should help

floor level),

FACR's are used for each individual tool, and each assigned application in the toolset. Generally these are associated with a specific objective, although they may address more than one objective in the project. The goal is to determine how effectively the tool application worked, and in what ways? Also, what was found when it was used, what could be concluded, and what was the recommendation based on use? Note that rows can be added to (and inserted) and that a general summary statement should be added at the bottom of the worksheet on this should same page. Note that FACR's be completed for all tools and all applications by all on the team, per the syllabus assignment rollout, although final compilation is assigned based on the team leader for a given tool or phase under construction. Iterations for different information and/or data sets for the same application are intentionally to be configured as multiple sets of data for the same application, as a way to "test" and/or further develop the application. It is also true that applications can be modified and improved as part of the work of the team, if recommended them. All team members first by use the application with the project and the tool, as a separate data set generated. After all on the team have used the application, and posted in the appropriate thread area, one person on the team is assigned by the team leader to compile all of the same applications. After all separate applications for a given tool are compiled, the team leader compiles all of these into the tool FACR system as part of the grand portfolio compilation. As each tool is added and the toolset completed across the semester and course, these are added, cumulatively.
Tool/App/ Team Mem

Findings, Analysis, Conclusions and Recommendations (FACR's)

Project Objective

Analysis Method
Analysis methods are typically directly/specifically related to the nature of the application and tool under study, objectively analyzed discreetly, clearly, detailed. The anlysis method was to use the VSPC application as a systematic approach, gathering data iteratively with each team member filling a practice sheet.

Findings
Findings relate to how the application and tool were found to behave--did they work--and how well? Findings only reflect what was directly addressed, objectively. Based on various iterations it was found that the application was appropriate to monitor the drill operation for continuous improvement.

Conclusion,
Conclusions are directly related to, and based on, the findings and analysis method behaviors. Conclusions only reflect what was directly addressed, objectively. Conclusion drawn was that the VSPC application appears to be an appropriate data gathering and monitoring system for continuous improvement and variation reduction.

Recommendation
Recommendations are based on analysis, findings and conclusions logically derived based on the application to the objective, within the context of the tool. Objectivity

Each project objective is shown in a similar manner, as appropriate, and relationships to the tool and Explanation applications are addressed and directions and explained objectively.

Example Tool __, VSPC, by J. Doe and B. Anybody.

To reduce variation in the ABC production system, specifically at the ______ operation.

Recommendation was to continue to collect data and pursue the systematic analysis for variation reduction and continuous improvement.

General information

Name (s) shown at a row, for Tool/App/Team Mem Rows are added by placing may be one person or the cursar in the row desired Rows can be expanded for multiple's since, when to be added to, and using the more verbage by going to the compiled, the information insert prompt above, row line at the left, under may be merged, greater as dragged down to rows. numbers of rows. sum, vs. individual.

One indication that a team is working well together, all are "pulling their weight", will be that all names are represented in the total final tool compilation, and all objectives are addressed.

Typically, the FACR will be several pages in length, all one worksheet, for a given tool, addressing all applications for that tool.

FACR Summary

FACR summary is a general statement which summarizes and overviews the more detailed overall listing of information from the FACR information provided by all on the team. Main findings, analyses, conclusions and recommendations provided, and how this has impacted the project would be appropriate for the summary on each overall tool and/or application, depending on levels of significance afforded by team. The summary is usually best done by the team leader for that tool, based on compiled input and feedback from all on team.